WorldWideScience

Sample records for source detection threshold

  1. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  2. Detection thresholds of macaque otolith afferents.

    Science.gov (United States)

    Yu, Xiong-Jie; Dickman, J David; Angelaki, Dora E

    2012-06-13

    The vestibular system is our sixth sense and is important for spatial perception functions, yet the sensory detection and discrimination properties of vestibular neurons remain relatively unexplored. Here we have used signal detection theory to measure detection thresholds of otolith afferents using 1 Hz linear accelerations delivered along three cardinal axes. Direction detection thresholds were measured by comparing mean firing rates centered on response peak and trough (full-cycle thresholds) or by comparing peak/trough firing rates with spontaneous activity (half-cycle thresholds). Thresholds were similar for utricular and saccular afferents, as well as for lateral, fore/aft, and vertical motion directions. When computed along the preferred direction, full-cycle direction detection thresholds were 7.54 and 3.01 cm/s(2) for regular and irregular firing otolith afferents, respectively. Half-cycle thresholds were approximately double, with excitatory thresholds being half as large as inhibitory thresholds. The variability in threshold among afferents was directly related to neuronal gain and did not depend on spike count variance. The exact threshold values depended on both the time window used for spike count analysis and the filtering method used to calculate mean firing rate, although differences between regular and irregular afferent thresholds were independent of analysis parameters. The fact that minimum thresholds measured in macaque otolith afferents are of the same order of magnitude as human behavioral thresholds suggests that the vestibular periphery might determine the limit on our ability to detect or discriminate small differences in head movement, with little noise added during downstream processing.

  3. Doubler system quench detection threshold

    International Nuclear Information System (INIS)

    Kuepke, K.; Kuchnir, M.; Martin, P.

    1983-01-01

    The experimental study leading to the determination of the sensitivity needed for protecting the Fermilab Doubler from damage during quenches is presented. The quench voltage thresholds involved were obtained from measurements made on Doubler cable of resistance x temperature and voltage x time during quenches under several currents and from data collected during operation of the Doubler Quench Protection System as implemented in the B-12 string of 20 magnets. At 4kA, a quench voltage threshold in excess of 5.OV will limit the peak Doubler cable temperature to 452K for quenches originating in the magnet coils whereas a threshold of 0.5V is required for quenches originating outside of coils

  4. QRS Detection Based on Improved Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Xuanyu Lu

    2018-01-01

    Full Text Available Cardiovascular disease is the first cause of death around the world. In accomplishing quick and accurate diagnosis, automatic electrocardiogram (ECG analysis algorithm plays an important role, whose first step is QRS detection. The threshold algorithm of QRS complex detection is known for its high-speed computation and minimized memory storage. In this mobile era, threshold algorithm can be easily transported into portable, wearable, and wireless ECG systems. However, the detection rate of the threshold algorithm still calls for improvement. An improved adaptive threshold algorithm for QRS detection is reported in this paper. The main steps of this algorithm are preprocessing, peak finding, and adaptive threshold QRS detecting. The detection rate is 99.41%, the sensitivity (Se is 99.72%, and the specificity (Sp is 99.69% on the MIT-BIH Arrhythmia database. A comparison is also made with two other algorithms, to prove our superiority. The suspicious abnormal area is shown at the end of the algorithm and RR-Lorenz plot drawn for doctors and cardiologists to use as aid for diagnosis.

  5. Moving Sources Detection System

    International Nuclear Information System (INIS)

    Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Normand, Stephane

    2013-06-01

    To monitor radioactivity passing through a pipe or in a given container such as a train or a truck, radiation detection systems are commonly employed. These detectors could be used in a network set along the source track to increase the overall detection efficiency. However detection methods are based on counting statistics analysis. The method usually implemented consists in trigging an alarm when an individual signal rises over a threshold initially estimated in regards to the natural background signal. The detection efficiency is then proportional to the number of detectors in use, due to the fact that each sensor is taken as a standalone sensor. A new approach is presented in this paper taking into account the temporal periodicity of the signals taken by all distributed sensors as a whole. This detection method is not based only on counting statistics but also on the temporal series analysis aspect. Therefore, a specific algorithm is then developed in our lab for this kind of applications and shows a significant improvement, especially in terms of detection efficiency and false alarms reduction. We also plan on extracting information from the source vector. This paper presents the theoretical approach and some preliminary results obtain in our laboratory. (authors)

  6. Statistical Algorithm for the Adaptation of Detection Thresholds

    DEFF Research Database (Denmark)

    Stotsky, Alexander A.

    2008-01-01

    Many event detection mechanisms in spark ignition automotive engines are based on the comparison of the engine signals to the detection threshold values. Different signal qualities for new and aged engines necessitate the development of an adaptation algorithm for the detection thresholds...... remains constant regardless of engine age and changing detection threshold values. This, in turn, guarantees the same event detection performance for new and aged engines/sensors. Adaptation of the engine knock detection threshold is given as an example. Udgivelsesdato: 2008...

  7. Phosphatase activity tunes two-component system sensor detection threshold.

    Science.gov (United States)

    Landry, Brian P; Palanki, Rohan; Dyulgyarov, Nikola; Hartsough, Lucas A; Tabor, Jeffrey J

    2018-04-12

    Two-component systems (TCSs) are the largest family of multi-step signal transduction pathways in biology, and a major source of sensors for biotechnology. However, the input concentrations to which biosensors respond are often mismatched with application requirements. Here, we utilize a mathematical model to show that TCS detection thresholds increase with the phosphatase activity of the sensor histidine kinase. We experimentally validate this result in engineered Bacillus subtilis nitrate and E. coli aspartate TCS sensors by tuning their detection threshold up to two orders of magnitude. We go on to apply our TCS tuning method to recently described tetrathionate and thiosulfate sensors by mutating a widely conserved residue previously shown to impact phosphatase activity. Finally, we apply TCS tuning to engineer B. subtilis to sense and report a wide range of fertilizer concentrations in soil. This work will enable the engineering of tailor-made biosensors for diverse synthetic biology applications.

  8. Predicting visual acuity from detection thresholds.

    Science.gov (United States)

    Newacheck, J S; Haegerstrom-Portnoy, G; Adams, A J

    1990-03-01

    Visual performance based exclusively on high luminance and high contrast letter acuity measures often fails to predict individual performance at low contrast and low luminance. Here we measured visual acuity over a wide range of contrasts and luminances (low mesopic to photopic) for 17 young normal observers. Acuity vs. contrast functions appear to fit a single template which can be displaced laterally along the log contrast axis. The magnitude of this lateral displacement for different luminances was well predicted by the contrast threshold difference for a 4 min arc spot. The acuity vs. contrast template, taken from the mean of all 17 subjects, was used in conjunction with individual spot contrast threshold measures to predict an individual's visual acuity over a wide range of luminance and contrast levels. The accuracy of the visual acuity predictions from this simple procedure closely approximates test-retest accuracy for both positive (projected Landolt rings) and negative contrast (Bailey-Lovie charts).

  9. Threshold-based Adaptive Detection for WSN

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-01-06

    Efficient receiver designs for wireless sensor networks (WSNs) are becoming increasingly important. Cooperative WSNs communicated with the use of L sensors. As the receiver is constrained, it can only process U out of L sensors. Channel shortening and reduced-rank techniques were employed to design the preprocessing matrix. In this work, a receiver structure is proposed which combines the joint iterative optimization (JIO) algorithm and our proposed threshold selection criteria. This receiver structure assists in determining the optimal Uopt. It also provides the freedom to choose U

  10. Threshold-based Adaptive Detection for WSN

    KAUST Repository

    Abuzaid, Abdulrahman I.; Ahmed, Qasim Zeeshan; Alouini, Mohamed-Slim

    2014-01-01

    Efficient receiver designs for wireless sensor networks (WSNs) are becoming increasingly important. Cooperative WSNs communicated with the use of L sensors. As the receiver is constrained, it can only process U out of L sensors. Channel shortening and reduced-rank techniques were employed to design the preprocessing matrix. In this work, a receiver structure is proposed which combines the joint iterative optimization (JIO) algorithm and our proposed threshold selection criteria. This receiver structure assists in determining the optimal Uopt. It also provides the freedom to choose U

  11. Neutron threshold activation detectors (TAD) for the detection of fissions

    Science.gov (United States)

    Gozani, Tsahi; Stevenson, John; King, Michael J.

    2011-10-01

    , called Threshold Activation Detection (TAD), is to utilize appropriate substances that can be selectively activated by the fission neutrons and not by the source radiation and then measure the radioactively decaying activation products (typically beta and gamma rays) well after the source pulse. The activation material should possess certain properties: a suitable half-life of the order of seconds; an energy threshold below which the numerous source neutrons will not activate it (e.g., 3 MeV); easily detectable activation products (typically >1 MeV beta and gamma rays) and have a usable cross-section for the selected reaction. Ideally the substance would be a part of the scintillator. There are several good material candidates for the TAD, including fluorine, which is a major constituent of available scintillators such as BaF 2, CaF 2 and hydrogen free liquid fluorocarbon. Thus the fluorine activation products, in particular the beta particles, can be measured with a very high efficiency in the detector. The principles, applications and experimental results obtained with the fluorine based TAD are discussed.

  12. Neutron threshold activation detectors (TAD) for the detection of fissions

    International Nuclear Information System (INIS)

    Gozani, Tsahi; Stevenson, John; King, Michael J.

    2011-01-01

    , called Threshold Activation Detection (TAD), is to utilize appropriate substances that can be selectively activated by the fission neutrons and not by the source radiation and then measure the radioactively decaying activation products (typically beta and gamma rays) well after the source pulse. The activation material should possess certain properties: a suitable half-life of the order of seconds; an energy threshold below which the numerous source neutrons will not activate it (e.g., 3 MeV); easily detectable activation products (typically >1 MeV beta and gamma rays) and have a usable cross-section for the selected reaction. Ideally the substance would be a part of the scintillator. There are several good material candidates for the TAD, including fluorine, which is a major constituent of available scintillators such as BaF 2 , CaF 2 and hydrogen free liquid fluorocarbon. Thus the fluorine activation products, in particular the beta particles, can be measured with a very high efficiency in the detector. The principles, applications and experimental results obtained with the fluorine based TAD are discussed.

  13. Neutron threshold activation detectors (TAD) for the detection of fissions

    Energy Technology Data Exchange (ETDEWEB)

    Gozani, Tsahi, E-mail: tgozani@rapiscansystems.com [Rapiscan Laboratories, Inc., 520 Almanor Ave., Sunnyvale, CA 94085 (United States); Stevenson, John; King, Michael J. [Rapiscan Laboratories, Inc., 520 Almanor Ave., Sunnyvale, CA 94085 (United States)

    2011-10-01

    material. The technique, called Threshold Activation Detection (TAD), is to utilize appropriate substances that can be selectively activated by the fission neutrons and not by the source radiation and then measure the radioactively decaying activation products (typically beta and gamma rays) well after the source pulse. The activation material should possess certain properties: a suitable half-life of the order of seconds; an energy threshold below which the numerous source neutrons will not activate it (e.g., 3 MeV); easily detectable activation products (typically >1 MeV beta and gamma rays) and have a usable cross-section for the selected reaction. Ideally the substance would be a part of the scintillator. There are several good material candidates for the TAD, including fluorine, which is a major constituent of available scintillators such as BaF{sub 2}, CaF{sub 2} and hydrogen free liquid fluorocarbon. Thus the fluorine activation products, in particular the beta particles, can be measured with a very high efficiency in the detector. The principles, applications and experimental results obtained with the fluorine based TAD are discussed.

  14. Experimental determination of alpha particle threshold detection in cellulose nitrate

    International Nuclear Information System (INIS)

    Knoefell, T.M.J.

    1978-01-01

    LR 115, type II, Kodak-Pathe cellulose nitrate pellicles were irradiated perpendicularly with monoenergetic alpha bemas in the energy range 2,5-5,5 Mev. The alpha particle beams were produced by an intense Am 241 source using Argon as energy attenuating. After irradiations, samples were etched with NaOH solutions without agitation at 60 0 C, by different time periods varying from 15 minutes to 3,5 hours. Measurements of density and track diameter were done using optical microscopy. The sample compositions were done by CHN method of combustion gas analysis showing good agreement with the composition of cellulose trinitrate. From detection threshold and from obtained results, the development of latent tracks only occur for alpha particles with stopping power superior to 0,87 +- 0,06 MeV.cm -2 .mg -1 , was verified. (M.C.K.) [pt

  15. Effect of strong fragrance on olfactory detection threshold.

    Science.gov (United States)

    Fasunla, Ayotunde James; Douglas, David Dayo; Adeosun, Aderemi Adeleke; Steinbach, Silke; Nwaorgu, Onyekwere George Benjamin

    2014-09-01

    To assess the olfactory threshold of healthy volunteers at the University College Hospital, Ibadan and to investigate the effect of perfume on their olfactory detection thresholds. A quasi-experimental study on olfactory detection thresholds of healthy volunteers from September 2013 to November 2013. Tertiary health institution. A structured questionniare was administered to the participants in order to obtain information on sociodemographics, occupation, ability to perceive smell, use of perfume, effects of perfume on appetite and self-confidence, history of allergy, and previous nasal surgery. Participants subjectively rated their olfactory performance. Subsequently, they had olfactory detection threshold testing done at baseline and after exposure to perfume with varied concentrations of n-butanol in a forced triple response and staircase fashion. Healthy volunteers, 37 males and 63 females, were evaluated. Their ages ranged from 19 to 59 years with a mean of 31 years ± 8. Subjectively, 94% of the participants had excellent olfactory function. In the pre-exposure forced triple response, 88% were able to detect the odor at ≤.25 mmol/l concentration while in the post-exposure forced triple response, only 66% were able to detect the odor at ≤.25 mmol/l concentration. There is also a statistical significant difference in the olfactory detection threshold score between the pre-exposure and post-exposure period in the participants (P fragrances affects the olfactory detection threshold. Therefore patients and clinicians should be aware of this and its effects on the outcome of test of olfaction. © American Academy of Otolaryngology—Head and Neck Surgery Foundation 2014.

  16. Rate modulation detection thresholds for cochlear implant users.

    Science.gov (United States)

    Brochier, Tim; McKay, Colette; McDermott, Hugh

    2018-02-01

    The perception of temporal amplitude modulations is critical for speech understanding by cochlear implant (CI) users. The present study compared the ability of CI users to detect sinusoidal modulations of the electrical stimulation rate and current level, at different presentation levels (80% and 40% of the dynamic range) and modulation frequencies (10 and 100 Hz). Rate modulation detection thresholds (RMDTs) and amplitude modulation detection thresholds (AMDTs) were measured and compared to assess whether there was a perceptual advantage to either modulation method. Both RMDTs and AMDTs improved with increasing presentation level and decreasing modulation frequency. RMDTs and AMDTs were correlated, indicating that a common processing mechanism may underlie the perception of rate modulation and amplitude modulation, or that some subject-dependent factors affect both types of modulation detection.

  17. Variable threshold method for ECG R-peak detection.

    Science.gov (United States)

    Kew, Hsein-Ping; Jeong, Do-Un

    2011-10-01

    In this paper, a wearable belt-type ECG electrode worn around the chest by measuring the real-time ECG is produced in order to minimize the inconvenient in wearing. ECG signal is detected using a potential instrument system. The measured ECG signal is transmits via an ultra low power consumption wireless data communications unit to personal computer using Zigbee-compatible wireless sensor node. ECG signals carry a lot of clinical information for a cardiologist especially the R-peak detection in ECG. R-peak detection generally uses the threshold value which is fixed. There will be errors in peak detection when the baseline changes due to motion artifacts and signal size changes. Preprocessing process which includes differentiation process and Hilbert transform is used as signal preprocessing algorithm. Thereafter, variable threshold method is used to detect the R-peak which is more accurate and efficient than fixed threshold value method. R-peak detection using MIT-BIH databases and Long Term Real-Time ECG is performed in this research in order to evaluate the performance analysis.

  18. The problem of the detection threshold in radiation measurement

    International Nuclear Information System (INIS)

    Rose, E.; Wueneke, C.D.

    1983-01-01

    In all cases encountered in practical radiation measurement, the basic problem is to differentiate between the lowest measured value and the zero value (background, natural background radiation, etc.). For this purpose, on the mathematical side, tests based on hypotheses are to be applied. These will show the probability of differentiation between two values having the same random spread. By means of these tests and the corresponding error theory, a uniform treatment of the subject, applicable to all problems relating to measuring technique alike, can be found. Two basic concepts are found in this process, which have to be defined in terms of semantics and nomenclature: Decision threshold and detection threshold, or 'minimum detectable mean value'. At the decision threshold, one has to decide (with a given statistical error probability) whether a measured value is to be attributed to the background radiation, accepting the zero hypothesis, or whether this value differs significantly from the background radiation (error of 1rst kind). The minimum detectable mean value is the value which, with a given decision threshold, can be determined with sufficient significance to be a measured value and thus cannot be mistaken as background radiation (alternative hypothesis, error of 2nd kind). Normally, the two error types are of equal importance. It may happen, however, that one type of error gains more importance, depending on the approach. (orig.) [de

  19. Passive Sonar Target Detection Using Statistical Classifier and Adaptive Threshold

    Directory of Open Access Journals (Sweden)

    Hamed Komari Alaie

    2018-01-01

    Full Text Available This paper presents the results of an experimental investigation about target detecting with passive sonar in Persian Gulf. Detecting propagated sounds in the water is one of the basic challenges of the researchers in sonar field. This challenge will be complex in shallow water (like Persian Gulf and noise less vessels. Generally, in passive sonar, the targets are detected by sonar equation (with constant threshold that increases the detection error in shallow water. The purpose of this study is proposed a new method for detecting targets in passive sonars using adaptive threshold. In this method, target signal (sound is processed in time and frequency domain. For classifying, Bayesian classification is used and posterior distribution is estimated by Maximum Likelihood Estimation algorithm. Finally, target was detected by combining the detection points in both domains using Least Mean Square (LMS adaptive filter. Results of this paper has showed that the proposed method has improved true detection rate by about 24% when compared other the best detection method.

  20. Algorithmic detectability threshold of the stochastic block model

    Science.gov (United States)

    Kawamoto, Tatsuro

    2018-03-01

    The assumption that the values of model parameters are known or correctly learned, i.e., the Nishimori condition, is one of the requirements for the detectability analysis of the stochastic block model in statistical inference. In practice, however, there is no example demonstrating that we can know the model parameters beforehand, and there is no guarantee that the model parameters can be learned accurately. In this study, we consider the expectation-maximization (EM) algorithm with belief propagation (BP) and derive its algorithmic detectability threshold. Our analysis is not restricted to the community structure but includes general modular structures. Because the algorithm cannot always learn the planted model parameters correctly, the algorithmic detectability threshold is qualitatively different from the one with the Nishimori condition.

  1. Modern Adaptive Analytics Approach to Lowering Seismic Network Detection Thresholds

    Science.gov (United States)

    Johnson, C. E.

    2017-12-01

    Modern seismic networks present a number of challenges, but perhaps most notably are those related to 1) extreme variation in station density, 2) temporal variation in station availability, and 3) the need to achieve detectability for much smaller events of strategic importance. The first of these has been reasonably addressed in the development of modern seismic associators, such as GLASS 3.0 by the USGS/NEIC, though some work still remains to be done in this area. However, the latter two challenges demand special attention. Station availability is impacted by weather, equipment failure or the adding or removing of stations, and while thresholds have been pushed to increasingly smaller magnitudes, new algorithms are needed to achieve even lower thresholds. Station availability can be addressed by a modern, adaptive architecture that maintains specified performance envelopes using adaptive analytics coupled with complexity theory. Finally, detection thresholds can be lowered using a novel approach that tightly couples waveform analytics with the event detection and association processes based on a principled repicking algorithm that uses particle realignment for enhanced phase discrimination.

  2. Exploring light mediators with low-threshold direct detection experiments

    Energy Technology Data Exchange (ETDEWEB)

    Kahlhoefer, Felix [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); RWTH Aachen Univ. (Germany). Inst. for Theoretical Particle Physics and Cosmology; Kulkarni, Suchita [Oesterreichische Akademie der Wissenschaften, Vienna (Austria). Inst. fuer Hochenergiephysik; Wild, Sebastian [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)

    2017-11-15

    We explore the potential of future cryogenic direct detection experiments to determine the properties of the mediator that communicates the interactions between dark matter and nuclei. Due to their low thresholds and large exposures, experiments like CRESST-III, SuperCDMS SNOLAB and EDELWEISS-III will have excellent capability to reconstruct mediator masses in the MeV range for a large class of models. Combining the information from several experiments further improves the parameter reconstruction, even when taking into account additional nuisance parameters related to background uncertainties and the dark matter velocity distribution. These observations may offer the intriguing possibility of studying dark matter self-interactions with direct detection experiments.

  3. Exploring light mediators with low-threshold direct detection experiments

    International Nuclear Information System (INIS)

    Kahlhoefer, Felix

    2017-11-01

    We explore the potential of future cryogenic direct detection experiments to determine the properties of the mediator that communicates the interactions between dark matter and nuclei. Due to their low thresholds and large exposures, experiments like CRESST-III, SuperCDMS SNOLAB and EDELWEISS-III will have excellent capability to reconstruct mediator masses in the MeV range for a large class of models. Combining the information from several experiments further improves the parameter reconstruction, even when taking into account additional nuisance parameters related to background uncertainties and the dark matter velocity distribution. These observations may offer the intriguing possibility of studying dark matter self-interactions with direct detection experiments.

  4. Dynamic multiple thresholding breast boundary detection algorithm for mammograms

    International Nuclear Information System (INIS)

    Wu, Yi-Ta; Zhou Chuan; Chan, Heang-Ping; Paramagul, Chintana; Hadjiiski, Lubomir M.; Daly, Caroline Plowden; Douglas, Julie A.; Zhang Yiheng; Sahiner, Berkman; Shi Jiazheng; Wei Jun

    2010-01-01

    Purpose: Automated detection of breast boundary is one of the fundamental steps for computer-aided analysis of mammograms. In this study, the authors developed a new dynamic multiple thresholding based breast boundary (MTBB) detection method for digitized mammograms. Methods: A large data set of 716 screen-film mammograms (442 CC view and 274 MLO view) obtained from consecutive cases of an Institutional Review Board approved project were used. An experienced breast radiologist manually traced the breast boundary on each digitized image using a graphical interface to provide a reference standard. The initial breast boundary (MTBB-Initial) was obtained by dynamically adapting the threshold to the gray level range in local regions of the breast periphery. The initial breast boundary was then refined by using gradient information from horizontal and vertical Sobel filtering to obtain the final breast boundary (MTBB-Final). The accuracy of the breast boundary detection algorithm was evaluated by comparison with the reference standard using three performance metrics: The Hausdorff distance (HDist), the average minimum Euclidean distance (AMinDist), and the area overlap measure (AOM). Results: In comparison with the authors' previously developed gradient-based breast boundary (GBB) algorithm, it was found that 68%, 85%, and 94% of images had HDist errors less than 6 pixels (4.8 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 89%, 90%, and 96% of images had AMinDist errors less than 1.5 pixels (1.2 mm) for GBB, MTBB-Initial, and MTBB-Final, respectively. 96%, 98%, and 99% of images had AOM values larger than 0.9 for GBB, MTBB-Initial, and MTBB-Final, respectively. The improvement by the MTBB-Final method was statistically significant for all the evaluation measures by the Wilcoxon signed rank test (p<0.0001). Conclusions: The MTBB approach that combined dynamic multiple thresholding and gradient information provided better performance than the breast boundary

  5. Optimal threshold detection for Málaga turbulent optical links

    DEFF Research Database (Denmark)

    Jurado-Navas, Antonio; Garrido-Balsellss, José María; del Castillo Vázquez, Miguel

    2016-01-01

    in this paper the role of the detection threshold in a free-space optical system employing an on-off keying modulation technique and involved in different scenarios, and taking into account the extinction ratio associated to the employed laser. First we have derived some analytical expressions for the lower......A new and generalized statistical model, called Málaga distribution (M distribution), has been derived recently to characterize the irradiance fluctuations of an unbounded optical wave front propagating through a turbulent medium under all irradiance fluctuation conditions. As great advantages...... associated to that model, we can indicate that it is written in a simple tractable closed-form expression and that it is able to unify most of the proposed statistical models for free-space optical communications derived until now in the scientific literature. Based on that Málaga model, we have analyzed...

  6. Computerized detection of masses on mammograms by entropy maximization thresholding

    International Nuclear Information System (INIS)

    Kom, Guillaume; Tiedeu, Alain; Feudjio, Cyrille; Ngundam, J.

    2010-03-01

    In many cases, masses in X-ray mammograms are subtle and their detection can benefit from an automated system serving as a diagnostic aid. It is to this end that the authors propose in this paper, a new computer aided mass detection for breast cancer diagnosis. The first step focuses on wavelet filters enhancement which removes bright background due to dense breast tissues and some film artifacts while preserving features and patterns related to the masses. In the second step, enhanced image is computed by Entropy Maximization Thresholding (EMT) to obtain segmented masses. The efficiency of 98,181% is achieved by analyzing a database of 84 mammograms previously marked by radiologists and digitized at a pixel size of 343μmm x 343μ mm. The segmentation results, in terms of size of detected masses, give a relative error on mass area that is less than 8%. The performance of the proposed method has also been evaluated by means of the receiver operating-characteristics (ROC) analysis. This yielded respectively, an area (Az) of 0.9224 and 0.9295 under the ROC curve whether enhancement step is applied or not. Furthermore, we observe that the EMT yields excellent segmentation results compared to those found in literature. (author)

  7. Optimization of Second Fault Detection Thresholds to Maximize Mission POS

    Science.gov (United States)

    Anzalone, Evan

    2018-01-01

    both magnitude and time. As such, the Navigation team is taking advantage of the INS's capability to schedule and change fault detection thresholds in flight. These values are optimized along a nominal trajectory in order to maximize probability of mission success, and reducing the probability of false positives (defined as when the INS would report a second fault condition resulting in loss of mission, but the vehicle would still meet insertion requirements within system-level margins). This paper will describe an optimization approach using Genetic Algorithms to tune the threshold parameters to maximize vehicle resilience to second fault events as a function of potential fault magnitude and time of fault over an ascent mission profile. The analysis approach, and performance assessment of the results will be presented to demonstrate the applicability of this process to second fault detection to maximize mission probability of success.

  8. How to detect and visualize extinction thresholds for structured PVA models

    NARCIS (Netherlands)

    Hildenbrandt, H.; Grimm, V.

    2006-01-01

    An extinction threshold is a population size below which extinction risk increases to beyond critical values. However, detecting extinction thresholds for structured population models is not straightforward because many different population structures may correspond to the same population size.

  9. Optimal threshold functions for fault detection and isolation

    DEFF Research Database (Denmark)

    Stoustrup, J.; Niemann, Hans Henrik; Cour-Harbo, A. la

    2003-01-01

    Fault diagnosis systems usually comprises two parts: a filtering part and a decision part, the latter typically based on threshold functions. In this paper, systematic ways to choose the threshold values are proposed. Two different test functions for the filtered signals are discussed and a method...

  10. Odour Detection Threshold Determination of Volatile Compounds in Topical Skin Formulations

    DEFF Research Database (Denmark)

    Thomsen, Birgitte Raagaard; Hyldig, Grethe; Taylor, Robert

    2018-01-01

    determination and also odour description by a trained sensory panel. In one case, the odour detection threshold value was 50 times higher (less detectable) in skin care products than in water, whereas for other volatile compounds the odour detection threshold value was only 1.5 times higher. The odour...

  11. Electrophysiological gap detection thresholds: effects of age and comparison with a behavioral measure.

    Science.gov (United States)

    Palmer, Shannon B; Musiek, Frank E

    2014-01-01

    Temporal processing ability has been linked to speech understanding ability and older adults often complain of difficulty understanding speech in difficult listening situations. Temporal processing can be evaluated using gap detection procedures. There is some research showing that gap detection can be evaluated using an electrophysiological procedure. However, there is currently no research establishing gap detection threshold using the N1-P2 response. The purposes of the current study were to 1) determine gap detection thresholds in younger and older normal-hearing adults using an electrophysiological measure, 2) compare the electrophysiological gap detection threshold and behavioral gap detection threshold within each group, and 3) investigate the effect of age on each gap detection measure. This study utilized an older adult group and younger adult group to compare performance on an electrophysiological and behavioral gap detection procedure. The subjects in this study were 11 younger, normal-hearing adults (mean = 22 yrs) and 11 older, normal-hearing adults (mean = 64.36 yrs). All subjects completed an adaptive behavioral gap detection procedure in order to determine their behavioral gap detection threshold (BGDT). Subjects also completed an electrophysiologic gap detection procedure to determine their electrophysiologic gap detection threshold (EGDT). Older adults demonstrated significantly larger gap detection thresholds than the younger adults. However, EGDT and BGDT were not significantly different in either group. The mean difference between EGDT and BGDT for all subjects was 0.43 msec. Older adults show poorer gap detection ability when compared to younger adults. However, this study shows that gap detection thresholds can be measured using evoked potential recordings and yield results similar to a behavioral measure. American Academy of Audiology.

  12. Effective temperature of an ultracold electron source based on near-threshold photoionization

    NARCIS (Netherlands)

    Engelen, W.J.; Smakman, E.P.; Bakker, D.J.; Luiten, O.J.; Vredenbregt, E.J.D.

    2014-01-01

    We present a detailed description of measurements of the effective temperature of a pulsed electron source, based on near-threshold photoionization of laser-cooled atoms. The temperature is determined by electron beam waist scans, source size measurements with ion beams, and analysis with an

  13. A system dynamics model of clinical decision thresholds for the detection of developmental-behavioral disorders

    Directory of Open Access Journals (Sweden)

    R. Christopher Sheldrick

    2016-11-01

    screening trials. Consistent with prior theory, virtual experiments suggest that physicians’ decision thresholds can be influenced and detection of disabilities improved by increasing access to referral sources and enhancing feedback regarding false negative cases. Conclusions The SD model of clinical decision-making offers a theoretically based framework to improve understanding of physicians’ behavior and the results of screening implementation trials. The SD model is also useful for initial testing of hypothesized strategies to increase detection of under-identified medical conditions.

  14. Histogram-based automatic thresholding for bruise detection of apples by structured-illumination reflectance imaging

    Science.gov (United States)

    Thresholding is an important step in the segmentation of image features, and the existing methods are not all effective when the image histogram exhibits a unimodal pattern, which is common in defect detection of fruit. This study was aimed at developing a general automatic thresholding methodology ...

  15. Threshold-Based Relay Selection for Detect-and-Forward Relaying in Cooperative Wireless Networks

    Directory of Open Access Journals (Sweden)

    Fan Yijia

    2010-01-01

    Full Text Available This paper studies two-hop cooperative demodulate-and-forward relaying using multiple relays in wireless networks. A threshold based relay selection scheme is considered, in which the reliable relays are determined by comparing source-relay SNR to a threshold, and one of the reliable relays is selected by the destination based on relay-destination SNR. The exact bit error rate of this scheme is derived, and a simple threshold function is proposed. It is shown that the network achieves full diversity order ( under the proposed threshold, where is the number of relays in the network. Unlike some other full diversity achieving protocols in the literature, the requirement that the instantaneous/average SNRs of the source-relay links be known at the destination is eliminated using the appropriate SNR threshold.

  16. Detecting fatigue thresholds from electromyographic signals: A systematic review on approaches and methodologies.

    Science.gov (United States)

    Ertl, Peter; Kruse, Annika; Tilp, Markus

    2016-10-01

    The aim of the current paper was to systematically review the relevant existing electromyographic threshold concepts within the literature. The electronic databases MEDLINE and SCOPUS were screened for papers published between January 1980 and April 2015 including the keywords: neuromuscular fatigue threshold, anaerobic threshold, electromyographic threshold, muscular fatigue, aerobic-anaerobictransition, ventilatory threshold, exercise testing, and cycle-ergometer. 32 articles were assessed with regard to their electromyographic methodologies, description of results, statistical analysis and test protocols. Only one article was of very good quality. 21 were of good quality and two articles were of very low quality. The review process revealed that: (i) there is consistent evidence of one or two non-linear increases of EMG that might reflect the additional recruitment of motor units (MU) or different fiber types during fatiguing cycle ergometer exercise, (ii) most studies reported no statistically significant difference between electromyographic and metabolic thresholds, (iii) one minute protocols with increments between 10 and 25W appear most appropriate to detect muscular threshold, (iv) threshold detection from the vastus medialis, vastus lateralis, and rectus femoris is recommended, and (v) there is a great variety in study protocols, measurement techniques, and data processing. Therefore, we recommend further research and standardization in the detection of EMGTs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Data-Driven Jump Detection Thresholds for Application in Jump Regressions

    Directory of Open Access Journals (Sweden)

    Robert Davies

    2018-03-01

    Full Text Available This paper develops a method to select the threshold in threshold-based jump detection methods. The method is motivated by an analysis of threshold-based jump detection methods in the context of jump-diffusion models. We show that over the range of sampling frequencies a researcher is most likely to encounter that the usual in-fill asymptotics provide a poor guide for selecting the jump threshold. Because of this we develop a sample-based method. Our method estimates the number of jumps over a grid of thresholds and selects the optimal threshold at what we term the ‘take-off’ point in the estimated number of jumps. We show that this method consistently estimates the jumps and their indices as the sampling interval goes to zero. In several Monte Carlo studies we evaluate the performance of our method based on its ability to accurately locate jumps and its ability to distinguish between true jumps and large diffusive moves. In one of these Monte Carlo studies we evaluate the performance of our method in a jump regression context. Finally, we apply our method in two empirical studies. In one we estimate the number of jumps and report the jump threshold our method selects for three commonly used market indices. In the other empirical application we perform a series of jump regressions using our method to select the jump threshold.

  18. Induced Temporal Signatures for Point-Source Detection

    International Nuclear Information System (INIS)

    Stephens, Daniel L.; Runkle, Robert C.; Carlson, Deborah K.; Peurrung, Anthony J.; Seifert, Allen; Wyatt, Cory R.

    2005-01-01

    Detection of radioactive point-sized sources is inherently divided into two regimes encompassing stationary and moving detectors. The two cases differ in their treatment of background radiation and its influence on detection sensitivity. In the stationary detector case the statistical fluctuation of the background determines the minimum detectable quantity. In the moving detector case the detector may be subjected to widely and irregularly varying background radiation, as a result of geographical and environmental variation. This significant systematic variation, in conjunction with the statistical variation of the background, requires a conservative threshold to be selected to yield the same false-positive rate as the stationary detection case. This results in lost detection sensitivity for real sources. This work focuses on a simple and practical modification of the detector geometry that increase point-source recognition via a distinctive temporal signature. A key part of this effort is the integrated development of both detector geometries that induce a highly distinctive signature for point sources and the development of statistical algorithms able to optimize detection of this signature amidst varying background. The identification of temporal signatures for point sources has been demonstrated and compared with the canonical method showing good results. This work demonstrates that temporal signatures are efficient at increasing point-source discrimination in a moving detector system

  19. CLEAR: Prospects for a low threshold neutrino experiment at the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Scholberg, Kate

    2008-01-01

    A low-threshold neutrino scattering experiment at a high intensity stopped-pion neutrino source has the potential to measure coherent neutral current neutrino-nucleus elastic scattering. A promising prospect for the measurement of this process is a proposed noble-liquid-based experiment, dubbed CLEAR (Coherent Low Energy A (Nuclear) Recoils), at the Spallation Neutron Source. This poster will describe the CLEAR proposal and its physics reach.

  20. Detection Thresholds of Falling Snow From Satellite-Borne Active and Passive Sensors

    Science.gov (United States)

    Skofronick-Jackson, Gail M.; Johnson, Benjamin T.; Munchak, S. Joseph

    2013-01-01

    There is an increased interest in detecting and estimating the amount of falling snow reaching the Earths surface in order to fully capture the global atmospheric water cycle. An initial step toward global spaceborne falling snow algorithms for current and future missions includes determining the thresholds of detection for various active and passive sensor channel configurations and falling snow events over land surfaces and lakes. In this paper, cloud resolving model simulations of lake effect and synoptic snow events were used to determine the minimum amount of snow (threshold) that could be detected by the following instruments: the W-band radar of CloudSat, Global Precipitation Measurement (GPM) Dual-Frequency Precipitation Radar (DPR)Ku- and Ka-bands, and the GPM Microwave Imager. Eleven different nonspherical snowflake shapes were used in the analysis. Notable results include the following: 1) The W-band radar has detection thresholds more than an order of magnitude lower than the future GPM radars; 2) the cloud structure macrophysics influences the thresholds of detection for passive channels (e.g., snow events with larger ice water paths and thicker clouds are easier to detect); 3) the snowflake microphysics (mainly shape and density)plays a large role in the detection threshold for active and passive instruments; 4) with reasonable assumptions, the passive 166-GHz channel has detection threshold values comparable to those of the GPM DPR Ku- and Ka-band radars with approximately 0.05 g *m(exp -3) detected at the surface, or an approximately 0.5-1.0-mm * h(exp -1) melted snow rate. This paper provides information on the light snowfall events missed by the sensors and not captured in global estimates.

  1. Detecting wood surface defects with fusion algorithm of visual saliency and local threshold segmentation

    Science.gov (United States)

    Wang, Xuejuan; Wu, Shuhang; Liu, Yunpeng

    2018-04-01

    This paper presents a new method for wood defect detection. It can solve the over-segmentation problem existing in local threshold segmentation methods. This method effectively takes advantages of visual saliency and local threshold segmentation. Firstly, defect areas are coarsely located by using spectral residual method to calculate global visual saliency of them. Then, the threshold segmentation of maximum inter-class variance method is adopted for positioning and segmenting the wood surface defects precisely around the coarse located areas. Lastly, we use mathematical morphology to process the binary images after segmentation, which reduces the noise and small false objects. Experiments on test images of insect hole, dead knot and sound knot show that the method we proposed obtains ideal segmentation results and is superior to the existing segmentation methods based on edge detection, OSTU and threshold segmentation.

  2. The Reduction of Vertical Interchannel Crosstalk: The Analysis of Localisation Thresholds for Natural Sound Sources

    Directory of Open Access Journals (Sweden)

    Rory Wallis

    2017-03-01

    Full Text Available In subjective listening tests, natural sound sources were presented to subjects as vertically-oriented phantom images from two layers of loudspeakers, ‘height’ and ‘main’. Subjects were required to reduce the amplitude of the height layer until the position of the resultant sound source matched that of the same source presented from the main layer only (the localisation threshold. Delays of 0, 1 and 10 ms were applied to the height layer with respect to the main, with vertical stereophonic and quadraphonic conditions being tested. The results of the study showed that the localisation thresholds obtained were not significantly affected by sound source or presentation method. Instead, the only variable whose effect was significant was interchannel time difference (ICTD. For ICTD of 0 ms, the median threshold was −9.5 dB, which was significantly lower than the −7 dB found for both 1 and 10 ms. The results of the study have implications both for the recording of sound sources for three-dimensional (3D audio reproduction formats and also for the rendering of 3D images.

  3. The ship edge feature detection based on high and low threshold for remote sensing image

    Science.gov (United States)

    Li, Xuan; Li, Shengyang

    2018-05-01

    In this paper, a method based on high and low threshold is proposed to detect the ship edge feature due to the low accuracy rate caused by the noise. Analyze the relationship between human vision system and the target features, and to determine the ship target by detecting the edge feature. Firstly, using the second-order differential method to enhance the quality of image; Secondly, to improvement the edge operator, we introduction of high and low threshold contrast to enhancement image edge and non-edge points, and the edge as the foreground image, non-edge as a background image using image segmentation to achieve edge detection, and remove the false edges; Finally, the edge features are described based on the result of edge features detection, and determine the ship target. The experimental results show that the proposed method can effectively reduce the number of false edges in edge detection, and has the high accuracy of remote sensing ship edge detection.

  4. Effects of visual erotic stimulation on vibrotactile detection thresholds in men.

    Science.gov (United States)

    Jiao, Chuanshu; Knight, Peter K; Weerakoon, Patricia; Turman, A Bulent

    2007-12-01

    This study examined the effects of sexual arousal on vibration detection thresholds in the right index finger of 30 healthy, heterosexual males who reported no sexual dysfunction. Vibrotactile detection thresholds at frequencies of 30, 60, and 100 Hz were assessed before and after watching erotic and control videos using a forced-choice, staircase method. A mechanical stimulator was used to produce the vibratory stimulus. Results were analyzed using repeated measures analysis of variance. After watching the erotic video, the vibrotactile detection thresholds at 30, 60, and 100 Hz were significantly reduced (p erotic stimulus. The results show that sexual arousal resulted in an increase in vibrotactile sensitivity to low frequency stimuli in the index finger of sexually functional men.

  5. The Chandra Source Catalog: Background Determination and Source Detection

    Science.gov (United States)

    McCollough, Michael; Rots, Arnold; Primini, Francis A.; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Danny G. Gibbs, II; Grier, John D.; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2009-09-01

    The Chandra Source Catalog (CSC) is a major project in which all of the pointed imaging observations taken by the Chandra X-Ray Observatory are used to generate one of the most extensive X-ray source catalog produced to date. Early in the development of the CSC it was recognized that the ability to estimate local background levels in an automated fashion would be critical for essential CSC tasks such as source detection, photometry, sensitivity estimates, and source characterization. We present a discussion of how such background maps are created directly from the Chandra data and how they are used in source detection. The general background for Chandra observations is rather smoothly varying, containing only low spatial frequency components. However, in the case of ACIS data, a high spatial frequency component is added that is due to the readout streaks of the CCD chips. We discuss how these components can be estimated reliably using the Chandra data and what limitations and caveats should be considered in their use. We will discuss the source detection algorithm used for the CSC and the effects of the background images on the detection results. We will also touch on some the Catalog Inclusion and Quality Assurance criteria applied to the source detection results. This work is supported by NASA contract NAS8-03060 (CXC).

  6. Chandra Source Catalog: Background Determination and Source Detection

    Science.gov (United States)

    McCollough, Michael L.; Rots, A. H.; Primini, F. A.; Evans, I. N.; Glotfelty, K. J.; Hain, R.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    The Chandra Source Catalog (CSC) is a major project in which all of the pointed imaging observations taken by the Chandra X-Ray Observatory will used to generate the most extensive X-ray source catalog produced to date. Early in the development of the CSC it was recognized that the ability to estimate local background levels in an automated fashion would be critical for essential CSC tasks such as source detection, photometry, sensitivity estimates, and source characterization. We present a discussion of how such background maps are created directly from the Chandra data and how they are used in source detection. The general background for Chandra observations is rather smoothly varying, containing only low spatial frequency components. However, in the case of ACIS data, a high spatial frequency component is added that is due to the readout streaks of the CCD chips. We discuss how these components can be estimated reliably using the Chandra data and what limitations and caveats should be considered in their use. We will discuss the source detection algorithm used for the CSC and the effects of the background images on the detection results. We will also touch on some the Catalog Inclusion and Quality Assurance criteria applied to the source detection results. This work is supported by NASA contract NAS8-03060 (CXC).

  7. Defect Detection of Steel Surfaces with Global Adaptive Percentile Thresholding of Gradient Image

    Science.gov (United States)

    Neogi, Nirbhar; Mohanta, Dusmanta K.; Dutta, Pranab K.

    2017-12-01

    Steel strips are used extensively for white goods, auto bodies and other purposes where surface defects are not acceptable. On-line surface inspection systems can effectively detect and classify defects and help in taking corrective actions. For detection of defects use of gradients is very popular in highlighting and subsequently segmenting areas of interest in a surface inspection system. Most of the time, segmentation by a fixed value threshold leads to unsatisfactory results. As defects can be both very small and large in size, segmentation of a gradient image based on percentile thresholding can lead to inadequate or excessive segmentation of defective regions. A global adaptive percentile thresholding of gradient image has been formulated for blister defect and water-deposit (a pseudo defect) in steel strips. The developed method adaptively changes the percentile value used for thresholding depending on the number of pixels above some specific values of gray level of the gradient image. The method is able to segment defective regions selectively preserving the characteristics of defects irrespective of the size of the defects. The developed method performs better than Otsu method of thresholding and an adaptive thresholding method based on local properties.

  8. Comparisons between detection threshold and loudness perception for individual cochlear implant channels

    Science.gov (United States)

    Bierer, Julie Arenberg; Nye, Amberly D

    2014-01-01

    Objective The objective of the present study, performed in cochlear implant listeners, was to examine how the level of current required to detect single-channel electrical pulse trains relates to loudness perception on the same channel. The working hypothesis was that channels with relatively high thresholds, when measured with a focused current pattern, interface poorly to the auditory nerve. For such channels a smaller dynamic range between perceptual threshold and the most comfortable loudness would result, in part, from a greater sensitivity to changes in electrical field spread compared to low-threshold channels. The narrower range of comfortable listening levels may have important implications for speech perception. Design Data were collected from eight, adult cochlear implant listeners implanted with the HiRes90k cochlear implant (Advanced Bionics Corp.). The partial tripolar (pTP) electrode configuration, consisting of one intracochlear active electrode, two flanking electrodes carrying a fraction (σ) of the return current, and an extracochlear ground, was used for stimulation. Single-channel detection thresholds and most comfortable listening levels were acquired using the most focused pTP configuration possible (σ ≥ 0.8) to identify three channels for further testing – those with the highest, median, and lowest thresholds – for each subject. Threshold, equal-loudness contours (at 50% of the monopolar dynamic range), and loudness growth functions were measured for each of these three test channels using various partial tripolar fractions. Results For all test channels, thresholds increased as the electrode configuration became more focused. The rate of increase with the focusing parameter σ was greatest for the high-threshold channel compared to the median- and low-threshold channels. The 50% equal-loudness contours exhibited similar rates of increase in level across test channels and subjects. Additionally, test channels with the highest

  9. Stress lowers the detection threshold for foul-smelling 2-mercaptoethanol.

    Science.gov (United States)

    Pacharra, Marlene; Schäper, Michael; Kleinbeck, Stefan; Blaszkewicz, Meinolf; Wolf, Oliver T; van Thriel, Christoph

    2016-01-01

    Previous studies have reported enhanced vigilance for threat-related information in response to acute stress. While it is known that acute stress modulates sensory systems in humans, its impact on olfaction and the olfactory detection of potential threats is less clear. Two psychophysical experiments examined, if acute stress lowers the detection threshold for foul-smelling 2-mercaptoethanol. Participants in Experiment 1 (N = 30) and Experiment 2 (N = 32) were randomly allocated to a control group or a stress group. Participants in the stress group underwent a purely psychosocial stressor (public mental arithmetic) in Experiment 1 and a stressor that combined a physically demanding task with social-evaluative threat in Experiment 2 (socially evaluated cold-pressor test). In both experiments, olfactory detection thresholds were repeatedly assessed by means of dynamic dilution olfactometry. Each threshold measurement consisted of three trials conducted using an ascending method of limits. Participants in the stress groups showed the expected changes in heart rate, salivary cortisol, and mood measures in response to stress. About 20 min after the stressor, participants in the stress groups could detect 2-mercaptoethanol at a lower concentration than participants in the corresponding control groups. Our results show that acute stress lowers the detection threshold for a malodor.

  10. Automatic Threshold Determination for a Local Approach of Change Detection in Long-Term Signal Recordings

    Directory of Open Access Journals (Sweden)

    David Hewson

    2007-01-01

    Full Text Available CUSUM (cumulative sum is a well-known method that can be used to detect changes in a signal when the parameters of this signal are known. This paper presents an adaptation of the CUSUM-based change detection algorithms to long-term signal recordings where the various hypotheses contained in the signal are unknown. The starting point of the work was the dynamic cumulative sum (DCS algorithm, previously developed for application to long-term electromyography (EMG recordings. DCS has been improved in two ways. The first was a new procedure to estimate the distribution parameters to ensure the respect of the detectability property. The second was the definition of two separate, automatically determined thresholds. One of them (lower threshold acted to stop the estimation process, the other one (upper threshold was applied to the detection function. The automatic determination of the thresholds was based on the Kullback-Leibler distance which gives information about the distance between the detected segments (events. Tests on simulated data demonstrated the efficiency of these improvements of the DCS algorithm.

  11. Radioactive sealed sources: Reasonable accountability, exemption, and licensing activity thresholds -- A technical basis

    International Nuclear Information System (INIS)

    Lee, D.W.; Shingleton, K.L.

    1996-01-01

    Perhaps owing to their small size and portability, some radiation accidents/incidents have involved radioactive sealed sources (RSSs). As a result, programs for the control and accountability of RSSs have come to be recommended and emplaced that essentially require RSSs to be controlled in a manner different from bulk, unsealed radioactive material. Crucially determining the total number of RSSs for which manpower-intensive radiation protection surveillance is provided is the individual RSS activity above which such surveillance is required and below which such effort is not considered cost effective. Individual RSS activity thresholds are typically determined through scenarios which impart a chosen internal or external limiting dose to Reference Man under specified exposure conditions. The resultant RSS threshold activity levels have meaning commensurate with the assumed scenario exposure parameters, i.e., if they are realistic and technically based. A review of how the Department of Energy (DOE), the International Atomic Energy Agency (IAEA), and the Nuclear Regulatory Commission (NRC) have determined their respective accountability, exemption, and licensing threshold activity values is provided. Finally, a fully explained method using references readily available to practicing health physicists is developed using realistic, technically-based calculation parameters by which RSS threshold activities may be locally generated

  12. Effective temperature of an ultracold electron source based on near-threshold photoionization.

    Science.gov (United States)

    Engelen, W J; Smakman, E P; Bakker, D J; Luiten, O J; Vredenbregt, E J D

    2014-01-01

    We present a detailed description of measurements of the effective temperature of a pulsed electron source, based on near-threshold photoionization of laser-cooled atoms. The temperature is determined by electron beam waist scans, source size measurements with ion beams, and analysis with an accurate beam line model. Experimental data is presented for the source temperature as a function of the wavelength of the photoionization laser, for both nanosecond and femtosecond ionization pulses. For the nanosecond laser, temperatures as low as 14 ± 3 K were found; for femtosecond photoionization, 30 ± 5 K is possible. With a typical source size of 25 μm, this results in electron bunches with a relative transverse coherence length in the 10⁻⁴ range and an emittance of a few nm rad. © 2013 Elsevier B.V. All rights reserved.

  13. [A cloud detection algorithm for MODIS images combining Kmeans clustering and multi-spectral threshold method].

    Science.gov (United States)

    Wang, Wei; Song, Wei-Guo; Liu, Shi-Xing; Zhang, Yong-Ming; Zheng, Hong-Yang; Tian, Wei

    2011-04-01

    An improved method for detecting cloud combining Kmeans clustering and the multi-spectral threshold approach is described. On the basis of landmark spectrum analysis, MODIS data is categorized into two major types initially by Kmeans method. The first class includes clouds, smoke and snow, and the second class includes vegetation, water and land. Then a multi-spectral threshold detection is applied to eliminate interference such as smoke and snow for the first class. The method is tested with MODIS data at different time under different underlying surface conditions. By visual method to test the performance of the algorithm, it was found that the algorithm can effectively detect smaller area of cloud pixels and exclude the interference of underlying surface, which provides a good foundation for the next fire detection approach.

  14. Verification of threshold activation detection (TAD) technique in prompt fission neutron detection using scintillators containing 19F

    Science.gov (United States)

    Sibczynski, P.; Kownacki, J.; Moszyński, M.; Iwanowska-Hanke, J.; Syntfeld-Każuch, A.; Gójska, A.; Gierlik, M.; Kaźmierczak, Ł.; Jakubowska, E.; Kędzierski, G.; Kujawiński, Ł.; Wojnarowicz, J.; Carrel, F.; Ledieu, M.; Lainé, F.

    2015-09-01

    In the present study ⌀ 5''× 3'' and ⌀ 2''× 2'' EJ-313 liquid fluorocarbon as well as ⌀ 2'' × 3'' BaF2 scintillators were exposed to neutrons from a 252Cf neutron source and a Sodern Genie 16GT deuterium-tritium (D+T) neutron generator. The scintillators responses to β- particles with maximum endpoint energy of 10.4 MeV from the n+19F reactions were studied. Response of a ⌀ 5'' × 3'' BC-408 plastic scintillator was also studied as a reference. The β- particles are the products of interaction of fast neutrons with 19F which is a component of the EJ-313 and BaF2 scintillators. The method of fast neutron detection via fluorine activation is already known as Threshold Activation Detection (TAD) and was proposed for photofission prompt neutron detection from fissionable and Special Nuclear Materials (SNM) in the field of Homeland Security and Border Monitoring. Measurements of the number of counts between 6.0 and 10.5 MeV with a 252Cf source showed that the relative neutron detection efficiency ratio, defined as epsilonBaF2 / epsilonEJ-313-5'', is 32.0% ± 2.3% and 44.6% ± 3.4% for front-on and side-on orientation of the BaF2, respectively. Moreover, the ⌀ 5'' EJ-313 and side-on oriented BaF2 were also exposed to neutrons from the D+T neutron generator, and the relative efficiency epsilonBaF2 / epsilonEJ-313-5'' was estimated to be 39.3%. Measurements of prompt photofission neutrons with the BaF2 detector by means of data acquisition after irradiation (out-of-beam) of nuclear material and between the beam pulses (beam-off) techniques were also conducted on the 9 MeV LINAC of the SAPHIR facility.

  15. Effect of gap detection threshold on consistency of speech in children with speech sound disorder.

    Science.gov (United States)

    Sayyahi, Fateme; Soleymani, Zahra; Akbari, Mohammad; Bijankhan, Mahmood; Dolatshahi, Behrooz

    2017-02-01

    The present study examined the relationship between gap detection threshold and speech error consistency in children with speech sound disorder. The participants were children five to six years of age who were categorized into three groups of typical speech, consistent speech disorder (CSD) and inconsistent speech disorder (ISD).The phonetic gap detection threshold test was used for this study, which is a valid test comprised six syllables with inter-stimulus intervals between 20-300ms. The participants were asked to listen to the recorded stimuli three times and indicate whether they heard one or two sounds. There was no significant difference between the typical and CSD groups (p=0.55), but there were significant differences in performance between the ISD and CSD groups and the ISD and typical groups (p=0.00). The ISD group discriminated between speech sounds at a higher threshold. Children with inconsistent speech errors could not distinguish speech sounds during time-limited phonetic discrimination. It is suggested that inconsistency in speech is a representation of inconsistency in auditory perception, which causes by high gap detection threshold. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Threshold-based system for noise detection in multilead ECG recordings

    International Nuclear Information System (INIS)

    Jekova, Irena; Krasteva, Vessela; Christov, Ivaylo; Abächerli, Roger

    2012-01-01

    This paper presents a system for detection of the most common noise types seen on the electrocardiogram (ECG) in order to evaluate whether an episode from 12-lead ECG is reliable for diagnosis. It implements criteria for estimation of the noise corruption level in specific frequency bands, aiming to identify the main sources of ECG quality disruption, such as missing signal or limited dynamics of the QRS components above 4 Hz; presence of high amplitude and steep artifacts seen above 1 Hz; baseline drift estimated at frequencies below 1 Hz; power–line interference in a band ±2 Hz around its central frequency; high-frequency and electromyographic noises above 20 Hz. All noise tests are designed to process the ECG series in the time domain, including 13 adjustable thresholds for amplitude and slope criteria which are evaluated in adjustable time intervals, as well as number of leads. The system allows flexible extension toward application-specific requirements for the noise levels in acceptable quality ECGs. Training of different thresholds’ settings to determine different positive noise detection rates is performed with the annotated set of 1000 ECGs from the PhysioNet database created for the Computing in Cardiology Challenge 2011. Two implementations are highlighted on the receiver operating characteristic (area 0.968) to fit to different applications. The implementation with high sensitivity (Se = 98.7%, Sp = 80.9%) appears as a reliable alarm when there are any incidental problems with the ECG acquisition, while the implementation with high specificity (Sp = 97.8%, Se = 81.8%) is less susceptible to transient problems but rather validates noisy ECGs with acceptable quality during a small portion of the recording. (paper)

  17. Olfaction and environment: Tsimane' of Bolivian rainforest have lower threshold of odor detection than industrialized German people.

    Directory of Open Access Journals (Sweden)

    Agnieszka Sorokowska

    Full Text Available Olfactory sensitivity varies between individuals. However, data regarding cross-cultural and inter-group differences are scarce. We compared the thresholds of odor detection of the traditional society of Tsimane' (native Amazonians of the Bolivian rainforest; n = 151 and people living in Dresden (Germany; n = 286 using "Sniffin' Sticks" threshold subtest. Tsimane' detected n-butanol at significantly lower concentrations than the German subjects. The distribution of thresholds of the Tsimane' was very specific, with 25% of Tsimane' obtaining better results in the olfactory test than any member of the German group. These data suggest that differences in olfactory sensitivity seem to be especially salient between industrialized and non-industrialized populations inhabiting different environmental conditions. We hypothesize that the possible sources of such differences are: (i the impact of pollution which impairs the olfactory abilities of people from industrialized countries; (ii better training of olfaction because of the higher importance of smell in traditional populations; (iii environmental pressures shaping olfactory abilities in these populations.

  18. LOWERING ICECUBE'S ENERGY THRESHOLD FOR POINT SOURCE SEARCHES IN THE SOUTHERN SKY

    Energy Technology Data Exchange (ETDEWEB)

    Aartsen, M. G. [Department of Physics, University of Adelaide, Adelaide, 5005 (Australia); Abraham, K. [Physik-department, Technische Universität München, D-85748 Garching (Germany); Ackermann, M. [DESY, D-15735 Zeuthen (Germany); Adams, J. [Department of Physics and Astronomy, University of Canterbury, Private Bag 4800, Christchurch (New Zealand); Aguilar, J. A.; Ansseau, I. [Université Libre de Bruxelles, Science Faculty CP230, B-1050 Brussels (Belgium); Ahlers, M. [Department of Physics and Wisconsin IceCube Particle Astrophysics Center, University of Wisconsin, Madison, WI 53706 (United States); Ahrens, M. [Oskar Klein Centre and Department of Physics, Stockholm University, SE-10691 Stockholm (Sweden); Altmann, D.; Anton, G. [Erlangen Centre for Astroparticle Physics, Friedrich-Alexander-Universität Erlangen-Nürnberg, D-91058 Erlangen (Germany); Andeen, K. [Department of Physics, Marquette University, Milwaukee, WI, 53201 (United States); Anderson, T.; Arlen, T. C. [Department of Physics, Pennsylvania State University, University Park, PA 16802 (United States); Archinger, M.; Baum, V. [Institute of Physics, University of Mainz, Staudinger Weg 7, D-55099 Mainz (Germany); Arguelles, C. [Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Auffenberg, J. [III. Physikalisches Institut, RWTH Aachen University, D-52056 Aachen (Germany); Bai, X. [Physics Department, South Dakota School of Mines and Technology, Rapid City, SD 57701 (United States); Barwick, S. W. [Department of Physics and Astronomy, University of California, Irvine, CA 92697 (United States); Bay, R., E-mail: jacob.feintzeig@gmail.com, E-mail: naoko@icecube.wisc.edu [Department of Physics, University of California, Berkeley, CA 94720 (United States); Collaboration: IceCube Collaboration; and others

    2016-06-20

    Observation of a point source of astrophysical neutrinos would be a “smoking gun” signature of a cosmic-ray accelerator. While IceCube has recently discovered a diffuse flux of astrophysical neutrinos, no localized point source has been observed. Previous IceCube searches for point sources in the southern sky were restricted by either an energy threshold above a few hundred TeV or poor neutrino angular resolution. Here we present a search for southern sky point sources with greatly improved sensitivities to neutrinos with energies below 100 TeV. By selecting charged-current ν{sub μ} interacting inside the detector, we reduce the atmospheric background while retaining efficiency for astrophysical neutrino-induced events reconstructed with sub-degree angular resolution. The new event sample covers three years of detector data and leads to a factor of 10 improvement in sensitivity to point sources emitting below 100 TeV in the southern sky. No statistically significant evidence of point sources was found, and upper limits are set on neutrino emission from individual sources. A posteriori analysis of the highest-energy (∼100 TeV) starting event in the sample found that this event alone represents a 2.8 σ deviation from the hypothesis that the data consists only of atmospheric background.

  19. Oxytocin administration selectively improves olfactory detection thresholds for lyral in patients with schizophrenia.

    Science.gov (United States)

    Woolley, J D; Lam, O; Chuang, B; Ford, J M; Mathalon, D H; Vinogradov, S

    2015-03-01

    Olfaction plays an important role in mammalian social behavior. Olfactory deficits are common in schizophrenia and correlate with negative symptoms and low social drive. Despite their prominence and possible clinical relevance, little is understood about the pathological mechanisms underlying olfactory deficits in schizophrenia and there are currently no effective treatments for these deficits. The prosocial neuropeptide oxytocin may affect the olfactory system when administered intranasally to humans and there is growing interest in its therapeutic potential in schizophrenia. To examine this model, we administered 40IU of oxytocin and placebo intranasally to 31 patients with a schizophrenia spectrum illness and 34 age-matched healthy control participants in a randomized, double-blind, placebo-controlled, cross-over study. On each test day, participants completed an olfactory detection threshold test for two different odors: (1) lyral, a synthetic fragrance compound for which patients with schizophrenia have specific olfactory detection threshold deficits, possibly related to decreased cyclic adenosine 3',5'-monophosphate (cAMP) signaling; and (2) anise, a compound for which olfactory detection thresholds change with menstrual cycle phase in women. On the placebo test day, patients with schizophrenia did not significantly differ from healthy controls in detection of either odor. We found that oxytocin administration significantly and selectively improved olfactory detection thresholds for lyral but not for anise in patients with schizophrenia. In contrast, oxytocin had no effect on detection of either odor in healthy controls. Our data indicate that oxytocin administration may ameliorate olfactory deficits in schizophrenia and suggest the effects of intranasal oxytocin may extend to influencing the olfactory system. Given that oxytocin has been found to increase cAMP signaling in vitro a possible mechanism for these effects is discussed. Published by Elsevier Ltd.

  20. Self-Tuning Threshold Method for Real-Time Gait Phase Detection Based on Ground Contact Forces Using FSRs

    Directory of Open Access Journals (Sweden)

    Jing Tang

    2018-02-01

    Full Text Available This paper presents a novel methodology for detecting the gait phase of human walking on level ground. The previous threshold method (TM sets a threshold to divide the ground contact forces (GCFs into on-ground and off-ground states. However, the previous methods for gait phase detection demonstrate no adaptability to different people and different walking speeds. Therefore, this paper presents a self-tuning triple threshold algorithm (STTTA that calculates adjustable thresholds to adapt to human walking. Two force sensitive resistors (FSRs were placed on the ball and heel to measure GCFs. Three thresholds (i.e., high-threshold, middle-threshold andlow-threshold were used to search out the maximum and minimum GCFs for the self-adjustments of thresholds. The high-threshold was the main threshold used to divide the GCFs into on-ground and off-ground statuses. Then, the gait phases were obtained through the gait phase detection algorithm (GPDA, which provides the rules that determine calculations for STTTA. Finally, the STTTA reliability is determined by comparing the results between STTTA and Mariani method referenced as the timing analysis module (TAM and Lopez–Meyer methods. Experimental results show that the proposed method can be used to detect gait phases in real time and obtain high reliability when compared with the previous methods in the literature. In addition, the proposed method exhibits strong adaptability to different wearers walking at different walking speeds.

  1. Thresholds for human detection of patient setup errors in digitally reconstructed portal images of prostate fields

    International Nuclear Information System (INIS)

    Phillips, Brooke L.; Jiroutek, Michael R.; Tracton, Gregg; Elfervig, Michelle; Muller, Keith E.; Chaney, Edward L.

    2002-01-01

    Purpose: Computer-assisted methods to analyze electronic portal images for the presence of treatment setup errors should be studied in controlled experiments before use in the clinical setting. Validation experiments using images that contain known errors usually report the smallest errors that can be detected by the image analysis algorithm. This paper offers human error-detection thresholds as one benchmark for evaluating the smallest errors detected by algorithms. Unfortunately, reliable data are lacking describing human performance. The most rigorous benchmarks for human performance are obtained under conditions that favor error detection. To establish such benchmarks, controlled observer studies were carried out to determine the thresholds of detectability for in-plane and out-of-plane translation and rotation setup errors introduced into digitally reconstructed portal radiographs (DRPRs) of prostate fields. Methods and Materials: Seventeen observers comprising radiation oncologists, radiation oncology residents, physicists, and therapy students participated in a two-alternative forced choice experiment involving 378 DRPRs computed using the National Library of Medicine Visible Human data sets. An observer viewed three images at a time displayed on adjacent computer monitors. Each image triplet included a reference digitally reconstructed radiograph displayed on the central monitor and two DRPRs displayed on the flanking monitors. One DRPR was error free. The other DRPR contained a known in-plane or out-of-plane error in the placement of the treatment field over a target region in the pelvis. The range for each type of error was determined from pilot observer studies based on a Probit model for error detection. The smallest errors approached the limit of human visual capability. The observer was told what kind of error was introduced, and was asked to choose the DRPR that contained the error. Observer decisions were recorded and analyzed using repeated

  2. Detecting hidden sources-STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Nikkinen, M.; Aarnio, P. [Helsinki Univ. of Technology, Espoo (Finland); Honkamaa, T.; Tiilikainen, H. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland)

    1997-12-31

    The task of the team was to locate and to identify hidden sources in a specified area in Padasjoki Auttoinen village. The team used AB-420 helicopter of the Finnish Frontier Guard. The team had two measuring systems: HPGe system (relative efficiency 18%) and 5`x5` NaI system. The team found two sources in real-time and additional two sources after 24 h analysis time. After the locations and characteristics of the sources were announced it was found out that altogether six sources would have been possible to find using the measured data. The total number of sources was ten. The NaI detector was good at detecting and locating the sources and HPGe was most useful in identification and calculation of the activity estimates. The following development should be made: 1) larger detectors are needed, 2) the software has to be improved. (This has been performed after the exercise) and 3) the navigation must be based on DGPS. visual navigation causes easily gaps between the flight lines and some sources may not be detected. (au).

  3. Detecting hidden sources-STUK/HUT team

    Energy Technology Data Exchange (ETDEWEB)

    Nikkinen, M; Aarnio, P [Helsinki Univ. of Technology, Espoo (Finland); Honkamaa, T; Tiilikainen, H [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland)

    1998-12-31

    The task of the team was to locate and to identify hidden sources in a specified area in Padasjoki Auttoinen village. The team used AB-420 helicopter of the Finnish Frontier Guard. The team had two measuring systems: HPGe system (relative efficiency 18%) and 5`x5` NaI system. The team found two sources in real-time and additional two sources after 24 h analysis time. After the locations and characteristics of the sources were announced it was found out that altogether six sources would have been possible to find using the measured data. The total number of sources was ten. The NaI detector was good at detecting and locating the sources and HPGe was most useful in identification and calculation of the activity estimates. The following development should be made: 1) larger detectors are needed, 2) the software has to be improved. (This has been performed after the exercise) and 3) the navigation must be based on DGPS. visual navigation causes easily gaps between the flight lines and some sources may not be detected. (au).

  4. Characterizing multi-photon quantum interference with practical light sources and threshold single-photon detectors

    Science.gov (United States)

    Navarrete, Álvaro; Wang, Wenyuan; Xu, Feihu; Curty, Marcos

    2018-04-01

    The experimental characterization of multi-photon quantum interference effects in optical networks is essential in many applications of photonic quantum technologies, which include quantum computing and quantum communication as two prominent examples. However, such characterization often requires technologies which are beyond our current experimental capabilities, and today's methods suffer from errors due to the use of imperfect sources and photodetectors. In this paper, we introduce a simple experimental technique to characterize multi-photon quantum interference by means of practical laser sources and threshold single-photon detectors. Our technique is based on well-known methods in quantum cryptography which use decoy settings to tightly estimate the statistics provided by perfect devices. As an illustration of its practicality, we use this technique to obtain a tight estimation of both the generalized Hong‑Ou‑Mandel dip in a beamsplitter with six input photons and the three-photon coincidence probability at the output of a tritter.

  5. Acoustical source reconstruction from non-synchronous sequential measurements by Fast Iterative Shrinkage Thresholding Algorithm

    Science.gov (United States)

    Yu, Liang; Antoni, Jerome; Leclere, Quentin; Jiang, Weikang

    2017-11-01

    Acoustical source reconstruction is a typical inverse problem, whose minimum frequency of reconstruction hinges on the size of the array and maximum frequency depends on the spacing distance between the microphones. For the sake of enlarging the frequency of reconstruction and reducing the cost of an acquisition system, Cyclic Projection (CP), a method of sequential measurements without reference, was recently investigated (JSV,2016,372:31-49). In this paper, the Propagation based Fast Iterative Shrinkage Thresholding Algorithm (Propagation-FISTA) is introduced, which improves CP in two aspects: (1) the number of acoustic sources is no longer needed and the only making assumption is that of a "weakly sparse" eigenvalue spectrum; (2) the construction of the spatial basis is much easier and adaptive to practical scenarios of acoustical measurements benefiting from the introduction of propagation based spatial basis. The proposed Propagation-FISTA is first investigated with different simulations and experimental setups and is next illustrated with an industrial case.

  6. Endogenous attention signals evoked by threshold contrast detection in human superior colliculus.

    Science.gov (United States)

    Katyal, Sucharit; Ress, David

    2014-01-15

    Human superior colliculus (SC) responds in a retinotopically selective manner when attention is deployed on a high-contrast visual stimulus using a discrimination task. To further elucidate the role of SC in endogenous visual attention, high-resolution fMRI was used to demonstrate that SC also exhibits a retinotopically selective response for covert attention in the absence of significant visual stimulation using a threshold-contrast detection task. SC neurons have a laminar organization according to their function, with visually responsive neurons present in the superficial layers and visuomotor neurons in the intermediate layers. The results show that the response evoked by the threshold-contrast detection task is significantly deeper than the response evoked by the high-contrast speed discrimination task, reflecting a functional dissociation of the attentional enhancement of visuomotor and visual neurons, respectively. Such a functional dissociation of attention within SC laminae provides a subcortical basis for the oculomotor theory of attention.

  7. Automatic video shot boundary detection using k-means clustering and improved adaptive dual threshold comparison

    Science.gov (United States)

    Sa, Qila; Wang, Zhihui

    2018-03-01

    At present, content-based video retrieval (CBVR) is the most mainstream video retrieval method, using the video features of its own to perform automatic identification and retrieval. This method involves a key technology, i.e. shot segmentation. In this paper, the method of automatic video shot boundary detection with K-means clustering and improved adaptive dual threshold comparison is proposed. First, extract the visual features of every frame and divide them into two categories using K-means clustering algorithm, namely, one with significant change and one with no significant change. Then, as to the classification results, utilize the improved adaptive dual threshold comparison method to determine the abrupt as well as gradual shot boundaries.Finally, achieve automatic video shot boundary detection system.

  8. Is heat pain detection threshold associated with the area of secondary hyperalgesia following brief thermal sensitization?

    DEFF Research Database (Denmark)

    Hansen, Morten Sejer; Wetterslev, Jørn; Pipper, Christian Bressen

    2016-01-01

    role in the development of secondary hyperalgesia; however, a possible association of secondary hyperalgesia following brief thermal sensitization and other heat pain models remains unknown. Our aim with this study is to investigate how close the heat pain detection threshold is associated...... with the size of the area of secondary hyperalgesia induced by the clinical heat pain model: Brief thermal sensitization. METHODS AND DESIGN: We aim to include 120 healthy participants. The participants will be tested on two separate study days with the following procedures: i) Brief thermal sensitization, ii......) heat pain detection threshold and iii) pain during thermal stimulation. Additionally, the participants will be tested with the Pain Catastrophizing Scale and Hospital Anxiety and Depression Scale questionnaires. We conducted statistical simulations based on data from our previous study, to estimate...

  9. Detection of OH radicals from IRAS sources

    International Nuclear Information System (INIS)

    Lewis, B.M.; Eder, J.; Terzian, Y.

    1985-01-01

    An efficient method for detecting new OH/infrared stars is to begin with IRAS source positions, selected for appropriate infrared colours, and using radio-line observations to confirm the OH properties. The authors demonstrate the validity of this approach here, using the Arecibo 305 m radio-telescope to confirm the 1,612 MHz line observations of sources in IRAS Circulars 8 and 9; the present observations identify 21 new OH/infrared stars. The new sources have weaker 1,612 MHz fluxes, bluer (60-25) μm colours and a smaller mean separation between the principal emission peaks than previous samples. (author)

  10. A comparison of signal detection theory to the objective threshold/strategic model of unconscious perception.

    Science.gov (United States)

    Haase, Steven J; Fisk, Gary D

    2011-08-01

    A key problem in unconscious perception research is ruling out the possibility that weak conscious awareness of stimuli might explain the results. In the present study, signal detection theory was compared with the objective threshold/strategic model as explanations of results for detection and identification sensitivity in a commonly used unconscious perception task. In the task, 64 undergraduate participants detected and identified one of four briefly displayed, visually masked letters. Identification was significantly above baseline (i.e., proportion correct > .25) at the highest detection confidence rating. This result is most consistent with signal detection theory's continuum of sensory states and serves as a possible index of conscious perception. However, there was limited support for the other model in the form of a predicted "looker's inhibition" effect, which produced identification performance that was significantly below baseline. One additional result, an interaction between the target stimulus and type of mask, raised concerns for the generality of unconscious perception effects.

  11. Adaptive thresholding and dynamic windowing method for automatic centroid detection of digital Shack-Hartmann wavefront sensor

    International Nuclear Information System (INIS)

    Yin Xiaoming; Li Xiang; Zhao Liping; Fang Zhongping

    2009-01-01

    A Shack-Hartmann wavefront sensor (SWHS) splits the incident wavefront into many subsections and transfers the distorted wavefront detection into the centroid measurement. The accuracy of the centroid measurement determines the accuracy of the SWHS. Many methods have been presented to improve the accuracy of the wavefront centroid measurement. However, most of these methods are discussed from the point of view of optics, based on the assumption that the spot intensity of the SHWS has a Gaussian distribution, which is not applicable to the digital SHWS. In this paper, we present a centroid measurement algorithm based on the adaptive thresholding and dynamic windowing method by utilizing image processing techniques for practical application of the digital SHWS in surface profile measurement. The method can detect the centroid of each focal spot precisely and robustly by eliminating the influence of various noises, such as diffraction of the digital SHWS, unevenness and instability of the light source, as well as deviation between the centroid of the focal spot and the center of the detection area. The experimental results demonstrate that the algorithm has better precision, repeatability, and stability compared with other commonly used centroid methods, such as the statistical averaging, thresholding, and windowing algorithms.

  12. Fast Image Edge Detection based on Faber Schauder Wavelet and Otsu Threshold

    Directory of Open Access Journals (Sweden)

    Assma Azeroual

    2017-12-01

    Full Text Available Edge detection is a critical stage in many computer vision systems, such as image segmentation and object detection. As it is difficult to detect image edges with precision and with low complexity, it is appropriate to find new methods for edge detection. In this paper, we take advantage of Faber Schauder Wavelet (FSW and Otsu threshold to detect edges in a multi-scale way with low complexity, since the extrema coefficients of this wavelet are located on edge points and contain only arithmetic operations. First, the image is smoothed using bilateral filter depending on noise estimation. Second, the FSW extrema coefficients are selected based on Otsu threshold. Finally, the edge points are linked using a predictive edge linking algorithm to get the image edges. The effectiveness of the proposed method is supported by the experimental results which prove that our method is faster than many competing state-of-the-art approaches and can be used in real-time applications.

  13. Detecting modulated signals in modulated noise: (II) neural thresholds in the songbird forebrain.

    Science.gov (United States)

    Bee, Mark A; Buschermöhle, Michael; Klump, Georg M

    2007-10-01

    Sounds in the real world fluctuate in amplitude. The vertebrate auditory system exploits patterns of amplitude fluctuations to improve signal detection in noise. One experimental paradigm demonstrating these general effects has been used in psychophysical studies of 'comodulation detection difference' (CDD). The CDD effect refers to the fact that thresholds for detecting a modulated, narrowband noise signal are lower when the envelopes of flanking bands of modulated noise are comodulated with each other, but fluctuate independently of the signal compared with conditions in which the envelopes of the signal and flanking bands are all comodulated. Here, we report results from a study of the neural correlates of CDD in European starlings (Sturnus vulgaris). We manipulated: (i) the envelope correlations between a narrowband noise signal and a masker comprised of six flanking bands of noise; (ii) the signal onset delay relative to masker onset; (iii) signal duration; and (iv) masker spectrum level. Masked detection thresholds were determined from neural responses using signal detection theory. Across conditions, the magnitude of neural CDD ranged between 2 and 8 dB, which is similar to that reported in a companion psychophysical study of starlings [U. Langemann & G.M. Klump (2007) Eur. J. Neurosci., 26, 1969-1978]. We found little evidence to suggest that neural CDD resulted from the across-channel processing of auditory grouping cues related to common envelope fluctuations and synchronous onsets between the signal and flanking bands. We discuss a within-channel model of peripheral processing that explains many of our results.

  14. Pain and sensory detection threshold response to acupuncture is modulated by coping strategy and acupuncture sensation.

    Science.gov (United States)

    Lee, Jeungchan; Napadow, Vitaly; Park, Kyungmo

    2014-09-01

    Acupuncture has been shown to reduce pain, and acupuncture-induced sensation may be important for this analgesia. In addition, cognitive coping strategies can influence sensory perception. However, the role of coping strategy on acupuncture modulation of pain and sensory thresholds, and the association between acupuncture sensation and these modulatory effects, is currently unknown. Electroacupuncture (EA) was applied at acupoints ST36 and GB39 of 61 healthy adults. Different coping conditions were experimentally designed to form an active coping strategy group (AC group), who thought they could control EA stimulation intensity, and a passive coping strategy group (PC group), who did not think they had such control. Importantly, neither group was actually able to control EA stimulus intensity. Quantitative sensory testing was performed before and after EA, and consisted of vibration (VDT), mechanical (MDT), warm (WDT), and cold (CDT) detection thresholds, and pressure (PPT), mechanical (MPT), heat (HPT) and cold (CPT) pain thresholds. Autonomic measures (e.g. skin conductance response, SCR) were also acquired to quantify physiological response to EA under different coping conditions. Subjects also reported the intensity of any acupuncture-induced sensations. Coping strategy was induced with successful blinding in 58% of AC subjects. Compared to PC, AC showed greater SCR to EA. Under AC, EA reduced PPT and CPT. In the AC group, improved pain and sensory thresholds were correlated with acupuncture sensation (VDTchange vs. MI: r=0.58, CDTchange vs. tingling: r=0.53, CPTchange vs. tingling; r=0.55, CPTchange vs. dull; r=0.55). However, in the PC group, improved sensory thresholds were negatively correlated with acupuncture sensation (CDTchange vs. intensity sensitization: r=-0.52, WDTchange vs. fullness: r=-0.57). Our novel approach was able to successfully induce AC and PC strategies to EA stimulation. The interaction between psychological coping strategy and

  15. Magnetotelluric Detection Thresholds as a Function of Leakage Plume Depth, TDS and Volume

    Energy Technology Data Exchange (ETDEWEB)

    Yang, X. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Buscheck, T. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mansoor, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Carroll, S. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-04-21

    We conducted a synthetic magnetotelluric (MT) data analysis to establish a set of specific thresholds of plume depth, TDS concentration and volume for detection of brine and CO2 leakage from legacy wells into shallow aquifers in support of Strategic Monitoring Subtask 4.1 of the US DOE National Risk Assessment Partnership (NRAP Phase II), which is to develop geophysical forward modeling tools. 900 synthetic MT data sets span 9 plume depths, 10 TDS concentrations and 10 plume volumes. The monitoring protocol consisted of 10 MT stations in a 2×5 grid laid out along the flow direction. We model the MT response in the audio frequency range of 1 Hz to 10 kHz with a 50 Ωm baseline resistivity and the maximum depth up to 2000 m. Scatter plots show the MT detection thresholds for a trio of plume depth, TDS concentration and volume. Plumes with a large volume and high TDS located at a shallow depth produce a strong MT signal. We demonstrate that the MT method with surface based sensors can detect a brine and CO2 plume so long as the plume depth, TDS concentration and volume are above the thresholds. However, it is unlikely to detect a plume at a depth larger than 1000 m with the change of TDS concentration smaller than 10%. Simulated aquifer impact data based on the Kimberlina site provides a more realistic view of the leakage plume distribution than rectangular synthetic plumes in this sensitivity study, and it will be used to estimate MT responses over simulated brine and CO2 plumes and to evaluate the leakage detectability. Integration of the simulated aquifer impact data and the MT method into the NRAP DREAM tool may provide an optimized MT survey configuration for MT data collection. This study presents a viable approach for sensitivity study of geophysical monitoring methods for leakage detection. The results come in handy for rapid assessment of leakage detectability.

  16. A Fiber Bragg Grating Interrogation System with Self-Adaption Threshold Peak Detection Algorithm.

    Science.gov (United States)

    Zhang, Weifang; Li, Yingwu; Jin, Bo; Ren, Feifei; Wang, Hongxun; Dai, Wei

    2018-04-08

    A Fiber Bragg Grating (FBG) interrogation system with a self-adaption threshold peak detection algorithm is proposed and experimentally demonstrated in this study. This system is composed of a field programmable gate array (FPGA) and advanced RISC machine (ARM) platform, tunable Fabry-Perot (F-P) filter and optical switch. To improve system resolution, the F-P filter was employed. As this filter is non-linear, this causes the shifting of central wavelengths with the deviation compensated by the parts of the circuit. Time-division multiplexing (TDM) of FBG sensors is achieved by an optical switch, with the system able to realize the combination of 256 FBG sensors. The wavelength scanning speed of 800 Hz can be achieved by a FPGA+ARM platform. In addition, a peak detection algorithm based on a self-adaption threshold is designed and the peak recognition rate is 100%. Experiments with different temperatures were conducted to demonstrate the effectiveness of the system. Four FBG sensors were examined in the thermal chamber without stress. When the temperature changed from 0 °C to 100 °C, the degree of linearity between central wavelengths and temperature was about 0.999 with the temperature sensitivity being 10 pm/°C. The static interrogation precision was able to reach 0.5 pm. Through the comparison of different peak detection algorithms and interrogation approaches, the system was verified to have an optimum comprehensive performance in terms of precision, capacity and speed.

  17. A Fiber Bragg Grating Interrogation System with Self-Adaption Threshold Peak Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Weifang Zhang

    2018-04-01

    Full Text Available A Fiber Bragg Grating (FBG interrogation system with a self-adaption threshold peak detection algorithm is proposed and experimentally demonstrated in this study. This system is composed of a field programmable gate array (FPGA and advanced RISC machine (ARM platform, tunable Fabry–Perot (F–P filter and optical switch. To improve system resolution, the F–P filter was employed. As this filter is non-linear, this causes the shifting of central wavelengths with the deviation compensated by the parts of the circuit. Time-division multiplexing (TDM of FBG sensors is achieved by an optical switch, with the system able to realize the combination of 256 FBG sensors. The wavelength scanning speed of 800 Hz can be achieved by a FPGA+ARM platform. In addition, a peak detection algorithm based on a self-adaption threshold is designed and the peak recognition rate is 100%. Experiments with different temperatures were conducted to demonstrate the effectiveness of the system. Four FBG sensors were examined in the thermal chamber without stress. When the temperature changed from 0 °C to 100 °C, the degree of linearity between central wavelengths and temperature was about 0.999 with the temperature sensitivity being 10 pm/°C. The static interrogation precision was able to reach 0.5 pm. Through the comparison of different peak detection algorithms and interrogation approaches, the system was verified to have an optimum comprehensive performance in terms of precision, capacity and speed.

  18. Perceptual thresholds for detecting modifications applied to the acoustical properties of a violin.

    Science.gov (United States)

    Fritz, Claudia; Cross, Ian; Moore, Brian C J; Woodhouse, Jim

    2007-12-01

    This study is the first step in the psychoacoustic exploration of perceptual differences between the sounds of different violins. A method was used which enabled the same performance to be replayed on different "virtual violins," so that the relationships between acoustical characteristics of violins and perceived qualities could be explored. Recordings of real performances were made using a bridge-mounted force transducer, giving an accurate representation of the signal from the violin string. These were then played through filters corresponding to the admittance curves of different violins. Initially, limits of listener performance in detecting changes in acoustical characteristics were characterized. These consisted of shifts in frequency or increases in amplitude of single modes or frequency bands that have been proposed previously to be significant in the perception of violin sound quality. Thresholds were significantly lower for musically trained than for nontrained subjects but were not significantly affected by the violin used as a baseline. Thresholds for the musicians typically ranged from 3 to 6 dB for amplitude changes and 1.5%-20% for frequency changes. Interpretation of the results using excitation patterns showed that thresholds for the best subjects were quite well predicted by a multichannel model based on optimal processing.

  19. Direction detection thresholds of passive self-motion in artistic gymnasts.

    Science.gov (United States)

    Hartmann, Matthias; Haller, Katia; Moser, Ivan; Hossner, Ernst-Joachim; Mast, Fred W

    2014-04-01

    In this study, we compared direction detection thresholds of passive self-motion in the dark between artistic gymnasts and controls. Twenty-four professional female artistic gymnasts (ranging from 7 to 20 years) and age-matched controls were seated on a motion platform and asked to discriminate the direction of angular (yaw, pitch, roll) and linear (leftward-rightward) motion. Gymnasts showed lower thresholds for the linear leftward-rightward motion. Interestingly, there was no difference for the angular motions. These results show that the outstanding self-motion abilities in artistic gymnasts are not related to an overall higher sensitivity in self-motion perception. With respect to vestibular processing, our results suggest that gymnastic expertise is exclusively linked to superior interpretation of otolith signals when no change in canal signals is present. In addition, thresholds were overall lower for the older (14-20 years) than for the younger (7-13 years) participants, indicating the maturation of vestibular sensitivity from childhood to adolescence.

  20. Motion Detection from Mobile Robots with Fuzzy Threshold Selection in Consecutive 2D Laser Scans

    Directory of Open Access Journals (Sweden)

    María A. Martínez

    2015-01-01

    Full Text Available Motion detection and tracking is a relevant problem for mobile robots during navigation to avoid collisions in dynamic environments or in applications where service robots interact with humans. This paper presents a simple method to distinguish mobile obstacles from the environment that is based on applying fuzzy threshold selection to consecutive two-dimensional (2D laser scans previously matched with robot odometry. The proposed method has been tested with the Auriga-α mobile robot in indoors to estimate the motion of nearby pedestrians.

  1. Experimental evaluation of the detection threshold of uranium in urine samples

    International Nuclear Information System (INIS)

    Ferreyra, M. D.; Suarez Mendez, Sebastian; Tossi, Mirta H.

    1999-01-01

    The routine internal dosimetric tests for nuclear installations workers includes the determination of uranium in urine. The analysis is carried out, after chemical treatment, by UV fluorometry, comparing the results with urine blank samples from workers not exposed professionally to contamination. The fluctuation of the results of the uranium content in the blank samples greatly affects the determinations. In 30 blank samples the uranium content was determined and the results were evaluated by three calculation methods: 1) The procedure recommended by IUPAC; 2) The graphical method; 3) and The error propagation method. The last one has been adopted for the calculation of the detection threshold. (authors)

  2. Critical review and hydrologic application of threshold detection methods for the generalized Pareto (GP) distribution

    Science.gov (United States)

    Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto

    2016-04-01

    Estimation of extreme rainfall from data constitutes one of the most important issues in statistical hydrology, as it is associated with the design of hydraulic structures and flood water management. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a generalized Pareto (GP) distribution model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches, such as non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data, graphical methods where one studies the dependence of GP distribution parameters (or related metrics) on the threshold level u, and Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GP distribution model is applicable. In this work, we review representative methods for GP threshold detection, discuss fundamental differences in their theoretical bases, and apply them to 1714 daily rainfall records from the NOAA-NCDC open-access database, with more than 110 years of data. We find that non-parametric methods that are intended to locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while methods that are based on asymptotic properties of the upper distribution tail lead to unrealistically high threshold and shape parameter estimates. The latter is justified by theoretical arguments, and it is especially the case in rainfall applications, where the shape parameter of the GP distribution is low; i.e. on the order of 0.1 ÷ 0.2. Better performance is demonstrated by graphical methods and GoF metrics that rely on pre-asymptotic properties of the GP distribution. For daily rainfall, we find that GP threshold estimates range between 2÷12 mm/d with a mean value of 6.5 mm/d, while the existence of quantization in the

  3. Effects of aging on vibration detection thresholds at various body regions

    Directory of Open Access Journals (Sweden)

    Walsh Natalie

    2003-02-01

    Full Text Available Abstract Background The ability to detect sinusoidal vibrations on the skin surface is dependent on the activation of two classes of receptors. The density of such receptors varies across the skin surface and is a factor in determining the sensory acuity of each skin area. However, the acuity of many sensory systems is known to deteriorate with advancing age. The aim of this study was to determine if vibrotactile sensibility of several skin surfaces deteriorated equally with advancing age. Methods Vibration detection thresholds for two frequencies of vibration (30 Hz and 200 Hz were determined using a method of limits protocol, in two groups of healthy adults, one group aged 17 to 27 years and the other aged 55 to 90 years. Sinusoidal vibrations were generated by a computer and delivered to the skin surface via the probe (diameter = 2 mm of a mechanical vibrator. Four skin sites (palmar surface of the tip of the middle finger, volar surface of the forearm, lateral aspect of the shoulder, cheek just caudal to the zygoma were tested. Results The fingertip was the most sensitive site for vibrotactile detection at both frequencies in a substantial majority of subjects. The older group of subjects showed significantly higher detection thresholds for both frequencies at all sites, except the fingertip, when compared to young subjects. Conclusion The study confirms the deterioration of vibrotactile acuity at several skin sites previously reported in the literature. However, there appears to be no significant reduction in vibrotactile detection at the fingertips in older subjects. This may reflect the high receptor density of this area, or the functional importance of vibrotactile sensibility of the fingertips or some combination of both of these factors.

  4. Reduced visual surround suppression in schizophrenia shown by measuring contrast detection thresholds

    Directory of Open Access Journals (Sweden)

    Ignacio eSerrano-Pedraza

    2014-12-01

    Full Text Available Visual perception in schizophrenia is attracting a broad interest given the deep knowledge that we have about the visual system in healthy population. In visual science it is known that the visibility of a grating located in the visual periphery is impaired by the presence of a surrounding grating of the same spatial frequency and orientation. Previous studies have suggested abnormal visual surround suppression in patients with schizophrenia. Given that schizophrenia patients have cortical alterations including hypofunction of NMDA receptors and reduced concentration of GABA neurotransmitter, which affect lateral inhibitory connections, then they should perform better than controls in visual suppression tasks. We tested this hypothesis by measuring contrast detection thresholds using a new stimulus configuration. We tested two groups: 21 schizophrenia patients and 24 healthy subjects. Thresholds were obtained using Bayesian staircases in a 4AFC detection task where the target was a grating within a 3 deg Butterworth window that appeared in one of four possible positions at 5 deg eccentricity. We compared three conditions, a target with no surround (NS, b target on top of a surrounding grating of 20 deg diameter and 25% contrast with same spatial frequency and orthogonal orientation (OS, and c target on top of a surrounding grating with parallel (same orientation (PS. Our results show significantly lower thresholds for controls than for patients in NS and OS conditions. We also found significant lower suppression ratios PS/NS in patients. Our results support the hypothesis that inhibitory lateral connections in early visual cortex are impaired in schizophrenia patients.

  5. Flood Extent Mapping for Namibia Using Change Detection and Thresholding with SAR

    Science.gov (United States)

    Long, Stephanie; Fatoyinbo, Temilola E.; Policelli, Frederick

    2014-01-01

    A new method for flood detection change detection and thresholding (CDAT) was used with synthetic aperture radar (SAR) imagery to delineate the extent of flooding for the Chobe floodplain in the Caprivi region of Namibia. This region experiences annual seasonal flooding and has seen a recent renewal of severe flooding after a long dry period in the 1990s. Flooding in this area has caused loss of life and livelihoods for the surrounding communities and has caught the attention of disaster relief agencies. There is a need for flood extent mapping techniques that can be used to process images quickly, providing near real-time flooding information to relief agencies. ENVISAT/ASAR and Radarsat-2 images were acquired for several flooding seasons from February 2008 to March 2013. The CDAT method was used to determine flooding from these images and includes the use of image subtraction, decision based classification with threshold values, and segmentation of SAR images. The total extent of flooding determined for 2009, 2011 and 2012 was about 542 km2, 720 km2, and 673 km2 respectively. Pixels determined to be flooded in vegetation were typically flooding in vegetation was much greater (almost one third of the total flooded area). The time to maximum flooding for the 2013 flood season was determined to be about 27 days. Landsat water classification was used to compare the results from the new CDAT with SAR method; the results show good spatial agreement with Landsat scenes.

  6. A Robust Automated Cataract Detection Algorithm Using Diagnostic Opinion Based Parameter Thresholding for Telemedicine Application

    Directory of Open Access Journals (Sweden)

    Shashwat Pathak

    2016-09-01

    Full Text Available This paper proposes and evaluates an algorithm to automatically detect the cataracts from color images in adult human subjects. Currently, methods available for cataract detection are based on the use of either fundus camera or Digital Single-Lens Reflex (DSLR camera; both are very expensive. The main motive behind this work is to develop an inexpensive, robust and convenient algorithm which in conjugation with suitable devices will be able to diagnose the presence of cataract from the true color images of an eye. An algorithm is proposed for cataract screening based on texture features: uniformity, intensity and standard deviation. These features are first computed and mapped with diagnostic opinion by the eye expert to define the basic threshold of screening system and later tested on real subjects in an eye clinic. Finally, a tele-ophthamology model using our proposed system has been suggested, which confirms the telemedicine application of the proposed system.

  7. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    International Nuclear Information System (INIS)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S.; Bond, I. A.; Allen, W.; Monard, L. A. G.; Albrow, M. D.; Fouqué, P.; Dominik, M.; Tsapras, Y.; Udalski, A.; Zellem, R.; Bos, M.; Christie, G. W.; DePoy, D. L.; Dong, Subo; Drummond, J.; Gorbikov, E.; Han, C.

    2013-01-01

    We analyze MOA-2010-BLG-311, a high magnification (A max > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only Δχ 2 ∼ 80. The preferred mass ratio between the lens star and its companion is q = 10 –3.7±0.1 , placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  8. MOA-2010-BLG-311: A PLANETARY CANDIDATE BELOW THE THRESHOLD OF RELIABLE DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Yee, J. C.; Hung, L.-W.; Gould, A.; Gaudi, B. S. [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Bond, I. A. [Institute for Information and Mathematical Sciences, Massey University, Private Bag 102-904, Auckland 1330 (New Zealand); Allen, W. [Vintage Lane Observatory, Blenheim (New Zealand); Monard, L. A. G. [Bronberg Observatory, Centre for Backyard Astrophysics, Pretoria (South Africa); Albrow, M. D. [Department of Physics and Astronomy, University of Canterbury, Private Bag 4800, Christchurch 8020 (New Zealand); Fouque, P. [IRAP, CNRS, Universite de Toulouse, 14 avenue Edouard Belin, F-31400 Toulouse (France); Dominik, M. [SUPA, University of St. Andrews, School of Physics and Astronomy, North Haugh, St. Andrews, KY16 9SS (United Kingdom); Tsapras, Y. [Las Cumbres Observatory Global Telescope Network, 6740B Cortona Drive, Goleta, CA 93117 (United States); Udalski, A. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Zellem, R. [Department of Planetary Sciences/LPL, University of Arizona, 1629 East University Boulevard, Tucson, AZ 85721 (United States); Bos, M. [Molehill Astronomical Observatory, North Shore City, Auckland (New Zealand); Christie, G. W. [Auckland Observatory, P.O. Box 24-180, Auckland (New Zealand); DePoy, D. L. [Department of Physics, Texas A and M University, 4242 TAMU, College Station, TX 77843-4242 (United States); Dong, Subo [Institute for Advanced Study, Einstein Drive, Princeton, NJ 08540 (United States); Drummond, J. [Possum Observatory, Patutahi (New Zealand); Gorbikov, E. [School of Physics and Astronomy, Raymond and Beverley Sackler Faculty of Exact Sciences, Tel-Aviv University, Tel Aviv 69978 (Israel); Han, C., E-mail: liweih@astro.ucla.edu, E-mail: rzellem@lpl.arizona.edu, E-mail: tim.natusch@aut.ac.nz [Department of Physics, Chungbuk National University, 410 Seongbong-Rho, Hungduk-Gu, Chongju 371-763 (Korea, Republic of); Collaboration: muFUN Collaboration; MOA Collaboration; OGLE Collaboration; PLANET Collaboration; RoboNet Collaboration; MiNDSTEp Consortium; and others

    2013-05-20

    We analyze MOA-2010-BLG-311, a high magnification (A{sub max} > 600) microlensing event with complete data coverage over the peak, making it very sensitive to planetary signals. We fit this event with both a point lens and a two-body lens model and find that the two-body lens model is a better fit but with only {Delta}{chi}{sup 2} {approx} 80. The preferred mass ratio between the lens star and its companion is q = 10{sup -3.7{+-}0.1}, placing the candidate companion in the planetary regime. Despite the formal significance of the planet, we show that because of systematics in the data the evidence for a planetary companion to the lens is too tenuous to claim a secure detection. When combined with analyses of other high-magnification events, this event helps empirically define the threshold for reliable planet detection in high-magnification events, which remains an open question.

  9. The relationship between intelligence and creativity: New support for the threshold hypothesis by means of empirical breakpoint detection

    Science.gov (United States)

    Jauk, Emanuel; Benedek, Mathias; Dunst, Beate; Neubauer, Aljoscha C.

    2013-01-01

    The relationship between intelligence and creativity has been subject to empirical research for decades. Nevertheless, there is yet no consensus on how these constructs are related. One of the most prominent notions concerning the interplay between intelligence and creativity is the threshold hypothesis, which assumes that above-average intelligence represents a necessary condition for high-level creativity. While earlier research mostly supported the threshold hypothesis, it has come under fire in recent investigations. The threshold hypothesis is commonly investigated by splitting a sample at a given threshold (e.g., at 120 IQ points) and estimating separate correlations for lower and upper IQ ranges. However, there is no compelling reason why the threshold should be fixed at an IQ of 120, and to date, no attempts have been made to detect the threshold empirically. Therefore, this study examined the relationship between intelligence and different indicators of creative potential and of creative achievement by means of segmented regression analysis in a sample of 297 participants. Segmented regression allows for the detection of a threshold in continuous data by means of iterative computational algorithms. We found thresholds only for measures of creative potential but not for creative achievement. For the former the thresholds varied as a function of criteria: When investigating a liberal criterion of ideational originality (i.e., two original ideas), a threshold was detected at around 100 IQ points. In contrast, a threshold of 120 IQ points emerged when the criterion was more demanding (i.e., many original ideas). Moreover, an IQ of around 85 IQ points was found to form the threshold for a purely quantitative measure of creative potential (i.e., ideational fluency). These results confirm the threshold hypothesis for qualitative indicators of creative potential and may explain some of the observed discrepancies in previous research. In addition, we obtained

  10. Low tube voltage CT for improved detection of pancreatic cancer: detection threshold for small, simulated lesions

    International Nuclear Information System (INIS)

    Holm, Jon; Loizou, Louiza; Albiin, Nils; Kartalis, Nikolaos; Leidner, Bertil; Sundin, Anders

    2012-01-01

    Pancreatic ductal adenocarcinoma is associated with dismal prognosis. The detection of small pancreatic tumors which are still resectable is still a challenging problem. The aim of this study was to investigate the effect of decreasing the tube voltage from 120 to 80 kV on the detection of pancreatic tumors. Three scanning protocols was used; one using the standard tube voltage (120 kV) and current (160 mA) and two using 80 kV but with different tube currents (500 and 675 mA) to achieve equivalent dose (15 mGy) and noise (15 HU) as that of the standard protocol. Tumors were simulated into collected CT phantom images. The attenuation in normal parenchyma at 120 kV was set at 130 HU, as measured previously in clinical examinations, and the tumor attenuation was assumed to differ 20 HU and was set at 110HU. By scanning and measuring of iodine solution with different concentrations the corresponding tumor and parenchyma attenuation at 80 kV was found to be 185 and 219 HU, respectively. To objectively evaluate the differences between the three protocols, a multi-reader multi-case receiver operating characteristic study was conducted, using three readers and 100 cases, each containing 0–3 lesions. The highest reader averaged figure-of-merit (FOM) was achieved for 80 kV and 675 mA (FOM = 0,850), and the lowest for 120 kV (FOM = 0,709). There was a significant difference between the three protocols (p < 0,0001), when making an analysis of variance (ANOVA). Post-hoc analysis (students t-test) shows that there was a significant difference between 120 and 80 kV, but not between the two levels of tube currents at 80 kV. We conclude that when decreasing the tube voltage there is a significant improvement in tumor conspicuity

  11. Flood extent mapping for Namibia using change detection and thresholding with SAR

    International Nuclear Information System (INIS)

    Long, Stephanie; Fatoyinbo, Temilola E; Policelli, Frederick

    2014-01-01

    A new method for flood detection change detection and thresholding (CDAT) was used with synthetic aperture radar (SAR) imagery to delineate the extent of flooding for the Chobe floodplain in the Caprivi region of Namibia. This region experiences annual seasonal flooding and has seen a recent renewal of severe flooding after a long dry period in the 1990s. Flooding in this area has caused loss of life and livelihoods for the surrounding communities and has caught the attention of disaster relief agencies. There is a need for flood extent mapping techniques that can be used to process images quickly, providing near real-time flooding information to relief agencies. ENVISAT/ASAR and Radarsat-2 images were acquired for several flooding seasons from February 2008 to March 2013. The CDAT method was used to determine flooding from these images and includes the use of image subtraction, decision-based classification with threshold values, and segmentation of SAR images. The total extent of flooding determined for 2009, 2011 and 2012 was about 542 km 2 , 720 km 2 , and 673 km 2 respectively. Pixels determined to be flooded in vegetation were typically <0.5% of the entire scene, with the exception of 2009 where the detection of flooding in vegetation was much greater (almost one third of the total flooded area). The time to maximum flooding for the 2013 flood season was determined to be about 27 days. Landsat water classification was used to compare the results from the new CDAT with SAR method; the results show good spatial agreement with Landsat scenes. (paper)

  12. Mouse epileptic seizure detection with multiple EEG features and simple thresholding technique

    Science.gov (United States)

    Tieng, Quang M.; Anbazhagan, Ashwin; Chen, Min; Reutens, David C.

    2017-12-01

    Objective. Epilepsy is a common neurological disorder characterized by recurrent, unprovoked seizures. The search for new treatments for seizures and epilepsy relies upon studies in animal models of epilepsy. To capture data on seizures, many applications require prolonged electroencephalography (EEG) with recordings that generate voluminous data. The desire for efficient evaluation of these recordings motivates the development of automated seizure detection algorithms. Approach. A new seizure detection method is proposed, based on multiple features and a simple thresholding technique. The features are derived from chaos theory, information theory and the power spectrum of EEG recordings and optimally exploit both linear and nonlinear characteristics of EEG data. Main result. The proposed method was tested with real EEG data from an experimental mouse model of epilepsy and distinguished seizures from other patterns with high sensitivity and specificity. Significance. The proposed approach introduces two new features: negative logarithm of adaptive correlation integral and power spectral coherence ratio. The combination of these new features with two previously described features, entropy and phase coherence, improved seizure detection accuracy significantly. Negative logarithm of adaptive correlation integral can also be used to compute the duration of automatically detected seizures.

  13. Threshold disorder as a source of diverse and complex behavior in random nets

    DEFF Research Database (Denmark)

    McGuire, P.C.; Bohr, Henrik; Clark, J.W.

    2002-01-01

    We study the diversity of complex spatio-temporal patterns in the behavior of random synchronous asymmetric neural networks (RSANNs). Special attention is given to the impact of disordered threshold values on limit-cycle diversity and limit-cycle complexity in RSANNs which have 'normal' thresholds...... systems. In order to reach beyond this seemingly disabling 'stable and small' aspect of the limit-cycle repertoire of RSANNs, we have found that if an RSANN has threshold disorder above a critical level, then there is a rapid increase of the size of the repertoire of patterns. The repertoire size...... initially follows a power-law function of the magnitude of the threshold disorder. As the disorder increases further, the limit-cycle patterns themselves become simpler until at a second critical level most of the limit cycles become simple fixed points. Nonetheless, for moderate changes in the threshold...

  14. Modelling single shot damage thresholds of multilayer optics for high-intensity short-wavelength radiation sources

    NARCIS (Netherlands)

    Loch, R.A.; Sobierajski, R.; Louis, Eric; Bosgra, J.; Bosgra, J.; Bijkerk, Frederik

    2012-01-01

    The single shot damage thresholds of multilayer optics for highintensity short-wavelength radiation sources are theoretically investigated, using a model developed on the basis of experimental data obtained at the FLASH and LCLS free electron lasers. We compare the radiation hardness of commonly

  15. Heat pain detection threshold is associated with the area of secondary hyperalgesia following brief thermal sensitization

    DEFF Research Database (Denmark)

    Hansen, Morten Sejer; Wetterslev, Jørn; Pipper, Christian Bressen

    2017-01-01

    INTRODUCTION: The area of secondary hyperalgesia following brief thermal sensitization (BTS) of the skin and heat pain detection thresholds (HPDT) may both have predictive abilities in regards to pain sensitivity and clinical pain states. The association between HPDT and secondary hyperalgesia......, however, remains unsettled, and the dissimilarities in physiologic properties suggest that they may represent 2 distinctively different pain entities. The aim of this study was to investigate the association between HPDT and BTS-induced secondary hyperalgesia. METHODS: A sample of 121 healthy male...... participants was included and tested on 2 separate study days with BTS (45°C, 3 minutes), HPDT, and pain during thermal stimulation (45°C, 1 minute). Areas of secondary hyperalgesia were quantified after monofilament pinprick stimulation. The pain catastrophizing scale (PCS) and hospital anxiety and depression...

  16. Air Traffic Controller Acceptability of Unmanned Aircraft System Detect-and-Avoid Thresholds

    Science.gov (United States)

    Mueller, Eric R.; Isaacson, Douglas R.; Stevens, Derek

    2016-01-01

    A human-in-the-loop experiment was conducted with 15 retired air traffic controllers to investigate two research questions: (a) what procedures are appropriate for the use of unmanned aircraft system (UAS) detect-and-avoid systems, and (b) how long in advance of a predicted close encounter should pilots request or execute a separation maneuver. The controller participants managed a busy Oakland air route traffic control sector with mixed commercial/general aviation and manned/UAS traffic, providing separation services, miles-in-trail restrictions and issuing traffic advisories. Controllers filled out post-scenario and post-simulation questionnaires, and metrics were collected on the acceptability of procedural options and temporal thresholds. The states of aircraft were also recorded when controllers issued traffic advisories. Subjective feedback indicated a strong preference for pilots to request maneuvers to remain well clear from intruder aircraft rather than deviate from their IFR clearance. Controllers also reported that maneuvering at 120 seconds until closest point of approach (CPA) was too early; maneuvers executed with less than 90 seconds until CPA were more acceptable. The magnitudes of the requested maneuvers were frequently judged to be too large, indicating a possible discrepancy between the quantitative UAS well clear standard and the one employed subjectively by manned pilots. The ranges between pairs of aircraft and the times to CPA at which traffic advisories were issued were used to construct empirical probability distributions of those metrics. Given these distributions, we propose that UAS pilots wait until an intruder aircraft is approximately 80 seconds to CPA or 6 nmi away before requesting a maneuver, and maneuver immediately if the intruder is within 60 seconds and 4 nmi. These thresholds should make the use of UAS detect and avoid systems compatible with current airspace procedures and controller expectations.

  17. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection.

  18. Threshold-based generic scheme for encrypted and tunneled Voice Flows Detection over IP Networks

    Directory of Open Access Journals (Sweden)

    M. Mazhar U. Rathore

    2015-07-01

    Full Text Available VoIP usage is rapidly growing due to its cost effectiveness, dramatic functionality over the traditional telephone network and its compatibility with public switched telephone network (PSTN. In some countries, like Pakistan, the commercial usage of VoIP is prohibited. Internet service providers (ISPs and telecommunication authorities are interested in detecting VoIP calls to either block or prioritize them. So detection of VoIP calls is important for both types of authorities. Signature-based, port-based, and pattern-based VoIP detection techniques are inefficient due to complex and confidential security and tunneling mechanisms used by VoIP. In this paper, we propose a generic, robust, efficient, and practically implementable statistical analysis-based solution to identify encrypted, non-encrypted, or tunneled VoIP media (voice flows using threshold values of flow statistical parameters. We have made a comparison with existing techniques and evaluated our system with respect to accuracy and efficiency. Our system has 97.54% direct rate and .00015% false positive rate.

  19. Cocaine Promotes Coincidence Detection and Lowers Induction Threshold during Hebbian Associative Synaptic Potentiation in Prefrontal Cortex.

    Science.gov (United States)

    Ruan, Hongyu; Yao, Wei-Dong

    2017-01-25

    Addictive drugs usurp neural plasticity mechanisms that normally serve reward-related learning and memory, primarily by evoking changes in glutamatergic synaptic strength in the mesocorticolimbic dopamine circuitry. Here, we show that repeated cocaine exposure in vivo does not alter synaptic strength in the mouse prefrontal cortex during an early period of withdrawal, but instead modifies a Hebbian quantitative synaptic learning rule by broadening the temporal window and lowers the induction threshold for spike-timing-dependent LTP (t-LTP). After repeated, but not single, daily cocaine injections, t-LTP in layer V pyramidal neurons is induced at +30 ms, a normally ineffective timing interval for t-LTP induction in saline-exposed mice. This cocaine-induced, extended-timing t-LTP lasts for ∼1 week after terminating cocaine and is accompanied by an increased susceptibility to potentiation by fewer pre-post spike pairs, indicating a reduced t-LTP induction threshold. Basal synaptic strength and the maximal attainable t-LTP magnitude remain unchanged after cocaine exposure. We further show that the cocaine facilitation of t-LTP induction is caused by sensitized D1-cAMP/protein kinase A dopamine signaling in pyramidal neurons, which then pathologically recruits voltage-gated l-type Ca 2+ channels that synergize with GluN2A-containing NMDA receptors to drive t-LTP at extended timing. Our results illustrate a mechanism by which cocaine, acting on a key neuromodulation pathway, modifies the coincidence detection window during Hebbian plasticity to facilitate associative synaptic potentiation in prefrontal excitatory circuits. By modifying rules that govern activity-dependent synaptic plasticity, addictive drugs can derail the experience-driven neural circuit remodeling process important for executive control of reward and addiction. It is believed that addictive drugs often render an addict's brain reward system hypersensitive, leaving the individual more susceptible to

  20. Indirect detection of radiation sources through direct detection of radiolysis products

    Science.gov (United States)

    Farmer, Joseph C [Tracy, CA; Fischer, Larry E [Los Gatos, CA; Felter, Thomas E [Livermore, CA

    2010-04-20

    A system for indirectly detecting a radiation source by directly detecting radiolytic products. The radiation source emits radiation and the radiation produces the radiolytic products. A fluid is positioned to receive the radiation from the radiation source. When the fluid is irradiated, radiolytic products are produced. By directly detecting the radiolytic products, the radiation source is detected.

  1. Psychophysical estimate of plantar vibration sensitivity brings additional information to the detection threshold in young and elderly subjects

    Directory of Open Access Journals (Sweden)

    Yves Jammes

    Full Text Available Objective: Vibration detection threshold of the foot sole was compared to the psychophysical estimate of vibration in a wide range of amplitudes in young (20–34 years old and elderly subjects (53–67 years old. Methods: The vibration detection threshold was determined on the hallux, 5th metatarsal head, and heel at frequencies of 25, 50 and 150 Hz. For vibrations of higher amplitude (reaching 360 μm, the Stevens power function (Ψ = k * Φn allowed to obtain regression equations between the vibration estimate (Ψ and its physical magnitude (Φ, the n coefficient giving the subjective intensity in vibration perception. We searched for age-related changes in the vibration perception by the foot sole. Results: In all participants, higher n values were measured at vibration frequencies of 150 Hz and, compared to the young adults the elderly had lower n values measured at this frequency. Only in the young participants, the vibration detection threshold was lowered at 150 Hz. Conclusion: The psychophysical estimate brings further information than the vibration detection threshold which is less affected by age. Significance: The clinical interest of psychophysical vibration estimate was assessed in a patient with a unilateral alteration of foot sensitivity. Keywords: Vibration sensitivity, Vibration detection threshold, Foot sole, Elderly

  2. CISN ShakeAlert: Faster Warning Information Through Multiple Threshold Event Detection in the Virtual Seismologist (VS) Early Warning Algorithm

    Science.gov (United States)

    Cua, G. B.; Fischer, M.; Caprio, M.; Heaton, T. H.; Cisn Earthquake Early Warning Project Team

    2010-12-01

    The Virtual Seismologist (VS) earthquake early warning (EEW) algorithm is one of 3 EEW approaches being incorporated into the California Integrated Seismic Network (CISN) ShakeAlert system, a prototype EEW system that could potentially be implemented in California. The VS algorithm, implemented by the Swiss Seismological Service at ETH Zurich, is a Bayesian approach to EEW, wherein the most probable source estimate at any given time is a combination of contributions from a likehihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS codes have been running in real-time at the Southern California Seismic Network since July 2008, and at the Northern California Seismic Network since February 2009. We discuss recent enhancements to the VS EEW algorithm that are being integrated into CISN ShakeAlert. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to initiate an event declaration, with the goal of reducing false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and the requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) into an on-site/regional approach capable of providing a continuously evolving stream of EEW information starting from the first P-detection. Real-time and offline analysis on Swiss and California waveform datasets indicate that the

  3. Cool, warm, and heat-pain detection thresholds: testing methods and inferences about anatomic distribution of receptors.

    Science.gov (United States)

    Dyck, P J; Zimmerman, I; Gillen, D A; Johnson, D; Karnes, J L; O'Brien, P C

    1993-08-01

    We recently found that vibratory detection threshold is greatly influenced by the algorithm of testing. Here, we study the influence of stimulus characteristics and algorithm of testing and estimating threshold on cool (CDT), warm (WDT), and heat-pain (HPDT) detection thresholds. We show that continuously decreasing (for CDT) or increasing (for WDT) thermode temperature to the point at which cooling or warming is perceived and signaled by depressing a response key ("appearance" threshold) overestimates threshold with rapid rates of thermal change. The mean of the appearance and disappearance thresholds also does not perform well for insensitive sites and patients. Pyramidal (or flat-topped pyramidal) stimuli ranging in magnitude, in 25 steps, from near skin temperature to 9 degrees C for 10 seconds (for CDT), from near skin temperature to 45 degrees C for 10 seconds (for WDT), and from near skin temperature to 49 degrees C for 10 seconds (for HPDT) provide ideal stimuli for use in several algorithms of testing and estimating threshold. Near threshold, only the initial direction of thermal change from skin temperature is perceived, and not its return to baseline. Use of steps of stimulus intensity allows the subject or patient to take the needed time to decide whether the stimulus was felt or not (in 4, 2, and 1 stepping algorithms), or whether it occurred in stimulus interval 1 or 2 (in two-alternative forced-choice testing). Thermal thresholds were generally significantly lower with a large (10 cm2) than with a small (2.7 cm2) thermode.(ABSTRACT TRUNCATED AT 250 WORDS)

  4. Bayesian-statistical decision threshold, detection limit, and confidence interval in nuclear radiation measurement

    International Nuclear Information System (INIS)

    Weise, K.

    1998-01-01

    When a contribution of a particular nuclear radiation is to be detected, for instance, a spectral line of interest for some purpose of radiation protection, and quantities and their uncertainties must be taken into account which, such as influence quantities, cannot be determined by repeated measurements or by counting nuclear radiation events, then conventional statistics of event frequencies is not sufficient for defining the decision threshold, the detection limit, and the limits of a confidence interval. These characteristic limits are therefore redefined on the basis of Bayesian statistics for a wider applicability and in such a way that the usual practice remains as far as possible unaffected. The principle of maximum entropy is applied to establish probability distributions from available information. Quantiles of these distributions are used for defining the characteristic limits. But such a distribution must not be interpreted as a distribution of event frequencies such as the Poisson distribution. It rather expresses the actual state of incomplete knowledge of a physical quantity. The different definitions and interpretations and their quantitative consequences are presented and discussed with two examples. The new approach provides a theoretical basis for the DIN 25482-10 standard presently in preparation for general applications of the characteristic limits. (orig.) [de

  5. Quantitative prediction of perceptual decisions during near-threshold fear detection

    Science.gov (United States)

    Pessoa, Luiz; Padmala, Srikanth

    2005-04-01

    A fundamental goal of cognitive neuroscience is to explain how mental decisions originate from basic neural mechanisms. The goal of the present study was to investigate the neural correlates of perceptual decisions in the context of emotional perception. To probe this question, we investigated how fluctuations in functional MRI (fMRI) signals were correlated with behavioral choice during a near-threshold fear detection task. fMRI signals predicted behavioral choice independently of stimulus properties and task accuracy in a network of brain regions linked to emotional processing: posterior cingulate cortex, medial prefrontal cortex, right inferior frontal gyrus, and left insula. We quantified the link between fMRI signals and behavioral choice in a whole-brain analysis by determining choice probabilities by means of signal-detection theory methods. Our results demonstrate that voxel-wise fMRI signals can reliably predict behavioral choice in a quantitative fashion (choice probabilities ranged from 0.63 to 0.78) at levels comparable to neuronal data. We suggest that the conscious decision that a fearful face has been seen is represented across a network of interconnected brain regions that prepare the organism to appropriately handle emotionally challenging stimuli and that regulate the associated emotional response. decision making | emotion | functional MRI

  6. Comparison of edge detection techniques for M7 subtype Leukemic cell in terms of noise filters and threshold value

    Directory of Open Access Journals (Sweden)

    Abdul Salam Afifah Salmi

    2017-01-01

    Full Text Available This paper will focus on the study and identifying various threshold values for two commonly used edge detection techniques, which are Sobel and Canny Edge detection. The idea is to determine which values are apt in giving accurate results in identifying a particular leukemic cell. In addition, evaluating suitability of edge detectors are also essential as feature extraction of the cell depends greatly on image segmentation (edge detection. Firstly, an image of M7 subtype of Acute Myelocytic Leukemia (AML is chosen due to its diagnosing which were found lacking. Next, for an enhancement in image quality, noise filters are applied. Hence, by comparing images with no filter, median and average filter, useful information can be acquired. Each threshold value is fixed with value 0, 0.25 and 0.5. From the investigation found, without any filter, Canny with a threshold value of 0.5 yields the best result.

  7. A de-noising algorithm based on wavelet threshold-exponential adaptive window width-fitting for ground electrical source airborne transient electromagnetic signal

    Science.gov (United States)

    Ji, Yanju; Li, Dongsheng; Yu, Mingmei; Wang, Yuan; Wu, Qiong; Lin, Jun

    2016-05-01

    The ground electrical source airborne transient electromagnetic system (GREATEM) on an unmanned aircraft enjoys considerable prospecting depth, lateral resolution and detection efficiency, etc. In recent years it has become an important technical means of rapid resources exploration. However, GREATEM data are extremely vulnerable to stationary white noise and non-stationary electromagnetic noise (sferics noise, aircraft engine noise and other human electromagnetic noises). These noises will cause degradation of the imaging quality for data interpretation. Based on the characteristics of the GREATEM data and major noises, we propose a de-noising algorithm utilizing wavelet threshold method and exponential adaptive window width-fitting. Firstly, the white noise is filtered in the measured data using the wavelet threshold method. Then, the data are segmented using data window whose step length is even logarithmic intervals. The data polluted by electromagnetic noise are identified within each window based on the discriminating principle of energy detection, and the attenuation characteristics of the data slope are extracted. Eventually, an exponential fitting algorithm is adopted to fit the attenuation curve of each window, and the data polluted by non-stationary electromagnetic noise are replaced with their fitting results. Thus the non-stationary electromagnetic noise can be effectively removed. The proposed algorithm is verified by the synthetic and real GREATEM signals. The results show that in GREATEM signal, stationary white noise and non-stationary electromagnetic noise can be effectively filtered using the wavelet threshold-exponential adaptive window width-fitting algorithm, which enhances the imaging quality.

  8. Real-time detection of faecally contaminated drinking water with tryptophan-like fluorescence: defining threshold values.

    Science.gov (United States)

    Sorensen, James P R; Baker, Andy; Cumberland, Susan A; Lapworth, Dan J; MacDonald, Alan M; Pedley, Steve; Taylor, Richard G; Ward, Jade S T

    2018-05-01

    We assess the use of fluorescent dissolved organic matter at excitation-emission wavelengths of 280nm and 360nm, termed tryptophan-like fluorescence (TLF), as an indicator of faecally contaminated drinking water. A significant logistic regression model was developed using TLF as a predictor of thermotolerant coliforms (TTCs) using data from groundwater- and surface water-derived drinking water sources in India, Malawi, South Africa and Zambia. A TLF threshold of 1.3ppb dissolved tryptophan was selected to classify TTC contamination. Validation of the TLF threshold indicated a false-negative error rate of 15% and a false-positive error rate of 18%. The threshold was unsuccessful at classifying contaminated sources containing water globally. Copyright © 2017 Natural Environment Research Council (NERC), as represented by the British Geological Survey (BGS. Published by Elsevier B.V. All rights reserved.

  9. Histogram-Based Thresholding for Detection and Quantification of Hemorrhages in Retinal Images

    Directory of Open Access Journals (Sweden)

    Hussain Fadhel Hamdan Jaafar

    2016-12-01

    Full Text Available Retinal image analysis is commonly used for the detection and quantification of retinal diabetic retinopathy. In retinal images, dark lesions including hemorrhages and microaneurysms are the earliest warnings of vision loss. In this paper, new algorithm for extraction and quantification of hemorrhages in fundus images is presented. Hemorrhage candidates are extracted in a preliminary step as a coarse segmentation followed by a fine segmentation step. Local variation processes are applied in the coarse segmentation step to determine boundaries of all candidates with distinct edges. Fine segmentation processes are based on histogram thresholding to extract real hemorrhages from the segmented candidates locally. The proposed method was trained and tested using an image dataset of 153 manually labeled retinal images. At the pixel level, the proposed method could identify abnormal retinal images with 90.7% sensitivity and 85.1% predictive value. Due to its distinctive performance measurements, this technique demonstrates that it could be used for a computer-aided mass screening of retinal diseases.

  10. Enhanced detection threshold for in vivo cortical stimulation produced by Hebbian conditioning

    Science.gov (United States)

    Rebesco, James M.; Miller, Lee E.

    2011-02-01

    Normal brain function requires constant adaptation, as an organism learns to associate important sensory stimuli with the appropriate motor actions. Neurological disorders may disrupt these learned associations and require the nervous system to reorganize itself. As a consequence, neural plasticity is a crucial component of normal brain function and a critical mechanism for recovery from injury. Associative, or Hebbian, pairing of pre- and post-synaptic activity has been shown to alter stimulus-evoked responses in vivo; however, to date, such protocols have not been shown to affect the animal's subsequent behavior. We paired stimulus trains separated by a brief time delay to two electrodes in rat sensorimotor cortex, which changed the statistical pattern of spikes during subsequent behavior. These changes were consistent with strengthened functional connections from the leading electrode to the lagging electrode. We then trained rats to respond to a microstimulation cue, and repeated the paradigm using the cue electrode as the leading electrode. This pairing lowered the rat's ICMS-detection threshold, with the same dependence on intra-electrode time lag that we found for the functional connectivity changes. The timecourse of the behavioral effects was very similar to that of the connectivity changes. We propose that the behavioral changes were a consequence of strengthened functional connections from the cue electrode to other regions of sensorimotor cortex. Such paradigms might be used to augment recovery from a stroke, or to promote adaptation in a bidirectional brain-machine interface.

  11. A Method for Harmonic Sources Detection based on Harmonic Distortion Power Rate

    Science.gov (United States)

    Lin, Ruixing; Xu, Lin; Zheng, Xian

    2018-03-01

    Harmonic sources detection at the point of common coupling is an essential step for harmonic contribution determination and harmonic mitigation. The harmonic distortion power rate index is proposed for harmonic source location based on IEEE Std 1459-2010 in the paper. The method only based on harmonic distortion power is not suitable when the background harmonic is large. To solve this problem, a threshold is determined by the prior information, when the harmonic distortion power is larger than the threshold, the customer side is considered as the main harmonic source, otherwise, the utility side is. A simple model of public power system was built in MATLAB/Simulink and field test results of typical harmonic loads verified the effectiveness of proposed method.

  12. Assessment of hearing threshold in adults with hearing loss using an automated system of cortical auditory evoked potential detection

    Directory of Open Access Journals (Sweden)

    Alessandra Spada Durante

    Full Text Available Abstract Introduction: The use of hearing aids by individuals with hearing loss brings a better quality of life. Access to and benefit from these devices may be compromised in patients who present difficulties or limitations in traditional behavioral audiological evaluation, such as newborns and small children, individuals with auditory neuropathy spectrum, autism, and intellectual deficits, and in adults and the elderly with dementia. These populations (or individuals are unable to undergo a behavioral assessment, and generate a growing demand for objective methods to assess hearing. Cortical auditory evoked potentials have been used for decades to estimate hearing thresholds. Current technological advances have lead to the development of equipment that allows their clinical use, with features that enable greater accuracy, sensitivity, and specificity, and the possibility of automated detection, analysis, and recording of cortical responses. Objective: To determine and correlate behavioral auditory thresholds with cortical auditory thresholds obtained from an automated response analysis technique. Methods: The study included 52 adults, divided into two groups: 21 adults with moderate to severe hearing loss (study group; and 31 adults with normal hearing (control group. An automated system of detection, analysis, and recording of cortical responses (HEARLab® was used to record the behavioral and cortical thresholds. The subjects remained awake in an acoustically treated environment. Altogether, 150 tone bursts at 500, 1000, 2000, and 4000 Hz were presented through insert earphones in descending-ascending intensity. The lowest level at which the subject detected the sound stimulus was defined as the behavioral (hearing threshold (BT. The lowest level at which a cortical response was observed was defined as the cortical electrophysiological threshold. These two responses were correlated using linear regression. Results: The cortical

  13. Assessment of hearing threshold in adults with hearing loss using an automated system of cortical auditory evoked potential detection.

    Science.gov (United States)

    Durante, Alessandra Spada; Wieselberg, Margarita Bernal; Roque, Nayara; Carvalho, Sheila; Pucci, Beatriz; Gudayol, Nicolly; de Almeida, Kátia

    The use of hearing aids by individuals with hearing loss brings a better quality of life. Access to and benefit from these devices may be compromised in patients who present difficulties or limitations in traditional behavioral audiological evaluation, such as newborns and small children, individuals with auditory neuropathy spectrum, autism, and intellectual deficits, and in adults and the elderly with dementia. These populations (or individuals) are unable to undergo a behavioral assessment, and generate a growing demand for objective methods to assess hearing. Cortical auditory evoked potentials have been used for decades to estimate hearing thresholds. Current technological advances have lead to the development of equipment that allows their clinical use, with features that enable greater accuracy, sensitivity, and specificity, and the possibility of automated detection, analysis, and recording of cortical responses. To determine and correlate behavioral auditory thresholds with cortical auditory thresholds obtained from an automated response analysis technique. The study included 52 adults, divided into two groups: 21 adults with moderate to severe hearing loss (study group); and 31 adults with normal hearing (control group). An automated system of detection, analysis, and recording of cortical responses (HEARLab ® ) was used to record the behavioral and cortical thresholds. The subjects remained awake in an acoustically treated environment. Altogether, 150 tone bursts at 500, 1000, 2000, and 4000Hz were presented through insert earphones in descending-ascending intensity. The lowest level at which the subject detected the sound stimulus was defined as the behavioral (hearing) threshold (BT). The lowest level at which a cortical response was observed was defined as the cortical electrophysiological threshold. These two responses were correlated using linear regression. The cortical electrophysiological threshold was, on average, 7.8dB higher than the

  14. Detection threshold for sound distortion resulting from noise reduction in normal-hearing and hearing-impaired listeners.

    Science.gov (United States)

    Brons, Inge; Dreschler, Wouter A; Houben, Rolph

    2014-09-01

    Hearing-aid noise reduction should reduce background noise, but not disturb the target speech. This objective is difficult because noise reduction suffers from a trade-off between the amount of noise removed and signal distortion. It is unknown if this important trade-off differs between normal-hearing (NH) and hearing-impaired (HI) listeners. This study separated the negative effect of noise reduction (distortion) from the positive effect (reduction of noise) to allow the measurement of the detection threshold for noise-reduction (NR) distortion. Twelve NH subjects and 12 subjects with mild to moderate sensorineural hearing loss participated in this study. The detection thresholds for distortion were determined using an adaptive procedure with a three-interval, two-alternative forced-choice paradigm. Different levels of distortion were obtained by changing the maximum amount of noise reduction. Participants were also asked to indicate their preferred NR strength. The detection threshold for overall distortion was higher for HI subjects than for NH subjects, suggesting that stronger noise reduction can be applied for HI listeners without affecting the perceived sound quality. However, the preferred NR strength of HI listeners was closer to their individual detection threshold for distortion than in NH listeners. This implies that HI listeners tolerate fewer audible distortions than NH listeners.

  15. Asymptomatic loss of intraepidermal nerve fibers with preserved thermal detection thresholds after repeated exposure to severe cold

    DEFF Research Database (Denmark)

    Krøigård, Thomas; Wirenfeldt, Martin; Svendsen, Toke K.

    2018-01-01

    Background: Cold-induced peripheral neuropathy has been described in individuals exposed to severe cold resulting in pain, hypersensitivity to cold, hyperhidrosis, numbness, and skin changes. Nerve conduction studies and thermal detection thresholds are abnormal in symptomatic patients......-induced peripheral neuropathy may be prevalent in subjects living in or near polar regions which could have implications for the recruitment of healthy subjects....

  16. Exploring three faint source detections methods for aperture synthesis radio images

    Science.gov (United States)

    Peracaula, M.; Torrent, A.; Masias, M.; Lladó, X.; Freixenet, J.; Martí, J.; Sánchez-Sutil, J. R.; Muñoz-Arjonilla, A. J.; Paredes, J. M.

    2015-04-01

    Wide-field radio interferometric images often contain a large population of faint compact sources. Due to their low intensity/noise ratio, these objects can be easily missed by automated detection methods, which have been classically based on thresholding techniques after local noise estimation. The aim of this paper is to present and analyse the performance of several alternative or complementary techniques to thresholding. We compare three different algorithms to increase the detection rate of faint objects. The first technique consists of combining wavelet decomposition with local thresholding. The second technique is based on the structural behaviour of the neighbourhood of each pixel. Finally, the third algorithm uses local features extracted from a bank of filters and a boosting classifier to perform the detections. The methods' performances are evaluated using simulations and radio mosaics from the Giant Metrewave Radio Telescope and the Australia Telescope Compact Array. We show that the new methods perform better than well-known state of the art methods such as SEXTRACTOR, SAD and DUCHAMP at detecting faint sources of radio interferometric images.

  17. Estimation of Signal Coherence Threshold and Concealed Spectral Lines Applied to Detection of Turbofan Engine Combustion Noise

    Science.gov (United States)

    Miles, Jeffrey Hilton

    2010-01-01

    Combustion noise from turbofan engines has become important, as the noise from sources like the fan and jet are reduced. An aligned and un-aligned coherence technique has been developed to determine a threshold level for the coherence and thereby help to separate the coherent combustion noise source from other noise sources measured with far-field microphones. This method is compared with a statistics based coherence threshold estimation method. In addition, the un-aligned coherence procedure at the same time also reveals periodicities, spectral lines, and undamped sinusoids hidden by broadband turbofan engine noise. In calculating the coherence threshold using a statistical method, one may use either the number of independent records or a larger number corresponding to the number of overlapped records used to create the average. Using data from a turbofan engine and a simulation this paper shows that applying the Fisher z-transform to the un-aligned coherence can aid in making the proper selection of samples and produce a reasonable statistics based coherence threshold. Examples are presented showing that the underlying tonal and coherent broad band structure which is buried under random broadband noise and jet noise can be determined. The method also shows the possible presence of indirect combustion noise. Copyright 2011 Acoustical Society of America. This article may be downloaded for personal use only. Any other use requires prior permission of the author and the Acoustical Society of America.

  18. Simulated annealing CFAR threshold selection for South African ship detection in ASAR imagery

    CSIR Research Space (South Africa)

    Schwegmann, CP

    2014-07-01

    Full Text Available ALTER CURRENT THRESHOLD PLANE IF CANDIDATE IS BETTER IF CANDIDATE IS WORSE IF (RANDOM NUMBER < BOLTZMANN PROBABILITY) Fig. 3. The iterative procedure of Simulated Annealing. Starting at some initial threshold plane Ti (x, y) each iteration tests... if the new solution T is better than the previous best solution Tb (x, y). A possible “bad” candidate can replace the current best due to the Boltzmann probability. A new threshold plane Tb (x, y) is defined which is mapped to the 2D distribution map...

  19. Intrusion Detection using Open Source Tools

    OpenAIRE

    Jack TIMOFTE

    2008-01-01

    We have witnessed in the recent years that open source tools have gained popularity among all types of users, from individuals or small businesses to large organizations and enterprises. In this paper we will present three open source IDS tools: OSSEC, Prelude and SNORT.

  20. Automated detection of macular drusen using geometric background leveling and threshold selection.

    Science.gov (United States)

    Smith, R Theodore; Chan, Jackie K; Nagasaki, Takayuki; Ahmad, Umer F; Barbazetto, Irene; Sparrow, Janet; Figueroa, Marta; Merriam, Joanna

    2005-02-01

    Age-related macular degeneration (ARMD) is the most prevalent cause of visual loss in patients older than 60 years in the United States. Observation of drusen is the hallmark finding in the clinical evaluation of ARMD. To segment and quantify drusen found in patients with ARMD using image analysis and to compare the efficacy of image analysis segmentation with that of stereoscopic manual grading of drusen. Retrospective study. University referral center.Patients Photographs were randomly selected from an available database of patients with known ARMD in the ongoing Columbia University Macular Genetics Study. All patients were white and older than 60 years. Twenty images from 17 patients were selected as representative of common manifestations of drusen. Image preprocessing included automated color balancing and, where necessary, manual segmentation of confounding lesions such as geographic atrophy (3 images). The operator then chose among 3 automated processing options suggested by predominant drusen type. Automated processing consisted of elimination of background variability by a mathematical model and subsequent histogram-based threshold selection. A retinal specialist using a graphic tablet while viewing stereo pairs constructed digital drusen drawings for each image. The sensitivity and specificity of drusen segmentation using the automated method with respect to manual stereoscopic drusen drawings were calculated on a rigorous pixel-by-pixel basis. The median sensitivity and specificity of automated segmentation were 70% and 81%, respectively. After preprocessing and option choice, reproducibility of automated drusen segmentation was necessarily 100%. Automated drusen segmentation can be reliably performed on digital fundus photographs and result in successful quantification of drusen in a more precise manner than is traditionally possible with manual stereoscopic grading of drusen. With only minor preprocessing requirements, this automated detection

  1. Gap detection threshold in the rat before and after auditory cortex ablation.

    Science.gov (United States)

    Syka, J; Rybalko, N; Mazelová, J; Druga, R

    2002-10-01

    Gap detection threshold (GDT) was measured in adult female pigmented rats (strain Long-Evans) by an operant conditioning technique with food reinforcement, before and after bilateral ablation of the auditory cortex. GDT was dependent on the frequency spectrum and intensity of the continuously present noise in which the gaps were embedded. The mean values of GDT for gaps embedded in white noise or low-frequency noise (upper cutoff frequency 3 kHz) at 70 dB sound pressure level (SPL) were 1.57+/-0.07 ms and 2.9+/-0.34 ms, respectively. Decreasing noise intensity from 80 dB SPL to 20 dB SPL produced a significant increase in GDT. The increase in GDT was relatively small in the range of 80-50 dB SPL for white noise and in the range of 80-60 dB for low-frequency noise. The minimal intensity level of the noise that enabled GDT measurement was 20 dB SPL for white noise and 30 dB SPL for low-frequency noise. Mean GDT values at these intensities were 10.6+/-3.9 ms and 31.3+/-4.2 ms, respectively. Bilateral ablation of the primary auditory cortex (complete destruction of the Te1 and partial destruction of the Te2 and Te3 areas) resulted in an increase in GDT values. The fifth day after surgery, the rats were able to detect gaps in the noise. The values of GDT observed at this time were 4.2+/-1.1 ms for white noise and 7.4+/-3.1 ms for low-frequency noise at 70 dB SPL. During the first month after cortical ablation, recovery of GDT was observed. However, 1 month after cortical ablation GDT still remained slightly higher than in controls (1.8+/-0.18 for white noise, 3.22+/-0.15 for low-frequency noise, Pdecrease in GDT values during the subsequent months was not observed.

  2. Multiple-Threshold Event Detection and Other Enhancements to the Virtual Seismologist (VS) Earthquake Early Warning Algorithm

    Science.gov (United States)

    Fischer, M.; Caprio, M.; Cua, G. B.; Heaton, T. H.; Clinton, J. F.; Wiemer, S.

    2009-12-01

    The Virtual Seismologist (VS) algorithm is a Bayesian approach to earthquake early warning (EEW) being implemented by the Swiss Seismological Service at ETH Zurich. The application of Bayes’ theorem in earthquake early warning states that the most probable source estimate at any given time is a combination of contributions from a likelihood function that evolves in response to incoming data from the on-going earthquake, and selected prior information, which can include factors such as network topology, the Gutenberg-Richter relationship or previously observed seismicity. The VS algorithm was one of three EEW algorithms involved in the California Integrated Seismic Network (CISN) real-time EEW testing and performance evaluation effort. Its compelling real-time performance in California over the last three years has led to its inclusion in the new USGS-funded effort to develop key components of CISN ShakeAlert, a prototype EEW system that could potentially be implemented in California. A significant portion of VS code development was supported by the SAFER EEW project in Europe. We discuss recent enhancements to the VS EEW algorithm. We developed and continue to test a multiple-threshold event detection scheme, which uses different association / location approaches depending on the peak amplitudes associated with an incoming P pick. With this scheme, an event with sufficiently high initial amplitudes can be declared on the basis of a single station, maximizing warning times for damaging events for which EEW is most relevant. Smaller, non-damaging events, which will have lower initial amplitudes, will require more picks to be declared an event to reduce false alarms. This transforms the VS codes from a regional EEW approach reliant on traditional location estimation (and it requirement of at least 4 picks as implemented by the Binder Earthworm phase associator) to a hybrid on-site/regional approach capable of providing a continuously evolving stream of EEW

  3. Ionizing radiation source detection by personal TLD

    International Nuclear Information System (INIS)

    Marinkovic, O.; Mirkov, Z.

    2002-01-01

    The Laboratory for personal dosimetry has about 3000 workers under control. The most of them work in medicine. Some institutions, as big health centers, have different ionizing radiation sources. It is usefull to analyze what has been the source of irradiation, special when appears a dosimeter with high dose. Personal dosimetry equipment is Harshaw TLD Reader Model 6600 and dosimeters consist of two chips LiF TLD-100 assembled in bar-coded cards which are wearing in holders with one tissue-equivalent filter (to determine H(10)) and skin-equivalent the other (to determine H(0.07)). The calibration dosimeters have been irradiated in holders by different sources: x-ray (for 80keV and 100keV), 6 0C o, 9 0S r (for different distances from beta source) and foton beem (at radiotherapy accelerator by 6MeV, 10MeV and 18MeV). The dose ratio for two LiF cristals was calculated and represented with graphs. So, it is possible to calculate the ratio H(10)/H(0.07) for a personal TLD and analyze what has been the source of irradiation. Also, there is the calibration for determination the time of irradiation, according to glow curve deconvolution

  4. Beam Loss Detection at Radiation Source ELBE

    CERN Document Server

    Michel, P; Schurig, R; Langenhagen, H

    2003-01-01

    The Rossendorf superconducting Electron Linac of high Brilliance and low Emittance (ELBE) delivers an 40 MeV, 1 mA cw-beam for different applications such as bremsstrahlung production, electron channelling, free-electron lasers or secondary particle beam generation. In this energy region in case of collisions of the electron beam with the pipe nearly all beam power will be deposited into the pipe material. Therefore a reliable beam loss monitoring is essential for machine protection at ELBE. Different systems basing on photo multipliers, compton diodes and long ionization chambers were studied. The pros and cons of the different systems will be discussed. Ionization chambers based on air-isolated RF cables installed some cm away parallel to the beam line turned out to be the optimal solution. The beam shut-off threshold was adjusted to 1 μC integral charge loss during a 100 ms time interval. Due to the favourable geometry the monitor sensitivity varies less than ±50% along the beam line (di...

  5. Automatic Semiconductor Wafer Image Segmentation for Defect Detection Using Multilevel Thresholding

    Directory of Open Access Journals (Sweden)

    Saad N.H.

    2016-01-01

    Full Text Available Quality control is one of important process in semiconductor manufacturing. A lot of issues trying to be solved in semiconductor manufacturing industry regarding the rate of production with respect to time. In most semiconductor assemblies, a lot of wafers from various processes in semiconductor wafer manufacturing need to be inspected manually using human experts and this process required full concentration of the operators. This human inspection procedure, however, is time consuming and highly subjective. In order to overcome this problem, implementation of machine vision will be the best solution. This paper presents automatic defect segmentation of semiconductor wafer image based on multilevel thresholding algorithm which can be further adopted in machine vision system. In this work, the defect image which is in RGB image at first is converted to the gray scale image. Median filtering then is implemented to enhance the gray scale image. Then the modified multilevel thresholding algorithm is performed to the enhanced image. The algorithm worked in three main stages which are determination of the peak location of the histogram, segmentation the histogram between the peak and determination of first global minimum of histogram that correspond to the threshold value of the image. The proposed approach is being evaluated using defected wafer images. The experimental results shown that it can be used to segment the defect correctly and outperformed other thresholding technique such as Otsu and iterative thresholding.

  6. Gaps-in-Noise test: gap detection thresholds in 9-year-old normal-hearing children.

    Science.gov (United States)

    Marculino, Carolina Finetti; Rabelo, Camila Maia; Schochat, Eliane

    2011-12-01

    To establish the standard criteria for the Gaps-in-Noise (GIN) test in 9-year-old normal-hearing children; to obtain the mean gap detection thresholds; and to verify the influence of the variables gender and ear on the gap detection thresholds. Forty normal-hearing individuals, 20 male and 20 female, with ages ranging from 9 years to 9 years and 11 months, were evaluated. The procedures performed were: anamnesis, audiological evaluation, acoustic immittance measures (tympanometry and acoustic reflex), Dichotic Digits Test, and GIN test. The results obtained were statistically analyzed. The results revealed similar performance of right and left ears in the population studied. There was also no difference regarding the variable gender. In the subjects evaluated, the mean gap detection thresholds were 4.4 ms for the right ear, and 4.2 ms for the left ear. The values obtained for right and left ear, as well as their standard deviations, can be used as standard criteria for 9-year-old children, regardless of ear or gender.

  7. Detection limits for real-time source water monitoring using indigenous freshwater microalgae

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Jr, Miguel [ORNL; Greenbaum, Elias [ORNL

    2009-01-01

    This research identified toxin detection limits using the variable fluorescence of naturally occurring microalgae in source drinking water for five chemical toxins with different molecular structures and modes of toxicity. The five chemicals investigated were atrazine, Diuron, paraquat, methyl parathion, and potassium cyanide. Absolute threshold sensitivities of the algae for detection of the toxins in unmodified source drinking water were measured. Differential kinetics between the rate of action of the toxins and natural changes in algal physiology, such as diurnal photoinhibition, are significant enough that effects of the toxin can be detected and distinguished from the natural variance. This is true even for physiologically impaired algae where diminished photosynthetic capacity may arise from uncontrollable external factors such as nutrient starvation. Photoinhibition induced by high levels of solar radiation is a predictable and reversible phenomenon that can be dealt with using a period of dark adaption of 30 minutes or more.

  8. Correlator bank detection of gravitational wave chirps--False-alarm probability, template density, and thresholds: Behind and beyond the minimal-match issue

    International Nuclear Information System (INIS)

    Croce, R.P.; Demma, Th.; Pierro, V.; Pinto, I.M.; Longo, M.; Marano, S.; Matta, V.

    2004-01-01

    The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness

  9. Tetrodotoxin: Chemistry, Toxicity, Source, Distribution and Detection

    Directory of Open Access Journals (Sweden)

    Vaishali Bane

    2014-02-01

    Full Text Available Tetrodotoxin (TTX is a naturally occurring toxin that has been responsible for human intoxications and fatalities. Its usual route of toxicity is via the ingestion of contaminated puffer fish which are a culinary delicacy, especially in Japan. TTX was believed to be confined to regions of South East Asia, but recent studies have demonstrated that the toxin has spread to regions in the Pacific and the Mediterranean. There is no known antidote to TTX which is a powerful sodium channel inhibitor. This review aims to collect pertinent information available to date on TTX and its analogues with a special emphasis on the structure, aetiology, distribution, effects and the analytical methods employed for its detection.

  10. Deblending of simultaneous-source data using iterative seislet frame thresholding based on a robust slope estimation

    Science.gov (United States)

    Zhou, Yatong; Han, Chunying; Chi, Yue

    2018-06-01

    In a simultaneous source survey, no limitation is required for the shot scheduling of nearby sources and thus a huge acquisition efficiency can be obtained but at the same time making the recorded seismic data contaminated by strong blending interference. In this paper, we propose a multi-dip seislet frame based sparse inversion algorithm to iteratively separate simultaneous sources. We overcome two inherent drawbacks of traditional seislet transform. For the multi-dip problem, we propose to apply a multi-dip seislet frame thresholding strategy instead of the traditional seislet transform for deblending simultaneous-source data that contains multiple dips, e.g., containing multiple reflections. The multi-dip seislet frame strategy solves the conflicting dip problem that degrades the performance of the traditional seislet transform. For the noise issue, we propose to use a robust dip estimation algorithm that is based on velocity-slope transformation. Instead of calculating the local slope directly using the plane-wave destruction (PWD) based method, we first apply NMO-based velocity analysis and obtain NMO velocities for multi-dip components that correspond to multiples of different orders, then a fairly accurate slope estimation can be obtained using the velocity-slope conversion equation. An iterative deblending framework is given and validated through a comprehensive analysis over both numerical synthetic and field data examples.

  11. Applicablitiy Determinations on the PSD 100 tpy Major Source Threshold Catergory for Fossil Fuel Boilers Industries

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  12. Detecting fragmentation extinction thresholds for forest understory plant species in peninsular Spain.

    Science.gov (United States)

    Rueda, Marta; Moreno Saiz, Juan Carlos; Morales-Castilla, Ignacio; Albuquerque, Fabio S; Ferrero, Mila; Rodríguez, Miguel Á

    2015-01-01

    Ecological theory predicts that fragmentation aggravates the effects of habitat loss, yet empirical results show mixed evidences, which fail to support the theory instead reinforcing the primary importance of habitat loss. Fragmentation hypotheses have received much attention due to their potential implications for biodiversity conservation, however, animal studies have traditionally been their main focus. Here we assess variation in species sensitivity to forest amount and fragmentation and evaluate if fragmentation is related to extinction thresholds in forest understory herbs and ferns. Our expectation was that forest herbs would be more sensitive to fragmentation than ferns due to their lower dispersal capabilities. Using forest cover percentage and the proportion of this percentage occurring in the largest patch within UTM cells of 10-km resolution covering Peninsular Spain, we partitioned the effects of forest amount versus fragmentation and applied logistic regression to model occurrences of 16 species. For nine models showing robustness according to a set of quality criteria we subsequently defined two empirical fragmentation scenarios, minimum and maximum, and quantified species' sensitivity to forest contraction with no fragmentation, and to fragmentation under constant forest cover. We finally assessed how the extinction threshold of each species (the habitat amount below which it cannot persist) varies under no and maximum fragmentation. Consistent with their preference for forest habitats probability occurrences of all species decreased as forest cover contracted. On average, herbs did not show significant sensitivity to fragmentation whereas ferns were favored. In line with theory, fragmentation yielded higher extinction thresholds for two species. For the remaining species, fragmentation had either positive or non-significant effects. We interpret these differences as reflecting species-specific traits and conclude that although forest amount is of

  13. Detecting fragmentation extinction thresholds for forest understory plant species in peninsular Spain.

    Directory of Open Access Journals (Sweden)

    Marta Rueda

    Full Text Available Ecological theory predicts that fragmentation aggravates the effects of habitat loss, yet empirical results show mixed evidences, which fail to support the theory instead reinforcing the primary importance of habitat loss. Fragmentation hypotheses have received much attention due to their potential implications for biodiversity conservation, however, animal studies have traditionally been their main focus. Here we assess variation in species sensitivity to forest amount and fragmentation and evaluate if fragmentation is related to extinction thresholds in forest understory herbs and ferns. Our expectation was that forest herbs would be more sensitive to fragmentation than ferns due to their lower dispersal capabilities. Using forest cover percentage and the proportion of this percentage occurring in the largest patch within UTM cells of 10-km resolution covering Peninsular Spain, we partitioned the effects of forest amount versus fragmentation and applied logistic regression to model occurrences of 16 species. For nine models showing robustness according to a set of quality criteria we subsequently defined two empirical fragmentation scenarios, minimum and maximum, and quantified species' sensitivity to forest contraction with no fragmentation, and to fragmentation under constant forest cover. We finally assessed how the extinction threshold of each species (the habitat amount below which it cannot persist varies under no and maximum fragmentation. Consistent with their preference for forest habitats probability occurrences of all species decreased as forest cover contracted. On average, herbs did not show significant sensitivity to fragmentation whereas ferns were favored. In line with theory, fragmentation yielded higher extinction thresholds for two species. For the remaining species, fragmentation had either positive or non-significant effects. We interpret these differences as reflecting species-specific traits and conclude that although

  14. Threshold-based detection for amplify-and-forward cooperative communication systems with channel estimation error

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-09-01

    Efficient receiver designs for cooperative communication systems are becoming increasingly important. In previous work, cooperative networks communicated with the use of $L$ relays. As the receiver is constrained, it can only process $U$ out of $L$ relays. Channel shortening and reduced-rank techniques were employed to design the preprocessing matrix. In this paper, a receiver structure is proposed which combines the joint iterative optimization (JIO) algorithm and our proposed threshold selection criteria. This receiver structure assists in determining the optimal $U-{opt}$. Furthermore, this receiver provides the freedom to choose $U ≤ U-{opt}$ for each frame depending upon the tolerable difference allowed for mean square error (MSE). Our study and simulation results show that by choosing an appropriate threshold, it is possible to gain in terms of complexity savings without affecting the BER performance of the system. Furthermore, in this paper the effect of channel estimation errors is investigated on the MSE performance of the amplify-and-forward (AF) cooperative relaying system.

  15. Automatic internal crack detection from a sequence of infrared images with a triple-threshold Canny edge detector

    Science.gov (United States)

    Wang, Gaochao; Tse, Peter W.; Yuan, Maodan

    2018-02-01

    Visual inspection and assessment of the condition of metal structures are essential for safety. Pulse thermography produces visible infrared images, which have been widely applied to detect and characterize defects in structures and materials. When active thermography, a non-destructive testing tool, is applied, the necessity of considerable manual checking can be avoided. However, detecting an internal crack with active thermography remains difficult, since it is usually invisible in the collected sequence of infrared images, which makes the automatic detection of internal cracks even harder. In addition, the detection of an internal crack can be hindered by a complicated inspection environment. With the purpose of putting forward a robust and automatic visual inspection method, a computer vision-based thresholding method is proposed. In this paper, the image signals are a sequence of infrared images collected from the experimental setup with a thermal camera and two flash lamps as stimulus. The contrast of pixels in each frame is enhanced by the Canny operator and then reconstructed by a triple-threshold system. Two features, mean value in the time domain and maximal amplitude in the frequency domain, are extracted from the reconstructed signal to help distinguish the crack pixels from others. Finally, a binary image indicating the location of the internal crack is generated by a K-means clustering method. The proposed procedure has been applied to an iron pipe, which contains two internal cracks and surface abrasion. Some improvements have been made for the computer vision-based automatic crack detection methods. In the future, the proposed method can be applied to realize the automatic detection of internal cracks from many infrared images for the industry.

  16. OpenMC In Situ Source Convergence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldrich, Garrett Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Univ. of California, Davis, CA (United States); Dutta, Soumya [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); The Ohio State Univ., Columbus, OH (United States); Woodring, Jonathan Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are able to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.

  17. Comparison of Threshold Detection Methods for the Generalized Pareto Distribution (GPD): Application to the NOAA-NCDC Daily Rainfall Dataset

    Science.gov (United States)

    Deidda, Roberto; Mamalakis, Antonis; Langousis, Andreas

    2015-04-01

    One of the most crucial issues in statistical hydrology is the estimation of extreme rainfall from data. To that extent, based on asymptotic arguments from Extreme Excess (EE) theory, several studies have focused on developing new, or improving existing methods to fit a Generalized Pareto Distribution (GPD) model to rainfall excesses above a properly selected threshold u. The latter is generally determined using various approaches that can be grouped into three basic classes: a) non-parametric methods that locate the changing point between extreme and non-extreme regions of the data, b) graphical methods where one studies the dependence of the GPD parameters (or related metrics) to the threshold level u, and c) Goodness of Fit (GoF) metrics that, for a certain level of significance, locate the lowest threshold u that a GPD model is applicable. In this work, we review representative methods for GPD threshold detection, discuss fundamental differences in their theoretical bases, and apply them to daily rainfall records from the NOAA-NCDC open-access database (http://www.ncdc.noaa.gov/oa/climate/ghcn-daily/). We find that non-parametric methods that locate the changing point between extreme and non-extreme regions of the data are generally not reliable, while graphical methods and GoF metrics that rely on limiting arguments for the upper distribution tail lead to unrealistically high thresholds u. The latter is expected, since one checks the validity of the limiting arguments rather than the applicability of a GPD distribution model. Better performance is demonstrated by graphical methods and GoF metrics that rely on GPD properties. Finally, we discuss the effects of data quantization (common in hydrologic applications) on the estimated thresholds. Acknowledgments: The research project is implemented within the framework of the Action «Supporting Postdoctoral Researchers» of the Operational Program "Education and Lifelong Learning" (Action's Beneficiary: General

  18. Plagiarism and Source Deception Detection Based on Syntax Analysis

    Directory of Open Access Journals (Sweden)

    Eman Salih Al-Shamery

    2017-02-01

    Full Text Available In this research, the shingle algorithm with Jaccard method are employed as a new approach to detect deception in sources in addition to detect plagiarism . Source deception occurs as a result of taking a particular text from a source and relative it to another source, while plagiarism occurs in the documents as a result of taking part or all of the text belong to another research, this approach is based on Shingle algorithm with Jaccard coefficient , Shingling is an efficient way to compare the set of shingle in the files that contain text which are used as a feature to measure the syntactic similarity of the documents and it will work with Jaccard coefficient that measures similarity between sample sets . In this proposed system, text will be checked whether it contains syntax plagiarism or not and gives a percentage of similarity with other documents , As well as research sources will be checked to detect deception in source , by matching it with available sources from Turnitin report of the same research by using shingle algorithm with Jaccard coefficient. The motivations of this work is to discovery of literary thefts that occur on the researches , especially what students are doing in their researches , also discover the deception that occurs in the sources.

  19. Taste detection and recognition thresholds in Japanese patients with Alzheimer-type dementia.

    Science.gov (United States)

    Ogawa, Takao; Irikawa, Naoya; Yanagisawa, Daijiro; Shiino, Akihiko; Tooyama, Ikuo; Shimizu, Takeshi

    2017-04-01

    Alzheimer-type dementia (AD) is pathologically characterized by massive neuronal loss in the brain, and the taste cortex is thought to be affected. However, there are only a few reports regarding the gustatory function of AD patients, and the conclusions of this research are inconsistent. This prospective study enrolled 22 consecutive patients with mild to moderately severe Alzheimer-type dementia (AD) with mean age of 84.0 years, and 49 elderly volunteers without dementia with mean age of 71.0 years as control subjects. The control subjects were divided into two groups according to age: a younger group (N=28, mean age: 68.5) and an older group (N=21, mean age: 83.0). The gustatory function was investigated using the filter paper disc method (FPD) and electrogustometry (EGM). The gustatory function as measured by the FPD was significantly impaired in patients with AD as compared with age-matched control subjects; no such difference was found between the younger and the older control groups. On the other hand, as for the EGM thresholds, there were no differences between the AD patient group and the age-matched controls. The FPD method demonstrated decreased gustatory function in AD patients beyond that of aging. On the other hand, EGM thresholds did not differ between the AD patient group and the age-matched controls. These results suggest that failure of taste processing in the brain, but not taste transmission in the peripheral taste system, occurs in patients with AD. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. In vivo transcranial cavitation threshold detection during ultrasound-induced blood-brain barrier opening in mice

    International Nuclear Information System (INIS)

    Tung, Yao-Sheng; Vlachos, Fotios; Choi, James J; Deffieux, Thomas; Selert, Kirsten; Konofagou, Elisa E

    2010-01-01

    The in vivo cavitation response associated with blood-brain barrier (BBB) opening as induced by transcranial focused ultrasound (FUS) in conjunction with microbubbles was studied in order to better identify the underlying mechanism in its noninvasive application. A cylindrically focused hydrophone, confocal with the FUS transducer, was used as a passive cavitation detector (PCD) to identify the threshold of inertial cavitation (IC) in the presence of Definity (registered) microbubbles (mean diameter range: 1.1-3.3 μm, Lantheus Medical Imaging, MA, USA). A vessel phantom was first used to determine the reliability of the PCD prior to in vivo use. A cerebral blood vessel was simulated by generating a cylindrical channel of 610 μm in diameter inside a polyacrylamide gel and by saturating its volume with microbubbles. The microbubbles were sonicated through an excised mouse skull. Second, the same PCD setup was employed for in vivo noninvasive (i.e. transdermal and transcranial) cavitation detection during BBB opening. After the intravenous administration of Definity (registered) microbubbles, pulsed FUS was applied (frequency: 1.525 or 1.5 MHz, peak-rarefactional pressure: 0.15-0.60 MPa, duty cycle: 20%, PRF: 10 Hz, duration: 1 min with a 30 s interval) to the right hippocampus of twenty-six (n = 26) mice in vivo through intact scalp and skull. T1 and T2-weighted MR images were used to verify the BBB opening. A spectrogram was generated at each pressure in order to detect the IC onset and duration. The threshold of BBB opening was found to be at a 0.30 MPa peak-rarefactional pressure in vivo. Both the phantom and in vivo studies indicated that the IC pressure threshold had a peak-rarefactional amplitude of 0.45 MPa. This indicated that BBB opening may not require IC at or near the threshold. Histological analysis showed that BBB opening could be induced without any cellular damage at 0.30 and 0.45 MPa. In conclusion, the cavitation response could be detected without

  1. In vivo transcranial cavitation threshold detection during ultrasound-induced blood-brain barrier opening in mice

    Energy Technology Data Exchange (ETDEWEB)

    Tung, Yao-Sheng; Vlachos, Fotios; Choi, James J; Deffieux, Thomas; Selert, Kirsten; Konofagou, Elisa E, E-mail: ek2191@columbia.ed [Department of Biomedical Engineering, Columbia University, New York, NY (United States)

    2010-10-21

    The in vivo cavitation response associated with blood-brain barrier (BBB) opening as induced by transcranial focused ultrasound (FUS) in conjunction with microbubbles was studied in order to better identify the underlying mechanism in its noninvasive application. A cylindrically focused hydrophone, confocal with the FUS transducer, was used as a passive cavitation detector (PCD) to identify the threshold of inertial cavitation (IC) in the presence of Definity (registered) microbubbles (mean diameter range: 1.1-3.3 {mu}m, Lantheus Medical Imaging, MA, USA). A vessel phantom was first used to determine the reliability of the PCD prior to in vivo use. A cerebral blood vessel was simulated by generating a cylindrical channel of 610 {mu}m in diameter inside a polyacrylamide gel and by saturating its volume with microbubbles. The microbubbles were sonicated through an excised mouse skull. Second, the same PCD setup was employed for in vivo noninvasive (i.e. transdermal and transcranial) cavitation detection during BBB opening. After the intravenous administration of Definity (registered) microbubbles, pulsed FUS was applied (frequency: 1.525 or 1.5 MHz, peak-rarefactional pressure: 0.15-0.60 MPa, duty cycle: 20%, PRF: 10 Hz, duration: 1 min with a 30 s interval) to the right hippocampus of twenty-six (n = 26) mice in vivo through intact scalp and skull. T1 and T2-weighted MR images were used to verify the BBB opening. A spectrogram was generated at each pressure in order to detect the IC onset and duration. The threshold of BBB opening was found to be at a 0.30 MPa peak-rarefactional pressure in vivo. Both the phantom and in vivo studies indicated that the IC pressure threshold had a peak-rarefactional amplitude of 0.45 MPa. This indicated that BBB opening may not require IC at or near the threshold. Histological analysis showed that BBB opening could be induced without any cellular damage at 0.30 and 0.45 MPa. In conclusion, the cavitation response could be detected

  2. In vivo transcranial cavitation threshold detection during ultrasound-induced blood-brain barrier opening in mice.

    Science.gov (United States)

    Tung, Yao-Sheng; Vlachos, Fotios; Choi, James J; Deffieux, Thomas; Selert, Kirsten; Konofagou, Elisa E

    2010-10-21

    The in vivo cavitation response associated with blood-brain barrier (BBB) opening as induced by transcranial focused ultrasound (FUS) in conjunction with microbubbles was studied in order to better identify the underlying mechanism in its noninvasive application. A cylindrically focused hydrophone, confocal with the FUS transducer, was used as a passive cavitation detector (PCD) to identify the threshold of inertial cavitation (IC) in the presence of Definity® microbubbles (mean diameter range: 1.1-3.3 µm, Lantheus Medical Imaging, MA, USA). A vessel phantom was first used to determine the reliability of the PCD prior to in vivo use. A cerebral blood vessel was simulated by generating a cylindrical channel of 610 µm in diameter inside a polyacrylamide gel and by saturating its volume with microbubbles. The microbubbles were sonicated through an excised mouse skull. Second, the same PCD setup was employed for in vivo noninvasive (i.e. transdermal and transcranial) cavitation detection during BBB opening. After the intravenous administration of Definity® microbubbles, pulsed FUS was applied (frequency: 1.525 or 1.5 MHz, peak-rarefactional pressure: 0.15-0.60 MPa, duty cycle: 20%, PRF: 10 Hz, duration: 1 min with a 30 s interval) to the right hippocampus of twenty-six (n = 26) mice in vivo through intact scalp and skull. T1 and T2-weighted MR images were used to verify the BBB opening. A spectrogram was generated at each pressure in order to detect the IC onset and duration. The threshold of BBB opening was found to be at a 0.30 MPa peak-rarefactional pressure in vivo. Both the phantom and in vivo studies indicated that the IC pressure threshold had a peak-rarefactional amplitude of 0.45 MPa. This indicated that BBB opening may not require IC at or near the threshold. Histological analysis showed that BBB opening could be induced without any cellular damage at 0.30 and 0.45 MPa. In conclusion, the cavitation response could be detected without craniotomy in mice

  3. Limit of detection and threshold for positivity of the Centers for Disease Control and Prevention assay for factor VIII inhibitors.

    Science.gov (United States)

    Miller, C H; Boylan, B; Shapiro, A D; Lentz, S R; Wicklund, B M

    2017-10-01

    Essentials Immunologic methods detect factor VIII (FVIII) antibodies in some inhibitor-negative specimens. Specimens were tested by modified Nijmegen-Bethesda assay (NBA) and fluorescence immunoassay. The NBA with preanalytical heat inactivation detects FVIII inhibitors down to 0.2 NBU. IgG 4 frequency validates the established threshold for positivity of ≥ 0.5 NBU for this NBA. Background The Bethesda assay for measurement of factor VIII inhibitors called for quantification of positive inhibitors by using dilutions producing 25-75% residual activity (RA), corresponding to 0.4-2.0 Bethesda units, with the use of 'more sensitive methods' for samples with RA closer to 100% being recommended. The Nijmegen modification (Nijmegen-Bethesda assay [NBA]) changed the reagents used but not these calculations. Some specimens negative by the NBA have been shown to have FVIII antibodies detectable with sensitive immunologic methods. Objective To examine the performance at very low inhibitor titers of the Centers for Disease Control and Prevention (CDC)-modified NBA (CDC-NBA), which includes preanalytic heat inactivation to liberate bound anti-FVIII antibodies. Methods Specimens with known inhibitors were tested with the CDC-NBA. IgG 4 anti-FVIII antibodies were measured by fluorescence immunoassay (FLI). Results Diluted inhibitors showed linearity below 0.4 Nijmegen-Bethesda units (NBU). With four statistical methods, the limit of detection of the CDC-NBA was determined to be 0.2 NBU. IgG 4 anti-FVIII antibodies, which correlate most strongly with functional inhibitors, were present at rates above the background rate of healthy controls in specimens with titers ≥ 0.2 NBU and showed an increase in frequency from 14.3% at 0.4 NBU to 67% at the established threshold for positivity of 0.5 NBU. Conclusions The CDC-NBA can detect inhibitors down to 0.2 NBU. The FLI, which is more sensitive, demonstrates anti-FVIII IgG 4 in some patients with negative (NBA, supporting the need for

  4. Research on point source simulating the γ-ray detection efficiencies of stander source

    International Nuclear Information System (INIS)

    Tian Zining; Jia Mingyan; Shen Maoquan; Yang Xiaoyan; Cheng Zhiwei

    2010-01-01

    For φ 75 mm x 25 mm sample, the full energy peak efficiencies on different heights of sample radius were obtained using the point sources, and the function parameters about the full energy peak efficiencies of point sources based on radius was fixed. The 59.54 keV γ-ray, 661.66 keV γ-ray, 1173.2 keV γ-ray, 1332.5 keV γ-ray detection efficiencies on different height of samples were obtained, based on the full energy peak efficiencies of point sources and its height, and the function parameters about the full energy peak efficiencies of surface sources based on sample height was fixed. The detection efficiency of (75 mm x 25 mm calibration source can be obtained by integrality, the detection efficiencies simulated by point sources are consistent with the results of stander source in 10%. Therefore, the calibration method of stander source can be substituted by the point source simulation method, and it tis feasible when there is no stander source.) (authors)

  5. Development and utility of an internal threshold control (ITC real-time PCR assay for exogenous DNA detection.

    Directory of Open Access Journals (Sweden)

    Weiyi Ni

    Full Text Available Sensitive and specific tests for detecting exogenous DNA molecules are useful for infectious disease diagnosis, gene therapy clinical trial safety, and gene doping surveillance. Taqman real-time PCR using specific sequence probes provides an effective approach to accurately and quantitatively detect exogenous DNA. However, one of the major challenges in these analyses is to eliminate false positive signals caused by either non-targeted exogenous or endogenous DNA sequences, or false negative signals caused by impurities that inhibit PCR. Although multiplex Taqman PCR assays have been applied to address these problems by adding extra primer-probe sets targeted to endogenous DNA sequences, the differences between targets can lead to different detection efficiencies. To avoid these complications, a Taqman PCR-based approach that incorporates an internal threshold control (ITC has been developed. In this single reaction format, the target sequence and ITC template are co-amplified by the same primers, but are detected by different probes each with a unique fluorescent dye. Sample DNA, a prescribed number of ITC template molecules set near the limit of sensitivity, a single pair of primers, target probe and ITC probe are added to one reaction. Fluorescence emission signals are obtained simultaneously to determine the cycle thresholds (Ct for amplification of the target and ITC sequences. The comparison of the target Ct with the ITC Ct indicates if a sample is a true positive for the target (i.e. Ct less than or equal to the ITC Ct or negative (i.e. Ct greater than the ITC Ct. The utility of this approach was demonstrated in a nonhuman primate model of rAAV vector mediated gene doping in vivo and in human genomic DNA spiked with plasmid DNA.

  6. Optimal Threshold for a Positive Hybrid Capture 2 Test for Detection of Human Papillomavirus: Data from the ARTISTIC Trial▿

    Science.gov (United States)

    Sargent, A.; Bailey, A.; Turner, A.; Almonte, M.; Gilham, C.; Baysson, H.; Peto, J.; Roberts, C.; Thomson, C.; Desai, M.; Mather, J.; Kitchener, H.

    2010-01-01

    We present data on the use of the Hybrid Capture 2 (HC2) test for the detection of high-risk human papillomavirus (HR HPV) with different thresholds for positivity within a primary screening setting and as a method of triage for low-grade cytology. In the ARTISTIC population-based trial, 18,386 women were screened by cytology and for HPV. Cervical intraepithelial neoplasia lesions of grade two and higher (CIN2+ lesions) were identified for 453 women within 30 months of an abnormal baseline sample. When a relative light unit/cutoff (RLU/Co) ratio of ≥1 was used as the threshold for considering an HC2 result positive, 15.6% of results were positive, and the proportion of CIN2+ lesions in this group was 14.7%. The relative sensitivity for CIN2+ lesion detection was 93.4%. When an RLU/Co ratio of ≥2 was used as the threshold, there was a 2.5% reduction in positivity, with an increase in the proportion of CIN2+ lesions detected. The relative sensitivity decreased slightly, to 90.3%. Among women with low-grade cytology, HPV prevalences were 43.7% and 40.3% at RLU/Co ratios of ≥1 and ≥2, respectively. The proportions of CIN2+ lesions detected were 17.3% and 18.0%, with relative sensitivities of 87.7% at an RLU/Co ratio of ≥1 and 84.2% at an RLU/Co ratio of ≥2. At an RLU/Co ratio of ≥1, 68.3% of HC2-positive results were confirmed by the Roche line blot assay, compared to 77.2% of those at an RLU/Co ratio of ≥2. Fewer HC2-positive results were confirmed for 35- to 64-year-olds (50.3% at an RLU/Co ratio of ≥1 and 63.2% at an RLU/Co ratio of >2) than for 20- to 34-year-olds (78.7% at an RLU/Co ratio of ≥1 and 83.7% at an RLU/Co ratio of >2). If the HC2 test is used for routine screening as an initial test or as a method of triage for low-grade cytology, we would suggest increasing the threshold for positivity from the RLU/Co ratio of ≥1, recommended by the manufacturer, to an RLU/Co ratio of ≥2, since this study has shown that a beneficial balance

  7. Threshold-selecting strategy for best possible ground state detection with genetic algorithms

    Science.gov (United States)

    Lässig, Jörg; Hoffmann, Karl Heinz

    2009-04-01

    Genetic algorithms are a standard heuristic to find states of low energy in complex state spaces as given by physical systems such as spin glasses but also in combinatorial optimization. The paper considers the problem of selecting individuals in the current population in genetic algorithms for crossover. Many schemes have been considered in literature as possible crossover selection strategies. We show for a large class of quality measures that the best possible probability distribution for selecting individuals in each generation of the algorithm execution is a rectangular distribution over the individuals sorted by their energy values. This means uniform probabilities have to be assigned to a group of the individuals with lowest energy in the population but probabilities equal to zero to individuals which are corresponding to energy values higher than a fixed cutoff, which is equal to a certain rank in the vector sorted by the energy of the states in the current population. The considered strategy is dubbed threshold selecting. The proof applies basic arguments of Markov chains and linear optimization and makes only a few assumptions on the underlying principles and hence applies to a large class of algorithms.

  8. Detection of supernova neutrinos at spallation neutron sources

    Science.gov (United States)

    Huang, Ming-Yang; Guo, Xin-Heng; Young, Bing-Lin

    2016-07-01

    After considering supernova shock effects, Mikheyev-Smirnov-Wolfenstein effects, neutrino collective effects, and Earth matter effects, the detection of supernova neutrinos at the China Spallation Neutron Source is studied and the expected numbers of different flavor supernova neutrinos observed through various reaction channels are calculated with the neutrino energy spectra described by the Fermi-Dirac distribution and the “beta fit” distribution respectively. Furthermore, the numerical calculation method of supernova neutrino detection on Earth is applied to some other spallation neutron sources, and the total expected numbers of supernova neutrinos observed through different reactions channels are given. Supported by National Natural Science Foundation of China (11205185, 11175020, 11275025, 11575023)

  9. Configuration of electro-optic fire source detection system

    Science.gov (United States)

    Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir

    2007-04-01

    The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.

  10. Acetic Acid Detection Threshold in Synthetic Wine Samples of a Portable Electronic Nose

    Directory of Open Access Journals (Sweden)

    Miguel Macías Macías

    2012-12-01

    Full Text Available Wine quality is related to its intrinsic visual, taste, or aroma characteristics and is reflected in the price paid for that wine. One of the most important wine faults is the excessive concentration of acetic acid which can cause a wine to take on vinegar aromas and reduce its varietal character. Thereby it is very important for the wine industry to have methods, like electronic noses, for real-time monitoring the excessive concentration of acetic acid in wines. However, aroma characterization of alcoholic beverages with sensor array electronic noses is a difficult challenge due to the masking effect of ethanol. In this work, in order to detect the presence of acetic acid in synthetic wine samples (aqueous ethanol solution at 10% v/v we use a detection unit which consists of a commercial electronic nose and a HSS32 auto sampler, in combination with a neural network classifier (MLP. To find the characteristic vector representative of the sample that we want to classify, first we select the sensors, and the section of the sensors response curves, where the probability of detecting the presence of acetic acid will be higher, and then we apply Principal Component Analysis (PCA such that each sensor response curve is represented by the coefficients of its first principal components. Results show that the PEN3 electronic nose is able to detect and discriminate wine samples doped with acetic acid in concentrations equal or greater than 2 g/L.

  11. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    A major challenge facing the prospective deployment of radiation detection systems for homeland security applications is the discrimination of radiological or nuclear 'threat sources' from radioactive, but benign, 'nuisance sources'. Common examples of such nuisance sources include naturally occurring radioactive material (NORM), medical patients who have received radioactive drugs for either diagnostics or treatment, and industrial sources. A sensitive detector that cannot distinguish between 'threat' and 'benign' classes will generate false positives which, if sufficiently frequent, will preclude it from being operationally deployed. In this report, we describe a first-principles physics-based modeling approach that is used to approximate the physical properties and corresponding gamma ray spectral signatures of real nuisance sources. Specific models are proposed for the three nuisance source classes - NORM, medical and industrial. The models can be validated against measured data - that is, energy spectra generated with the model can be compared to actual nuisance source data. We show by example how this is done for NORM and medical sources, using data sets obtained from spectroscopic detector deployments for cargo container screening and urban area traffic screening, respectively. In addition to capturing the range of radioactive signatures of individual nuisance sources, a nuisance source population model must generate sources with a frequency of occurrence consistent with that found in actual movement of goods and people. Measured radiation detection data can indicate these frequencies, but, at present, such data are available only for a very limited set of locations and time periods. In this report, we make more general estimates of frequencies for NORM and medical sources using a range of data sources such as shipping manifests and medical treatment statistics. We also identify potential data sources for industrial

  12. Verification of Minimum Detectable Activity for Radiological Threat Source Search

    Science.gov (United States)

    Gardiner, Hannah; Myjak, Mitchell; Baciak, James; Detwiler, Rebecca; Seifert, Carolyn

    2015-10-01

    The Department of Homeland Security's Domestic Nuclear Detection Office is working to develop advanced technologies that will improve the ability to detect, localize, and identify radiological and nuclear sources from airborne platforms. The Airborne Radiological Enhanced-sensor System (ARES) program is developing advanced data fusion algorithms for analyzing data from a helicopter-mounted radiation detector. This detector platform provides a rapid, wide-area assessment of radiological conditions at ground level. The NSCRAD (Nuisance-rejection Spectral Comparison Ratios for Anomaly Detection) algorithm was developed to distinguish low-count sources of interest from benign naturally occurring radiation and irrelevant nuisance sources. It uses a number of broad, overlapping regions of interest to statistically compare each newly measured spectrum with the current estimate for the background to identify anomalies. We recently developed a method to estimate the minimum detectable activity (MDA) of NSCRAD in real time. We present this method here and report on the MDA verification using both laboratory measurements and simulated injects on measured backgrounds at or near the detection limits. This work is supported by the US Department of Homeland Security, Domestic Nuclear Detection Office, under competitively awarded contract/IAA HSHQDC-12-X-00376. This support does not constitute an express or implied endorsement on the part of the Gov't.

  13. Comparison of electrochemical skin conductance and vibration perception threshold measurement in the detection of early diabetic neuropathy.

    Directory of Open Access Journals (Sweden)

    Amit Goel

    Full Text Available The early diagnosis of diabetic peripheral neuropathy (DPN is challenging. Sudomotor dysfunction is one of the earliest detectable abnormalities in DPN. The present study aimed to determine the diagnostic performance of the electrochemical skin conductance (ESC test in detecting early DPN, compared with the vibration perception threshold (VPT test and diabetic neuropathy symptom (DNS score, using the modified neuropathy disability score (NDS as the reference standard. Five hundred and twenty-three patients with type 2 diabetes underwent an NDS-based clinical assessment for neuropathy. Participants were classified into the DPN and non-DPN groups based on the NDS (≥ 6. Both groups were evaluated further using the DNS, and VPT and ESC testing. A receiver-operator characteristic (ROC curve analysis was performed to compare the efficacy of ESC measurements with those of DNS and VPT testing in detecting DPN. The DPN group (n = 110, 21% had significantly higher HbA1c levels and longer diabetes durations compared with the non-DPN group (n = 413. The sensitivity of feet ESC 15 V, and DNS ≥ 1, were 16.4, 10.9 and 1.8, respectively. ESC measurement is an objective and sensitive technique for the early detection of DPN. Feet ESC measurement was superior to VPT testing for identifying patients with early DPN.

  14. Energy-minimum sub-threshold self-timed circuits using current-sensing completion detection

    DEFF Research Database (Denmark)

    Akgun, O. C.; Rodrigues, J. N.; Sparsø, Jens

    2011-01-01

    This study addresses the design of self-timed energy-minimum circuits, operating in the sub-VT domain and a generic implementation template using bundled-data circuitry and current sensing completion detection (CSCD). Furthermore, a fully decoupled latch controller was developed, which integrates......V. Spice simulations indicate a gain of 52.58% in throughput because of asynchronous operation. By trading the throughput improvement, energy dissipation is reduced by 16.8% at the energy-minimum supply voltage....

  15. Calibration experiments of neutron source identification and detection in soil

    International Nuclear Information System (INIS)

    Gorin, N. V.; Lipilina, E. N.; Rukavishnikov, G. V.; Shmakov, D. V.; Ulyanov, A. I.

    2007-01-01

    In the course of detection of fissile materials in soil, series of calibration experiments were carried out on in laboratory conditions on an experimental installation, presenting a mock-up of an endless soil with various heterogeneous bodies in it, fissile material, measuring boreholes. A design of detecting device, methods of neutrons detection are described. Conditions of neutron background measuring are given. Soil density, humidity, chemical composition of soil was measured. Sensitivity of methods of fissile materials detection and identification in soil was estimated in the calibration experiments. Minimal detectable activity and the distance at which it can be detected were defined. Characteristics of neutron radiation in a borehole mock-up were measured; dependences of method sensitivities from water content in soil, source-detector distance and presence of heterogeneous bodies were examined. Possibility of direction detection to a fissile material as neutron source from a borehole using a collimator is shown. Identification of fissile material was carried out by measuring the gamma-spectrum. Mathematical modeling was carried out using the PRIZMA code (Developed in RFNC-VNIITF) and MCNP code (Developed in LANL). Good correlation of calculational and experimental values was shown. The methodic were shown to be applicable in the field conditions

  16. Detection of aeroacoustic sound sources on aircraft and wind turbines

    NARCIS (Netherlands)

    Oerlemans, Stefan

    2009-01-01

    This thesis deals with the detection of aeroacoustic sound sources on aircraft and wind turbines using phased microphone arrays. First, the reliability of the array technique is assessed using airframe noise measurements in open and closed wind tunnels. It is demonstrated that quantitative acoustic

  17. Plagiarism Detection Algorithm for Source Code in Computer Science Education

    Science.gov (United States)

    Liu, Xin; Xu, Chan; Ouyang, Boyu

    2015-01-01

    Nowadays, computer programming is getting more necessary in the course of program design in college education. However, the trick of plagiarizing plus a little modification exists among some students' home works. It's not easy for teachers to judge if there's plagiarizing in source code or not. Traditional detection algorithms cannot fit this…

  18. Geometric effects in alpha particle detection from distributed air sources

    International Nuclear Information System (INIS)

    Gil, L.R.; Leitao, R.M.S.; Marques, A.; Rivera, A.

    1994-08-01

    Geometric effects associated to detection of alpha particles from distributed air sources, as it happens in Radon and Thoron measurements, are revisited. The volume outside which no alpha particle may reach the entrance window of the detector is defined and determined analytically for rectangular and cylindrical symmetry geometries. (author). 3 figs

  19. Complex Event Detection via Multi Source Video Attributes (Open Access)

    Science.gov (United States)

    2013-10-03

    Complex Event Detection via Multi-Source Video Attributes Zhigang Ma† Yi Yang‡ Zhongwen Xu‡§ Shuicheng Yan Nicu Sebe† Alexander G. Hauptmann...under its International Research Centre @ Singapore Fund- ing Initiative and administered by the IDM Programme Of- fice, and the Intelligence Advanced

  20. Threshold quantum cryptography

    International Nuclear Information System (INIS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding

  1. Multilevel Thresholding Method Based on Electromagnetism for Accurate Brain MRI Segmentation to Detect White Matter, Gray Matter, and CSF

    Directory of Open Access Journals (Sweden)

    G. Sandhya

    2017-01-01

    Full Text Available This work explains an advanced and accurate brain MRI segmentation method. MR brain image segmentation is to know the anatomical structure, to identify the abnormalities, and to detect various tissues which help in treatment planning prior to radiation therapy. This proposed technique is a Multilevel Thresholding (MT method based on the phenomenon of Electromagnetism and it segments the image into three tissues such as White Matter (WM, Gray Matter (GM, and CSF. The approach incorporates skull stripping and filtering using anisotropic diffusion filter in the preprocessing stage. This thresholding method uses the force of attraction-repulsion between the charged particles to increase the population. It is the combination of Electromagnetism-Like optimization algorithm with the Otsu and Kapur objective functions. The results obtained by using the proposed method are compared with the ground-truth images and have given best values for the measures sensitivity, specificity, and segmentation accuracy. The results using 10 MR brain images proved that the proposed method has accurately segmented the three brain tissues compared to the existing segmentation methods such as K-means, fuzzy C-means, OTSU MT, Particle Swarm Optimization (PSO, Bacterial Foraging Algorithm (BFA, Genetic Algorithm (GA, and Fuzzy Local Gaussian Mixture Model (FLGMM.

  2. Theoretical detection threshold of the proton-acoustic range verification technique

    International Nuclear Information System (INIS)

    Ahmad, Moiz; Yousefi, Siavash; Xing, Lei; Xiang, Liangzhong

    2015-01-01

    Purpose: Range verification in proton therapy using the proton-acoustic signal induced in the Bragg peak was investigated for typical clinical scenarios. The signal generation and detection processes were simulated in order to determine the signal-to-noise limits. Methods: An analytical model was used to calculate the dose distribution and local pressure rise (per proton) for beams of different energy (100 and 160 MeV) and spot widths (1, 5, and 10 mm) in a water phantom. In this method, the acoustic waves propagating from the Bragg peak were generated by the general 3D pressure wave equation implemented using a finite element method. Various beam pulse widths (0.1–10 μs) were simulated by convolving the acoustic waves with Gaussian kernels. A realistic PZT ultrasound transducer (5 cm diameter) was simulated with a Butterworth bandpass filter with consideration of random noise based on a model of thermal noise in the transducer. The signal-to-noise ratio on a per-proton basis was calculated, determining the minimum number of protons required to generate a detectable pulse. The maximum spatial resolution of the proton-acoustic imaging modality was also estimated from the signal spectrum. Results: The calculated noise in the transducer was 12–28 mPa, depending on the transducer central frequency (70–380 kHz). The minimum number of protons detectable by the technique was on the order of 3–30 × 10 6 per pulse, with 30–800 mGy dose per pulse at the Bragg peak. Wider pulses produced signal with lower acoustic frequencies, with 10 μs pulses producing signals with frequency less than 100 kHz. Conclusions: The proton-acoustic process was simulated using a realistic model and the minimal detection limit was established for proton-acoustic range validation. These limits correspond to a best case scenario with a single large detector with no losses and detector thermal noise as the sensitivity limiting factor. Our study indicated practical proton-acoustic range

  3. Theoretical detection threshold of the proton-acoustic range verification technique

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad, Moiz; Yousefi, Siavash; Xing, Lei, E-mail: lei@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, California 94305-5847 (United States); Xiang, Liangzhong [Center for Bioengineering and School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019-1101 (United States)

    2015-10-15

    Purpose: Range verification in proton therapy using the proton-acoustic signal induced in the Bragg peak was investigated for typical clinical scenarios. The signal generation and detection processes were simulated in order to determine the signal-to-noise limits. Methods: An analytical model was used to calculate the dose distribution and local pressure rise (per proton) for beams of different energy (100 and 160 MeV) and spot widths (1, 5, and 10 mm) in a water phantom. In this method, the acoustic waves propagating from the Bragg peak were generated by the general 3D pressure wave equation implemented using a finite element method. Various beam pulse widths (0.1–10 μs) were simulated by convolving the acoustic waves with Gaussian kernels. A realistic PZT ultrasound transducer (5 cm diameter) was simulated with a Butterworth bandpass filter with consideration of random noise based on a model of thermal noise in the transducer. The signal-to-noise ratio on a per-proton basis was calculated, determining the minimum number of protons required to generate a detectable pulse. The maximum spatial resolution of the proton-acoustic imaging modality was also estimated from the signal spectrum. Results: The calculated noise in the transducer was 12–28 mPa, depending on the transducer central frequency (70–380 kHz). The minimum number of protons detectable by the technique was on the order of 3–30 × 10{sup 6} per pulse, with 30–800 mGy dose per pulse at the Bragg peak. Wider pulses produced signal with lower acoustic frequencies, with 10 μs pulses producing signals with frequency less than 100 kHz. Conclusions: The proton-acoustic process was simulated using a realistic model and the minimal detection limit was established for proton-acoustic range validation. These limits correspond to a best case scenario with a single large detector with no losses and detector thermal noise as the sensitivity limiting factor. Our study indicated practical proton

  4. Effects of tempo, swing density, and listener's drumming experience, on swing detection thresholds for drum rhythms.

    Science.gov (United States)

    Frane, Andrew V; Shams, Ladan

    2017-06-01

    Swing, a popular technique in music performance, has been said to enhance the "groove" of the rhythm. Swing works by delaying the onsets of even-numbered subdivisions of each beat (e.g., 16th-note swing delays the onsets of the second and fourth 16th-note subdivisions of each quarter-note beat). The "swing magnitude" (loosely speaking, the amount of delay) is often quite small. And there has been little investigation, using musical stimuli, into what swing magnitudes listeners can detect. To that end, this study presented continually-looped electronic drum rhythms, with 16th-note swing in the hi-hat on every other bar, to drummers and non-drummers. Swing magnitude was adjusted using a staircase procedure, to determine the magnitude where the difference between swinging and not-swinging bars was just-noticeable. Different tempi (60 to 140 quarter-notes per minute) and swing densities (how often notes occurred at even-numbered subdivisions) were used. Results showed that all subjects could detect smaller swing magnitudes when swing density was higher, thus confirming a previous speculation that the perceptual salience of swing increases with swing density. The just-noticeable magnitudes of swing for drummers differed from those of non-drummers, in terms of both overall magnitude and sensitivity to tempo, thus prompting questions for further exploration.

  5. Passive Detection of Narrowband Sources Using a Sensor Array

    Energy Technology Data Exchange (ETDEWEB)

    Chambers, D H; Candy, J V; Guidry, B L

    2007-10-24

    In this report we derive a model for a highly scattering medium, implemented as a set of MATLAB functions. This model is used to analyze an approach for using time-reversal to enhance the detection of a single frequency source in a highly scattering medium. The basic approach is to apply the singular value decomposition to the multistatic response matrix for a time-reversal array system. We then use the array in a purely passive mode, measuring the response to the presence of a source. The measured response is projected onto the singular vectors, creating a time-reversal pseudo-spectrum. We can then apply standard detection techniques to the pseudo-spectrum to determine the presence of a source. If the source is close to a particular scatterer in the medium, then we would expect an enhancement of the inner product between the array response to the source with the singular vector associated with that scatterer. In this note we begin by deriving the Foldy-Lax model of a highly scattering medium, calculate both the field emitted by the source and the multistatic response matrix of a time-reversal array system in the medium, then describe the initial analysis approach.

  6. Limit of detection of a fiber optics gyroscope using a super luminescent radiation source

    International Nuclear Information System (INIS)

    Sandoval R, G.E.; Nikolaev, V.A.

    2003-01-01

    The main objective of this work is to establish the dependence of characteristics of the fiber optics gyroscope (FOG) with respect to the parameters of the super luminescent emission source based on doped optical fiber with rare earth elements (Super luminescent Fiber Source, SFS), argument the pumping rate election of the SFS to obtain characteristics limits of the FOG sensibility. By using this type of emission source in the FOG is recommend to use the rate when the direction of the pumping signal coincide with the super luminescent signal. The most results are the proposition and argumentation of the SFS election as emission source to be use in the FOG of the phase type. Such a decision allow to increase the characteristics of the FOG sensibility in comparison with the use of luminescent source of semiconductors emission which are extensively used in the present time. The use of emission source of the SFS type allow to come closer to the threshold of the obtained sensibility limit (detection limit) which is determined with the shot noise. (Author)

  7. Limit of detection of a fiber optics gyroscope using a super luminescent radiation source

    CERN Document Server

    Sandoval, G E

    2003-01-01

    The main objective of this work is to establish the dependence of characteristics of the fiber optics gyroscope (FOG) with respect to the parameters of the super luminescent emission source based on doped optical fiber with rare earth elements (Super luminescent Fiber Source, SFS), argument the pumping rate election of the SFS to obtain characteristics limits of the FOG sensibility. By using this type of emission source in the FOG is recommend to use the rate when the direction of the pumping signal coincide with the super luminescent signal. The most results are the proposition and argumentation of the SFS election as emission source to be use in the FOG of the phase type. Such a decision allow to increase the characteristics of the FOG sensibility in comparison with the use of luminescent source of semiconductors emission which are extensively used in the present time. The use of emission source of the SFS type allow to come closer to the threshold of the obtained sensibility limit (detection limit) which i...

  8. Thresholds of Detection and Identification of Halite Nodule Habitats in the Atacama Desert Using Remote Imaging

    Science.gov (United States)

    Phillips, M. S.; Moersch, J. E.; Cabrol, N. A.; Davila, A. F.

    2018-01-01

    The guiding theme of Mars exploration is shifting from global and regional habitability assessment to biosignature detection. To locate features likely to contain biosignatures, it is useful to focus on the reliable identification of specific habitats with high biosignature preservation potential. Proposed chloride deposits on Mars may represent evaporitic environments conducive to the preservation of biosignatures. Analogous chloride- bearing, salt-encrusted playas (salars) are a habitat for life in the driest parts of the Atacama Desert, and are also environments with a taphonomic window. The specific geologic features that harbor and preserve microorganisms in Atacama salars are sub- meter to meter scale salt protuberances, or halite nodules. This study focuses on the ability to recognize and map halite nodules using images acquired from an unmanned aerial vehicle (UAV) at spatial resolutions ranging from mm/pixel to that of the highest resolution orbital images available for Mars.

  9. Optimization of input parameters of supra-threshold stochastic resonance image processing algorithm for the detection of abdomino-pelvic tumors on PET/CT scan

    International Nuclear Information System (INIS)

    Pandey, Anil Kumar; Saroha, Kartik; Patel, C.D.; Bal, C.S.; Kumar, Rakesh

    2016-01-01

    Administration of diuretics increases the urine output to clear radioactive urine from kidneys and bladder. Hence post-diuretic pelvic PET/CT scan enhances the probability of detection of abdomino-pelvic tumor. However, it causes discomfort in patients and has some side effects also. Application of supra threshold stochastic resonance (SSR) image processing algorithm on Pre-diuretic PET/CT scan may also increase the probability of detection of these tumors. Amount of noise and threshold are two variable parameters that effect the final image quality. This study was conducted to investigate the effect of these two variable parameters on the detection of abdomen-pelvic tumor

  10. Detecting people of interest from internet data sources

    Science.gov (United States)

    Cardillo, Raymond A.; Salerno, John J.

    2006-04-01

    In previous papers, we have documented success in determining the key people of interest from a large corpus of real-world evidence. Our recent efforts focus on exploring additional domains and data sources. Internet data sources such as email, web pages, and news feeds make it easier to gather a large corpus of documents for various domains, but detecting people of interest in these sources introduces new challenges. Analyzing these massive sources magnifies entity resolution problems, and demands a storage management strategy that supports efficient algorithmic analysis and visualization techniques. This paper discusses the techniques we used in order to analyze the ENRON email repository, which are also applicable to analyzing web pages returned from our "Buddy" meta-search engine.

  11. An overview of gravitational waves theory, sources and detection

    CERN Document Server

    Auger, Gerard

    2017-01-01

    This book describes detection techniques used to search for and analyze gravitational waves (GW). It covers the whole domain of GW science, starting from the theory and ending with the experimental techniques (both present and future) used to detect them. The theoretical sections of the book address the theory of general relativity and of GW, followed by the theory of GW detection. The various sources of GW are described as well as the methods used to analyse them and to extract their physical parameters. It includes an analysis of the consequences of GW observations in terms of astrophysics as well as a description of the different detectors that exist and that are planned for the future. With the recent announcement of GW detection and the first results from LISA Pathfinder, this book will allow non-specialists to understand the present status of the field and the future of gravitational wave science

  12. Non-human primate skull effects on the cavitation detection threshold of FUS-induced blood-brain barrier opening

    Science.gov (United States)

    Wu, Shih-Ying; Tung, Yao-Sheng; Marquet, Fabrice; Chen, Cherry C.; Konofagou, Elisa E.

    2012-11-01

    Microbubble (MB)-assisted focused ultrasound is a promising technique for delivering drugs to the brain by noninvasively and transiently opening the blood-brain barrier (BBB), and monitoring BBB opening using passive cavitation detection (PCD) is critical in detecting its occurrence, extent as well as assessing its mechanism. One of the main obstacles in achieving those objectives in large animals is the transcranial attenuation. To study the effects, the cavitation response through the in-vitro non-human primate (NHP) skull was investigated. In-house manufactured lipid-shelled MB (medium diameter: 4-5 um) were injected into a 4-mm channel of a phantom below a degassed monkey skull. A hydrophone confocally aligned with the FUS transducer served as PCD during sonication (frequency: 0.50 MHz, peak rarefactional pressures: 0.05-0.60 MPa, pulse length: 100 cycles, PRF: 10 Hz, duration: 2 s) for four cases: water without skull, water with skull, MB without skull and MB with skull. A 5.1-MHz linear-array transducer was also used to monitor the MB disruption. The frequency spectra, spectrograms, stable cavitation dose (SCD) and inertial cavitation dose (ICD) were quantified. Results showed that the onset of stable cavitation and inertial cavitation in the experiments occurred at 50 kPa, and was detectable throught the NHP skull since the both the detection thresholds for stable cavitation and inertial cavitation remained unchanged compared to the non-skull case, and the SCD and ICD acquired transcranially may not adequately represent the true extent of stable and inertial cavitation due to the skull attenuation.

  13. Accelerating fissile material detection with a neutron source

    Science.gov (United States)

    Rowland, Mark S.; Snyderman, Neal J.

    2018-01-30

    A neutron detector system for discriminating fissile material from non-fissile material wherein a digital data acquisition unit collects data at high rate, and in real-time processes large volumes of data directly to count neutrons from the unknown source and detecting excess grouped neutrons to identify fission in the unknown source. The system includes a Poisson neutron generator for in-beam interrogation of a possible fissile neutron source and a DC power supply that exhibits electrical ripple on the order of less than one part per million. Certain voltage multiplier circuits, such as Cockroft-Walton voltage multipliers, are used to enhance the effective of series resistor-inductor circuits components to reduce the ripple associated with traditional AC rectified, high voltage DC power supplies.

  14. A Comparison of Source Code Plagiarism Detection Engines

    Science.gov (United States)

    Lancaster, Thomas; Culwin, Fintan

    2004-06-01

    Automated techniques for finding plagiarism in student source code submissions have been in use for over 20 years and there are many available engines and services. This paper reviews the literature on the major modern detection engines, providing a comparison of them based upon the metrics and techniques they deploy. Generally the most common and effective techniques are seen to involve tokenising student submissions then searching pairs of submissions for long common substrings, an example of what is defined to be a paired structural metric. Computing academics are recommended to use one of the two Web-based detection engines, MOSS and JPlag. It is shown that whilst detection is well established there are still places where further research would be useful, particularly where visual support of the investigation process is possible.

  15. Galactic Sources Detected in the NuSTAR Serendipitous Survey

    Energy Technology Data Exchange (ETDEWEB)

    Tomsick, John A.; Clavel, Maïca; Chiu, Jeng-Lun [Space Sciences Laboratory, 7 Gauss Way, University of California, Berkeley, CA 94720-7450 (United States); Lansbury, George B.; Aird, James [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Rahoui, Farid [Department of Astronomy, Harvard University, 60 Garden Street, Cambridge, MA 02138 (United States); Fornasini, Francesca M.; Hong, JaeSub; Grindlay, Jonathan E. [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Alexander, David M. [Centre for Extragalactic Astronomy, Department of Physics, University of Durham, South Road, Durham DH1 3LE (United Kingdom); Bodaghee, Arash [Georgia College and State University, Milledgeville, GA 31061 (United States); Hailey, Charles J.; Mori, Kaya [Columbia Astrophysics Laboratory, Columbia University, New York, NY 10027 (United States); Harrison, Fiona A. [California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Krivonos, Roman A. [Space Research Institute of the Russian Academy of Sciences, Profsoyuznaya Str. 84/32, 117997, Moscow (Russian Federation); Stern, Daniel [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States)

    2017-06-01

    The Nuclear Spectroscopic Telescope Array (NuSTAR) provides an improvement in sensitivity at energies above 10 keV by two orders of magnitude over non-focusing satellites, making it possible to probe deeper into the Galaxy and universe. Lansbury and collaborators recently completed a catalog of 497 sources serendipitously detected in the 3–24 keV band using 13 deg{sup 2} of NuSTAR coverage. Here, we report on an optical and X-ray study of 16 Galactic sources in the catalog. We identify 8 of them as stars (but some or all could have binary companions), and use information from Gaia to report distances and X-ray luminosities for 3 of them. There are 4 CVs or CV candidates, and we argue that NuSTAR J233426–2343.9 is a relatively strong CV candidate based partly on an X-ray spectrum from XMM-Newton . NuSTAR J092418–3142.2, which is the brightest serendipitous source in the Lansbury catalog, and NuSTAR J073959–3147.8 are low-mass X-ray binary candidates, but it is also possible that these 2 sources are CVs. One of the sources is a known high-mass X-ray binary (HMXB), and NuSTAR J105008–5958.8 is a new HMXB candidate that has strong Balmer emission lines in its optical spectrum and a hard X-ray spectrum. We discuss the implications of finding these HMXBs for the surface density (log N –log S ) and luminosity function of Galactic HMXBs. We conclude that with the large fraction of unclassified sources in the Galactic plane detected by NuSTAR in the 8–24 keV band, there could be a significant population of low-luminosity HMXBs.

  16. An Analytical Threshold Voltage Model of Fully Depleted (FD) Recessed-Source/Drain (Re-S/D) SOI MOSFETs with Back-Gate Control

    Science.gov (United States)

    Saramekala, Gopi Krishna; Tiwari, Pramod Kumar

    2016-10-01

    This paper presents an analytical threshold voltage model for back-gated fully depleted (FD), recessed-source drain silicon-on-insulator metal-oxide-semiconductor field-effect transistors (MOSFETs). Analytical surface potential models have been developed at front and back surfaces of the channel by solving the two-dimensional (2-D) Poisson's equation in the channel region with appropriate boundary conditions assuming a parabolic potential profile in the transverse direction of the channel. The strong inversion criterion is applied to the front surface potential as well as on the back one in order to find two separate threshold voltages for front and back channels of the device, respectively. The device threshold voltage has been assumed to be associated with the surface that offers a lower threshold voltage. The developed model was analyzed extensively for a variety of device geometry parameters like the oxide and silicon channel thicknesses, the thickness of the source/drain extension in the buried oxide, and the applied bias voltages with back-gate control. The proposed model has been validated by comparing the analytical results with numerical simulation data obtained from ATLAS™, a 2-D device simulator from SILVACO.

  17. Electron Signal Detection for the Beam-Finder Wire of the Linac Coherent Light Source Undulator

    International Nuclear Information System (INIS)

    Wu, Juhao; Emma, P.; Field, R.C.; SLAC

    2006-01-01

    The Linac Coherent Light Source (LCLS) is a SASE x-ray Free-Electron Laser (FEL) based on the final kilometer of the Stanford Linear Accelerator. The tight tolerances for positioning the electron beam close to the undulator axis calls for the introduction of Beam Finder Wire (BFW) device. A BFW device close to the upstream end of the undulator segment and a quadrupole close to the down stream end of the undulator segment will allow a beam-based undulator segment alignment. Based on the scattering of the electrons on the BFW, we can detect the electron signal in the main dump bends after the undulator to find the beam position. We propose to use a threshold Cherenkov counter for this purpose. According to the signal strength at such a Cherenkov counter, we then suggest choice of material and size for such a BFW device in the undulator

  18. Some statistical problems inherent in radioactive-source detection

    International Nuclear Information System (INIS)

    Barnett, C.S.

    1978-01-01

    Some of the statistical questions associated with problems of detecting random-point-process signals embedded in random-point-process noise are examined. An example of such a problem is that of searching for a lost radioactive source with a moving detection system. The emphasis is on theoretical questions, but some experimental and Monte Carlo results are used to test the theoretical results. Several idealized binary decision problems are treated by starting with simple, specific situations and progressing toward more general problems. This sequence of decision problems culminates in the minimum-cost-expectation rule for deciding between two Poisson processes with arbitrary intensity functions. As an example, this rule is then specialized to the detector-passing-a-point-source decision problem. Finally, Monte Carlo techniques are used to develop and test one estimation procedure: the maximum-likelihood estimation of a parameter in the intensity function of a Poisson process. For the Monte Carlo test this estimation procedure is specialized to the detector-passing-a-point-source case. Introductory material from probability theory is included so as to make the report accessible to those not especially conversant with probabilistic concepts and methods. 16 figures

  19. How to select a proper early warning threshold to detect infectious disease outbreaks based on the China infectious disease automated alert and response system (CIDARS).

    Science.gov (United States)

    Wang, Ruiping; Jiang, Yonggen; Michael, Engelgau; Zhao, Genming

    2017-06-12

    China Centre for Diseases Control and Prevention (CDC) developed the China Infectious Disease Automated Alert and Response System (CIDARS) in 2005. The CIDARS was used to strengthen infectious disease surveillance and aid in the early warning of outbreak. The CIDARS has been integrated into the routine outbreak monitoring efforts of the CDC at all levels in China. Early warning threshold is crucial for outbreak detection in the CIDARS, but CDCs at all level are currently using thresholds recommended by the China CDC, and these recommended thresholds have recognized limitations. Our study therefore seeks to explore an operational method to select the proper early warning threshold according to the epidemic features of local infectious diseases. The data used in this study were extracted from the web-based Nationwide Notifiable Infectious Diseases Reporting Information System (NIDRIS), and data for infectious disease cases were organized by calendar week (1-52) and year (2009-2015) in Excel format; Px was calculated using a percentile-based moving window (moving window [5 week*5 year], x), where x represents one of 12 centiles (0.40, 0.45, 0.50….0.95). Outbreak signals for the 12 Px were calculated using the moving percentile method (MPM) based on data from the CIDARS. When the outbreak signals generated by the 'mean + 2SD' gold standard were in line with a Px generated outbreak signal for each week during the year of 2014, this Px was then defined as the proper threshold for the infectious disease. Finally, the performance of new selected thresholds for each infectious disease was evaluated by simulated outbreak signals based on 2015 data. Six infectious diseases were selected in this study (chickenpox, mumps, hand foot and mouth diseases (HFMD), scarlet fever, influenza and rubella). Proper thresholds for chickenpox (P75), mumps (P80), influenza (P75), rubella (P45), HFMD (P75), and scarlet fever (P80) were identified. The selected proper thresholds for these

  20. Magnetoencephalographic accuracy profiles for the detection of auditory pathway sources.

    Science.gov (United States)

    Bauer, Martin; Trahms, Lutz; Sander, Tilmann

    2015-04-01

    The detection limits for cortical and brain stem sources associated with the auditory pathway are examined in order to analyse brain responses at the limits of the audible frequency range. The results obtained from this study are also relevant to other issues of auditory brain research. A complementary approach consisting of recordings of magnetoencephalographic (MEG) data and simulations of magnetic field distributions is presented in this work. A biomagnetic phantom consisting of a spherical volume filled with a saline solution and four current dipoles is built. The magnetic fields outside of the phantom generated by the current dipoles are then measured for a range of applied electric dipole moments with a planar multichannel SQUID magnetometer device and a helmet MEG gradiometer device. The inclusion of a magnetometer system is expected to be more sensitive to brain stem sources compared with a gradiometer system. The same electrical and geometrical configuration is simulated in a forward calculation. From both the measured and the simulated data, the dipole positions are estimated using an inverse calculation. Results are obtained for the reconstruction accuracy as a function of applied electric dipole moment and depth of the current dipole. We found that both systems can localize cortical and subcortical sources at physiological dipole strength even for brain stem sources. Further, we found that a planar magnetometer system is more suitable if the position of the brain source can be restricted in a limited region of the brain. If this is not the case, a helmet-shaped sensor system offers more accurate source estimation.

  1. Stochastic Industrial Source Detection Using Lower Cost Methods

    Science.gov (United States)

    Thoma, E.; George, I. J.; Brantley, H.; Deshmukh, P.; Cansler, J.; Tang, W.

    2017-12-01

    Hazardous air pollutants (HAPs) can be emitted from a variety of sources in industrial facilities, energy production, and commercial operations. Stochastic industrial sources (SISs) represent a subcategory of emissions from fugitive leaks, variable area sources, malfunctioning processes, and improperly controlled operations. From the shared perspective of industries and communities, cost-effective detection of mitigable SIS emissions can yield benefits such as safer working environments, cost saving through reduced product loss, lower air shed pollutant impacts, and improved transparency and community relations. Methods for SIS detection can be categorized by their spatial regime of operation, ranging from component-level inspection to high-sensitivity kilometer scale surveys. Methods can be temporally intensive (providing snap-shot measures) or sustained in both time-integrated and continuous forms. Each method category has demonstrated utility, however, broad adoption (or routine use) has thus far been limited by cost and implementation viability. Described here are a subset of SIS methods explored by the U.S EPA's next generation emission measurement (NGEM) program that focus on lower cost methods and models. An emerging systems approach that combines multiple forms to help compensate for reduced performance factors of lower cost systems is discussed. A case study of a multi-day HAP emission event observed by a combination of low cost sensors, open-path spectroscopy, and passive samplers is detailed. Early field results of a novel field gas chromatograph coupled with a fast HAP concentration sensor is described. Progress toward near real-time inverse source triangulation assisted by pre-modeled facility profiles using the Los Alamos Quick Urban & Industrial Complex (QUIC) model is discussed.

  2. The influence of olfactory concept on the probability of detecting sub- and peri-threshold odorants in a complex mixture

    NARCIS (Netherlands)

    Bult, J.H.F.; Schifferstein, H.N.J.; Roozen, J.P.; Voragen, A.G.J.; Kroeze, J.H.A.

    2001-01-01

    The headspace of apple juice was analysed to obtain an ecologically relevant stimulus model mixture of apple volatiles. Two sets of volatiles were made up: a set of eight supra-threshold volatiles (MIX) and a set of three sub-threshold volatiles. These sets were used to test the hypothesis that

  3. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  4. Transmission thresholds for dengue in terms of Aedes aegypti pupae per person with discussion of their utility in source reduction efforts.

    Science.gov (United States)

    Focks, D A; Brenner, R J; Hayes, J; Daniels, E

    2000-01-01

    The expense and ineffectiveness of drift-based insecticide aerosols to control dengue epidemics has led to suppression strategies based on eliminating larval breeding sites. With the notable but short-lived exceptions of Cuba and Singapore, these source reduction efforts have met with little documented success; failure has chiefly been attributed to inadequate participation of the communities involved. The present work attempts to estimate transmission thresholds for dengue based on an easily-derived statistic, the standing crop of Aedes aegypti pupae per person in the environment. We have developed these thresholds for use in the assessment of risk of transmission and to provide targets for the actual degree of suppression required to prevent or eliminate transmission in source reduction programs. The notion of thresholds is based on 2 concepts: the mass action principal-the course of an epidemic is dependent on the rate of contact between susceptible hosts and infectious vectors, and threshold theory-the introduction of a few infectious individuals into a community of susceptible individuals will not give rise to an outbreak unless the density of vectors exceeds a certain critical level. We use validated transmission models to estimate thresholds as a function of levels of pre-existing antibody levels in human populations, ambient air temperatures, and size and frequency of viral introduction. Threshold levels were estimated to range between about 0.5 and 1.5 Ae. aegypti pupae per person for ambient air temperatures of 28 degrees C and initial seroprevalences ranging between 0% to 67%. Surprisingly, the size of the viral introduction used in these studies, ranging between 1 and 12 infectious individuals per year, was not seen to significantly influence the magnitude of the threshold. From a control perspective, these results are not particularly encouraging. The ratio of Ae. aegypti pupae to human density has been observed in limited field studies to range

  5. Detecting analogical resemblance without retrieving the source analogy.

    Science.gov (United States)

    Kostic, Bogdan; Cleary, Anne M; Severin, Kaye; Miller, Samuel W

    2010-06-01

    We examined whether people can detect analogical resemblance to an earlier experimental episode without being able to recall the experimental source of the analogical resemblance. We used four-word analogies (e.g., robin-nest/beaver-dam), in a variation of the recognition-without-cued-recall method (Cleary, 2004). Participants studied word pairs (e.g., robin-nest) and were shown new word pairs at test, half of which analogically related to studied word pairs (e.g., beaver-dam) and half of which did not. For each test pair, participants first attempted to recall an analogically similar pair from the study list. Then, regardless of whether successful recall occurred, participants were prompted to rate the familiarity of the test pair, which was said to indicate the likelihood that a pair that was analogically similar to the test pair had been studied. Across three experiments, participants demonstrated an ability to detect analogical resemblance without recalling the source analogy. Findings are discussed in terms of their potential relevance to the study of analogical reasoning and insight, as well as to the study of familiarity and recognition memory.

  6. A national survey of HDR source knowledge among practicing radiation oncologists and residents: Establishing a willingness-to-pay threshold for cobalt-60 usage.

    Science.gov (United States)

    Mailhot Vega, Raymond; Talcott, Wesley; Ishaq, Omar; Cohen, Patrice; Small, Christina J; Duckworth, Tamara; Sarria Bardales, Gustavo; Perez, Carmen A; Schiff, Peter B; Small, William; Harkenrider, Matthew M

    Ir-192 is the predominant source for high-dose-rate (HDR) brachytherapy in United States markets. Co-60, with longer half-life and fewer source exchanges, has piloted abroad with comparable clinical dosimetry but increased shielding requirements. We sought to identify practitioner knowledge of Co-60 and establish acceptable willingness-to-pay (WTP) thresholds for additional shielding requirements for use in future cost-benefit analysis. A nationwide survey of U.S. radiation oncologists was conducted from June to July 2015, assessing knowledge of HDR sources, brachytherapy unit shielding, and factors that may influence source-selection decision-making. Self-identified decision makers in radiotherapy equipment purchase and acquisition were asked their WTP on shielding should a more cost-effective source become available. Four hundred forty surveys were completed and included. Forty-four percent were ABS members. Twenty percent of respondents identified Co-60 as an HDR source. Respondents who identified Co-60 were significantly more likely to be ABS members, have attended a national brachytherapy conference, and be involved in brachytherapy selection. Sixty-six percent of self-identified decision makers stated that their facility would switch to a more cost-effective source than Ir-192, if available. Cost and experience were the most common reasons provided for not switching. The most common WTP value selected by respondents was decision makers to establish WTP for shielding costs that source change to Co-60 may require. These results will be used to establish WTP threshold for future cost-benefit analysis. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  7. Detection of aeroacoustic sound sources on aircraft and wind turbines

    International Nuclear Information System (INIS)

    Oerlemans, S.

    2009-01-01

    This thesis deals with the detection of aeroacoustic sound sources on aircraft and wind turbines using phased microphone arrays. First, the reliability of the array technique is assessed using airframe noise measurements in open and closed wind tunnels. It is demonstrated that quantitative acoustic measurements are possible in both wind tunnels. Then, the array technique is applied to characterize the noise sources on two modern large wind turbines. It is shown that practically all noise emitted to the ground is produced by the outer part of the blades during their downward movement. This asymmetric source pattern, which causes the typical swishing noise during the passage of the blades, can be explained by trailing edge noise directivity and convective amplification. Next, a semi-empirical prediction method is developed for the noise from large wind turbines. The prediction code is successfully validated against the experimental results, not only with regard to sound levels, spectra, and directivity, but also with regard to the noise source distribution in the rotor plane and the temporal variation in sound level (swish). The validated prediction method is then applied to calculate wind turbine noise footprints, which show that large swish amplitudes can occur even at large distance. The influence of airfoil shape on blade noise is investigated through acoustic wind tunnel tests on a series of wind turbine airfoils. Measurements are carried out at various wind speeds and angles of attack, with and without upstream turbulence and boundary layer tripping. The speed dependence, directivity, and tonal behaviour are determined for both trailing edge noise and inflow turbulence noise. Finally, two noise reduction concepts are tested on a large wind turbine: acoustically optimized airfoils and trailing edge serrations. Both blade modifications yield a significant trailing edge noise reduction at low frequencies, but also cause increased tip noise at high frequencies

  8. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  9. Detection of embedded radiation sources using temporal variation of gamma spectral data.

    Energy Technology Data Exchange (ETDEWEB)

    Shokair, Isaac R.

    2011-09-01

    Conventional full spectrum gamma spectroscopic analysis has the objective of quantitative identification of all the isotopes present in a measurement. For low energy resolution detectors, when photopeaks alone are not sufficient for complete isotopic identification, such analysis requires template spectra for all the isotopes present in the measurement. When many isotopes are present it is difficult to make the correct identification and this process often requires many trial solutions by highly skilled spectroscopists. This report investigates the potential of a new analysis method which uses spatial/temporal information from multiple low energy resolution measurements to test the hypothesis of the presence of a target spectrum of interest in these measurements without the need to identify all the other isotopes present. This method is referred to as targeted principal component analysis (TPCA). For radiation portal monitor applications, multiple measurements of gamma spectra are taken at equally spaced time increments as a vehicle passes through the portal and the TPCA method is directly applicable to this type of measurement. In this report we describe the method and investigate its application to the problem of detection of a radioactive localized source that is embedded in a distributed source in the presence of an ambient background. Examples using simulated spectral measurements indicate that this method works very well and has the potential for automated analysis for RPM applications. This method is also expected to work well for isotopic detection in the presence of spectrally and spatially varying backgrounds as a result of vehicle-induced background suppression. Further work is needed to include effects of shielding, to understand detection limits, setting of thresholds, and to estimate false positive probability.

  10. Measuring mouse retina response near the detection threshold to direct stimulation of photons with sub-poisson statistics

    Science.gov (United States)

    Tavala, Amir; Dovzhik, Krishna; Schicker, Klaus; Koschak, Alexandra; Zeilinger, Anton

    Probing the visual system of human and animals at very low photon rate regime has recently attracted the quantum optics community. In an experiment on the isolated photoreceptor cells of Xenopus, the cell output signal was measured while stimulating it by pulses with sub-poisson distributed photons. The results showed single photon detection efficiency of 29 +/-4.7% [1]. Another behavioral experiment on human suggests a less detection capability at perception level with the chance of 0.516 +/-0.01 (i.e. slightly better than random guess) [2]. Although the species are different, both biological models and experimental observations with classical light stimuli expect that a fraction of single photon responses is filtered somewhere within the retina network and/or during the neural processes in the brain. In this ongoing experiment, we look for a quantitative answer to this question by measuring the output signals of the last neural layer of WT mouse retina using microelectrode arrays. We use a heralded downconversion single-photon source. We stimulate the retina directly since the eye lens (responsible for 20-50% of optical loss and scattering [2]) is being removed. Here, we demonstrate our first results that confirms the response to the sub-poisson distributied pulses. This project was supported by Austrian Academy of Sciences, SFB FoQuS F 4007-N23 funded by FWF and ERC QIT4QAD 227844 funded by EU Commission.

  11. An analytical threshold voltage model for a short-channel dual-metal-gate (DMG) recessed-source/drain (Re-S/D) SOI MOSFET

    Science.gov (United States)

    Saramekala, G. K.; Santra, Abirmoya; Dubey, Sarvesh; Jit, Satyabrata; Tiwari, Pramod Kumar

    2013-08-01

    In this paper, an analytical short-channel threshold voltage model is presented for a dual-metal-gate (DMG) fully depleted recessed source/drain (Re-S/D) SOI MOSFET. For the first time, the advantages of recessed source/drain (Re-S/D) and of dual-metal-gate structure are incorporated simultaneously in a fully depleted SOI MOSFET. The analytical surface potential model at Si-channel/SiO2 interface and Si-channel/buried-oxide (BOX) interface have been developed by solving the 2-D Poisson’s equation in the channel region with appropriate boundary conditions assuming parabolic potential profile in the transverse direction of the channel. Thereupon, a threshold voltage model is derived from the minimum surface potential in the channel. The developed model is analyzed extensively for a variety of device parameters like the oxide and silicon channel thicknesses, thickness of source/drain extension in the BOX, control and screen gate length ratio. The validity of the present 2D analytical model is verified with ATLAS™, a 2D device simulator from SILVACO Inc.

  12. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  13. Calculation of left ventricular volume and ejection fraction from ECG-gated myocardial SPECT. Automatic detection of endocardial borders by threshold method

    International Nuclear Information System (INIS)

    Fukushi, Shoji; Teraoka, Satomi.

    1997-01-01

    A new method which calculate end-diastolic volume (EDV), end-systolic volume (ESV) and ejection fraction (LVEF) of the left ventricle from myocardial short axis images of ECG-gated SPECT using 99m Tc myocardial perfusion tracer has been designed. Eight frames per cardiac cycle ECG-gated 180 degrees SPECT was performed. Threshold method was used to detect myocardial borders automatically. The optimal threshold was 45% by myocardial SPECT phantom. To determine if EDV, ESV and LVEF can also be calculated by this method, 12 patients were correlated ventriculography (LVG) for 10 days each. The correlation coefficient with LVG was 0.918 (EDV), 0.935 (ESV) and 0.900 (LVEF). This method is excellent at objectivity and reproductivity because of the automatic detection of myocardial borders. It also provides useful information on heart function in addition to myocardial perfusion. (author)

  14. Detection and classification of alarm threshold violations in condition monitoring systems working in highly varying operational conditions

    Science.gov (United States)

    Strączkiewicz, M.; Barszcz, T.; Jabłoński, A.

    2015-07-01

    All commonly used condition monitoring systems (CMS) enable defining alarm thresholds that enhance efficient surveillance and maintenance of dynamic state of machinery. The thresholds are imposed on the measured values such as vibration-based indicators, temperature, pressure, etc. For complex machinery such as wind turbine (WT) the total number of thresholds might be counted in hundreds multiplied by the number of operational states. All the parameters vary not only due to possible machinery malfunctions, but also due to changes in operating conditions and these changes are typically much stronger than the former ones. Very often, such a behavior may lead to hundreds of false alarms. Therefore, authors propose a novel approach based on parameterized description of the threshold violation. For this purpose the novelty and severity factors are introduced. The first parameter refers to the time of violation occurrence while the second one describes the impact of the indicator-increase to the entire machine. Such approach increases reliability of the CMS by providing the operator with the most useful information of the system events. The idea of the procedure is presented on a simulated data similar to those from a wind turbine.

  15. Detection and classification of alarm threshold violations in condition monitoring systems working in highly varying operational conditions

    International Nuclear Information System (INIS)

    Strączkiewicz, M; Barszcz, T; Jabłoński, A

    2015-01-01

    All commonly used condition monitoring systems (CMS) enable defining alarm thresholds that enhance efficient surveillance and maintenance of dynamic state of machinery. The thresholds are imposed on the measured values such as vibration-based indicators, temperature, pressure, etc. For complex machinery such as wind turbine (WT) the total number of thresholds might be counted in hundreds multiplied by the number of operational states. All the parameters vary not only due to possible machinery malfunctions, but also due to changes in operating conditions and these changes are typically much stronger than the former ones. Very often, such a behavior may lead to hundreds of false alarms. Therefore, authors propose a novel approach based on parameterized description of the threshold violation. For this purpose the novelty and severity factors are introduced. The first parameter refers to the time of violation occurrence while the second one describes the impact of the indicator-increase to the entire machine. Such approach increases reliability of the CMS by providing the operator with the most useful information of the system events. The idea of the procedure is presented on a simulated data similar to those from a wind turbine. (paper)

  16. Mechano-sensitive nociceptors are required to detect heat pain thresholds and cowhage itch in human skin.

    Science.gov (United States)

    Weinkauf, B; Dusch, M; van der Ham, J; Benrath, J; Ringkamp, M; Schmelz, M; Rukwied, R

    2016-02-01

    Mechano-sensitive and mechano-insensitive C-nociceptors in human skin differ in receptive field sizes and electrical excitation thresholds, but their distinct functional roles are yet unclear. After blocking the lateral femoral cutaneous nerve (NCFL) in eight healthy male subjects (3-mL Naropin(®) 1%), we mapped the skin innervation territory being anaesthetic to mechanical pin prick but sensitive to painful transcutaneous electrical stimuli. Such 'differentially anaesthetic zones' indicated that the functional innervation with mechano-sensitive nociceptors was absent but the innervation with mechano-insensitive nociceptors remained intact. In these areas, we explored heat pain thresholds, low pH-induced pain, cowhage- and histamine-induced itch, and axon reflex flare. In differentially anaesthetic skin, heat pain thresholds were above the cut-off of 50°C (non-anaesthetized skin 47 ± 0.4°C). Pain ratings to 30 μL pH 4 injections were reduced compared to non-anaesthetized skin (48 ± 9 vs. 79 ± 6 VAS; p pain. The mechano-sensitive nociceptors are crucial for cowhage-induced itch and for the assessment of heat pain thresholds. © 2015 European Pain Federation - EFIC®

  17. Detection of extended galactic sources with an underwater neutrino telescope

    International Nuclear Information System (INIS)

    Leisos, A.; Tsirigotis, A. G.; Tzamarias, S. E.; Lenis, D.

    2014-01-01

    In this study we investigate the discovery capability of a Very Large Volume Neutrino Telescope to Galactic extended sources. We focus on the brightest HESS gamma rays sources which are considered also as very high energy neutrino emitters. We use the unbinned method taking into account both the spatial and the energy distribution of high energy neutrinos and we investigate parts of the Galactic plane where nearby potential neutrino emitters form neutrino source clusters. Neutrino source clusters as well as isolated neutrino sources are combined to estimate the observation period for 5 sigma discovery of neutrino signals from these objects

  18. Comparison of two threshold detection criteria methodologies for determination of probe positivity for intraoperative in situ identification of presumed abnormal 18F-FDG-avid tissue sites during radioguided oncologic surgery.

    Science.gov (United States)

    Chapman, Gregg J; Povoski, Stephen P; Hall, Nathan C; Murrey, Douglas A; Lee, Robert; Martin, Edward W

    2014-09-13

    Intraoperative in situ identification of (18)F-FDG-avid tissue sites during radioguided oncologic surgery remains a significant challenge for surgeons. The purpose of our study was to evaluate the 1.5-to-1 ratiometric threshold criteria method versus the three-sigma statistical threshold criteria method for determination of gamma detection probe positivity for intraoperative in situ identification of presumed abnormal (18)F-FDG-avid tissue sites in a manner that was independent of the specific type of gamma detection probe used. From among 52 patients undergoing appropriate in situ evaluation of presumed abnormal (18)F-FDG-avid tissue sites during (18)F-FDG-directed surgery using 6 available gamma detection probe systems, a total of 401 intraoperative gamma detection probe measurement sets of in situ counts per second measurements were cumulatively taken. For the 401 intraoperative gamma detection probe measurement sets, probe positivity was successfully met by the 1.5-to-1 ratiometric threshold criteria method in 150/401 instances (37.4%) and by the three-sigma statistical threshold criteria method in 259/401 instances (64.6%) (P < 0.001). Likewise, the three-sigma statistical threshold criteria method detected true positive results at target-to-background ratios much lower than the 1.5-to-1 target-to-background ratio of the 1.5-to-1 ratiometric threshold criteria method. The three-sigma statistical threshold criteria method was significantly better than the 1.5-to-1 ratiometric threshold criteria method for determination of gamma detection probe positivity for intraoperative in situ detection of presumed abnormal (18)F-FDG-avid tissue sites during radioguided oncologic surgery. This finding may be extremely important for reshaping the ongoing and future research and development of gamma detection probe systems that are necessary for optimizing the in situ detection of radioisotopes of higher-energy gamma photon emissions used during radioguided oncologic surgery.

  19. Effects of Active and Passive Hearing Protection Devices on Sound Source Localization, Speech Recognition, and Tone Detection.

    Directory of Open Access Journals (Sweden)

    Andrew D Brown

    Full Text Available Hearing protection devices (HPDs such as earplugs offer to mitigate noise exposure and reduce the incidence of hearing loss among persons frequently exposed to intense sound. However, distortions of spatial acoustic information and reduced audibility of low-intensity sounds caused by many existing HPDs can make their use untenable in high-risk (e.g., military or law enforcement environments where auditory situational awareness is imperative. Here we assessed (1 sound source localization accuracy using a head-turning paradigm, (2 speech-in-noise recognition using a modified version of the QuickSIN test, and (3 tone detection thresholds using a two-alternative forced-choice task. Subjects were 10 young normal-hearing males. Four different HPDs were tested (two active, two passive, including two new and previously untested devices. Relative to unoccluded (control performance, all tested HPDs significantly degraded performance across tasks, although one active HPD slightly improved high-frequency tone detection thresholds and did not degrade speech recognition. Behavioral data were examined with respect to head-related transfer functions measured using a binaural manikin with and without tested HPDs in place. Data reinforce previous reports that HPDs significantly compromise a variety of auditory perceptual facilities, particularly sound localization due to distortions of high-frequency spectral cues that are important for the avoidance of front-back confusions.

  20. Detection of infarct size safety threshold for left ventricular ejection fraction impairment in acute myocardial infarction successfully treated with primary percutaneous coronary intervention.

    Science.gov (United States)

    Sciagrà, Roberto; Cipollini, Fabrizio; Berti, Valentina; Migliorini, Angela; Antoniucci, David; Pupi, Alberto

    2013-04-01

    In acute myocardial infarction (AMI) treated by primary percutaneous coronary intervention (PCI), there is a direct relationship between myocardial damage and consequent left ventricular (LV) functional impairment. It is however unclear whether there is a safety threshold below which infarct size does not significantly affect LV ejection fraction (EF). The aim of this study was to evaluate the relationship between infarct size and LVEF in AMI patients treated by successful PCI using a specific statistical approach to identify a possible safety threshold. Among patients with recent AMI submitted to perfusion gated single photon emission computed tomography (SPECT) to define the infarct size, the data of 427 subjects with sizable infarct size were considered. The relationship between infarct size and LVEF was analysed using a simple segmented regression (SSR) model and an iterative algorithm based on robust least squares (RLS) for parameter estimation. The RLS algorithm detected two break points in the SSR model, set at infarct size values of 11.0 and 51.5 %. Because the slope coefficients of the two extreme segments of the regression line were not significant, by constraining such segments to zero slope in the SSR model, the lower break point was identified at infarct size = 8 % and the upper one at 45 %. Using a rigorous statistical approach, it is possible to demonstrate that below a threshold of 8 % the infarct size apparently does not affect the LVEF and therefore a safety threshold could be set at this value. Furthermore, the same analysis suggests that the relationship between infarct size and LVEF impairment is lost for an infarct size > 45 %.

  1. Structural Health Monitoring System Trade Space Analysis Tool with Consideration for Crack Growth, Sensor Degradation and a Variable Detection Threshold

    Science.gov (United States)

    2014-09-18

    Fatigue crack growth ..................................................................................................25 Probability of detection...32 Figure 5: Fatigue crack growth simulation results for 10 runs .............................................. 35 Figure 6...43 Figure 10: Linear regression fit of ln() vs. ln( ) data for SHM using PZT sensors (Kuhn, 2009

  2. Detecting the leakage source of a reservoir using isotopes.

    Science.gov (United States)

    Yi, Peng; Yang, Jing; Wang, Yongdong; Mugwanezal, Vincent de Paul; Chen, Li; Aldahan, Ala

    2018-07-01

    A good monitoring method is vital for understanding the sources of a water reservoir leakage and planning for effective restoring. Here we present a combination of several tracers ( 222 Rn, oxygen and hydrogen isotopes, anions and temperature) for identification of water leakage sources in the Pushihe pumped storage power station which is in the Liaoning province, China. The results show an average 222 Rn activity of 6843 Bq/m 3 in the leakage water, 3034 Bq/m 3 in the reservoir water, and 41,759 Bq/m 3 in the groundwater. Considering that 222 Rn activity in surface water is typically less than 5000 Bq/m 3 , the low level average 222 Rn activity in the leakage water suggests the reservoir water as the main source of water. Results of the oxygen and hydrogen isotopes show comparable ranges and values in the reservoir and the leakage water samples. However, important contribution of the groundwater (up to 36%) was present in some samples from the bottom and upper parts of the underground powerhouse, while the leakage water from some other parts indicate the reservoir water as the dominant source. The isotopic finding suggests that the reservoir water is the main source of the leakage water which is confirmed by the analysis of anions (nitrate, sulfate, and chloride) in the water samples. The combination of these tracer methods for studying dam water leakage improves the accuracy of identifying the source of leaks and provide a scientific reference for engineering solutions to ensure the dam safety. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Cortical Local Field Potential Power Is Associated with Behavioral Detection of Near-threshold Stimuli in the Rat Whisker System: Dissociation between Orbitofrontal and Somatosensory Cortices.

    Science.gov (United States)

    Rickard, Rachel E; Young, Andrew M J; Gerdjikov, Todor V

    2018-01-01

    There is growing evidence that ongoing brain oscillations may represent a key regulator of attentional processes and as such may contribute to behavioral performance in psychophysical tasks. OFC appears to be involved in the top-down modulation of sensory processing; however, the specific contribution of ongoing OFC oscillations to perception has not been characterized. Here we used the rat whiskers as a model system to further characterize the relationship between cortical state and tactile detection. Head-fixed rats were trained to report the presence of a vibrotactile stimulus (frequency = 60 Hz, duration = 2 sec, deflection amplitude = 0.01-0.5 mm) applied to a single vibrissa. We calculated power spectra of local field potentials preceding the onset of near-threshold stimuli from microelectrodes chronically implanted in OFC and somatosensory cortex. We found a dissociation between slow oscillation power in the two regions in relation to detection probability: Higher OFC but not somatosensory delta power was associated with increased detection probability. Furthermore, coherence between OFC and barrel cortex was reduced preceding successful detection. Consistent with the role of OFC in attention, our results identify a cortical network whose activity is differentially modulated before successful tactile detection.

  4. Fire management and research in the Kruger National Park, with suggestions on the detection of thresholds of potential concern

    Directory of Open Access Journals (Sweden)

    B.W. Van Wilgen

    1998-07-01

    Full Text Available This paper reviews the options for management of the savanna ecosystems of the Kruger National Park using fire. The major goals of management have shifted from attempts to use fire to achieve a stable vegetation composition, to one of recognising that savanna ecosystems are in constant flux. Fire is a major form of disturbance that helps to maintain a state of flux, and thus to conserve biodiversity. Three candidate approaches for fire management have been put forward@the lightning fire approach, the patch mosaic burning approach, and an approach based on the assessment of ecological criteria. These approaches differ in their underlying philosophies, but not necessarily in their outcomes, although this cannot be predicted with confidence. We propose, therefore, that patterns of fire frequency, season, intensity and spatial distribution be recorded and monitored, and that these patterns should serve as surrogate measures of biodiversity. Guidelines for the definition of thresholds of potential concern with regard to these patterns are discussed. The monitoring of both fire patterns and trends in plant and animal populations can be used to identify interactions between fire and the components of the ecosystem, and these in turn can be used to define a relevant research agenda. The role of management in monitoring and assessing fire patterns (previously regarded as a research responsibility is emphasised. Convergence in the patterns of fire that result from the different management approaches could also serve as a basis for merging some or all of these approaches in order to simplify management.

  5. Measuring temporal summation in visual detection with a single-photon source.

    Science.gov (United States)

    Holmes, Rebecca; Victora, Michelle; Wang, Ranxiao Frances; Kwiat, Paul G

    2017-11-01

    Temporal summation is an important feature of the visual system which combines visual signals that arrive at different times. Previous research estimated complete summation to last for 100ms for stimuli judged "just detectable." We measured the full range of temporal summation for much weaker stimuli using a new paradigm and a novel light source, developed in the field of quantum optics for generating small numbers of photons with precise timing characteristics and reduced variance in photon number. Dark-adapted participants judged whether a light was presented to the left or right of their fixation in each trial. In Experiment 1, stimuli contained a stream of photons delivered at a constant rate while the duration was systematically varied. Accuracy should increase with duration as long as the later photons can be integrated with the proceeding ones into a single signal. The temporal integration window was estimated as the point that performance no longer improved, and was found to be 650ms on average. In Experiment 2, the duration of the visual stimuli was kept short (100ms or photons was varied to explore the efficiency of summation over the integration window compared to Experiment 1. There was some indication that temporal summation remains efficient over the integration window, although there is variation between individuals. The relatively long integration window measured in this study may be relevant to studies of the absolute visual threshold, i.e., tests of single-photon vision, where "single" photons should be separated by greater than the integration window to avoid summation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Thermal detection thresholds of Aδ- and C-fibre afferents activated by brief CO2 laser pulses applied onto the human hairy skin.

    Directory of Open Access Journals (Sweden)

    Maxim Churyukanov

    Full Text Available Brief high-power laser pulses applied onto the hairy skin of the distal end of a limb generate a double sensation related to the activation of Aδ- and C-fibres, referred to as first and second pain. However, neurophysiological and behavioural responses related to the activation of C-fibres can be studied reliably only if the concomitant activation of Aδ-fibres is avoided. Here, using a novel CO(2 laser stimulator able to deliver constant-temperature heat pulses through a feedback regulation of laser power by an online measurement of skin temperature at target site, combined with an adaptive staircase algorithm using reaction-time to distinguish between responses triggered by Aδ- and C-fibre input, we show that it is possible to estimate robustly and independently the thermal detection thresholds of Aδ-fibres (46.9±1.7°C and C-fibres (39.8±1.7°C. Furthermore, we show that both thresholds are dependent on the skin temperature preceding and/or surrounding the test stimulus, indicating that the Aδ- and C-fibre afferents triggering the behavioural responses to brief laser pulses behave, at least partially, as detectors of a change in skin temperature rather than as pure level detectors. Most importantly, our results show that the difference in threshold between Aδ- and C-fibre afferents activated by brief laser pulses can be exploited to activate C-fibres selectively and reliably, provided that the rise in skin temperature generated by the laser stimulator is well-controlled. Our approach could constitute a tool to explore, in humans, the physiological and pathophysiological mechanisms involved in processing C- and Aδ-fibre input, respectively.

  7. Detecting New Pedestrian Facilities from VGI Data Sources

    Science.gov (United States)

    Zhong, S.; Xie, Z.

    2017-12-01

    Pedestrian facility (e.g. footbridge, pedestrian crossing and underground passage) information is an important basic data of location based service (LBS) for pedestrians. However, timely updating pedestrian facility information challenges due to facilities change frequently. Previous pedestrian facility information collecting and updating tasks are mainly completed by highly trained specialized persons. However, this conventional approach has several disadvantages such as high cost, long update cycle and so on. Volunteered Geographic Information (VGI) has proven efficiency to provide new, free and fast growing spatial data. Pedestrian trajectory, which can be seen as measurements of real pedestrian road, is one of the most valuable information of VGI data. Although the accuracy of the trajectories is not too high, due to the large number of measurements, an improvement of quality of the road information can be achieved. Thus, we develop a method for detecting new pedestrian facilities based on the current road network and pedestrian trajectories. Specifically, 1) by analyzing speed, distance and direction, those outliers of pedestrian trajectories are removed, 2) a road network matching algorithm is developed for eliminating redundant trajectories, and 3) a space-time cluster algorithm is adopted for detecting new walking facilities. The performance of the method is evaluated with a series of experiments conducted on a part of the road network of Heifei and a large number of real pedestrian trajectories, and verified the results by using Tencent Street map. The results show that the proposed method is able to detecting new pedestrian facilities from VGI data accurately. We believe that the proposed method provides an alternative way for general road data acquisition, and can improve the quality of LBS for pedestrians.

  8. EEG recordings as a source for the detection of IRBD

    DEFF Research Database (Denmark)

    Bisgaard, Sissel; Duun-Christensen, Bolette; Kempfner, Lykke

    2015-01-01

    The purpose of this pilot study was to develop a supportive algorithm for the detection of idiopathic Rapid Eye-Movement (REM) sleep Behaviour Disorder (iRBD) from EEG recordings. iRBD is defined as REM sleep without atonia with no current sign of neurodegenerative disease, and is one...... of the earliest known biomarkers of Parkinson's Disease (PD). It is currently diagnosed by polysomnography (PSG), primarily based on EMG recordings during REM sleep. The algorithm was developed using data collected from 42 control subjects and 34 iRBD subjects. A feature was developed to represent high amplitude...

  9. Study allowing a decision-making from the activity calculation of a iodine 131 source detected in a dump at the incineration facility

    International Nuclear Information System (INIS)

    Houy, J.C.; Laugle, S.

    2000-01-01

    This study is divided in six parts: the first one details the determination of the different threshold in order to make the decision; the second part is the description of the gantry placed at the incineration factory; the third part is devoted to the gantry calibration by detector; the fourth part concerns the theoretical determination of the rates ratio in function of the source position in the truck; the fifth part makes the concordance between the theoretical calculations and the practice measures; the sixth part expresses the source activity and position determination in the truck with the decision-making. To conclude, the alarm threshold adjustment of the Rennes incineration factory is set to twice the background noise without taking into account of the source position in the domestic wastes truck. The alarm setting off can be carried out for a low activity source situated close to the truck wall and conversely, do not detect a MBq source situated in the middle of the truck. This alarm should be set off from a calculation program, taking into account the detectors report, in order to estimate the activity and the position of the source in the truck and to determine the decision making for the management of these wastes. (N.C.)

  10. Is heat pain detection threshold associated with the area of secondary hyperalgesia following brief thermal sensitization? A study of healthy volunteers - design and detailed plan of analysis.

    Science.gov (United States)

    Hansen, Morten Sejer; Wetterslev, Jørn; Pipper, Christian Bressen; Asghar, Mohammad Sohail; Dahl, Jørgen Berg

    2016-05-31

    Several factors are believed to influence the development and experience of pain. Human clinical pain models are central tools, in the investigation of basic physiologic pain responses, and can be applied in patients as well as in healthy volunteers. Each clinical pain model investigates different aspects of the human pain response. Brief thermal sensitization induces a mild burn injury, resulting in development of primary hyperalgesia at the site of stimulation, and secondary hyperalgesia surrounding the site of stimulation. Central sensitization is believed to play an important role in the development of secondary hyperalgesia; however, a possible association of secondary hyperalgesia following brief thermal sensitization and other heat pain models remains unknown. Our aim with this study is to investigate how close the heat pain detection threshold is associated with the size of the area of secondary hyperalgesia induced by the clinical heat pain model: Brief thermal sensitization. We aim to include 120 healthy participants. The participants will be tested on two separate study days with the following procedures: i) Brief thermal sensitization, ii) heat pain detection threshold and iii) pain during thermal stimulation. Additionally, the participants will be tested with the Pain Catastrophizing Scale and Hospital Anxiety and Depression Scale questionnaires. We conducted statistical simulations based on data from our previous study, to estimate an empirical power of 99.9 % with α of 0.05. We define that an R(2) heat stimulation, and thus may be a biomarker of an individual's pain sensitivity. The number of studies investigating secondary hyperalgesia is growing; however basic knowledge of the physiologic aspects of secondary hyperalgesia in humans is still incomplete. We therefore find it interesting to investigate if HPDT, a known quantitative sensory test, is associated with areas of secondary hyperalgesia following brief thermal sensitization Clinicaltrials

  11. Highly sensitive colour change system within slight differences in metal ion concentrations based on homo-binuclear complex formation equilibrium for visual threshold detection of trace metal ions

    International Nuclear Information System (INIS)

    Mizuguchi, Hitoshi; Atsumi, Hiroshi; Hashimoto, Keigo; Shimada, Yasuhiro; Kudo, Yuki; Endo, Masatoshi; Yokota, Fumihiko; Shida, Junichi; Yotsuyanagi, Takao

    2004-01-01

    A new technique of expressing slight differences in metal ion concentrations by clear difference in colour was established for visual threshold detection of trace metal ions. The proposed method is based on rapid change of the mole fraction of the homo-binuclear complex (M 2 L) about a ligand in a narrow range of the total metal ion concentration (M T ) in a small excess, in case the second metal ion is bound to the reagent molecule which can bind two metal ions. Theoretical simulations showed that the highly sensitive colour change within slight differences in metal ion concentrations would be realized under the following conditions: (i) both of the stepwise formation constants of complex species are sufficiently large; (ii) the stepwise formation constant of the 1:1 complex (ML) is larger than that of M 2 L; and (iii) the absorption spectrum of M 2 L is far apart from the other species in the visible region. Furthermore, the boundary of the colour region in M T would be readily controlled by the total ligand concentration (L T ). Based on this theory, the proposed model was verified with the 3,3'-bis[bis(carboxymethyl)amino]methyl derivatives of sulphonephthalein dyes such as xylenol orange (XO), methylthymol blue (MTB), and methylxylenol blue (MXB), which can bind two metal ions at both ends of a π-electron conjugated system. The above-mentioned model was proved with the iron(III)-XO system at pH 2. In addition, MTB and MXB were suitable reagents for the visual threshold detection of trivalent metal ions such as iron(III), aluminium(III), gallium(III) and indium(III) ion in slightly acidic media. The proposed method has been applied successfully as a screening test for aluminium(III) ion in river water sampled at the downstream area of an old mine

  12. Reliability, standard error, and minimum detectable change of clinical pressure pain threshold testing in people with and without acute neck pain.

    Science.gov (United States)

    Walton, David M; Macdermid, Joy C; Nielson, Warren; Teasell, Robert W; Chiasson, Marco; Brown, Lauren

    2011-09-01

    Clinical measurement. To evaluate the intrarater, interrater, and test-retest reliability of an accessible digital algometer, and to determine the minimum detectable change in normal healthy individuals and a clinical population with neck pain. Pressure pain threshold testing may be a valuable assessment and prognostic indicator for people with neck pain. To date, most of this research has been completed using algometers that are too resource intensive for routine clinical use. Novice raters (physiotherapy students or clinical physiotherapists) were trained to perform algometry testing over 2 clinically relevant sites: the angle of the upper trapezius and the belly of the tibialis anterior. A convenience sample of normal healthy individuals and a clinical sample of people with neck pain were tested by 2 different raters (all participants) and on 2 different days (healthy participants only). Intraclass correlation coefficient (ICC), standard error of measurement, and minimum detectable change were calculated. A total of 60 healthy volunteers and 40 people with neck pain were recruited. Intrarater reliability was almost perfect (ICC = 0.94-0.97), interrater reliability was substantial to near perfect (ICC = 0.79-0.90), and test-retest reliability was substantial (ICC = 0.76-0.79). Smaller change was detectable in the trapezius compared to the tibialis anterior. This study provides evidence that novice raters can perform digital algometry with adequate reliability for research and clinical use in people with and without neck pain.

  13. Identification of threshold prostate specific antigen levels to optimize the detection of clinically significant prostate cancer by magnetic resonance imaging/ultrasound fusion guided biopsy.

    Science.gov (United States)

    Shakir, Nabeel A; George, Arvin K; Siddiqui, M Minhaj; Rothwax, Jason T; Rais-Bahrami, Soroush; Stamatakis, Lambros; Su, Daniel; Okoro, Chinonyerem; Raskolnikov, Dima; Walton-Diaz, Annerleim; Simon, Richard; Turkbey, Baris; Choyke, Peter L; Merino, Maria J; Wood, Bradford J; Pinto, Peter A

    2014-12-01

    Prostate specific antigen sensitivity increases with lower threshold values but with a corresponding decrease in specificity. Magnetic resonance imaging/ultrasound targeted biopsy detects prostate cancer more efficiently and of higher grade than standard 12-core transrectal ultrasound biopsy but the optimal population for its use is not well defined. We evaluated the performance of magnetic resonance imaging/ultrasound targeted biopsy vs 12-core biopsy across a prostate specific antigen continuum. We reviewed the records of all patients enrolled in a prospective trial who underwent 12-core transrectal ultrasound and magnetic resonance imaging/ultrasound targeted biopsies from August 2007 through February 2014. Patients were stratified by each of 4 prostate specific antigen cutoffs. The greatest Gleason score using either biopsy method was compared in and across groups as well as across the population prostate specific antigen range. Clinically significant prostate cancer was defined as Gleason 7 (4 + 3) or greater. Univariate and multivariate analyses were performed. A total of 1,003 targeted and 12-core transrectal ultrasound biopsies were performed, of which 564 diagnosed prostate cancer for a 56.2% detection rate. Targeted biopsy led to significantly more upgrading to clinically significant disease compared to 12-core biopsy. This trend increased more with increasing prostate specific antigen, specifically in patients with prostate specific antigen 4 to 10 and greater than 10 ng/ml. Prostate specific antigen 5.2 ng/ml or greater captured 90% of upgrading by targeted biopsy, corresponding to 64% of patients who underwent multiparametric magnetic resonance imaging and subsequent fusion biopsy. Conversely a greater proportion of clinically insignificant disease was detected by 12-core vs targeted biopsy overall. These differences persisted when controlling for potential confounders on multivariate analysis. Prostate cancer upgrading with targeted biopsy increases

  14. Optimization of CW-OSL parameters for improved dose detection threshold in Al2O3:C

    International Nuclear Information System (INIS)

    Rawat, N.S.; Dhabekar, B.; Kulkarni, M.S.; Muthe, K.P.; Mishra, D.R.; Soni, A.; Gupta, S.K.; Babu, D.A.R.

    2014-01-01

    Continuous wave optically stimulated luminescence (CW-OSL) is relatively a simple technique that offers good signal to noise ratio (SNR) and involves simple instrumentation. This study reports the influence and optimization of CW-OSL parameters on minimum detectable dose (MDD) using α-Al 2 O 3 :C phosphor. It is found that at a given stimulation intensity MDD in CW-OSL mode depends on signal integration time. At lower integration times MDD is inferior. It exhibits an improvement for intermediate values, shows a plateau region and deteriorates as integration time increases further. MDD is found to be ∼127 μGy at 4 mW/cm 2 stimulation intensity for integration time of 0.1 s, which improves to ∼10.5 μGy for 60 s. At stimulation intensity of 72 mW/cm 2 , MDD is 37 μGy for integration time of 60 s and improves significantly to 7 μGy for 1 s. - Highlights: • CW-OSL parameters are optimized to obtain best SNR and MDD in Al 2 O 3 :C. • MDD is found to depend on signal integration time and stimulation intensity. • With time, MDD initially improves, stabilizes then deteriorates. • At a given intensity, MDD is optimum for a certain range of integration time

  15. Nodule Detection in a Lung Region that's Segmented with Using Genetic Cellular Neural Networks and 3D Template Matching with Fuzzy Rule Based Thresholding

    International Nuclear Information System (INIS)

    Ozekes, Serhat; Osman, Onur; Ucan, N.

    2008-01-01

    The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer aided detection of lung nodules

  16. Coded moderator approach for fast neutron source detection and localization at standoff

    Energy Technology Data Exchange (ETDEWEB)

    Littell, Jennifer [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Lukosi, Eric, E-mail: elukosi@utk.edu [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States); Institute for Nuclear Security, University of Tennessee, 1640 Cumberland Avenue, Knoxville, TN 37996 (United States); Hayward, Jason; Milburn, Robert; Rowan, Allen [Department of Nuclear Engineering, University of Tennessee, 305 Pasqua Engineering Building, Knoxville, TN 37996 (United States)

    2015-06-01

    Considering the need for directional sensing at standoff for some security applications and scenarios where a neutron source may be shielded by high Z material that nearly eliminates the source gamma flux, this work focuses on investigating the feasibility of using thermal neutron sensitive boron straw detectors for fast neutron source detection and localization. We utilized MCNPX simulations to demonstrate that, through surrounding the boron straw detectors by a HDPE coded moderator, a source-detector orientation-specific response enables potential 1D source localization in a high neutron detection efficiency design. An initial test algorithm has been developed in order to confirm the viability of this detector system's localization capabilities which resulted in identification of a 1 MeV neutron source with a strength equivalent to 8 kg WGPu at 50 m standoff within ±11°.

  17. Process Model Improvement for Source Code Plagiarism Detection in Student Programming Assignments

    Science.gov (United States)

    Kermek, Dragutin; Novak, Matija

    2016-01-01

    In programming courses there are various ways in which students attempt to cheat. The most commonly used method is copying source code from other students and making minimal changes in it, like renaming variable names. Several tools like Sherlock, JPlag and Moss have been devised to detect source code plagiarism. However, for larger student…

  18. Goaf water detection using the grounded electrical source airborne transient electromagnetic system

    Science.gov (United States)

    Li, D.; Ji, Y.; Guan, S.; Wu, Y.; Wang, A.

    2017-12-01

    To detect the geoelectric characteristic of goaf water, the grounded electrical source airborne transient electromagnetic (GREATEM) system (developed by Jilin University, China) is applied to the goaf water detection since its advantages of considerable prospecting depth, lateral resolution and detection efficiency. For the test of GREATEM system in goaf water detection, an experimental survey was conducted at Qinshui coal mine (Shanxi province, China). After data acquisition, noise reduction and inversion, the resistivity profiles of survey area is presented. The results highly agree the investigation information provided by Shanxi Coal Geology Geophysical Surveying Exploration Institute (China), conforming that the GREATEM system is an effective technique for resistivity detection of goaf water.

  19. Early warning system for detection of protozoal contamination of source waters

    DEFF Research Database (Denmark)

    Al-Sabi, Mohammad Nafi Solaiman; Mogensen, Claus; Berg, Tommy W.

    2012-01-01

    Ensuring water quality is an ever increasing important issue world-wide. Currently, detection of protozoa in drinking water is a costly and time consuming process. We have developed an online, real-time sensor for detection of Cryptosporidium and Giardia spp. in a range of source waters. The novel...

  20. Electrophysiological Correlates of the Threshold to Detection of Passive Motion: An Investigation in Professional Volleyball Athletes with and without Atrophy of the Infraspinatus Muscle

    Science.gov (United States)

    Salles, José Inácio; Cossich, Victor Rodrigues Amaral; Amaral, Marcus Vinicius; Monteiro, Martim T.; Cagy, Maurício; Motta, Geraldo; Velasques, Bruna; Piedade, Roberto; Ribeiro, Pedro

    2013-01-01

    The goal of the present study is to compare the electrophysiological correlates of the threshold to detection of passive motion (TTDPM) among three groups: healthy individuals (control group), professional volleyball athletes with atrophy of the infraspinatus muscle on the dominant side, and athletes with no shoulder pathologies. More specifically, the study aims at assessing the effects of infraspinatus muscle atrophy on the cortical representation of the TTDPM. A proprioception testing device (PTD) was used to measure the TTDPM. The device passively moved the shoulder and participants were instructed to respond as soon as movement was detected (TTDPM) by pressing a button switch. Response latency was established as the delay between the stimulus (movement) and the response (button press). Electroencephalographic (EEG) and electromyographic (EMG) activities were recorded simultaneously. An analysis of variance (ANOVA) and subsequent post hoc tests indicated a significant difference in latency between the group of athletes without the atrophy when compared both to the group of athletes with the atrophy and to the control group. Furthermore, distinct patterns of cortical activity were observed in the three experimental groups. The results suggest that systematically trained motor abilities, as well as the atrophy of the infraspinatus muscle, change the cortical representation of the different stages of proprioceptive information processing and, ultimately, the cortical representation of the TTDPM. PMID:23484136

  1. Adaptive thresholding with inverted triangular area for real-time detection of the heart rate from photoplethysmogram traces on a smartphone.

    Science.gov (United States)

    Jiang, Wen Jun; Wittek, Peter; Zhao, Li; Gao, Shi Chao

    2014-01-01

    Photoplethysmogram (PPG) signals acquired by smartphone cameras are weaker than those acquired by dedicated pulse oximeters. Furthermore, the signals have lower sampling rates, have notches in the waveform and are more severely affected by baseline drift, leading to specific morphological characteristics. This paper introduces a new feature, the inverted triangular area, to address these specific characteristics. The new feature enables real-time adaptive waveform detection using an algorithm of linear time complexity. It can also recognize notches in the waveform and it is inherently robust to baseline drift. An implementation of the algorithm on Android is available for free download. We collected data from 24 volunteers and compared our algorithm in peak detection with two competing algorithms designed for PPG signals, Incremental-Merge Segmentation (IMS) and Adaptive Thresholding (ADT). A sensitivity of 98.0% and a positive predictive value of 98.8% were obtained, which were 7.7% higher than the IMS algorithm in sensitivity, and 8.3% higher than the ADT algorithm in positive predictive value. The experimental results confirmed the applicability of the proposed method.

  2. Global dust sources detection using MODIS Deep Blue Collection 6 aerosol products

    Science.gov (United States)

    Pérez García-Pando, C.; Ginoux, P. A.

    2015-12-01

    Our understanding of the global dust cycle is limited by a dearth of information about dust sources, especially small-scale features which could account for a large fraction of global emissions. Remote sensing sensors are the most useful tool to locate dust sources. These sensors include microwaves, visible channels, and lidar. On the global scale, major dust source regions have been identified using polar orbiting satellite instruments. The MODIS Deep Blue algorithm has been particularly useful to detect small-scale sources such as floodplains, alluvial fans, rivers, and wadis , as well as to identify anthropogenic sources from agriculture. The recent release of Collection 6 MODIS aerosol products allows to extend dust source detection to the entire land surfaces, which is quite useful to identify mid to high latitude dust sources and detect not only dust from agriculture but fugitive dust from transport and industrial activities. This presentation will overview the advantages and drawbacks of using MODIS Deep Blue for dust detection, compare to other instruments (polar orbiting and geostationary). The results of Collection 6 with a new dust screening will be compared against AERONET. Applications to long range transport of anthropogenic dust will be presented.

  3. Automated detection of extended sources in radio maps: progress from the SCORPIO survey

    Science.gov (United States)

    Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.

    2016-08-01

    Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.

  4. Detection of fission signatures induced by a low-energy neutron source

    International Nuclear Information System (INIS)

    Ocherashvili, A.; Becka, A.; Mayorovb, V.; Roesgen, E.; Crochemoreb, J.-M.; Mosconi, M.; Pedersen, B.; Heger, C.

    2015-01-01

    We present a method for the detection of special nuclear materials (SNM) in shielded containers which is both sensitive and applicable under field conditions. The method uses an external pulsed neutron source to induce fission in SNM and subsequent detection of the fast prompt fission neutrons. The detectors surrounding the container under investigation are liquid scintillation detectors able to distinguish gamma rays from fast neutrons by means of the pulse shape discrimination method (PSD). One advantage of these detectors, besides the ability for PSD analysis, is that the analogue signal from a detection event is of very short duration (typically few tens of nanoseconds). This allows the use of very short coincidence gates for the detection of the prompt fission neutrons in multiple detectors while benefiting from a low accidental (background) coincidence rate yielding a low detection limit. Another principle advantage of this method derives from the fact that the external neutron source is pulsed. By proper time gating the interrogation can be conducted by epithermal and thermal source neutrons only. These source neutrons do not appear in the fast neutron signal following the PSD analysis thus providing a fundamental method for separating the interrogating source neutrons from the sample response in form of fast fission neutrons. The paper describes laboratory tests with a configuration of eight detectors in the Pulsed Neutron Interrogation Test Assembly (PUNITA). The sensitivity of the coincidence signal to fissile mass is investigated for different sample configurations and interrogation regimes.

  5. [Demand for and the Development of Detection Techniques for Source of Schistosome Infection in China].

    Science.gov (United States)

    Wang, Shi-ping; He, Xin; Zhou, Yun-fei

    2015-12-01

    Schistosomiasis is a type of zoonotic parasitosis that severely impairs human health. Rapid detection of infection sources is a key to the control of schistosomiasis. With the effective control of schistosomiasis in China, the detection techniques for infection sources have also been developed. The rate and the intensity of infection among humans and livestocks have been significantly decreased in China, as the control program has entered the transmission control stage in most of the endemic areas. Under this situation, the traditional etiological diagnosing techniques and common immunological methods can not afford rapid detection of infection sources of schistosomiasis. Instead, we are calling for detection methods with higher sensitivity, specificity and stability while being less time-consuming, more convenient and less costing. In recent years, many improved or novel detection methods have been applied for the epidemiological surveillance of schistosomiasis, such as the automatic scanning microscopic image acquisition system, PCR-ELISA, immunosensors, loop-mediated isothermal amplification, etc. The development of new monitoring techniques can facilitate rapid detection of schistosome infection sources in endemic areas.

  6. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 2: Fundamentals and application to counting measurements with the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) on samples are counted after treating them (e.g. aliquotation, solution, enrichment, separation). It considers, besides the random character of radioactive decay and of pulse counting, all other influences arising from sample treatment, (e.g. weighing, enrichment, calibration or the instability of the test setup). ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG 2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as non radioactive waste, and ISO 11929-1, ISO 11929-3 and ISO 11929-4 and is, consequently, complementary to these documents

  7. Detection of Point Sources on Two-Dimensional Images Based on Peaks

    Directory of Open Access Journals (Sweden)

    R. B. Barreiro

    2005-09-01

    Full Text Available This paper considers the detection of point sources in two-dimensional astronomical images. The detection scheme we propose is based on peak statistics. We discuss the example of the detection of far galaxies in cosmic microwave background experiments throughout the paper, although the method we present is totally general and can be used in many other fields of data analysis. We consider sources with a Gaussian profile—that is, a fair approximation of the profile of a point source convolved with the detector beam in microwave experiments—on a background modeled by a homogeneous and isotropic Gaussian random field characterized by a scale-free power spectrum. Point sources are enhanced with respect to the background by means of linear filters. After filtering, we identify local maxima and apply our detection scheme, a Neyman-Pearson detector that defines our region of acceptance based on the a priori pdf of the sources and the ratio of number densities. We study the different performances of some linear filters that have been used in this context in the literature: the Mexican hat wavelet, the matched filter, and the scale-adaptive filter. We consider as well an extension to two dimensions of the biparametric scale-adaptive filter (BSAF. The BSAF depends on two parameters which are determined by maximizing the number density of real detections while fixing the number density of spurious detections. For our detection criterion the BSAF outperforms the other filters in the interesting case of white noise.

  8. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    Science.gov (United States)

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  9. 21-cm signature of the first sources in the Universe: prospects of detection with SKA

    Science.gov (United States)

    Ghara, Raghunath; Choudhury, T. Roy; Datta, Kanan K.

    2016-07-01

    Currently several low-frequency experiments are being planned to study the nature of the first stars using the redshifted 21-cm signal from the cosmic dawn and Epoch of Reionization. Using a one-dimensional radiative transfer code, we model the 21-cm signal pattern around the early sources for different source models, I.e. the metal-free Population III (PopIII) stars, primordial galaxies consisting of Population II (PopII) stars, mini-QSOs and high-mass X-ray binaries (HMXBs). We investigate the detectability of these sources by comparing the 21-cm visibility signal with the system noise appropriate for a telescope like the SKA1-low. Upon integrating the visibility around a typical source over all baselines and over a frequency interval of 16 MHz, we find that it will be possible to make a ˜9σ detection of the isolated sources like PopII galaxies, mini-QSOs and HMXBs at z ˜ 15 with the SKA1-low in 1000 h. The exact value of the signal-to-noise ratio (SNR) will depend on the source properties, in particular on the mass and age of the source and the escape fraction of ionizing photons. The predicted SNR decreases with increasing redshift. We provide simple scaling laws to estimate the SNR for different values of the parameters which characterize the source and the surrounding medium. We also argue that it will be possible to achieve an SNR ˜9 even in the presence of the astrophysical foregrounds by subtracting out the frequency-independent component of the observed signal. These calculations will be useful in planning 21-cm observations to detect the first sources.

  10. Calculated Absolute Detection Efficiencies of Cylindrical Nal (Tl) Scintillation Crystals for Aqueous Spherical Sources

    Energy Technology Data Exchange (ETDEWEB)

    Strindehag, O; Tollander, B

    1968-08-15

    Calculated values of the absolute total detection efficiencies of cylindrical scintillation crystals viewing spherical sources of various sizes are presented. The calculation is carried out for 2 x 2 inch and 3 x 3 inch Nal(Tl) crystals and for sources which have the radii 1/4, 1/2, 3/4 and 1 times the crystal radius. Source-detector distances of 5-20 cm and gamma energies in the range 0.1 - 5 MeV are considered. The correction factor for absorption in the sample container wall and in the detector housing is derived and calculated for a practical case.

  11. Source detection at 100 meter standoff with a time-encoded imaging system

    International Nuclear Information System (INIS)

    Brennan, J.; Brubaker, E.; Gerling, M.; Marleau, P.; Monterial, M.

    2017-01-01

    Here, we present the design, characterization, and testing of a laboratory prototype radiological search and localization system. The system, based on time-encoded imaging, uses the attenuation signature of neutrons in time, induced by the geometrical layout and motion of the system. We have demonstrated the ability to detect a ~1 mCi 252 Cf radiological source at 100 m standoff with 90% detection efficiency and 10% false positives against background in 12 min. As a result, this same detection efficiency is met at 15 s for a 40 m standoff, and 1.2 s for a 20 m standoff.

  12. The resolution of point sources of light as analyzed by quantum detection theory

    Science.gov (United States)

    Helstrom, C. W.

    1972-01-01

    The resolvability of point sources of incoherent light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  13. Resolution of point sources of light as analyzed by quantum detection theory.

    Science.gov (United States)

    Helstrom, C. W.

    1973-01-01

    The resolvability of point sources of incoherent thermal light is analyzed by quantum detection theory in terms of two hypothesis-testing problems. In the first, the observer must decide whether there are two sources of equal radiant power at given locations, or whether there is only one source of twice the power located midway between them. In the second problem, either one, but not both, of two point sources is radiating, and the observer must decide which it is. The decisions are based on optimum processing of the electromagnetic field at the aperture of an optical instrument. In both problems the density operators of the field under the two hypotheses do not commute. The error probabilities, determined as functions of the separation of the points and the mean number of received photons, characterize the ultimate resolvability of the sources.

  14. Atmospheric Pressure Chemical Ionization Sources Used in The Detection of Explosives by Ion Mobility Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Waltman, Melanie J. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    2010-05-01

    Explosives detection is a necessary and wide spread field of research. From large shipping containers to airline luggage, numerous items are tested for explosives every day. In the area of trace explosives detection, ion mobility spectrometry (IMS) is the technique employed most often because it is a quick, simple, and accurate way to test many items in a short amount of time. Detection by IMS is based on the difference in drift times of product ions through the drift region of an IMS instrument. The product ions are created when the explosive compounds, introduced to the instrument, are chemically ionized through interactions with the reactant ions. The identity of the reactant ions determines the outcomes of the ionization process. This research investigated the reactant ions created by various ionization sources and looked into ways to manipulate the chemistry occurring in the sources.

  15. Research and Development of Landmine Detection System by a Compact Fusion Neutron Source

    International Nuclear Information System (INIS)

    Yoshikawa, Kiyoshi; Masuda, Kai; Toku, Hisayuki; Nagasaki, Kazunobu; Mizutani, Toshiyuki; Takamatsu, Teruhisa; Imoto, Masaki; Yamamoto, Yasushi; Ohnishi, Masami; Osawa, Hodaka; Hotta, Eiki; Kohno, Toshiyuki; Okino, Akitoshi; Watanabe, Masato; Yamauchi, Kunihito; Yuura, Morimasa; Shiroya, Seiji; Misawa, Tsuyoshi; Mori, Takamasa

    2005-01-01

    Current results are described on the research and development of an advanced anti-personnel landmine detection system by using a compact discharge-type fusion neutron source called IECF (Inertial-Electrostatic Confinement Fusion). Landmines are to be identified through backscattering of neutrons, and specific-energy capture γ-rays by hydrogen and nitrogen atoms in the landmine explosives.For this purpose, improvements in the IECF were made by various methods to achieve a drastic enhancement of neutron yields of more than 10 8 n/s in pulsed operation. This required R and D on the power source, as well as analysis of envisaged detection systems with multi-sensors. The results suggest promising and practical features for humanitarian landmine detection, particularly, in Afghanistan

  16. An automated multi-scale network-based scheme for detection and location of seismic sources

    Science.gov (United States)

    Poiata, N.; Aden-Antoniow, F.; Satriano, C.; Bernard, P.; Vilotte, J. P.; Obara, K.

    2017-12-01

    We present a recently developed method - BackTrackBB (Poiata et al. 2016) - allowing to image energy radiation from different seismic sources (e.g., earthquakes, LFEs, tremors) in different tectonic environments using continuous seismic records. The method exploits multi-scale frequency-selective coherence in the wave field, recorded by regional seismic networks or local arrays. The detection and location scheme is based on space-time reconstruction of the seismic sources through an imaging function built from the sum of station-pair time-delay likelihood functions, projected onto theoretical 3D time-delay grids. This imaging function is interpreted as the location likelihood of the seismic source. A signal pre-processing step constructs a multi-band statistical representation of the non stationary signal, i.e. time series, by means of higher-order statistics or energy envelope characteristic functions. Such signal-processing is designed to detect in time signal transients - of different scales and a priori unknown predominant frequency - potentially associated with a variety of sources (e.g., earthquakes, LFE, tremors), and to improve the performance and the robustness of the detection-and-location location step. The initial detection-location, based on a single phase analysis with the P- or S-phase only, can then be improved recursively in a station selection scheme. This scheme - exploiting the 3-component records - makes use of P- and S-phase characteristic functions, extracted after a polarization analysis of the event waveforms, and combines the single phase imaging functions with the S-P differential imaging functions. The performance of the method is demonstrated here in different tectonic environments: (1) analysis of the one year long precursory phase of 2014 Iquique earthquake in Chile; (2) detection and location of tectonic tremor sources and low-frequency earthquakes during the multiple episodes of tectonic tremor activity in southwestern Japan.

  17. Fissile material detection and control facility with pulsed neutron sources and digital data processing

    International Nuclear Information System (INIS)

    Romodanov, V.L.; Chernikova, D.N.; Afanasiev, V.V.

    2010-01-01

    Full text: In connection with possible nuclear terrorism, there is long-felt need of devices for effective control of radioactive and fissile materials in the key points of crossing the state borders (airports, seaports, etc.), as well as various customs check-points. In International Science and Technology Center Projects No. 596 and No. 2978, a new physical method and digital technology have been developed for the detection of fissile and radioactive materials in models of customs facilities with a graphite moderator, pulsed neutron source and digital processing of responses from scintillation PSD detectors. Detectability of fissile materials, even those shielded with various radiation-absorbing screens, has been shown. The use of digital processing of scintillation signals in this facility is a necessary element, as neutrons and photons are discriminated in the time dependence of fissile materials responses at such loads on the electronic channels that standard types of spectrometers are inapplicable. Digital processing of neutron and photon responses practically resolves the problem of dead time and allows implementing devices, in which various energy groups of neutrons exist for some time after a pulse of source neutrons. Thus, it is possible to detect fissile materials deliberately concealed with shields having a large cross-section of absorption of photons and thermal neutrons. Two models of detection and the control of fissile materials were advanced: 1. the model based on graphite neutrons moderator and PSD scintillators with digital technology of neutrons and photons responses separation; 2. the model based on plastic scintillators and detecting of time coincidences of fission particles by digital technology. Facilities that count time coincidences of neutrons and photons occurring in the fission of fissile materials can use an Am Li source of neutrons, e.g. that is the case with the AWCC system. The disadvantages of the facility are related to the issues

  18. Detection prospects for high energy neutrino sources from the anisotropic matter distribution in the local universe

    DEFF Research Database (Denmark)

    Mertsch, Philipp; Rameez, Mohamed; Tamborra, Irene

    2017-01-01

    Constraints on the number and luminosity of the sources of the cosmic neutrinos detected by IceCube have been set by targeted searches for point sources. We set complementary constraints by using the 2MASS Redshift Survey (2MRS) catalogue, which maps the matter distribution of the local Universe....... Assuming that the distribution of the neutrino sources follows that of matter we look for correlations between `warm' spots on the IceCube skymap and the 2MRS matter distribution. Through Monte Carlo simulations of the expected number of neutrino multiplets and careful modelling of the detector performance...... (including that of IceCube-Gen2) we demonstrate that sources with local density exceeding $10^{-6} \\, \\text{Mpc}^{-3}$ and neutrino luminosity $L_{\

  19. A dual neutron/gamma source for the Fissmat Inspection for Nuclear Detection (FIND) system.

    Energy Technology Data Exchange (ETDEWEB)

    Doyle, Barney Lee (Sandia National Laboratories, Albuquerque, NM); King, Michael; Rossi, Paolo (Sandia National Laboratories, Albuquerque, NM); McDaniel, Floyd Del (Sandia National Laboratories, Albuquerque, NM); Morse, Daniel Henry; Antolak, Arlyn J.; Provencio, Paula Polyak (Sandia National Laboratories, Albuquerque, NM); Raber, Thomas N.

    2008-12-01

    Shielded special nuclear material (SNM) is very difficult to detect and new technologies are needed to clear alarms and verify the presence of SNM. High-energy photons and neutrons can be used to actively interrogate for heavily shielded SNM, such as highly enriched uranium (HEU), since neutrons can penetrate gamma-ray shielding and gamma-rays can penetrate neutron shielding. Both source particles then induce unique detectable signals from fission. In this LDRD, we explored a new type of interrogation source that uses low-energy proton- or deuteron-induced nuclear reactions to generate high fluxes of mono-energetic gammas or neutrons. Accelerator-based experiments, computational studies, and prototype source tests were performed to obtain a better understanding of (1) the flux requirements, (2) fission-induced signals, background, and interferences, and (3) operational performance of the source. The results of this research led to the development and testing of an axial-type gamma tube source and the design/construction of a high power coaxial-type gamma generator based on the {sup 11}B(p,{gamma}){sup 12}C nuclear reaction.

  20. Research and development of a compact fusion neutron source for humanitarian landmine detection

    International Nuclear Information System (INIS)

    Yoshikawa, K.; Masuda, K.; Yamamoto, Y.; Takamatsu, T.; Toku, H.; Nagasaki, K.; Hotta, E.; Yamauchi, K.; Ohnishi, M.; Osawa, H.

    2005-01-01

    Research and development of the advanced anti-personnel landmine detection system by using a compact discharge-type D-D fusion neutron source called IECF (Inertial-Electrostatic Confinement Fusion) are described. Landmines are to be identified through increased backscattering of neutrons by the hydrogen atoms, and specific-energy capture γ-ray emission by hydrogen and nitrogen atoms with thermalized neutrons in the landmine explosives. For this purpose, improvements of the IECF device were studied for drastic enhancement of neutron production rates of more than 10 8 n/s in pulsed operation including R and D of robust power sources, as well as analyses of envisaged detection system with multi-sensors in parallel in order to show promising and practical features of this detection system for humanitarian landmine detection, particularly, in the aridic, or dry Afghanistan deserted area, where the soil moisture remains between 3-8%, which eventually enables effectively detection of hydrogen anomaly inherent in the landmine explosives. In this paper, improvements of the IECF are focused to be described. (author)

  1. Detection of emission sources using passive-remote Fourier transform infrared spectroscopy

    International Nuclear Information System (INIS)

    Demirgian, J.C.; Macha, S.M.; Darby, S.M.; Ditillo, J.

    1995-01-01

    The detection and identification of toxic chemicals released in the environment is important for public safety. Passive-remote Fourier transform infrared (FTIR) spectrometers can be used to detect these releases. Their primary advantages are their small size and ease of setup and use. Open-path FTIR spectrometers are used to detect concentrations of pollutants from a fixed frame of reference. These instruments detect plumes, but they are too large and difficult to aim to be used to track a plume to its source. Passive remote FTIR spectrometers contain an interferometer, optics, and a detector. They can be used on tripods and in some cases can be hand-held. A telescope can be added to most units. We will discuss the capability of passive-remote FTIR spectrometers to detect the origin of plumes. Low concentration plumes were released using a custom-constructed vaporizer. These plumes were detected with different spectrometers from different distances. Passive-remote spectrometers were able to detect small 10 cm on a side chemical releases at concentration-pathlengths at the low parts per million-meter (ppm-m) level

  2. Psychophysical evaluation of the image quality of a dynamic flat-panel digital x-ray image detector using the threshold contrast detail detectability (TCDD) technique

    Science.gov (United States)

    Davies, Andrew G.; Cowen, Arnold R.; Bruijns, Tom J. C.

    1999-05-01

    We are currently in an era of active development of the digital X-ray imaging detectors that will serve the radiological communities in the new millennium. The rigorous comparative physical evaluations of such devices are therefore becoming increasingly important from both the technical and clinical perspectives. The authors have been actively involved in the evaluation of a clinical demonstration version of a flat-panel dynamic digital X-ray image detector (or FDXD). Results of objective physical evaluation of this device have been presented elsewhere at this conference. The imaging performance of FDXD under radiographic exposure conditions have been previously reported, and in this paper a psychophysical evaluation of the FDXD detector operating under continuous fluoroscopic conditions is presented. The evaluation technique employed was the threshold contrast detail detectability (TCDD) technique, which enables image quality to be measured on devices operating in the clinical environment. This approach addresses image quality in the context of both the image acquisition and display processes, and uses human observers to measure performance. The Leeds test objects TO[10] and TO[10+] were used to obtain comparative measurements of performance on the FDXD and two digital spot fluorography (DSF) systems, one utilizing a Plumbicon camera and the other a state of the art CCD camera. Measurements were taken at a range of detector entrance exposure rates, namely 6, 12, 25 and 50 (mu) R/s. In order to facilitate comparisons between the systems, all fluoroscopic image processing such as noise reduction algorithms, were disabled during the experiments. At the highest dose rate FDXD significantly outperformed the DSF comparison systems in the TCDD comparisons. At 25 and 12 (mu) R/s all three-systems performed in an equivalent manner and at the lowest exposure rate FDXD was inferior to the two DSF systems. At standard fluoroscopic exposures, FDXD performed in an equivalent

  3. Real-time implementation of logo detection on open source BeagleBoard

    Science.gov (United States)

    George, M.; Kehtarnavaz, N.; Estevez, L.

    2011-03-01

    This paper presents the real-time implementation of our previously developed logo detection and tracking algorithm on the open source BeagleBoard mobile platform. This platform has an OMAP processor that incorporates an ARM Cortex processor. The algorithm combines Scale Invariant Feature Transform (SIFT) with k-means clustering, online color calibration and moment invariants to robustly detect and track logos in video. Various optimization steps that are carried out to allow the real-time execution of the algorithm on BeagleBoard are discussed. The results obtained are compared to the PC real-time implementation results.

  4. Application of digital lock-in detection to Hefei Light Source turn-by-turn system

    International Nuclear Information System (INIS)

    Yang Yongliang; Wang Junhua; Sun Baogen; Chen Yuanbo; Zhou Zeran

    2010-01-01

    This paper introduces the digital lock-in detection theory and discusses its feasibility to obtain the damping rate in turn-by-turn measurement systems. Numerical simulations of this method were carried out with Matlab. Then principle presenting beam experiments were conducted on the Hefei Light Source (HLS) storage ring. The measured beta oscillation growth time is about 0.26 ms and the damping time is about 1.2 ms. Simulation and experimental results show that, the digital lock-in detection method is effective in damping rate measurement in turn-by-turn measurement systems. (authors)

  5. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  6. Adapting astronomical source detection software to help detect animals in thermal images obtained by unmanned aerial systems

    Science.gov (United States)

    Longmore, S. N.; Collins, R. P.; Pfeifer, S.; Fox, S. E.; Mulero-Pazmany, M.; Bezombes, F.; Goodwind, A.; de Juan Ovelar, M.; Knapen, J. H.; Wich, S. A.

    2017-02-01

    In this paper we describe an unmanned aerial system equipped with a thermal-infrared camera and software pipeline that we have developed to monitor animal populations for conservation purposes. Taking a multi-disciplinary approach to tackle this problem, we use freely available astronomical source detection software and the associated expertise of astronomers, to efficiently and reliably detect humans and animals in aerial thermal-infrared footage. Combining this astronomical detection software with existing machine learning algorithms into a single, automated, end-to-end pipeline, we test the software using aerial video footage taken in a controlled, field-like environment. We demonstrate that the pipeline works reliably and describe how it can be used to estimate the completeness of different observational datasets to objects of a given type as a function of height, observing conditions etc. - a crucial step in converting video footage to scientifically useful information such as the spatial distribution and density of different animal species. Finally, having demonstrated the potential utility of the system, we describe the steps we are taking to adapt the system for work in the field, in particular systematic monitoring of endangered species at National Parks around the world.

  7. Data-Fusion for a Vision-Aided Radiological Detection System: Sensor dependence and Source Tracking

    Science.gov (United States)

    Stadnikia, Kelsey; Martin, Allan; Henderson, Kristofer; Koppal, Sanjeev; Enqvist, Andreas

    2018-01-01

    The University of Florida is taking a multidisciplinary approach to fuse the data between 3D vision sensors and radiological sensors in hopes of creating a system capable of not only detecting the presence of a radiological threat, but also tracking it. The key to developing such a vision-aided radiological detection system, lies in the count rate being inversely dependent on the square of the distance. Presented in this paper are the results of the calibration algorithm used to predict the location of the radiological detectors based on 3D distance from the source to the detector (vision data) and the detectors count rate (radiological data). Also presented are the results of two correlation methods used to explore source tracking.

  8. WASTK: A Weighted Abstract Syntax Tree Kernel Method for Source Code Plagiarism Detection

    Directory of Open Access Journals (Sweden)

    Deqiang Fu

    2017-01-01

    Full Text Available In this paper, we introduce a source code plagiarism detection method, named WASTK (Weighted Abstract Syntax Tree Kernel, for computer science education. Different from other plagiarism detection methods, WASTK takes some aspects other than the similarity between programs into account. WASTK firstly transfers the source code of a program to an abstract syntax tree and then gets the similarity by calculating the tree kernel of two abstract syntax trees. To avoid misjudgment caused by trivial code snippets or frameworks given by instructors, an idea similar to TF-IDF (Term Frequency-Inverse Document Frequency in the field of information retrieval is applied. Each node in an abstract syntax tree is assigned a weight by TF-IDF. WASTK is evaluated on different datasets and, as a result, performs much better than other popular methods like Sim and JPlag.

  9. Detection limits of pollutants in water for PGNAA using Am-Be source

    International Nuclear Information System (INIS)

    Khelifi, R.; Amokrane, A.; Bode, P.

    2007-01-01

    A basic PGNAA facility with an Am-Be neutron source is described to analyze the pollutants in water. The properties of neutron flux were determined by MCNP calculations. In order to determine the efficiency curve of a HPGe detector, the prompt-gamma rays from chlorine were used and an exponential curve was fitted. The detection limits for typical water sample are also estimated using the statistical fluctuations of the background level in the areas of recorded the prompt-gamma spectrum

  10. Detection of a Very Bright Source Close to the LMC Supernova SN 1987A: Erratum

    Science.gov (United States)

    Nisenson, P.; Papaliolios, C.; Karovska, M.; Noyes, R.

    1988-01-01

    In the Letter "Detection of a Very Bright Source Close to the LMC Supernova SN 1987A" by P. Nisenson, C. Papaliolios, M. Karovska, and R. Noyes (1987 Ap. J. [Letters], 320, L15), two of the figure labels for Figure 1 were inadvertently transposed in the production process. A corrected version of the figure appears as Plate L4. The Journal regrets the error.

  11. Event Coverage Detection and Event Source Determination in Underwater Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Zhangbing Zhou

    2015-12-01

    Full Text Available With the advent of the Internet of Underwater Things, smart things are deployed in the ocean space and establish underwater wireless sensor networks for the monitoring of vast and dynamic underwater environments. When events are found to have possibly occurred, accurate event coverage should be detected, and potential event sources should be determined for the enactment of prompt and proper responses. To address this challenge, a technique that detects event coverage and determines event sources is developed in this article. Specifically, the occurrence of possible events corresponds to a set of neighboring sensor nodes whose sensory data may deviate from a normal sensing range in a collective fashion. An appropriate sensor node is selected as the relay node for gathering and routing sensory data to sink node(s. When sensory data are collected at sink node(s, the event coverage is detected and represented as a weighted graph, where the vertices in this graph correspond to sensor nodes and the weight specified upon the edges reflects the extent of sensory data deviating from a normal sensing range. Event sources are determined, which correspond to the barycenters in this graph. The results of the experiments show that our technique is more energy efficient, especially when the network topology is relatively steady.

  12. CARA Risk Assessment Thresholds

    Science.gov (United States)

    Hejduk, M. D.

    2016-01-01

    Warning remediation threshold (Red threshold): Pc level at which warnings are issued, and active remediation considered and usually executed. Analysis threshold (Green to Yellow threshold): Pc level at which analysis of event is indicated, including seeking additional information if warranted. Post-remediation threshold: Pc level to which remediation maneuvers are sized in order to achieve event remediation and obviate any need for immediate follow-up maneuvers. Maneuver screening threshold: Pc compliance level for routine maneuver screenings (more demanding than regular Red threshold due to additional maneuver uncertainty).

  13. Detection and monitoring of pollutant sources with Lidar/Dial techniques

    International Nuclear Information System (INIS)

    Gaudio, P; Gelfusa, M; Malizia, A; Parracino, S; Richetta, M; De Leo, L; Perrimezzi, C; Bellecci, C

    2015-01-01

    It's well known that air pollution due to anthropogenic sources can have adverse effects on humans and the ecosystem. Therefore, in the last years, surveying large regions of the atmosphere in an automatic way has become a strategic objective of various public health organizations for early detection of pollutant sources in urban and industrial areas.The Lidar and Dial techniques have become well established laser based methods for the remote sensing of the atmosphere. They are often implemented to probe almost any level of the atmosphere and to acquire information to validate theoretical models about different topics of atmospheric physics. They can also be used for environment surveying by monitoring particles, aerosols and molecules.The aim of the present work is to demonstrate the potential of these methods to detect pollutants emitted from local sources (such as particulate and/or chemical compounds) and to evaluate their concentration. This is exemplified with the help of experimental data acquired in an industrial area in the south of Italy by mean of experimental campaign by use of pollutants simulated source. For this purpose, two mobile systems Lidar and Dial have been developed by the authors. In this paper there will be presented the operating principles of the system and the results of the experimental campaign. (paper)

  14. Acoustic emission sensor radiation damage threshold experiment

    International Nuclear Information System (INIS)

    Beeson, K.M.; Pepper, C.E.

    1994-01-01

    Determination of the threshold for damage to acoustic emission sensors exposed to radiation is important in their application to leak detection in radioactive waste transport and storage. Proper response to system leaks is necessary to ensure the safe operation of these systems. A radiation impaired sensor could provide ''false negative or false positive'' indication of acoustic signals from leaks within the system. Research was carried out in the Radiochemical Technology Division at Oak Ridge National Laboratory to determine the beta/gamma radiation damage threshold for acoustic emission sensor systems. The individual system consisted of an acoustic sensor mounted with a two part epoxy onto a stainless steel waveguide. The systems were placed in an irradiation fixture and exposed to a Cobalt-60 source. After each irradiation, the sensors were recalibrated by Physical Acoustics Corporation. The results were compared to the initial calibrations performed prior to irradiation and a control group, not exposed to radiation, was used to validate the results. This experiment determines the radiation damage threshold of each acoustic sensor system and verifies its life expectancy, usefulness and reliability for many applications in radioactive environments

  15. Hypothesis tests for the detection of constant speed radiation moving sources

    Energy Technology Data Exchange (ETDEWEB)

    Dumazert, Jonathan; Coulon, Romain; Kondrasovs, Vladimir; Boudergui, Karim; Sannie, Guillaume; Gameiro, Jordan; Normand, Stephane [CEA, LIST, Laboratoire Capteurs Architectures Electroniques, 99 Gif-sur-Yvette, (France); Mechin, Laurence [CNRS, UCBN, Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen, 4050 Caen, (France)

    2015-07-01

    Radiation Portal Monitors are deployed in linear network to detect radiological material in motion. As a complement to single and multichannel detection algorithms, inefficient under too low signal to noise ratios, temporal correlation algorithms have been introduced. Test hypothesis methods based on empirically estimated mean and variance of the signals delivered by the different channels have shown significant gain in terms of a tradeoff between detection sensitivity and false alarm probability. This paper discloses the concept of a new hypothesis test for temporal correlation detection methods, taking advantage of the Poisson nature of the registered counting signals, and establishes a benchmark between this test and its empirical counterpart. The simulation study validates that in the four relevant configurations of a pedestrian source carrier under respectively high and low count rate radioactive background, and a vehicle source carrier under the same respectively high and low count rate radioactive background, the newly introduced hypothesis test ensures a significantly improved compromise between sensitivity and false alarm, while guaranteeing the stability of its optimization parameter regardless of signal to noise ratio variations between 2 to 0.8. (authors)

  16. Detecting new γ-ray sources based on multi-frequency data the case of 1WHSPJ031423.9+061956

    Science.gov (United States)

    Arsioli, Bruno; Chang, Yu Ling

    2015-12-01

    We use the Fermi Science Tools in an attempt to unveil faint γ-ray blazars that may be above the threshold for detectability with Fermi-LAT and are not identified by automated methods. Our search for new sources in the 100MeV-300GeV band is mainly driven by the 1/2WHSP catalogs, which list high synchrotron peaked blazars expected to be emitters of VHE photons. Here we present the γ-ray detection of 1WHSP J031423.9+061956, modelling its high energy spectrum as a power law. We describe an example where multi-frequency selection, performed at much lower energies (from radio to X-ray), helps to pin-point a high energy source. The 1/2WHSP catalogs are built with the aim of providing a list of TeV targets for the VHE arrays of Cherenkov telescopes. Moreover, these catalogs provide useful seeds for identifying new high energy sources within the raw-data from Fermi. With the aid of multi-frequency data, we can explore the very high energy domain in greater details, improving the description of the γ-ray sky.

  17. Space-Based Detection of Missing Sulfur Dioxide Sources of Global Air Pollution

    Science.gov (United States)

    McLinden, Chris A.; Fioletov, Vitali; Shephard, Mark W.; Krotkov, Nick; Li, Can; Martin, Randall V.; Moran, Michael D.; Joiner, Joanna

    2016-01-01

    Sulfur dioxide is designated a criteria air contaminant (or equivalent) by virtually all developed nations. When released into the atmosphere, sulfur dioxide forms sulfuric acid and fine particulate matter, secondary pollutants that have significant adverse effects on human health, the environment and the economy. The conventional, bottom-up emissions inventories used to assess impacts, however, are often incomplete or outdated, particularly for developing nations that lack comprehensive emission reporting requirements and infrastructure. Here we present a satellite-based, global emission inventory for SO2 that is derived through a simultaneous detection, mapping and emission-quantifying procedure, and thereby independent of conventional information sources. We find that of the 500 or so large sources in our inventory, nearly 40 are not captured in leading conventional inventories. These missing sources are scattered throughout the developing world-over a third are clustered around the Persian Gulf-and add up to 7 to 14 Tg of SO2 yr(exp -1), or roughly 6-12% of the global anthropogenic source. Our estimates of national total emissions are generally in line with conventional numbers, but for some regions, and for SO2 emissions from volcanoes, discrepancies can be as large as a factor of three or more. We anticipate that our inventory will help eliminate gaps in bottom-up inventories, independent of geopolitical borders and source types.

  18. Detection prospects for high energy neutrino sources from the anisotropic matter distribution in the local Universe

    Energy Technology Data Exchange (ETDEWEB)

    Mertsch, Philipp; Rameez, Mohamed; Tamborra, Irene, E-mail: mertsch@nbi.ku.dk, E-mail: mohamed.rameez@nbi.ku.dk, E-mail: tamborra@nbi.ku.dk [Niels Bohr International Academy, Niels Bohr Institute, Blegdamsvej 17, 2100 Copenhagen (Denmark)

    2017-03-01

    Constraints on the number and luminosity of the sources of the cosmic neutrinos detected by IceCube have been set by targeted searches for point sources. We set complementary constraints by using the 2MASS Redshift Survey (2MRS) catalogue, which maps the matter distribution of the local Universe. Assuming that the distribution of the neutrino sources follows that of matter, we look for correlations between ''warm'' spots on the IceCube skymap and the 2MRS matter distribution. Through Monte Carlo simulations of the expected number of neutrino multiplets and careful modelling of the detector performance (including that of IceCube-Gen2), we demonstrate that sources with local density exceeding 10{sup −6} Mpc{sup −3} and neutrino luminosity L {sub ν} ∼< 10{sup 42} erg s{sup −1} (10{sup 41} erg s{sup −1}) will be efficiently revealed by our method using IceCube (IceCube-Gen2). At low luminosities such as will be probed by IceCube-Gen2, the sensitivity of this analysis is superior to requiring statistically significant direct observation of a point source.

  19. Design and construction of an explosive detection system by Tna methods, using 252Cf radioisotope source

    International Nuclear Information System (INIS)

    Tavakkoli Farsouli, A.

    1999-01-01

    Bombs concealed in luggage have threatened human life and property throughout the world's traffic. The plastic explosives could not checked by the X-ray detecting device. Thermal Neutron Activation method has been tested in the present work for non-destructive detection of explosives. A radioisotope neutron source 252 Cf and two gamma spectroscopy systems have been used as a tool to find explosives, regardless of the bomb's shape and the packing materials. The MCNP code has been used to design the neutronic section of the system. The measured thermal neutron fluxes by the gold foils in some location of the system were in good agreement with those data obtained by the MCNP code. Also, detection limits for nitrogen in various counting times were measured. The measurements show that the system is capable to detect 417 gr of HMX explosive material (158 gr nitrogen) by 10 minutes of counting time. To modify the system and to decrease the detection limits some opinions are given

  20. Shadow Detection Based on Regions of Light Sources for Object Extraction in Nighttime Video

    Directory of Open Access Journals (Sweden)

    Gil-beom Lee

    2017-03-01

    Full Text Available Intelligent video surveillance systems detect pre-configured surveillance events through background modeling, foreground and object extraction, object tracking, and event detection. Shadow regions inside video frames sometimes appear as foreground objects, interfere with ensuing processes, and finally degrade the event detection performance of the systems. Conventional studies have mostly used intensity, color, texture, and geometric information to perform shadow detection in daytime video, but these methods lack the capability of removing shadows in nighttime video. In this paper, a novel shadow detection algorithm for nighttime video is proposed; this algorithm partitions each foreground object based on the object’s vertical histogram and screens out shadow objects by validating their orientations heading toward regions of light sources. From the experimental results, it can be seen that the proposed algorithm shows more than 93.8% shadow removal and 89.9% object extraction rates for nighttime video sequences, and the algorithm outperforms conventional shadow removal algorithms designed for daytime videos.

  1. Automatic landslide detection from LiDAR DTM derivatives by geographic-object-based image analysis based on open-source software

    Science.gov (United States)

    Knevels, Raphael; Leopold, Philip; Petschko, Helene

    2017-04-01

    With high-resolution airborne Light Detection and Ranging (LiDAR) data more commonly available, many studies have been performed to facilitate the detailed information on the earth surface and to analyse its limitation. Specifically in the field of natural hazards, digital terrain models (DTM) have been used to map hazardous processes such as landslides mainly by visual interpretation of LiDAR DTM derivatives. However, new approaches are striving towards automatic detection of landslides to speed up the process of generating landslide inventories. These studies usually use a combination of optical imagery and terrain data, and are designed in commercial software packages such as ESRI ArcGIS, Definiens eCognition, or MathWorks MATLAB. The objective of this study was to investigate the potential of open-source software for automatic landslide detection based only on high-resolution LiDAR DTM derivatives in a study area within the federal state of Burgenland, Austria. The study area is very prone to landslides which have been mapped with different methodologies in recent years. The free development environment R was used to integrate open-source geographic information system (GIS) software, such as SAGA (System for Automated Geoscientific Analyses), GRASS (Geographic Resources Analysis Support System), or TauDEM (Terrain Analysis Using Digital Elevation Models). The implemented geographic-object-based image analysis (GEOBIA) consisted of (1) derivation of land surface parameters, such as slope, surface roughness, curvature, or flow direction, (2) finding optimal scale parameter by the use of an objective function, (3) multi-scale segmentation, (4) classification of landslide parts (main scarp, body, flanks) by k-mean thresholding, (5) assessment of the classification performance using a pre-existing landslide inventory, and (6) post-processing analysis for the further use in landslide inventories. The results of the developed open-source approach demonstrated good

  2. Passive monitoring for near surface void detection using traffic as a seismic source

    Science.gov (United States)

    Zhao, Y.; Kuzma, H. A.; Rector, J.; Nazari, S.

    2009-12-01

    In this poster we present preliminary results based on our several field experiments in which we study seismic detection of voids using a passive array of surface geophones. The source of seismic excitation is vehicle traffic on nearby roads, which we model as a continuous line source of seismic energy. Our passive seismic technique is based on cross-correlation of surface wave fields and studying the resulting power spectra, looking for "shadows" caused by the scattering effect of a void. High frequency noise masks this effect in the time domain, so it is difficult to see on conventional traces. Our technique does not rely on phase distortions caused by small voids because they are generally too tiny to measure. Unlike traditional impulsive seismic sources which generate highly coherent broadband signals, perfect for resolving phase but too weak for resolving amplitude, vehicle traffic affords a high power signal a frequency range which is optimal for finding shallow structures. Our technique results in clear detections of an abandoned railroad tunnel and a septic tank. The ultimate goal of this project is to develop a technology for the simultaneous imaging of shallow underground structures and traffic monitoring near these structures.

  3. Use of Portal Monitors for Detection of Technogenic Radioactive Sources in Scrap Metal

    Science.gov (United States)

    Solovev, D. B.; Merkusheva, A. E.

    2017-11-01

    The article considers the features of organization of scrap-metal primary radiation control on the specialized enterprises engaging in its deep processing and storage at using by primary technical equipment - radiation portal monitors. The issue of this direction relevance, validity of radiation control implementation with the use of radiation portal monitors, physical and organizational bases of radiation control are considered in detail. The emphasis is put on the considerable increase in the number of technogenic radioactive sources detected in scrap-metal that results in the entering into exploitation of radioactive metallic structures as different building wares. One of reasons of such increase of the number of technogenic radioactive sources getting for processing with scrap-metal is the absence of any recommendations on the radiation portal monitors exploitation. The practical division of the article offers to recommendation on tuning of the modes of work of radiation portal monitors depending on influence the weather factor thus allowing to considerably increase the percent of technogenic radioactive sources detection.

  4. Doppler ultrasound for detection of renal transplant artery stenosis - Threshold peak systolic velocity needs to be higher in a low-risk or surveillance population

    International Nuclear Information System (INIS)

    Patel, U.; Khaw, K.K.; Hughes, N.C.

    2003-01-01

    AIMS: To establish the ideal threshold arterial velocity for the diagnosis of renal transplant artery stenosis in a surveillance population with a low pre-test probability of stenosis. METHODS: Retrospective review of Doppler ultrasound, angiographic and clinical outcome data of patients transplanted over a 3-year period. Data used to calculate sensitivity, specificity, positive predictive values (PPV) and negative predictive values (NPV) for various threshold peak systolic velocity values. RESULTS: Of 144 patients transplanted, full data were available in 117 cases. Five cases had renal transplant artery stenosis--incidence 4.2% [stenosis identified at a mean of 6.5 months (range 2-10 months)]. All five cases had a significant arterial pressure gradient across the narrowing and underwent angioplasty. Threshold peak systolic velocity of ≥2.5 m/s is not ideal [specificity=79% (CI 65-82%), PPV=18% (CI 6-32%), NPV=100% (CI 94-100%)], subjecting many patients to unnecessary angiography--8/117 (6%) in our population. Comparable values if the threshold is set at ≥3.0 m/s are 93% (CI 77-96%), 33% (CI 7-44%) and 99% (CI 93-100%), respectively. The clinical outcome of all patients was satisfactory, with no unexplained graft failures or loss. CONCLUSIONS: In a surveillance population with a low pre-test probability of stenosis, absolute renal artery velocity ≥2.5 m/s is a limited surrogate marker for significant renal artery stenosis. The false-positive rate is high, and ≥3.0 m/s is a better choice which will halve the number of patients enduring unnecessary angiography. Close clinical follow-up of patients in the 2.5-3.0 m/s range, with repeat Doppler ultrasound if necessary, will identify the test false-negatives

  5. Detection of environmental sources of Histoplasma capsulatum in Chiang Mai, Thailand, by nested PCR.

    Science.gov (United States)

    Norkaew, Treepradab; Ohno, Hideaki; Sriburee, Pojana; Tanabe, Koichi; Tharavichitkul, Prasit; Takarn, Piyawan; Puengchan, Tanpalang; Bumrungsri, Sara; Miyazaki, Yoshitsugu

    2013-12-01

    Histoplasmosis is a systemic mycosis caused by inhaling spores of Histoplasma capsulatum, a dimorphic fungus. This fungus grows in soil contaminated with bat and avian excreta. Each year, patients with disseminated histoplasmosis have been diagnosed in Chiang Mai, northern Thailand. No published information is currently available on the environmental sources of this fungus in Chiang Mai or anywhere else in Thailand. The aim of this study was to detect H. capsulatum in soil samples contaminated with bat guano and avian droppings by nested PCR. Two hundred and sixty-five samples were collected from the following three sources: soil contaminated with bat guano, 88 samples; soil contaminated with bird droppings, 86 samples; and soil contaminated with chicken droppings, 91 samples. Genomic DNA was directly extracted from each sample, and H. capsulatum was detected by nested PCR using a primer set specific to a gene encoding 100-kDa-like protein (HcI, HcII and HcIII, HcIV). Histoplasma capsulatum was detected in seven of 88 soil samples contaminated with bat guano, one of 21 soil samples contaminated with pigeon droppings and 10 of 91 soil samples contaminated with chicken droppings. The results indicate the possibility of the association of bat guano and chicken droppings with H. capsulatum in this area of Thailand.

  6. Astronomers Detect Powerful Bursting Radio Source Discovery Points to New Class of Astronomical Objects

    Science.gov (United States)

    2005-03-01

    Astronomers at Sweet Briar College and the Naval Research Laboratory (NRL) have detected a powerful new bursting radio source whose unique properties suggest the discovery of a new class of astronomical objects. The researchers have monitored the center of the Milky Way Galaxy for several years and reveal their findings in the March 3, 2005 edition of the journal, “Nature”. This radio image of the central region of the Milky Way Galaxy holds a new radio source, GCRT J1745-3009. The arrow points to an expanding ring of debris expelled by a supernova. CREDIT: N.E. Kassim et al., Naval Research Laboratory, NRAO/AUI/NSF Principal investigator, Dr. Scott Hyman, professor of physics at Sweet Briar College, said the discovery came after analyzing some additional observations from 2002 provided by researchers at Northwestern University. “"We hit the jackpot!” Hyman said referring to the observations. “An image of the Galactic center, made by collecting radio waves of about 1-meter in wavelength, revealed multiple bursts from the source during a seven-hour period from Sept. 30 to Oct. 1, 2002 — five bursts in fact, and repeating at remarkably constant intervals.” Hyman, four Sweet Briar students, and his NRL collaborators, Drs. Namir Kassim and Joseph Lazio, happened upon transient emission from two radio sources while studying the Galactic center in 1998. This prompted the team to propose an ongoing monitoring program using the National Science Foundation’s Very Large Array (VLA) radio telescope in New Mexico. The National Radio Astronomy Observatory, which operates the VLA, approved the program. The data collected, laid the groundwork for the detection of the new radio source. “Amazingly, even though the sky is known to be full of transient objects emitting at X- and gamma-ray wavelengths,” NRL astronomer Dr. Joseph Lazio pointed out, “very little has been done to look for radio bursts, which are often easier for astronomical objects to produce

  7. Development of the detection technology of the source area derived from nuclear activities

    International Nuclear Information System (INIS)

    Suh, Kyungsuk; Kim, Ingyu; Keum, Dongkwon; Lim, Kwangmuk; Lee, Jinyong

    2012-07-01

    - It is necessary to establish of the overall preparedness for analysis of the nuclear activities near the neighboring countries by increasing of the construction of the nuclear power plants and reprocessing facilities in China, North Korean Japan and Russia. - In Korea, the analysis and measurements for nuclear activities have been conducted, however the detection technology to find out the source area has not been developed. It is important to estimate the source origin of the radioisotope from the neighboring countries including Korea in the aspects of the surveillance and safety for the covert nuclear activities in Northeast Asia region - In this study, the data DB, treatment of the weather data and the development of connection module were conducted to track the origin of the radioisotope in the first year of the research. It had constructed the DB like the reactor types, places in China, Taiwan, Japan and Korea and the release amounts of the noble gas released into the air

  8. Development of the detection technology of the source area derived from nuclear activities

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Kyungsuk; Kim, Ingyu; Keum, Dongkwon; Lim, Kwangmuk; Lee, Jinyong

    2012-07-15

    - It is necessary to establish of the overall preparedness for analysis of the nuclear activities near the neighboring countries by increasing of the construction of the nuclear power plants and reprocessing facilities in China, North Korean Japan and Russia. - In Korea, the analysis and measurements for nuclear activities have been conducted, however the detection technology to find out the source area has not been developed. It is important to estimate the source origin of the radioisotope from the neighboring countries including Korea in the aspects of the surveillance and safety for the covert nuclear activities in Northeast Asia region - In this study, the data DB, treatment of the weather data and the development of connection module were conducted to track the origin of the radioisotope in the first year of the research. It had constructed the DB like the reactor types, places in China, Taiwan, Japan and Korea and the release amounts of the noble gas released into the air.

  9. Using an alternate light source to detect electrically singed feathers and hair in a forensic setting.

    Science.gov (United States)

    Viner, Tabitha C; Kagan, Rebecca A; Johnson, Jennifer L

    2014-01-01

    Mortality due to electrical injury in wildlife may occur in the form of lightning strike or power line contact. Evidence of electrical contact may be grossly obvious, with extensive singeing, curling, and blackening of feathers, fur, or skin. Occasionally, changes may be subtle, owing to lower current or reduced conductivity, making a definitive diagnosis of electrocution more difficult. We describe the use of an alternate light source in the examination of cases of lightning strike and power line contact in wildlife, and the enhanced detection of changes due to electrical currents in the hair and feathers of affected animals. Subtle changes in the wing feathers of 12 snow geese and 1 wolf that were struck by separate lightning events were made obvious by the use of an alternate light source. Similarly, this technique can be used to strengthen the evidence for power line exposure in birds. Published by Elsevier Ireland Ltd.

  10. Molecular detection of Fasciola hepatica in water sources of District Nowshehra Khyber Pakhtunkhwa Pakistan

    Science.gov (United States)

    Khan, Imran; Khan, Amir Muhammad; Ayaz, Khan, Sanaullah; Anees, Muhammad; Khan, Shaukat Ali

    2012-12-01

    Fascioliasis is spread through contamination of water sources and cause morbidity throughout the world. In the current study 300 water samples were processed by PCR for detection of Fasciola hepatica. The overall prevalence in different water sources was 9.66 % (29/300). Highest prevalence was recorded in drain water16 % (16/100) followed by tube well water 10% (4/40), open well water 8 % (8/100) and the lowest was recorded in tap water 1.66 %(1/60). The significant difference P < 0.05 was recorded during data analysis. The highest prevalence was recorded in summer. It was concluded from the study that cleaning and filtration should be adopted to avoid the health hazards against water borne zoonotic parasites.

  11. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    Directory of Open Access Journals (Sweden)

    Jacopo Aleotti

    2017-09-01

    Full Text Available A visuo-haptic augmented reality (VHAR interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  12. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface.

    Science.gov (United States)

    Aleotti, Jacopo; Micconi, Giorgio; Caselli, Stefano; Benassi, Giacomo; Zambelli, Nicola; Bettelli, Manuele; Zappettini, Andrea

    2017-09-29

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the situation awareness of the operator. The user teleoperates the UAV using a 3DOF haptic device that provides an attractive force feedback around the location of the most intense detected radiation source. Moreover, a fixed camera on the ground observes the environment where the UAV is flying. A 3D augmented reality scene is displayed on a computer screen accessible to the operator. Multiple types of graphical overlays are shown, including sensor data acquired by the nuclear radiation detector, a virtual cursor that tracks the UAV and geographical information, such as buildings. Experiments performed in a real environment are reported using an intense nuclear source.

  13. Design and construction of tank robot for detection and searching of radiation sources

    International Nuclear Information System (INIS)

    Rio Isman; Djiwo Harsono; Adi Abimanyu

    2016-01-01

    Developments of robotics technology can be implemented for searching the lost radiation source. Radiation source lost case allows the radiation dose exceeding NBD received by radiation worker when security precautions are taken. This research propose a robot tank that can help in detection and searching radiation sources. The robot consists of a micro controller board Arduino Mega 2560, XBee Pro S radio frequency modules, Gaps receiver U-B lox Neo-Mn-0-01, a servo motor and two DC motors. In this research, the amount of radiation is calculated in 0-5 volt analog voltage that is simulated using a potentiometer and then converted to a digital voltage value (0-1023) using ADC 10 bit Arduino Mega 2560. Results of the research show that the robot has a size of 28.7 cm x 24.8 cm x 11 cm, able to move forward with a speed of 0.477 m/s and are capable to rotate in 24 angles. Data transmission can be performed wireless up to 113 m in an open area without any changes of format and length of the data. Robot capable to rotate and move towards the angle which has the largest voltage readings so can predict the location of the radiation source. (author)

  14. CERN Web Application Detection. Refactoring and release as open source software

    CERN Document Server

    Lizonczyk, Piotr

    2015-01-01

    This paper covers my work during my assignment as participant of CERN Summer Students 2015 programme. The project was aimed at refactoring and publication of the Web Application Detection tool, which was developed at CERN and priorly used internally by the Computer Security team. The range of tasks performed include initial refactoring of code, which was developed like a script rather than a Python package, through extracting components that were not specific to CERN usage, the subsequent final release of the source code on GitHub and the integration with third-party software i.e. the w3af tool. Ultimately, Web Application Detection software received positive responses, being downloaded ca. 1500 times at the time of writing this report.

  15. Symbiotic Stars in X-rays. II. Faint Sources Detected with XMM-Newton and Chandra

    Science.gov (United States)

    Nunez, N. E.; Luna, G. J. M.; Pillitteri, I.; Mukai, K.

    2014-01-01

    We report the detection from four symbiotic stars that were not known to be X-ray sources. These four object show a ß-type X-ray spectrum, that is, their spectra can be modeled with an absorbed optically thin thermal emission with temperatures of a few million degrees. Photometric series obtained with the Optical Monitor on board XMM-Newton from V2416 Sgr and NSV 25735 support the proposed scenario where the X-ray emission is produced in a shock-heated region inside the symbiotic nebulae.

  16. Detection of air pollution sources in Israel by isotope-induced XRF

    International Nuclear Information System (INIS)

    Boazi, M.; Shenberg, C.

    1978-01-01

    Aerosol samples from urban, industrial and traffic areas were collected and analyzed for various elements using isotope-induced XRF. A 10 mCi 241 Am annular source, enclosed in lead housing, was used to excite the K X-rays of an iodine target. A lithium drifted silicon diode (25 mm 2 area, 3.0 mm depletion depth) with a 0.125 mm beryllium window, was used for detection. The detector was coupled through a low-noise preamplifier and a linear amplifier to a 400-channel analyzer. The resolution (FWHM) for the 6.4 keV FeKsub(α) X-rays was approximately 180 eV. The correlation between the multielement content of the samples and their sources was studied. It was found necessary to analyze samples before and after ashing because, while ashing improves the peak-to-background ratio, volatile elements may be lost. Samples collected at heights of 1.5 and 15 m were found to have the same elemental composition. Samples collected on consecutive filters showed sharp elemental fractionation. An example of source identification is given for a high traffic area as compared with an area in which a large bromine plant is located. In both cases different Pb:Br ratios were observed, with high bromine concentration in the latter. The analytical method used was based on the detection of characteristic K and L X-rays induced by a 241 Am-I source-target assembly. (T.G.)

  17. Single sources in the low-frequency gravitational wave sky: properties and time to detection by pulsar timing arrays

    Science.gov (United States)

    Kelley, Luke Zoltan; Blecha, Laura; Hernquist, Lars; Sesana, Alberto; Taylor, Stephen R.

    2018-06-01

    We calculate the properties, occurrence rates and detection prospects of individually resolvable `single sources' in the low-frequency gravitational wave (GW) spectrum. Our simulations use the population of galaxies and massive black hole binaries from the Illustris cosmological hydrodynamic simulations, coupled to comprehensive semi-analytic models of the binary merger process. Using mock pulsar timing arrays (PTA) with, for the first time, varying red-noise models, we calculate plausible detection prospects for GW single sources and the stochastic GW background (GWB). Contrary to previous results, we find that single sources are at least as detectable as the GW background. Using mock PTA, we find that these `foreground' sources (also `deterministic'/`continuous') are likely to be detected with ˜20 yr total observing baselines. Detection prospects, and indeed the overall properties of single sources, are only moderately sensitive to binary evolution parameters - namely eccentricity and environmental coupling, which can lead to differences of ˜5 yr in times to detection. Red noise has a stronger effect, roughly doubling the time to detection of the foreground between a white-noise only model (˜10-15 yr) and severe red noise (˜20-30 yr). The effect of red noise on the GWB is even stronger, suggesting that single source detections may be more robust. We find that typical signal-to-noise ratios for the foreground peak near f = 0.1 yr-1, and are much less sensitive to the continued addition of new pulsars to PTA.

  18. Theory of threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2002-01-01

    Theory of Threshold Phenomena in Quantum Scattering is developed in terms of Reduced Scattering Matrix. Relationships of different types of threshold anomalies both to nuclear reaction mechanisms and to nuclear reaction models are established. Magnitude of threshold effect is related to spectroscopic factor of zero-energy neutron state. The Theory of Threshold Phenomena, based on Reduced Scattering Matrix, does establish relationships between different types of threshold effects and nuclear reaction mechanisms: the cusp and non-resonant potential scattering, s-wave threshold anomaly and compound nucleus resonant scattering, p-wave anomaly and quasi-resonant scattering. A threshold anomaly related to resonant or quasi resonant scattering is enhanced provided the neutron threshold state has large spectroscopic amplitude. The Theory contains, as limit cases, Cusp Theories and also results of different nuclear reactions models as Charge Exchange, Weak Coupling, Bohr and Hauser-Feshbach models. (author)

  19. Patient Perceptions of Breast Cancer Risk in Imaging-Detected Low-Risk Scenarios and Thresholds for Desired Intervention: A Multi-Institution Survey.

    Science.gov (United States)

    Grimm, Lars J; Shelby, Rebecca A; Knippa, Emily E; Langman, Eun L; Miller, Lauren S; Whiteside, Beth E; Soo, Mary Scott C

    2018-06-01

    To determine women's perceptions of breast cancer risk and thresholds for desiring biopsy when considering BI-RADS 3 and 4A scenarios and recommendations, respectively. Women presenting for screening mammography from five geographically diverse medical centers were surveyed. Demographic information and baseline anxiety were queried. Participants were presented with scenarios of short-term imaging follow-up recommendations (ie, BI-RADS 3) and biopsy recommendations (ie, BI-RADS 4A) for low-risk mammographic abnormalities and asked to estimate their breast cancer risk for each scenario. Participants reported the threshold (ie, likelihood of cancer) where they would feel comfortable undergoing short-term imaging follow-up and biopsy and their anticipated regret for choosing short-term follow-up versus biopsy. Analysis of 2,747 surveys showed that participants estimated breast cancer risk of 32.8% for a BI-RADS 3 and 41.1% for a BI-RADS 4A scenarios are significantly greater rates than clinically established rates (<2% [P < .001] and 2%-10% [P < .001], respectively). Over one-half (55.4%) of participants reported they would never want imaging follow-up if there was any chance of cancer; two-thirds (66.2%) reported they would desire biopsy if there was any chance of cancer. Participants reported greater anticipated regret (P < .001) and less relief and confidence (P < .001) with the decision to undergo follow-up imaging versus biopsy. Women overestimate breast cancer risk associated with both BI-RADS 3 and 4A scenarios and desire very low biopsy thresholds. Greater anticipated regret and less relief and confidence was reported with the choice to undergo short-term imaging follow-up compared with biopsy. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. Enhancing the performance of a tensioned metastable fluid detector based active interrogation system for the detection of SNM in interrogation source in moderated/reflected geometries

    Science.gov (United States)

    Grimes, T. F.; Hagen, A. R.; Archambault, B. C.; Taleyarkhan, R. P.

    2018-03-01

    This paper describes the development of a SNM detection system for interrogating 1m3 cargos via the combination of a D-D neutron interrogation source (with and without reflectors) and tensioned metastable fluid detectors (TMFDs). TMFDs have been previously shown (Taleyarkhan et al., 2008; Grimes et al., 2015; Grimes and Taleyarkhan, 2016; Archambault et al., 2017; Hagen et al., 2016) to be capable of using Threshold Energy Neutron Analysis (TENA) techniques to reject the ∼2.45 MeV D-D interrogating neutrons while still remaining sensitive to >2.45 MeV neutrons resulting from fission in the target (HEU) material. In order to enhance the performance, a paraffin reflector was included around the accelerator head. This reflector was used to direct neutrons into the package to increase the fission signal, lower the energy of the interrogating neutrons to increase the fission cross-section with HEU, and, also to direct interrogating neutrons away from the detectors in order to enhance the required discrimination between interrogating and fission neutrons. Experiments performed with a 239 Pu-Be neutron source and MnO2 indicated that impressive performance gains could be made by placing a parabolic paraffin moderator between the interrogation source and an air-filled cargo container with HEU placed at the center. However, experiments with other cargo fillers (as specified in the well-known ANSI N42.41-2007 report), and with HEU placed in locations other than the center of the package indicated that other reflector geometries might be superior due to over-"focusing" and the increased solid angle effects due to the accommodation of the moderator geometry. The best performance for the worst case of source location and box fill was obtained by placing the reflector only behind the D-D neutron source rather than in front of it. Finally, it was shown that there could be significant gains in the ability to detect concealed SNM by operating the system in multiple geometric

  1. Detection of pulmonary nodules at paediatric CT: maximum intensity projections and axial source images are complementary

    International Nuclear Information System (INIS)

    Kilburn-Toppin, Fleur; Arthurs, Owen J.; Tasker, Angela D.; Set, Patricia A.K.

    2013-01-01

    Maximum intensity projection (MIP) images might be useful in helping to differentiate small pulmonary nodules from adjacent vessels on thoracic multidetector CT (MDCT). The aim was to evaluate the benefits of axial MIP images over axial source images for the paediatric chest in an interobserver variability study. We included 46 children with extra-pulmonary solid organ malignancy who had undergone thoracic MDCT. Three radiologists independently read 2-mm axial and 10-mm MIP image datasets, recording the number of nodules, size and location, overall time taken and confidence. There were 83 nodules (249 total reads among three readers) in 46 children (mean age 10.4 ± 4.98 years, range 0.3-15.9 years; 24 boys). Consensus read was used as the reference standard. Overall, three readers recorded significantly more nodules on MIP images (228 vs. 174; P < 0.05), improving sensitivity from 67% to 77.5% (P < 0.05) but with lower positive predictive value (96% vs. 85%, P < 0.005). MIP images took significantly less time to read (71.6 ± 43.7 s vs. 92.9 ± 48.7 s; P < 0.005) but did not improve confidence levels. Using 10-mm axial MIP images for nodule detection in the paediatric chest enhances diagnostic performance, improving sensitivity and reducing reading time when compared with conventional axial thin-slice images. Axial MIP and axial source images are complementary in thoracic nodule detection. (orig.)

  2. Nasu 1.4 GHz Interferometer Transient Radio Source Survey and Improvement in Detection of Radio Sources

    International Nuclear Information System (INIS)

    Matsumura, Nobuo; Kuniyoshi, Masaya; Takefuji, Kazuhiro; Niinuma, Kotaro; Kida, Sumiko; Takeuchi, Akihiko; Asuma, Kuniyuki; Daishido, Tsuneaki

    2006-01-01

    We have surveyed 1.4GHz transient radio sources in Nasu Pulsar Observatory. To investigate such sources, both immediacy and accuracy are severely maintained. We have developed Data Transfer System and improved antenna control system. Now we have received the fringe data from transient radio source candidates. To get reliable information, we carefully analyze with Fringe Band Pass Filter software and Fringe Fitting method

  3. Open-Source Radiation Exposure Extraction Engine (RE3) with Patient-Specific Outlier Detection.

    Science.gov (United States)

    Weisenthal, Samuel J; Folio, Les; Kovacs, William; Seff, Ari; Derderian, Vana; Summers, Ronald M; Yao, Jianhua

    2016-08-01

    We present an open-source, picture archiving and communication system (PACS)-integrated radiation exposure extraction engine (RE3) that provides study-, series-, and slice-specific data for automated monitoring of computed tomography (CT) radiation exposure. RE3 was built using open-source components and seamlessly integrates with the PACS. RE3 calculations of dose length product (DLP) from the Digital imaging and communications in medicine (DICOM) headers showed high agreement (R (2) = 0.99) with the vendor dose pages. For study-specific outlier detection, RE3 constructs robust, automatically updating multivariable regression models to predict DLP in the context of patient gender and age, scan length, water-equivalent diameter (D w), and scanned body volume (SBV). As proof of concept, the model was trained on 811 CT chest, abdomen + pelvis (CAP) exams and 29 outliers were detected. The continuous variables used in the outlier detection model were scan length (R (2)  = 0.45), D w (R (2) = 0.70), SBV (R (2) = 0.80), and age (R (2) = 0.01). The categorical variables were gender (male average 1182.7 ± 26.3 and female 1047.1 ± 26.9 mGy cm) and pediatric status (pediatric average 710.7 ± 73.6 mGy cm and adult 1134.5 ± 19.3 mGy cm).

  4. Detecting and analyzing soil phosphorus loss associated with critical source areas using a remote sensing approach.

    Science.gov (United States)

    Lou, Hezhen; Yang, Shengtian; Zhao, Changsen; Shi, Liuhua; Wu, Linna; Wang, Yue; Wang, Zhiwei

    2016-12-15

    The detection of critical source areas (CSAs) is a key step in managing soil phosphorus (P) loss and preventing the long-term eutrophication of water bodies at regional scale. Most related studies, however, focus on a local scale, which prevents a clear understanding of the spatial distribution of CSAs for soil P loss at regional scale. Moreover, the continual, long-term variation in CSAs was scarcely reported. It is impossible to identify the factors driving the variation in CSAs, or to collect land surface information essential for CSAs detection, by merely using the conventional methodologies at regional scale. This study proposes a new regional-scale approach, based on three satellite sensors (ASTER, TM/ETM and MODIS), that were implemented successfully to detect CSAs at regional scale over 15years (2000-2014). The approach incorporated five factors (precipitation, slope, soil erosion, land use, soil total phosphorus) that drive soil P loss from CSAs. Results show that the average area of critical phosphorus source areas (CPSAs) was 15,056km 2 over the 15-year period, and it occupied 13.8% of the total area, with a range varying from 1.2% to 23.0%, in a representative, intensive agricultural area of China. In contrast to previous studies, we found that the locations of CSAs with P loss are spatially variable, and are more dispersed in their distribution over the long term. We also found that precipitation acts as a key driving factor in the variation of CSAs at regional scale. The regional-scale method can provide scientific guidance for managing soil phosphorus loss and preventing the long-term eutrophication of water bodies at regional scale, and shows great potential for exploring factors that drive the variation in CSAs at global scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. A novel method for detecting light source for digital images forensic

    Science.gov (United States)

    Roy, A. K.; Mitra, S. K.; Agrawal, R.

    2011-06-01

    Manipulation in image has been in practice since centuries. These manipulated images are intended to alter facts — facts of ethics, morality, politics, sex, celebrity or chaos. Image forensic science is used to detect these manipulations in a digital image. There are several standard ways to analyze an image for manipulation. Each one has some limitation. Also very rarely any method tried to capitalize on the way image was taken by the camera. We propose a new method that is based on light and its shade as light and shade are the fundamental input resources that may carry all the information of the image. The proposed method measures the direction of light source and uses the light based technique for identification of any intentional partial manipulation in the said digital image. The method is tested for known manipulated images to correctly identify the light sources. The light source of an image is measured in terms of angle. The experimental results show the robustness of the methodology.

  6. Particle Filter with Integrated Voice Activity Detection for Acoustic Source Tracking

    Directory of Open Access Journals (Sweden)

    Anders M. Johansson

    2007-01-01

    Full Text Available In noisy and reverberant environments, the problem of acoustic source localisation and tracking (ASLT using an array of microphones presents a number of challenging difficulties. One of the main issues when considering real-world situations involving human speakers is the temporally discontinuous nature of speech signals: the presence of silence gaps in the speech can easily misguide the tracking algorithm, even in practical environments with low to moderate noise and reverberation levels. A natural extension of currently available sound source tracking algorithms is the integration of a voice activity detection (VAD scheme. We describe a new ASLT algorithm based on a particle filtering (PF approach, where VAD measurements are fused within the statistical framework of the PF implementation. Tracking accuracy results for the proposed method is presented on the basis of synthetic audio samples generated with the image method, whereas performance results obtained with a real-time implementation of the algorithm, and using real audio data recorded in a reverberant room, are published elsewhere. Compared to a previously proposed PF algorithm, the experimental results demonstrate the improved robustness of the method described in this work when tracking sources emitting real-world speech signals, which typically involve significant silence gaps between utterances.

  7. Application of leftover sample material from waterborne protozoa monitoring for the molecular detection of Bacteroidales and fecal source tracking markers

    Science.gov (United States)

    In this study, we examined the potential for detecting fecal bacteria and microbial source tracking markers in samples discarded during the concentration of Cryptosporidium and Giardia using USEPA Method 1623. Recovery rates for different fecal bacteria were determined using sp...

  8. Near-infrared Polarimetry of the Outflow Source AFGL 6366S: Detection of Circular Polarization

    Science.gov (United States)

    Kwon, Jungmi; Nakagawa, Takao; Tamura, Motohide; Hough, James H.; Kandori, Ryo; Choi, Minho; Kang, Miju; Cho, Jungyeon; Nakajima, Yasushi; Nagata, Tetsuya

    2018-07-01

    We have carried out near-infrared circular and linear imaging polarimetry of the AFGL 6366S region. There is one large infrared reflection nebula associated with the AFGL 6366S cluster and one small infrared reflection nebula associated with AFGL 6366S NE. Prominent and extended polarized nebulosities over the AFGL 6366S cluster field are found to be composed of several components and local nebula peaks, and those nebulosities are illuminated by at least three sources, which is roughly consistent with a previous study. However, the detailed linear polarization patterns and their degrees differ from the earlier study. The brightest regions of the nebulae are illuminated by the IRAS/WISE source. In addition, we report the first detection of circular polarization (CP) in the reflection nebula associated with AFGL 6366S. The CP is as large as approximately 4% in the K s band, and the maximum CP extent is approximately 0.45 pc, which is comparable to that for the largest CP regions known to date, such as Orion and Mon R2, although the CP degrees are much smaller. The CP pattern is mostly quadrupolar, and its morphology resembles the shape of the C18O dense core. Therefore, the CP region is probably illuminated by the IRAS/WISE source and its polarization is amplified by the dichroic absorption of the dense core associated with the cluster. This is the ninth source whose degrees of CPs are measured to be greater than 3%, suggesting that large and extended infrared CP regions are common among mid- to high-mass young stellar objects.

  9. In-source collision induced dissociation of inorganic explosives for mass spectrometric signature detection and chemical imaging

    Energy Technology Data Exchange (ETDEWEB)

    Forbes, Thomas P., E-mail: thomas.forbes@nist.gov; Sisco, Edward

    2015-09-10

    The trace detection, bulk quantification, and chemical imaging of inorganic explosives and components was demonstrated utilizing in-source collision induced dissociation (CID) coupled with laser desorption/ionization mass spectrometry (LDI-MS). The incorporation of in-source CID provided direct control over the extent of adduct and cluster fragmentation as well as organic noise reduction for the enhanced detection of both the elemental and molecular ion signatures of fuel-oxidizer mixtures and other inorganic components of explosive devices. Investigation of oxidizer molecular anions, specifically, nitrates, chlorates, and perchlorates, identified that the optimal in-source CID existed at the transition between fragmentation of the ionic salt bonds and molecular anion bonds. The chemical imaging of oxidizer particles from latent fingerprints was demonstrated, including both cation and anion components in positive and negative mode mass spectrometry, respectively. This investigation demonstrated LDI-MS with in-source CID as a versatile tool for security fields, as well as environmental monitoring and nuclear safeguards, facilitating the detection of elemental and molecular inorganic compounds at nanogram levels. - Highlights: • In-source CID enhanced detection of elemental inorganics up to 1000-fold. • In-source CID optimization of polyatomic oxidizers enhanced detection up to 100-fold. • Optimal CID identified at transition from breaking ionic salt to molecular anion bonds. • Trace detection of inorganic explosives at nanogram levels was demonstrated. • Oxidizer particles were chemically imaged directly from latent fingerprints.

  10. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  11. Testing seismic amplitude source location for fast debris-flow detection at Illgraben, Switzerland

    Science.gov (United States)

    Walter, Fabian; Burtin, Arnaud; McArdell, Brian W.; Hovius, Niels; Weder, Bianca; Turowski, Jens M.

    2017-06-01

    Heavy precipitation can mobilize tens to hundreds of thousands of cubic meters of sediment in steep Alpine torrents in a short time. The resulting debris flows (mixtures of water, sediment and boulders) move downstream with velocities of several meters per second and have a high destruction potential. Warning protocols for affected communities rely on raising awareness about the debris-flow threat, precipitation monitoring and rapid detection methods. The latter, in particular, is a challenge because debris-flow-prone torrents have their catchments in steep and inaccessible terrain, where instrumentation is difficult to install and maintain. Here we test amplitude source location (ASL) as a processing scheme for seismic network data for early warning purposes. We use debris-flow and noise seismograms from the Illgraben catchment, Switzerland, a torrent system which produces several debris-flow events per year. Automatic in situ detection is currently based on geophones mounted on concrete check dams and radar stage sensors suspended above the channel. The ASL approach has the advantage that it uses seismometers, which can be installed at more accessible locations where a stable connection to mobile phone networks is available for data communication. Our ASL processing uses time-averaged ground vibration amplitudes to estimate the location of the debris-flow front. Applied to continuous data streams, inversion of the seismic amplitude decay throughout the network is robust and efficient, requires no manual identification of seismic phase arrivals and eliminates the need for a local seismic velocity model. We apply the ASL technique to a small debris-flow event on 19 July 2011, which was captured with a temporary seismic monitoring network. The processing rapidly detects the debris-flow event half an hour before arrival at the outlet of the torrent and several minutes before detection by the in situ alarm system. An analysis of continuous seismic records furthermore

  12. Testing seismic amplitude source location for fast debris-flow detection at Illgraben, Switzerland

    Directory of Open Access Journals (Sweden)

    F. Walter

    2017-06-01

    Full Text Available Heavy precipitation can mobilize tens to hundreds of thousands of cubic meters of sediment in steep Alpine torrents in a short time. The resulting debris flows (mixtures of water, sediment and boulders move downstream with velocities of several meters per second and have a high destruction potential. Warning protocols for affected communities rely on raising awareness about the debris-flow threat, precipitation monitoring and rapid detection methods. The latter, in particular, is a challenge because debris-flow-prone torrents have their catchments in steep and inaccessible terrain, where instrumentation is difficult to install and maintain. Here we test amplitude source location (ASL as a processing scheme for seismic network data for early warning purposes. We use debris-flow and noise seismograms from the Illgraben catchment, Switzerland, a torrent system which produces several debris-flow events per year. Automatic in situ detection is currently based on geophones mounted on concrete check dams and radar stage sensors suspended above the channel. The ASL approach has the advantage that it uses seismometers, which can be installed at more accessible locations where a stable connection to mobile phone networks is available for data communication. Our ASL processing uses time-averaged ground vibration amplitudes to estimate the location of the debris-flow front. Applied to continuous data streams, inversion of the seismic amplitude decay throughout the network is robust and efficient, requires no manual identification of seismic phase arrivals and eliminates the need for a local seismic velocity model. We apply the ASL technique to a small debris-flow event on 19 July 2011, which was captured with a temporary seismic monitoring network. The processing rapidly detects the debris-flow event half an hour before arrival at the outlet of the torrent and several minutes before detection by the in situ alarm system. An analysis of continuous seismic

  13. Advanced Laser-Compton Gamma-Ray Sources for Nuclear Materials Detection, Assay and Imaging

    Science.gov (United States)

    Barty, C. P. J.

    2015-10-01

    Highly-collimated, polarized, mono-energetic beams of tunable gamma-rays may be created via the optimized Compton scattering of pulsed lasers off of ultra-bright, relativistic electron beams. Above 2 MeV, the peak brilliance of such sources can exceed that of the world's largest synchrotrons by more than 15 orders of magnitude and can enable for the first time the efficient pursuit of nuclear science and applications with photon beams, i.e. Nuclear Photonics. Potential applications are numerous and include isotope-specific nuclear materials management, element-specific medical radiography and radiology, non-destructive, isotope-specific, material assay and imaging, precision spectroscopy of nuclear resonances and photon-induced fission. This review covers activities at the Lawrence Livermore National Laboratory related to the design and optimization of mono-energetic, laser-Compton gamma-ray systems and introduces isotope-specific nuclear materials detection and assay applications enabled by them.

  14. The optimal on-source region size for detections with counting-type telescopes

    Energy Technology Data Exchange (ETDEWEB)

    Klepser, Stefan

    2017-01-15

    The on-source region is typically a circular area with radius θ in which the signal is expected to appear with the shape of the instrument point spread function (PSF). This paper addresses the question of what is the θ that maximises the probability of detection for a given PSF width and background event density. In the high count number limit and assuming a Gaussian PSF profile, the optimum is found to be at ζ{sup 2}{sub ∞}∼2.51 times the squared PSF width σ{sup 2}{sub PSF39}. While this number is shown to be a good choice in many cases, a dynamic formula for cases of lower count numbers, which favour larger on-source regions, is given. The recipe to get to this parametrisation can also be applied to cases with a non-Gaussian PSF. This result can standardise and simplify analysis procedures, reduce trials and eliminate the need for experience-based ad hoc cut definitions or expensive case-by-case Monte Carlo simulations.

  15. Application of wireless sensor network to problems of detection and tracking of radioactive sources

    International Nuclear Information System (INIS)

    Dupuy, P

    2006-01-01

    International efforts are being conducted to warranty a continuous control of radioactive sources. A theoretical and practical study has been achieved about the feasibility of installing wireless sensor networks on nuclear installations, or plants that uses radioactive material. The study is faced through the implementation of a system designed over the relatively new platform of motes, that gives a great flexibility for distributing sensors taking advantages of new wireless technologies and high-level programming. The work shows an analysis of the state of the technique of sensors, detectors, antennas and power supply including nuclear power supply. It also shows contributions on these fields by experimentation and proposed designs. Three applications that justify the technology are shown and a demonstration project is proposed. The social improvements of the system basically are a technical approach to the continuous control of radioactive sources during their life cycle and the online monitoring of the staff with the possibility of identifying and optimizing the procedures that are the maximum of expositions in practice or detecting potentials expositions [es

  16. The optimal on-source region size for detections with counting-type telescopes

    International Nuclear Information System (INIS)

    Klepser, Stefan

    2017-01-01

    The on-source region is typically a circular area with radius θ in which the signal is expected to appear with the shape of the instrument point spread function (PSF). This paper addresses the question of what is the θ that maximises the probability of detection for a given PSF width and background event density. In the high count number limit and assuming a Gaussian PSF profile, the optimum is found to be at ζ"2_∞∼2.51 times the squared PSF width σ"2_P_S_F_3_9. While this number is shown to be a good choice in many cases, a dynamic formula for cases of lower count numbers, which favour larger on-source regions, is given. The recipe to get to this parametrisation can also be applied to cases with a non-Gaussian PSF. This result can standardise and simplify analysis procedures, reduce trials and eliminate the need for experience-based ad hoc cut definitions or expensive case-by-case Monte Carlo simulations.

  17. THE EARLY DETECTION OF THE EMERALD ASH BORER (EAB USING MULTI-SOURCE REMOTELY SENSED DATA

    Directory of Open Access Journals (Sweden)

    B. Hu

    2018-04-01

    Full Text Available The objectives of this study were to exploit the synergy of hyperspectral imagery, Light Detection And Ranging (LiDAR and high spatial resolution data and their synergy in the early detection of the EAB (Emerald Ash Borer presence in trees within urban areas and to develop a framework to combine information extracted from multiple data sources. To achieve these, an object-oriented framework was developed to combine information derived from available data sets to characterize ash trees. Within this framework, an advanced individual tree delineation method was developed to delineate individual trees using the combined high-spatial resolution worldview-3 imagery was used together with LiDAR data. Individual trees were then classified to ash and non-ash trees using spectral and spatial information. In order to characterize the health state of individual ash trees, leaves from ash trees with various health states were sampled and measured using a field spectrometer. Based on the field measurements, the best indices that sensitive to leaf chlorophyll content were selected. The developed framework and methods were tested using worldview-3, airborne LiDAR data over the Keele campus of York University Toronto Canada. Satisfactory results in terms of individual tree crown delineation, ash tree identification and characterization of the health state of individual ash trees. Quantitative evaluations is being carried out.

  18. The Early Detection of the Emerald Ash Borer (eab) Using Multi-Source Remotely Sensed Data

    Science.gov (United States)

    Hu, B.; Naveed, F.; Tasneem, F.; Xing, C.

    2018-04-01

    The objectives of this study were to exploit the synergy of hyperspectral imagery, Light Detection And Ranging (LiDAR) and high spatial resolution data and their synergy in the early detection of the EAB (Emerald Ash Borer) presence in trees within urban areas and to develop a framework to combine information extracted from multiple data sources. To achieve these, an object-oriented framework was developed to combine information derived from available data sets to characterize ash trees. Within this framework, an advanced individual tree delineation method was developed to delineate individual trees using the combined high-spatial resolution worldview-3 imagery was used together with LiDAR data. Individual trees were then classified to ash and non-ash trees using spectral and spatial information. In order to characterize the health state of individual ash trees, leaves from ash trees with various health states were sampled and measured using a field spectrometer. Based on the field measurements, the best indices that sensitive to leaf chlorophyll content were selected. The developed framework and methods were tested using worldview-3, airborne LiDAR data over the Keele campus of York University Toronto Canada. Satisfactory results in terms of individual tree crown delineation, ash tree identification and characterization of the health state of individual ash trees. Quantitative evaluations is being carried out.

  19. Using Stable Isotopes to Detect Land Use Change and Nitrogen Sources in Aquatic Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, K. M. [National Isotope Center, GNS Science, Lower Hutt (New Zealand)

    2013-05-15

    Changing land use is one of the primary causes of increased sedimentation and nutrient levels in aquatic systems, resulting in contamination and reduction of biodiversity. Detecting and quantifying these inputs is the first step towards remediation, and enabling targeted reductions of transport processes into waterways from human impacted land surfaces. More recently, stable isotope analyses are being used as detection and quantification tools in aquatic environments. Carbon ({delta}{sup 13}C) and nitrogen ({delta}{sup 15}N) isotopes of sediments, as well as algae and invertebrates from aquatic systems can be used as proxies to record both short and long term environmental change. Excess nitrogen (or nitrogen-compounds) derived from urbanization, industry, forestry, farming and agriculture, increase the bioavailability of nitrogen to aquatic organisms, changing their natural {delta}15N isotopic signatures. Allochthonous (terrestrial) input from soil destabilization and human activity in surrounding catchments changes {delta}{sup 13}C isotopic compositions and increases the C:N ratio of sediments. Heavy metal and other organic pollutants can also be used to indicate urbanization and industrial contamination. The combined use of carbon and nitrogen isotopes, C:N ratios and heavy metals are powerful environmental monitoring tools, which are useful indicators of source and transport pathways of terrestrial derived material and anthropogenic pollutants into streams, rivers and estuaries. (author)

  20. Application of random-point processes to the detection of radiation sources

    International Nuclear Information System (INIS)

    Woods, J.W.

    1978-01-01

    In this report the mathematical theory of random-point processes is reviewed and it is shown how use of the theory can obtain optimal solutions to the problem of detecting radiation sources. As noted, the theory also applies to image processing in low-light-level or low-count-rate situations. Paralleling Snyder's work, the theory is extended to the multichannel case of a continuous, two-dimensional (2-D), energy-time space. This extension essentially involves showing that the data are doubly stochastic Poisson (DSP) point processes in energy as well as time. Further, a new 2-D recursive formulation is presented for the radiation-detection problem with large computational savings over nonrecursive techniques when the number of channels is large (greater than or equal to 30). Finally, some adaptive strategies for on-line ''learning'' of unknown, time-varying signal and background-intensity parameters and statistics are present and discussed. These adaptive procedures apply when a complete statistical description is not available a priori

  1. Threshold Signature Schemes Application

    Directory of Open Access Journals (Sweden)

    Anastasiya Victorovna Beresneva

    2015-10-01

    Full Text Available This work is devoted to an investigation of threshold signature schemes. The systematization of the threshold signature schemes was done, cryptographic constructions based on interpolation Lagrange polynomial, elliptic curves and bilinear pairings were examined. Different methods of generation and verification of threshold signatures were explored, the availability of practical usage of threshold schemes in mobile agents, Internet banking and e-currency was shown. The topics of further investigation were given and it could reduce a level of counterfeit electronic documents signed by a group of users.

  2. Particles near threshold

    International Nuclear Information System (INIS)

    Bhattacharya, T.; Willenbrock, S.

    1993-01-01

    We propose returning to the definition of the width of a particle in terms of the pole in the particle's propagator. Away from thresholds, this definition of width is equivalent to the standard perturbative definition, up to next-to-leading order; however, near a threshold, the two definitions differ significantly. The width as defined by the pole position provides more information in the threshold region than the standard perturbative definition and, in contrast with the perturbative definition, does not vanish when a two-particle s-wave threshold is approached from below

  3. Implementing statistical analysis in multi-channel acoustic impact-echo testing of concrete bridge decks: Determining thresholds for delamination detection

    Science.gov (United States)

    Hendricks, Lorin; Spencer Guthrie, W.; Mazzeo, Brian

    2018-04-01

    An automated acoustic impact-echo testing device with seven channels has been developed for faster surveying of bridge decks. Due to potential variations in bridge deck overlay thickness, varying conditions between testing passes, and occasional imprecise equipment calibrations, a method that can account for variations in deck properties and testing conditions was necessary to correctly interpret the acoustic data. A new methodology involving statistical analyses was therefore developed. After acoustic impact-echo data are collected and analyzed, the results are normalized by the median for each channel, a Gaussian distribution is fit to the histogram of the data, and the Kullback-Leibler divergence test or Otsu's method is then used to determine the optimum threshold for differentiating between intact and delaminated concrete. The new methodology was successfully applied to individual channels of previously unusable acoustic impact-echo data obtained from a three-lane interstate bridge deck surfaced with a polymer overlay, and the resulting delamination map compared very favorably with the results of a manual deck sounding survey.

  4. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  5. High-field quench behavior and dependence of hot spot temperature on quench detection voltage threshold in a Bi2Sr2CaCu2Ox coil

    International Nuclear Information System (INIS)

    Shen, Tengming; Ye, Liyang; Turrioni, Daniele; Li, Pei

    2015-01-01

    Small insert solenoids have been built using a multifilamentary Ag/Bi 2 Sr 2 CaCu 2 O x round wire insulated with a mullite sleeve (∼100 μm in thickness) and characterized in background fields to explore the quench behaviors and limits of Bi 2 Sr 2 CaCu 2 O x superconducting magnets, with an emphasis on assessing the impact of slow normal zone propagation on quench detection. Using heaters of various lengths to initiate a small normal zone, a coil was quenched safely more than 70 times without degradation, with the maximum coil temperature reaching 280 K. Coils withstood a resistive voltage of tens of mV for seconds without quenching, showing the high stability of these coils and suggesting that the quench detection voltage should be greater than 50 mV in order not to falsely trigger protection. The hot spot temperature for the resistive voltage of the normal zone to reach 100 mV increased from ∼40–∼80 K while increasing the operating wire current density J o from 89 A mm −2 to 354 A mm −2 , whereas for the voltage to reach 1 V, it increased from ∼60–∼140 K. This shows the increasing negative impact of slow normal zone propagation on quench detection with increasing J o and the need to limit the quench detection voltage to <1 V. These measurements, coupled with an analytical quench model, were used to assess the impact of the maximum allowable detection voltage and temperature upon quench detection on the quench protection, assuming a limit of the hot spot temperature to <300 K. (paper)

  6. Flash-sourcing or the rapid detection and characterisation of earthquake effects through clickstream data analysis

    Science.gov (United States)

    Bossu, R.; Mazet-Roux, G.; Roussel, F.; Frobert, L.

    2011-12-01

    crowd-to-agency system, but unlike them it is not based on declarative information (e.g. answers to a questionnaire) but on implicit data, clickstream observed on our website. We present first the main improvements of the method, improved detection of traffic surges, and a way to instantly map areas affected by severe damage or network disruptions. The second part describes how the derived information improves and fastens public earthquake information and, beyond seismology, what it can teach us on public behaviour when facing an earthquake. Finally, the discussion will focus on the future evolutions and how flash-sourcing could ultimately improve earthquake response.

  7. Use of basic principle of nucleation in determining temperature-threshold neutron energy relationship in superheated emulsions

    CERN Document Server

    Das, M; Chatterjee, B K; Roy, S C

    2003-01-01

    Detection of neutrons through use of superheated emulsions has been known for about two decades. The minimum neutron energy (threshold) required to nucleate drops of a given liquid has a dependence on the temperature of the liquid. The basic principle of nucleation has been utilized to find the relationship between the operating temperature and threshold neutron energy for superheated emulsions made of R-114 liquid. The threshold energy thus determined for different temperatures has been compared with accurate experimental results obtained using monoenergetic neutron sources. The agreement is found to be satisfactory and confirms the applicability of the present simple method to other liquids.

  8. CUTEX: CUrvature Thresholding EXtractor

    Science.gov (United States)

    Molinari, S.; Schisano, E.; Faustini, F.; Pestalozzi, M.; di Giorgio, A. M.; Liu, S.

    2017-08-01

    CuTEx analyzes images in the infrared bands and extracts sources from complex backgrounds, particularly star-forming regions that offer the challenges of crowding, having a highly spatially variable background, and having no-psf profiles such as protostars in their accreting phase. The code is composed of two main algorithms, the first an algorithm for source detection, and the second for flux extraction. The code is originally written in IDL language and it was exported in the license free GDL language. CuTEx could be used in other bands or in scientific cases different from the native case. This software is also available as an on-line tool from the Multi-Mission Interactive Archive web pages dedicated to the Herschel Observatory.

  9. Expert and crowd-sourced validation of an individualized sleep spindle detection method employing complex demodulation and individualized normalization

    Directory of Open Access Journals (Sweden)

    Laura eRay

    2015-09-01

    Full Text Available A spindle detection method was developed that: 1 extracts the signal of interest (i.e., spindle-related phasic changes in sigma relative to ongoing background sigma activity using complex demodulation, 2 accounts for variations of spindle characteristics across the night, scalp derivations and between individuals, and 3 employs a minimum number of sometimes arbitrary, user-defined parameters. Complex demodulation was used to extract instantaneous power in the spindle band. To account for intra- and inter-individual differences, the signal was z-score transformed using a 60s sliding window, per channel, over the course of the recording. Spindle events were detected with a z-score threshold corresponding to a low probability (e.g., 99th percentile. Spindle characteristics, such as amplitude, duration and oscillatory frequency, were derived for each individual spindle following detection, which permits spindles to be subsequently and flexibly categorized as slow or fast spindles from a single detection pass. Spindles were automatically detected in 15 young healthy subjects. Two experts manually identified spindles from C3 during Stage 2 sleep, from each recording; one employing conventional guidelines, and the other, identifying spindles with the aid of a sigma (11-16 Hz filtered channel. These spindles were then compared between raters and to the automated detection to identify the presence of true positives, true negatives, false positives and false negatives. This method of automated spindle detection resolves or avoids many of the limitations that complicate automated spindle detection, and performs well compared to a group of non-experts, and importantly, has good external validity with respect to the extant literature in terms of the characteristics of automatically detected spindles.

  10. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    Science.gov (United States)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  11. Laser Scribed Graphene Biosensor for Detection of Biogenic Amines in Food Samples Using Locally Sourced Materials

    Directory of Open Access Journals (Sweden)

    Diana C. Vanegas

    2018-04-01

    Full Text Available In foods, high levels of biogenic amines (BA are the result of microbial metabolism that could be affected by temperatures and storage conditions. Thus, the level of BA is commonly used as an indicator of food safety and quality. This manuscript outlines the development of laser scribed graphene electrodes, with locally sourced materials, for reagent-free food safety biosensing. To fabricate the biosensors, the graphene surface was functionalized with copper microparticles and diamine oxidase, purchased from a local supermarket; and then compared to biosensors fabricated with analytical grade materials. The amperometric biosensor exhibits good electrochemical performance, with an average histamine sensitivity of 23.3 µA/mM, a lower detection limit of 11.6 µM, and a response time of 7.3 s, showing similar performance to biosensors constructed from analytical grade materials. We demonstrated the application of the biosensor by testing total BA concentration in fish paste samples subjected to fermentation with lactic acid bacteria. Biogenic amines concentrations prior to lactic acid fermentation were below the detection limit of the biosensor, while concentration after fermentation was 19.24 ± 8.21 mg histamine/kg, confirming that the sensor was selective in a complex food matrix. The low-cost, rapid, and accurate device is a promising tool for biogenic amine estimation in food samples, particularly in situations where standard laboratory techniques are unavailable, or are cost prohibitive. This biosensor can be used for screening food samples, potentially limiting food waste, while reducing chances of foodborne outbreaks.

  12. Real breakthrough in detection of radioactive sources by portal monitors with plastic detectors and New Advanced Source Identification Algorithm (ASIA-New)

    Energy Technology Data Exchange (ETDEWEB)

    Stavrov, Andrei; Yamamoto, Eugene [Rapiscan Systems, Inc., 14000 Mead Street, Longmont, CO, 80504 (United States)

    2015-07-01

    Radiation Portal Monitors (RPM) with plastic detectors represent the main instruments used for primary border (customs) radiation control. RPM are widely used because they are simple, reliable, relatively inexpensive and have a high sensitivity. However, experience using the RPM in various countries has revealed the systems have some grave shortcomings. There is a dramatic decrease of the probability of detection of radioactive sources under high suppression of the natural gamma background (radiation control of heavy cargoes, containers and, especially, trains). NORM (Naturally Occurring Radioactive Material) existing in objects under control trigger the so-called 'nuisance alarms', requiring a secondary inspection for source verification. At a number of sites, the rate of such alarms is so high it significantly complicates the work of customs and border officers. This paper presents a brief description of new variant of algorithm ASIA-New (New Advanced Source Identification Algorithm), which was developed by the Rapiscan company. It also demonstrates results of different tests and the capability of a new system to overcome the shortcomings stated above. New electronics and ASIA-New enables RPM to detect radioactive sources under a high background suppression (tested at 15-30%) and to verify the detected NORM (KCl) and the artificial isotopes (Co- 57, Ba-133 and other). New variant of ASIA is based on physical principles, a phenomenological approach and analysis of some important parameter changes during the vehicle passage through the monitor control area. Thanks to this capability main advantage of new system is that this system can be easily installed into any RPM with plastic detectors. Taking into account that more than 4000 RPM has been installed worldwide their upgrading by ASIA-New may significantly increase probability of detection and verification of radioactive sources even masked by NORM. This algorithm was tested for 1,395 passages of

  13. Using Deep Learning for Gamma Ray Source Detection at the First G-APD Cherenkov Telescope (FACT)

    Science.gov (United States)

    Bieker, Jacob

    2018-06-01

    Finding gamma-ray sources is of paramount importance for Imaging Air Cherenkov Telescopes (IACT). This study looks at using deep neural networks on data from the First G-APD Cherenkov Telescope (FACT) as a proof-of-concept of finding gamma-ray sources with deep learning for the upcoming Cherenkov Telescope Array (CTA). In this study, FACT’s individual photon level observation data from the last 5 years was used with convolutional neural networks to determine if one or more sources were present. The neural networks used various architectures to determine which architectures were most successful in finding sources. Neural networks offer a promising method for finding faint and extended gamma-ray sources for IACTs. With further improvement and modifications, they offer a compelling method for source detection for the next generation of IACTs.

  14. Point source detection using the Spherical Mexican Hat Wavelet on simulated all-sky Planck maps

    Science.gov (United States)

    Vielva, P.; Martínez-González, E.; Gallegos, J. E.; Toffolatti, L.; Sanz, J. L.

    2003-09-01

    We present an estimation of the point source (PS) catalogue that could be extracted from the forthcoming ESA Planck mission data. We have applied the Spherical Mexican Hat Wavelet (SMHW) to simulated all-sky maps that include cosmic microwave background (CMB), Galactic emission (thermal dust, free-free and synchrotron), thermal Sunyaev-Zel'dovich effect and PS emission, as well as instrumental white noise. This work is an extension of the one presented in Vielva et al. We have developed an algorithm focused on a fast local optimal scale determination, that is crucial to achieve a PS catalogue with a large number of detections and a low flux limit. An important effort has been also done to reduce the CPU time processor for spherical harmonic transformation, in order to perform the PS detection in a reasonable time. The presented algorithm is able to provide a PS catalogue above fluxes: 0.48 Jy (857 GHz), 0.49 Jy (545 GHz), 0.18 Jy (353 GHz), 0.12 Jy (217 GHz), 0.13 Jy (143 GHz), 0.16 Jy (100 GHz HFI), 0.19 Jy (100 GHz LFI), 0.24 Jy (70 GHz), 0.25 Jy (44 GHz) and 0.23 Jy (30 GHz). We detect around 27 700 PS at the highest frequency Planck channel and 2900 at the 30-GHz one. The completeness level are: 70 per cent (857 GHz), 75 per cent (545 GHz), 70 per cent (353 GHz), 80 per cent (217 GHz), 90 per cent (143 GHz), 85 per cent (100 GHz HFI), 80 per cent (100 GHz LFI), 80 per cent (70 GHz), 85 per cent (44 GHz) and 80 per cent (30 GHz). In addition, we can find several PS at different channels, allowing the study of the spectral behaviour and the physical processes acting on them. We also present the basic procedure to apply the method in maps convolved with asymmetric beams. The algorithm takes ~72 h for the most CPU time-demanding channel (857 GHz) in a Compaq HPC320 (Alpha EV68 1-GHz processor) and requires 4 GB of RAM memory; the CPU time goes as O[NRoN3/2pix log(Npix)], where Npix is the number of pixels in the map and NRo is the number of optimal scales needed.

  15. Detection and identification of enteroviruses from various drinking water sources in Taiwan

    Science.gov (United States)

    Hsu, Bing-Mu; Chen, Chien-Hsien; Wan, Min-Tao; Chang, Po-Jen; Fan, Cheng-Wei

    2009-02-01

    SummaryTwenty-three water samples, including seventeen from surface water reservoirs, three from the raw water of groundwater treatment plants, and three from small water systems, were collected in Taiwan and investigated for the presence of, as well as the species of enteroviruses. RT-PCR was used for the detection of enteroviruses. Results revealed that 23.5% of raw water samples from reservoirs were positive for enteroviruses. In addition, one of the three groundwater samples and two of the three small system water samples were positive for enteroviruses. Water samples that were positive for enteroviruses subsequently were evaluated by real-time PCR. The results indicated that enterovirus concentration in groundwater was lower than that in samples obtained from surface water sources. Enteroviruses were identified by nucleic acid sequencing in the 5'-untranslated regions. Three clusters of enteroviruses were identified as coxsackievirus A2, coxsackievirus A6, and enterovirus 71. The presence of enteroviruses indicates the possibility of waterborne transmission of enteroviruses in Taiwan, if water is not adequately treated.

  16. YAP scintillators for resonant detection of epithermal neutrons at pulsed neutron sources

    International Nuclear Information System (INIS)

    Tardocchi, M.; Gorini, G.; Pietropaolo, A.; Andreani, C.; Senesi, R.; Rhodes, N.; Schooneveld, E. M.

    2004-01-01

    Recent studies indicate the resonance detector (RD) technique as an interesting approach for neutron spectroscopy in the electron volt energy region. This work summarizes the results of a series of experiments where RD consisting of YAlO 3 (YAP) scintillators were used to detect scattered neutrons with energy in the range 1-200 eV. The response of YAP scintillators to radiative capture γ emission from a 238 U analyzer foil was characterized in a series of experiments performed on the VESUVIO spectrometer at the ISIS pulsed neutron source. In these experiments a biparametric data acquisition allowed the simultaneous measurements of both neutron time-of-flight and γ pulse height (energy) spectra. The analysis of the γ pulse height and neutron time of flight spectra permitted to identify and distinguish the signal and background components. These measurements showed that a significant improvement in the signal-to-background ratio can be achieved by setting a lower level discrimination on the pulse height at about 600 keV equivalent photon energy. Present results strongly indicate YAP scintillators as the ideal candidate for neutron scattering studies with epithermal neutrons at both very low (<5 deg.) and intermediate scattering angles

  17. Early Detection of Apathetic Phenotypes in Huntington's Disease Knock-in Mice Using Open Source Tools.

    Science.gov (United States)

    Minnig, Shawn; Bragg, Robert M; Tiwana, Hardeep S; Solem, Wes T; Hovander, William S; Vik, Eva-Mari S; Hamilton, Madeline; Legg, Samuel R W; Shuttleworth, Dominic D; Coffey, Sydney R; Cantle, Jeffrey P; Carroll, Jeffrey B

    2018-02-02

    Apathy is one of the most prevalent and progressive psychiatric symptoms in Huntington's disease (HD) patients. However, preclinical work in HD mouse models tends to focus on molecular and motor, rather than affective, phenotypes. Measuring behavior in mice often produces noisy data and requires large cohorts to detect phenotypic rescue with appropriate power. The operant equipment necessary for measuring affective phenotypes is typically expensive, proprietary to commercial entities, and bulky which can render adequately sized mouse cohorts as cost-prohibitive. Thus, we describe here a home-built, open-source alternative to commercial hardware that is reliable, scalable, and reproducible. Using off-the-shelf hardware, we adapted and built several of the rodent operant buckets (ROBucket) to test Htt Q111/+ mice for attention deficits in fixed ratio (FR) and progressive ratio (PR) tasks. We find that, despite normal performance in reward attainment in the FR task, Htt Q111/+ mice exhibit reduced PR performance at 9-11 months of age, suggesting motivational deficits. We replicated this in two independent cohorts, demonstrating the reliability and utility of both the apathetic phenotype, and these ROBuckets, for preclinical HD studies.

  18. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements

    Science.gov (United States)

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K.; Cai, Chang; Nagarajan, Srikantan S.

    2018-06-01

    Objective. Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. Approach. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Main results. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. Significance. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  19. Double Photoionization Near Threshold

    Science.gov (United States)

    Wehlitz, Ralf

    2007-01-01

    The threshold region of the double-photoionization cross section is of particular interest because both ejected electrons move slowly in the Coulomb field of the residual ion. Near threshold both electrons have time to interact with each other and with the residual ion. Also, different theoretical models compete to describe the double-photoionization cross section in the threshold region. We have investigated that cross section for lithium and beryllium and have analyzed our data with respect to the latest results in the Coulomb-dipole theory. We find that our data support the idea of a Coulomb-dipole interaction.

  20. Thresholds in radiobiology

    International Nuclear Information System (INIS)

    Katz, R.; Hofmann, W.

    1982-01-01

    Interpretations of biological radiation effects frequently use the word 'threshold'. The meaning of this word is explored together with its relationship to the fundamental character of radiation effects and to the question of perception. It is emphasised that although the existence of either a dose or an LET threshold can never be settled by experimental radiobiological investigations, it may be argued on fundamental statistical grounds that for all statistical processes, and especially where the number of observed events is small, the concept of a threshold is logically invalid. (U.K.)

  1. Realization of rapid debugging for detection circuit of optical fiber gas sensor: Using an analog signal source

    Science.gov (United States)

    Tian, Changbin; Chang, Jun; Wang, Qiang; Wei, Wei; Zhu, Cunguang

    2015-03-01

    An optical fiber gas sensor mainly consists of two parts: optical part and detection circuit. In the debugging for the detection circuit, the optical part usually serves as a signal source. However, in the debugging condition, the optical part can be easily influenced by many factors, such as the fluctuation of ambient temperature or driving current resulting in instability of the wavelength and intensity for the laser; for dual-beam sensor, the different bends and stresses of the optical fiber will lead to the fluctuation of the intensity and phase; the intensity noise from the collimator, coupler, and other optical devices in the system will also result in the impurity of the optical part based signal source. In order to dramatically improve the debugging efficiency of the detection circuit and shorten the period of research and development, this paper describes an analog signal source, consisting of a single chip microcomputer (SCM), an amplifier circuit, and a voltage-to-current conversion circuit. It can be used to realize the rapid debugging detection circuit of the optical fiber gas sensor instead of optical part based signal source. This analog signal source performs well with many other advantages, such as the simple operation, small size, and light weight.

  2. Optimal Matched Filter in the Low-number Count Poisson Noise Regime and Implications for X-Ray Source Detection

    Science.gov (United States)

    Ofek, Eran O.; Zackay, Barak

    2018-04-01

    Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.

  3. Design of Threshold Controller Based Chaotic Circuits

    DEFF Research Database (Denmark)

    Mohamed, I. Raja; Murali, K.; Sinha, Sudeshna

    2010-01-01

    We propose a very simple implementation of a second-order nonautonomous chaotic oscillator, using a threshold controller as the only source of nonlinearity. We demonstrate the efficacy and simplicity of our design through numerical and experimental results. Further, we show that this approach...... of using a threshold controller as a nonlinear element, can be extended to obtain autonomous and multiscroll chaotic attractor circuits as well....

  4. Herschel-ATLAS: Dust Temperature and Redshift Distribution of SPIRE and PACS Detected Sources Using Submillimetre Colours

    Science.gov (United States)

    Amblard, A.; Cooray, Asantha; Serra, P.; Temi, P.; Barton, E.; Negrello, M.; Auld, R.; Baes, M.; Baldry, I. K.; Bamford, S.; hide

    2010-01-01

    We present colour-colour diagrams of detected sources in the Herschel-ATLAS Science Demonstration Field from 100 to 500/microns using both PACS and SPIRE. We fit isothermal modified-blackbody spectral energy distribution (SED) models in order to extract the dust temperature of sources with counterparts in GAMA or SDSS with either a spectroscopic or a photometric redshift. For a subsample of 331 sources detected in at least three FIR bands with significance greater than 30 sigma, we find an average dust temperature of (28 plus or minus 8)K. For sources with no known redshifts, we populate the colour-colour diagram with a large number of SEDs generated with a broad range of dust temperatures and emissivity parameters and compare to colours of observed sources to establish the redshift distribution of those samples. For another subsample of 1686 sources with fluxes above 35 mJy at 350 microns and detected at 250 and 500 microns with a significance greater than 3sigma, we find an average redshift of 2.2 plus or minus 0.6.

  5. Regional Seismic Threshold Monitoring

    National Research Council Canada - National Science Library

    Kvaerna, Tormod

    2006-01-01

    ... model to be used for predicting the travel times of regional phases. We have applied these attenuation relations to develop and assess a regional threshold monitoring scheme for selected subregions of the European Arctic...

  6. Tracking the MSL-SAM methane detection source location Through Mars Regional Atmospheric Modeling System (MRAMS)

    Science.gov (United States)

    Pla-García, Jorge

    2016-04-01

    1. Introduction: The putative in situ detection of methane by Sample Analysis at Mars (SAM) instrument suite on Curiosi-ty at Gale crater has garnered significant attention because of the potential implications for the presence of geological methane sources or indigenous Martian organisms [1, 2]. SAM reported detection of back-ground levels of atmospheric methane of mean value 0.69±0.25 parts per billion by volume (ppbv) at the 95% confidence interval (CI). Additionally, in four sequential measurements spanning a 60-sol period, SAM observed elevated levels of methane of 7.2±2.1 ppbv (95% CI), implying that Mars is episodically producing methane from an additional unknown source. There are many major unresolved questions regard-ing this detection: 1) What are the potential sources of the methane release? 2) What causes the rapid decrease in concentration? and 3) Where is the re-lease location? 4) How spatially extensive is the re-lease? 5) For how long is CH4 released? Regarding the first question, the source of methane, is so far not identified. It could be related with geo-logical process like methane release from clathrates [3], serpentinisation [4] and volcanism [5]; or due to biological activity from methanogenesis [6]. To answer the second question, the rapid decrease in concentration, it is important to note that the photo-chemical lifetime of methane is of order 100 years, much longer than the atmospheric mixing time scale, and thus the gas should tend to be well mixed except near a source or shortly after an episodic release. The observed spike of 7 ppb from the background of System (MRAMS). The model was focused on rover locations using nested grids with a spacing of 330 meters on the in-nermost grid that is centered over the landing [8, 9]. MRAMS is ideally suited for this investigation; the model is explicitly designed to simulate Mars' at-mospheric circulations at the mesoscale and smaller with realistic, high-resolution surface properties [10, 11

  7. Observations of the Hubble Deep Field with the Infrared Space Observatory .2. Source detection and photometry

    DEFF Research Database (Denmark)

    Goldschmidt, P.; Oliver, S.J.; Serjeant, S.B.G.

    1997-01-01

    We present positions and fluxes of point sources found in the Infrared Space Observatory (ISO) images of the Hubble Deep Field (HDF) at 6.7 and 15 mu m. We have constructed algorithmically selected 'complete' flux-limited samples of 19 sources in the 15-mu m image, and seven sources in the 6.7-mu m...

  8. Computational gestalts and perception thresholds.

    Science.gov (United States)

    Desolneux, Agnès; Moisan, Lionel; Morel, Jean-Michel

    2003-01-01

    In 1923, Max Wertheimer proposed a research programme and method in visual perception. He conjectured the existence of a small set of geometric grouping laws governing the perceptual synthesis of phenomenal objects, or "gestalt" from the atomic retina input. In this paper, we review this set of geometric grouping laws, using the works of Metzger, Kanizsa and their schools. In continuation, we explain why the Gestalt theory research programme can be translated into a Computer Vision programme. This translation is not straightforward, since Gestalt theory never addressed two fundamental matters: image sampling and image information measurements. Using these advances, we shall show that gestalt grouping laws can be translated into quantitative laws allowing the automatic computation of gestalts in digital images. From the psychophysical viewpoint, a main issue is raised: the computer vision gestalt detection methods deliver predictable perception thresholds. Thus, we are set in a position where we can build artificial images and check whether some kind of agreement can be found between the computationally predicted thresholds and the psychophysical ones. We describe and discuss two preliminary sets of experiments, where we compared the gestalt detection performance of several subjects with the predictable detection curve. In our opinion, the results of this experimental comparison support the idea of a much more systematic interaction between computational predictions in Computer Vision and psychophysical experiments.

  9. Detecting Source Code Plagiarism on .NET Programming Languages using Low-level Representation and Adaptive Local Alignment

    Directory of Open Access Journals (Sweden)

    Oscar Karnalim

    2017-01-01

    Full Text Available Even though there are various source code plagiarism detection approaches, only a few works which are focused on low-level representation for deducting similarity. Most of them are only focused on lexical token sequence extracted from source code. In our point of view, low-level representation is more beneficial than lexical token since its form is more compact than the source code itself. It only considers semantic-preserving instructions and ignores many source code delimiter tokens. This paper proposes a source code plagiarism detection which rely on low-level representation. For a case study, we focus our work on .NET programming languages with Common Intermediate Language as its low-level representation. In addition, we also incorporate Adaptive Local Alignment for detecting similarity. According to Lim et al, this algorithm outperforms code similarity state-of-the-art algorithm (i.e. Greedy String Tiling in term of effectiveness. According to our evaluation which involves various plagiarism attacks, our approach is more effective and efficient when compared with standard lexical-token approach.

  10. Detection of a compact radio source near the center of a gravitational lens: quasar image or galactic core

    International Nuclear Information System (INIS)

    Gorenstein, M.V.; Shapiro, I.I.; Cohen, N.L.

    1983-01-01

    By use of a new, very sensitive interferometric system, a faint, compact radio source has been detected near the center of the galaxy that acts as the main part of a gravitational lens. This lens forms two previously discovered images of the quasar Q0957 + 561, which lies in the direction of the constellation Ursa Major. The newly detected source has a core smaller than 0.002 arc second in diameter with a flux density of 0.6 +- 0.1 millijansky at the 13-centimeter wavelength of the radio observations. This source could be the predicted third image of the transparent gravitational lens, the central core of the galaxy, or some combination of the two. It is not yet possible to choose reliably between these alternatives

  11. Practical Application of Aptamer-Based Biosensors in Detection of Low Molecular Weight Pollutants in Water Sources

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2018-02-01

    Full Text Available Water pollution has become one of the leading causes of human health problems. Low molecular weight pollutants, even at trace concentrations in water sources, have aroused global attention due to their toxicity after long-time exposure. There is an increased demand for appropriate methods to detect these pollutants in aquatic systems. Aptamers, single-stranded DNA or RNA, have high affinity and specificity to each of their target molecule, similar to antigen-antibody interaction. Aptamers can be selected using a method called Systematic Evolution of Ligands by EXponential enrichment (SELEX. Recent years we have witnessed great progress in developing aptamer selection and aptamer-based sensors for low molecular weight pollutants in water sources, such as tap water, seawater, lake water, river water, as well as wastewater and its effluents. This review provides an overview of aptamer-based methods as a novel approach for detecting low molecular weight pollutants in water sources.

  12. Change Detection for Remote Monitoring of Underground Nuclear Testing: Comparison with Seismic and Associated Explosion Source Phenomenological Data

    DEFF Research Database (Denmark)

    Canty, M.; Jahnke, G.; Nielsen, Allan Aasbjerg

    2005-01-01

    The analysis of open-source satellite imagery is in process of establishing itself as an important tool for monitoring nuclear activities throughout the world which are relevant to disarmament treaties, like e. g. the Comprehensive Nuclear-Test-Ban Treaty (CTBT). However, the detection of anthrop......The analysis of open-source satellite imagery is in process of establishing itself as an important tool for monitoring nuclear activities throughout the world which are relevant to disarmament treaties, like e. g. the Comprehensive Nuclear-Test-Ban Treaty (CTBT). However, the detection...... of conventional multispectral satellite platforms with moderate ground resolution (Landsat TM, ASTER) to detect changes over wide areas.We chose the Nevada Test Site (NTS), USA, for a case study because of the large amount of available ground truth information. The analysis is based on the multivariate alteration...

  13. Limit of detection of a fiber optics gyroscope using a super luminescent radiation source; Limite de deteccion de un giroscopio de fibra optica usando una fuente de radiacion superluminiscente

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval R, G.E. [Laboratorio de Optica Aplicada, Centro de Ciencias Aplicadas y Desarrollo Tecnologico, Universidad Nacional Autonoma de Mexico, Apartado Postal 70-186, 04510 Mexico D.F. (Mexico); Nikolaev, V.A. [Departamento de Optica y Radiofisica Cuantica, Universidad Estatal de Telecomunicaciones de San Petersburgo, M.A. Bonch-Bruyevich, Kanal Moika 61, Saint Petersburg 191186, (Russian Federation)

    2003-07-01

    The main objective of this work is to establish the dependence of characteristics of the fiber optics gyroscope (FOG) with respect to the parameters of the super luminescent emission source based on doped optical fiber with rare earth elements (Super luminescent Fiber Source, SFS), argument the pumping rate election of the SFS to obtain characteristics limits of the FOG sensibility. By using this type of emission source in the FOG is recommend to use the rate when the direction of the pumping signal coincide with the super luminescent signal. The most results are the proposition and argumentation of the SFS election as emission source to be use in the FOG of the phase type. Such a decision allow to increase the characteristics of the FOG sensibility in comparison with the use of luminescent source of semiconductors emission which are extensively used in the present time. The use of emission source of the SFS type allow to come closer to the threshold of the obtained sensibility limit (detection limit) which is determined with the shot noise. (Author)

  14. Threshold guidance update

    International Nuclear Information System (INIS)

    Wickham, L.E.

    1986-01-01

    The Department of Energy (DOE) is developing the concept of threshold quantities for use in determining which waste materials must be handled as radioactive waste and which may be disposed of as nonradioactive waste at its sites. Waste above this concentration level would be managed as radioactive or mixed waste (if hazardous chemicals are present); waste below this level would be handled as sanitary waste. Last years' activities (1984) included the development of a threshold guidance dose, the development of threshold concentrations corresponding to the guidance dose, the development of supporting documentation, review by a technical peer review committee, and review by the DOE community. As a result of the comments, areas have been identified for more extensive analysis, including an alternative basis for selection of the guidance dose and the development of quality assurance guidelines. Development of quality assurance guidelines will provide a reasonable basis for determining that a given waste stream qualifies as a threshold waste stream and can then be the basis for a more extensive cost-benefit analysis. The threshold guidance and supporting documentation will be revised, based on the comments received. The revised documents will be provided to DOE by early November. DOE-HQ has indicated that the revised documents will be available for review by DOE field offices and their contractors

  15. Applications of Monte Carlo technique in the detection of explosives, narcotics and fissile material using neutron sources

    International Nuclear Information System (INIS)

    Sinha, Amar; Kashyap, Yogesh; Roy, Tushar; Agrawal, Ashish; Sarkar, P.S.; Shukla, Mayank

    2009-01-01

    The problem of illicit trafficking of explosives, narcotics or fissile materials represents a real challenge to civil security. Neutron based detection systems are being actively explored worldwide as a confirmatory tool for applications in the detection of explosives either hidden inside a vehicle or a cargo container or buried inside soil. The development of a system and its experimental testing is a tedious process and to develop such a system each experimental condition needs to be theoretically simulated. Monte Carlo based methods are used to find an optimized design for such detection system. In order to design such systems, it is necessary to optimize source and detector system for each specific application. The present paper deals with such optimization studies using Monte Carlo technique for tagged neutron based system for explosives and narcotics detection hidden in a cargo and landmine detection using backscatter neutrons. We will also discuss some simulation studies on detection of fissile material and photo-neutron source design for applications on cargo scanning. (author)

  16. Detection of hidden sources. Prompt reports by airborne teams in Resume95

    International Nuclear Information System (INIS)

    Toivonen, H.

    1997-01-01

    An exercise to locate and identify lost radioactive sources was arranged near Padasjoki Auttoinen village. Ten sources, consisting of caesium, cobolt, iridium and technetium, were hidden. The teams (10) were asked to report their findings immediately after the landing and 24 h later. The teams that had a large NaI detector at their disposal could locate more sources than the teams with HPGe detectors. However, for source identification and activity calculation and HPGe detector is superior. Thus, it is highly recommended for operational purposes that both measuring systems are used simultaneously. The best location results were provided by the Danish Emergency Management Agency; the team reported four sources at landing and two other sources were found in prompt data processing after the landing. (au)

  17. Detection of hidden sources. Prompt reports by airborne teams in Resume95

    Energy Technology Data Exchange (ETDEWEB)

    Toivonen, H. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland)

    1997-12-31

    An exercise to locate and identify lost radioactive sources was arranged near Padasjoki Auttoinen village. Ten sources, consisting of caesium, cobolt, iridium and technetium, were hidden. The teams (10) were asked to report their findings immediately after the landing and 24 h later. The teams that had a large NaI detector at their disposal could locate more sources than the teams with HPGe detectors. However, for source identification and activity calculation and HPGe detector is superior. Thus, it is highly recommended for operational purposes that both measuring systems are used simultaneously. The best location results were provided by the Danish Emergency Management Agency; the team reported four sources at landing and two other sources were found in prompt data processing after the landing. (au).

  18. Detection of hidden sources. Prompt reports by airborne teams in Resume95

    Energy Technology Data Exchange (ETDEWEB)

    Toivonen, H [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland)

    1998-12-31

    An exercise to locate and identify lost radioactive sources was arranged near Padasjoki Auttoinen village. Ten sources, consisting of caesium, cobolt, iridium and technetium, were hidden. The teams (10) were asked to report their findings immediately after the landing and 24 h later. The teams that had a large NaI detector at their disposal could locate more sources than the teams with HPGe detectors. However, for source identification and activity calculation and HPGe detector is superior. Thus, it is highly recommended for operational purposes that both measuring systems are used simultaneously. The best location results were provided by the Danish Emergency Management Agency; the team reported four sources at landing and two other sources were found in prompt data processing after the landing. (au).

  19. Near threshold fatigue testing

    Science.gov (United States)

    Freeman, D. C.; Strum, M. J.

    1993-01-01

    Measurement of the near-threshold fatigue crack growth rate (FCGR) behavior provides a basis for the design and evaluation of components subjected to high cycle fatigue. Typically, the near-threshold fatigue regime describes crack growth rates below approximately 10(exp -5) mm/cycle (4 x 10(exp -7) inch/cycle). One such evaluation was recently performed for the binary alloy U-6Nb. The procedures developed for this evaluation are described in detail to provide a general test method for near-threshold FCGR testing. In particular, techniques for high-resolution measurements of crack length performed in-situ through a direct current, potential drop (DCPD) apparatus, and a method which eliminates crack closure effects through the use of loading cycles with constant maximum stress intensity are described.

  20. Rapid detection of illegal colorants on traditional Chinese pastries through mass spectrometry with an interchangeable thermal desorption electrospray ionization source.

    Science.gov (United States)

    Chao, Yu-Ying; Chen, Yen-Ling; Chen, Wei-Chu; Chen, Bai-Hsiun; Huang, Yeou-Lih

    2018-06-30

    Ambient mass spectrometry using an interchangeable thermal desorption/electrospray ionization source (TD-ESI) is a relatively new technique that has had only a limited number of applications to date. Nevertheless, this direct-analysis technique has potential for wider use in analytical chemistry (e.g., in the rapid direct detection of contaminants, residues, and adulterants on and in food) when operated in dual-working mode (pretreatment-free qualitative screening and conventional quantitative confirmation) after switching to a TD-ESI source from a conventional ESI source. Herein, we describe the benefits and challenges associated with the use of a TD-ESI source to detect adulterants on traditional Chinese pastries (TCPs), as a proof-of-concept for the detection of illegal colorants. While TD-ESI can offer direct (i.e., without any sample preparation) qualitative screening analyses for TCPs with adequate sensitivity within 30 s, the use of TD-ESI for semi-quantification is applicable only for homogeneous matrices (e.g., tang yuan). Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Prevalence and methodologies for detection, characterization and subtyping of Listeria monocytogenes and L. ivanovii in foods and environmental sources

    Directory of Open Access Journals (Sweden)

    Jin-Qiang Chen

    2017-09-01

    Full Text Available Listeria monocytogenes, one of the most important foodborne pathogens, can cause listeriosis, a lethal disease for humans. L. ivanovii, which is closely related to L. monocytogenes, is also widely distributed in nature and infects mainly warm-blooded ruminants, causing economic loss. Thus, there are high priority needs for methodologies for rapid, specific, cost-effective and accurate detection, characterization and subtyping of L. monocytogenes and L. ivanovii in foods and environmental sources. In this review, we (A described L. monocytogenes and L. ivanovii, world-wide incidence of listeriosis, and prevalence of various L. monocytogenes strains in food and environmental sources; (B comprehensively reviewed different types of traditional and newly developed methodologies, including culture-based, antigen/antibody-based, LOOP-mediated isothermal amplification, matrix-assisted laser desorption ionization-time of flight-mass spectrometry, DNA microarray, and genomic sequencing for detection and characterization of L. monocytogenes in foods and environmental sources; (C comprehensively summarized different subtyping methodologies, including pulsed-field gel electrophoresis, multi-locus sequence typing, ribotyping, and phage-typing, and whole genomic sequencing etc. for subtyping of L. monocytogenes strains from food and environmental sources; and (D described the applications of these methodologies in detection and subtyping of L. monocytogenes in foods and food processing facilities.

  2. Improved Point-source Detection in Crowded Fields Using Probabilistic Cataloging

    Science.gov (United States)

    Portillo, Stephen K. N.; Lee, Benjamin C. G.; Daylan, Tansu; Finkbeiner, Douglas P.

    2017-10-01

    Cataloging is challenging in crowded fields because sources are extremely covariant with their neighbors and blending makes even the number of sources ambiguous. We present the first optical probabilistic catalog, cataloging a crowded (˜0.1 sources per pixel brighter than 22nd mag in F606W) Sloan Digital Sky Survey r-band image from M2. Probabilistic cataloging returns an ensemble of catalogs inferred from the image and thus can capture source-source covariance and deblending ambiguities. By comparing to a traditional catalog of the same image and a Hubble Space Telescope catalog of the same region, we show that our catalog ensemble better recovers sources from the image. It goes more than a magnitude deeper than the traditional catalog while having a lower false-discovery rate brighter than 20th mag. We also present an algorithm for reducing this catalog ensemble to a condensed catalog that is similar to a traditional catalog, except that it explicitly marginalizes over source-source covariances and nuisance parameters. We show that this condensed catalog has a similar completeness and false-discovery rate to the catalog ensemble. Future telescopes will be more sensitive, and thus more of their images will be crowded. Probabilistic cataloging performs better than existing software in crowded fields and so should be considered when creating photometric pipelines in the Large Synoptic Survey Telescope era.

  3. Protection and fault detection for Lawrence Berkeley Laboratory neutral beam sources

    International Nuclear Information System (INIS)

    Hopkins, D.B.; Baker, W.R.; Berkner, K.H.; Ehlers, K.W.; Honey, V.J.; Lietzke, A.F.; Milnes, K.A.; Owren, H.M.

    1979-11-01

    Testing of TFTR neutral beam (NB) sources has begun at the LBL Neutral Beam System Test Facility (NBSTF). Operation at 120 kV, 65 A, 0.5 sec should be achieved soon. Because NB sources spark down frequently during conditioning, the main accelerating (accel) power supply must be interrupted within a few microseconds to avoid degrading the voltage holding capability, or even the damaging, of the NB source. A variety of improper magnitudes and/or ratios of voltages, currents, and times can occur and must be recognized as fault conditions in order to initiate a prompt interruption of the accel power supply. This paper discusses in detail the key signals which must be monitored and the manner in which they are processed in fault detector circuitry for safe operation of LBL NB sources. The paper also reviews the more standard interlocks and protective features recommended for these sources

  4. THE SOURCE STRUCTURE OF 0642+449 DETECTED FROM THE CONT14 OBSERVATIONS

    International Nuclear Information System (INIS)

    Xu, Ming H.; Wang, Guang L.; Heinkelmann, Robert; Anderson, James M.; Mora-Diaz, Julian; Schuh, Harald

    2016-01-01

    The CONT14 campaign with state-of-the-art very long baseline interferometry (VLBI) data has observed the source 0642+449 with about 1000 observables each day during a continuous observing period of 15 days, providing tens of thousands of closure delays—the sum of the delays around a closed loop of baselines. The closure delay is independent of the instrumental and propagation delays and provides valuable additional information about the source structure. We demonstrate the use of this new “observable” for the determination of the structure in the radio source 0642+449. This source, as one of the defining sources in the second realization of the International Celestial Reference Frame, is found to have two point-like components with a relative position offset of −426 microarcseconds ( μ as) in R.A. and −66 μ as in decl. The two components are almost equally bright, with a flux-density ratio of 0.92. The standard deviation of closure delays for source 0642+449 was reduced from 139 to 90 ps by using this two-component model. Closure delays larger than 1 ns are found to be related to the source structure, demonstrating that structure effects for a source with this simple structure could be up to tens of nanoseconds. The method described in this paper does not rely on a priori source structure information, such as knowledge of source structure determined from direct (Fourier) imaging of the same observations or observations at other epochs. We anticipate our study to be a starting point for more effective determination of the structure effect in VLBI observations.

  5. THE SOURCE STRUCTURE OF 0642+449 DETECTED FROM THE CONT14 OBSERVATIONS

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Ming H.; Wang, Guang L. [Shanghai Astronomical Observatory, Chinese Academy of Sciences, No. 80 Nandan Raod, 200030, Shanghai (China); Heinkelmann, Robert; Anderson, James M.; Mora-Diaz, Julian; Schuh, Harald, E-mail: mhxu@shao.ac.cn [DeutschesGeoForschungsZentrum (GFZ), Potsdam, Telegrafenberg, D-14473 Potsdam (Germany)

    2016-11-01

    The CONT14 campaign with state-of-the-art very long baseline interferometry (VLBI) data has observed the source 0642+449 with about 1000 observables each day during a continuous observing period of 15 days, providing tens of thousands of closure delays—the sum of the delays around a closed loop of baselines. The closure delay is independent of the instrumental and propagation delays and provides valuable additional information about the source structure. We demonstrate the use of this new “observable” for the determination of the structure in the radio source 0642+449. This source, as one of the defining sources in the second realization of the International Celestial Reference Frame, is found to have two point-like components with a relative position offset of −426 microarcseconds ( μ as) in R.A. and −66 μ as in decl. The two components are almost equally bright, with a flux-density ratio of 0.92. The standard deviation of closure delays for source 0642+449 was reduced from 139 to 90 ps by using this two-component model. Closure delays larger than 1 ns are found to be related to the source structure, demonstrating that structure effects for a source with this simple structure could be up to tens of nanoseconds. The method described in this paper does not rely on a priori source structure information, such as knowledge of source structure determined from direct (Fourier) imaging of the same observations or observations at other epochs. We anticipate our study to be a starting point for more effective determination of the structure effect in VLBI observations.

  6. The detection of orphan radioactive sources and the regulatory attitude; La deteccion de fuentes radiactivas huerfanas y la actitud regulatoria

    Energy Technology Data Exchange (ETDEWEB)

    Truppa, W.; Amodei, A.; Castro, L.; Rojas, C. [Autoridad Regulatoria Nuclear, Subgerencia Control de Instalaciones Radiactivas Clase ll y III, Av. Del Libertador 8250 Ciudad de Buenos Aires (C1429BNP) (Argentina)]. e-mail: wtruppa@sede.arn.gov.ar

    2006-07-01

    In the last decade, the appearance of orphan control radioactive sources has been one constant restlessness in the environment of the regulatory control. Of the well-known cases in the world the more common have been the appearance of type sources or industrial use, which by control lack, by negligence or abandonment were without the due protection and receipt. It is presented in this work the detection of a radioactive source of Cs-137 pickup among the scrap that entered to an important steelworks of the country, by a detector of portal type. Starting from there, Ia Nuclear Regulatory Authority (ARN) it carried out a deep investigation to determine the origin of the radioactive source, which drove to detect and to put low control to other three radioactive sources of the same type used in level measurement, originally housed in a tank of daily consumption of gas-oil, inside a craft that it was broken up for it sale like scrap. During the execution of these tasks they took the regulatory collection, chord to what indicates the normative of the Argentine Republic, harmonized by the international requirements as for the control of radioactive material. (Author)

  7. A study on multi-data source fusion method for petroleum pipeline leak detection

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Wei; Zhang, Laibin [Research Center of Oil and Gas Safety Engineering Technology, China University of Petroleum, Beijing, (China)

    2010-07-01

    The detection of leaks on petroleum pipeline is a very important safety issue. Several studies were commissioned to develop new monitoring procedures for leakage detection. This paper sets out a new leak detection process. The approach developed took into consideration steady and transient states. The study investigated leak diagnosis problems in product pipelines using multi-sensor measurements (pressure, flux, density and temperature). The information collected from each sensor was considered as pieces of evidence that describe the operational conditions of the pipeline. The Dempster-Shafer (D-S) theory is used to associate multi-sensor data to pipe health indices. Experimental pressure and flow rate data were recorded using a Pipeline Leak Detection System (PLDS) acquisition card and used to verify the accuracy and reliability of this new detection method. The results showed that the degree of credibility was a high as 0.877. It was also found that multi-feature information fusion improves recognition of pipeline conditions.

  8. Influence of multi-microphone signal enhancement algorithms on the acoustics and detectability of angular and radial source movements

    DEFF Research Database (Denmark)

    Lundbeck, Micha; Hartog, Laura; Grimm, Giso

    2018-01-01

    Hearing-impaired listeners are known to have difficulties not only with understanding speech in noise, but also with judging source distance and movement, and these deficits are related to perceived handicap. It is possible that the perception of spatially dynamic sounds can be improved...... with hearing aids (HAs), but so far this has not been investigated. In a previous study, older hearing-impaired (OHI) listeners showed poorer detectability for virtual left-right (angular) and near-far (radial) source movements due to lateral interfering sounds and reverberation, respectively (Lundbeck, Grimm...

  9. Semi-empirical Determination of Detection Efficiency for Voluminous Source by Effective Solid Angle Method

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D. [Seoul National Univ., Seoul (Korea, Republic of); Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In the field of γ-ray measurements, the determination of full energy (FE) absorption peak efficiency for a voluminous sample is difficult, because the preparation of the certified radiation source with the same chemical composition and geometry for the original voluminous sample is not easy. In order to solve this inconvenience, simulation or semi-empirical methods are preferred in many cases. Effective Solid Angle (ESA) Code which includes semi-empirical approach has been developed by the Applied Nuclear Physics Group in Seoul National University. In this study, we validated ESA code by using Marinelli type voluminous KRISS (Korea Research Institute of Standards and Science) CRM (Certified Reference Materials) sources and IAEA standard γ-ray point sources. And semi-empirically determined efficiency curve for voluminous source by using the ESA code is compared with the experimental value. We calculated the efficiency curve of voluminous source from the measured efficiency of standard point source by using the ESA code. We will carry out the ESA code validation by measurement of various CRM volume sources with detector of different efficiency.

  10. Threshold factorization redux

    Science.gov (United States)

    Chay, Junegone; Kim, Chul

    2018-05-01

    We reanalyze the factorization theorems for the Drell-Yan process and for deep inelastic scattering near threshold, as constructed in the framework of the soft-collinear effective theory (SCET), from a new, consistent perspective. In order to formulate the factorization near threshold in SCET, we should include an additional degree of freedom with small energy, collinear to the beam direction. The corresponding collinear-soft mode is included to describe the parton distribution function (PDF) near threshold. The soft function is modified by subtracting the contribution of the collinear-soft modes in order to avoid double counting on the overlap region. As a result, the proper soft function becomes infrared finite, and all the factorized parts are free of rapidity divergence. Furthermore, the separation of the relevant scales in each factorized part becomes manifest. We apply the same idea to the dihadron production in e+e- annihilation near threshold, and show that the resultant soft function is also free of infrared and rapidity divergences.

  11. Elaborating on Threshold Concepts

    Science.gov (United States)

    Rountree, Janet; Robins, Anthony; Rountree, Nathan

    2013-01-01

    We propose an expanded definition of Threshold Concepts (TCs) that requires the successful acquisition and internalisation not only of knowledge, but also its practical elaboration in the domains of applied strategies and mental models. This richer definition allows us to clarify the relationship between TCs and Fundamental Ideas, and to account…

  12. Comparing Natural Gas Leakage Detection Technologies Using an Open-Source "Virtual Gas Field" Simulator.

    Science.gov (United States)

    Kemp, Chandler E; Ravikumar, Arvind P; Brandt, Adam R

    2016-04-19

    We present a tool for modeling the performance of methane leak detection and repair programs that can be used to evaluate the effectiveness of detection technologies and proposed mitigation policies. The tool uses a two-state Markov model to simulate the evolution of methane leakage from an artificial natural gas field. Leaks are created stochastically, drawing from the current understanding of the frequency and size distributions at production facilities. Various leak detection and repair programs can be simulated to determine the rate at which each would identify and repair leaks. Integrating the methane leakage over time enables a meaningful comparison between technologies, using both economic and environmental metrics. We simulate four existing or proposed detection technologies: flame ionization detection, manual infrared camera, automated infrared drone, and distributed detectors. Comparing these four technologies, we found that over 80% of simulated leakage could be mitigated with a positive net present value, although the maximum benefit is realized by selectively targeting larger leaks. Our results show that low-cost leak detection programs can rely on high-cost technology, as long as it is applied in a way that allows for rapid detection of large leaks. Any strategy to reduce leakage should require a careful consideration of the differences between low-cost technologies and low-cost programs.

  13. Archival Legacy Investigations of Circumstellar Environments (ALICE): Statistical assessment of point source detections

    Science.gov (United States)

    Choquet, Élodie; Pueyo, Laurent; Soummer, Rémi; Perrin, Marshall D.; Hagan, J. Brendan; Gofas-Salas, Elena; Rajan, Abhijith; Aguilar, Jonathan

    2015-09-01

    The ALICE program, for Archival Legacy Investigation of Circumstellar Environment, is currently conducting a virtual survey of about 400 stars, by re-analyzing the HST-NICMOS coronagraphic archive with advanced post-processing techniques. We present here the strategy that we adopted to identify detections and potential candidates for follow-up observations, and we give a preliminary overview of our detections. We present a statistical analysis conducted to evaluate the confidence level on these detection and the completeness of our candidate search.

  14. Nuclear Material Detection by One-Short-Pulse-Laser-Driven Neutron Source

    International Nuclear Information System (INIS)

    Favalli, Andrea; Aymond, F.; Bridgewater, Jon S.; Croft, Stephen; Deppert, O.; Devlin, Matthew James; Falk, Katerina; Fernandez, Juan Carlos; Gautier, Donald Cort; Gonzales, Manuel A.; Goodsell, Alison Victoria; Guler, Nevzat; Hamilton, Christopher Eric; Hegelich, Bjorn Manuel; Henzlova, Daniela; Ianakiev, Kiril Dimitrov; Iliev, Metodi; Johnson, Randall Philip; Jung, Daniel; Kleinschmidt, Annika; Koehler, Katrina Elizabeth; Pomerantz, Ishay; Roth, Markus; Santi, Peter Angelo; Shimada, Tsutomu; Swinhoe, Martyn Thomas; Taddeucci, Terry Nicholas; Wurden, Glen Anthony; Palaniyappan, Sasikumar; McCary, E.

    2015-01-01

    Covered in the PowerPoint presentation are the following areas: Motivation and requirements for active interrogation of nuclear material; laser-driven neutron source; neutron diagnostics; active interrogation of nuclear material; and, conclusions, remarks, and future works.

  15. Multi-Sensor Integration to Map Odor Distribution for the Detection of Chemical Sources

    Directory of Open Access Journals (Sweden)

    Xiang Gao

    2016-07-01

    Full Text Available This paper addresses the problem of mapping odor distribution derived from a chemical source using multi-sensor integration and reasoning system design. Odor localization is the problem of finding the source of an odor or other volatile chemical. Most localization methods require a mobile vehicle to follow an odor plume along its entire path, which is time consuming and may be especially difficult in a cluttered environment. To solve both of the above challenges, this paper proposes a novel algorithm that combines data from odor and anemometer sensors, and combine sensors’ data at different positions. Initially, a multi-sensor integration method, together with the path of airflow was used to map the pattern of odor particle movement. Then, more sensors are introduced at specific regions to determine the probable location of the odor source. Finally, the results of odor source location simulation and a real experiment are presented.

  16. Probabilistic Deviation Detection and Optimal Thresholds

    Science.gov (United States)

    2014-01-01

    27 List of Figures Figure 1: A screenshot of the StarCraft Brood War videogame ...War videogame StarCraft is used as the domain for the case-based planning research conducted in the DEEP project. StarCraft was selected for a number

  17. Detection of Nuclear Sources by UAV Teleoperation Using a Visuo-Haptic Augmented Reality Interface

    OpenAIRE

    Jacopo Aleotti; Giorgio Micconi; Stefano Caselli; Giacomo Benassi; Nicola Zambelli; Manuele Bettelli; Andrea Zappettini

    2017-01-01

    A visuo-haptic augmented reality (VHAR) interface is presented enabling an operator to teleoperate an unmanned aerial vehicle (UAV) equipped with a custom CdZnTe-based spectroscopic gamma-ray detector in outdoor environments. The task is to localize nuclear radiation sources, whose location is unknown to the user, without the close exposure of the operator. The developed detector also enables identification of the localized nuclear sources. The aim of the VHAR interface is to increase the sit...

  18. Locating sensors for detecting source-to-target patterns of special nuclear material smuggling: a spatial information theoretic approach.

    Science.gov (United States)

    Przybyla, Jay; Taylor, Jeffrey; Zhou, Xuesong

    2010-01-01

    In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM) smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  19. Locating Sensors for Detecting Source-to-Target Patterns of Special Nuclear Material Smuggling: A Spatial Information Theoretic Approach

    Directory of Open Access Journals (Sweden)

    Xuesong Zhou

    2010-08-01

    Full Text Available In this paper, a spatial information-theoretic model is proposed to locate sensors for detecting source-to-target patterns of special nuclear material (SNM smuggling. In order to ship the nuclear materials from a source location with SNM production to a target city, the smugglers must employ global and domestic logistics systems. This paper focuses on locating a limited set of fixed and mobile radiation sensors in a transportation network, with the intent to maximize the expected information gain and minimize the estimation error for the subsequent nuclear material detection stage. A Kalman filtering-based framework is adapted to assist the decision-maker in quantifying the network-wide information gain and SNM flow estimation accuracy.

  20. Acute gastrointestinal bleeding: detection of source and etiology with multi-detector-row CT

    Energy Technology Data Exchange (ETDEWEB)

    Scheffel, Hans; Pfammatter, Thomas; Marincek, Borut; Alkadhi, Hatem [University Hospital Zurich, Institute of Diagnostic Radiology, Zurich (Switzerland); Wildi, Stefan [University Hospital Zurich, Department of Visceral and Transplant Surgery, Zurich (Switzerland); Bauerfeind, Peter [University Hospital Zurich, Division of Gastroenterology, Zurich (Switzerland)

    2007-06-15

    This study was conducted to determine the ability of multi-detector-row computed tomography (CT) to identify the source and etiology of acute gastrointestinal bleeding. Eighteen patients with acute upper (n = 10) and lower (n = 8) gastrointestinal bleeding underwent 4-detector-row CT (n = 6), 16-detector-row CT (n = 11), and 64-slice CT (n = 1) with an arterial and portal venous phase of contrast enhancement. Unenhanced scans were performed in nine patients. CT scans were reviewed to determine conspicuity of bleeding source, underlying etiology, and for potential causes of false-negative prospective interpretations. Bleeding sources were prospectively identified with CT in 15 (83%) patients, and three (17%) bleeding sources were visualized in retrospect, allowing the characterization of all sources of bleeding with CT. Contrast extravasation was demonstrated with CT in all 11 patients with severe bleeding, but only in 1 of 7 patients with mild bleeding. The etiology could not be identified on unenhanced CT scans in any patient, whereas arterial-phase and portal venous-phase CT depicted etiology in 15 (83%) patients. Underlying etiology was correctly identified in all eight patients with mild GI bleeding. Multi-detector-row CT enables the identification of bleeding source and precise etiology in patients with acute gastrointestinal bleeding. (orig.)

  1. Acute gastrointestinal bleeding: detection of source and etiology with multi-detector-row CT

    International Nuclear Information System (INIS)

    Scheffel, Hans; Pfammatter, Thomas; Marincek, Borut; Alkadhi, Hatem; Wildi, Stefan; Bauerfeind, Peter

    2007-01-01

    This study was conducted to determine the ability of multi-detector-row computed tomography (CT) to identify the source and etiology of acute gastrointestinal bleeding. Eighteen patients with acute upper (n = 10) and lower (n = 8) gastrointestinal bleeding underwent 4-detector-row CT (n = 6), 16-detector-row CT (n = 11), and 64-slice CT (n = 1) with an arterial and portal venous phase of contrast enhancement. Unenhanced scans were performed in nine patients. CT scans were reviewed to determine conspicuity of bleeding source, underlying etiology, and for potential causes of false-negative prospective interpretations. Bleeding sources were prospectively identified with CT in 15 (83%) patients, and three (17%) bleeding sources were visualized in retrospect, allowing the characterization of all sources of bleeding with CT. Contrast extravasation was demonstrated with CT in all 11 patients with severe bleeding, but only in 1 of 7 patients with mild bleeding. The etiology could not be identified on unenhanced CT scans in any patient, whereas arterial-phase and portal venous-phase CT depicted etiology in 15 (83%) patients. Underlying etiology was correctly identified in all eight patients with mild GI bleeding. Multi-detector-row CT enables the identification of bleeding source and precise etiology in patients with acute gastrointestinal bleeding. (orig.)

  2. Improvement of bunch-by-bunch beam current detection system in Hefei light source

    International Nuclear Information System (INIS)

    Zheng Kai; Wang Junhua; Liu Zuping; Li Weimin; Zhou Zeran; Yang Yongliang; Huang Longjun; Chen Yuanbo

    2006-01-01

    Bunch-by-bunch beam current detection system is an important facility in the multibunch storage ring. In this paper, the established bunch-by-bunch beam current detection systems for the accelerator such as Cornell, SLAC and KEKB were compared and studied. The design of the bunch-by-bunch beam current detection system for HLS, which was based on the bunch-by-bunch tracing measurement system in HLS was given. Both demodulation by sine wave and square were applied in this paper, the deviation of the detect system was determined by the longitudinal oscillation. Compared the data acquired from ADC with the data from DCCT, the ADC data was scaled by the bunch current. The standard deviations of linear fit were about 1%, and the standard deviations of polynomial fit were less than 0.5% in both sine wave and square wave demodulation. Some analysis of the measurement results also had been shown in this paper. (authors)

  3. Sensitivities and detection limits of X-ray fluorescence analysis with a 10 mCi241Am source

    International Nuclear Information System (INIS)

    Wundt, K.; Janghorbani, M.; Starke, K.

    1976-01-01

    Seven trace elements ranging from chromium to barium were preconcentrated on Amberlite IR-120 cation exchange paper and determined in an energy dispersive X-ray fluorescence system using a 10 mCi 241 Am source. Sensitivities were experimentally determined and checked with theoretically calculated values. The detection limits are compared with elemental levels present in typical surface waters and those allowed in drinking water. Appropriate conclusions as to feasibility of such a system for environmental monitoring are drawn. (orig.) [de

  4. Multi-Source Fusion for Explosive Hazard Detection in Forward Looking Sensors

    Science.gov (United States)

    2016-12-01

    California, United States Paper Title: Voxel-space radar signal processing for side attack explosive ballistic detection Publication Status: 1-Published...documentation. 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211...R-0012. The aim of this research is to improve the U.S. Army’s ability to detect landmines and explosive hazards in different scenarios using

  5. Psychophysical thresholds of face visibility during infancy

    DEFF Research Database (Denmark)

    Gelskov, Sofie; Kouider, Sid

    2010-01-01

    The ability to detect and focus on faces is a fundamental prerequisite for developing social skills. But how well can infants detect faces? Here, we address this question by studying the minimum duration at which faces must appear to trigger a behavioral response in infants. We used a preferential...... looking method in conjunction with masking and brief presentations (300 ms and below) to establish the temporal thresholds of visibility at different stages of development. We found that 5 and 10 month-old infants have remarkably similar visibility thresholds about three times higher than those of adults....... By contrast, 15 month-olds not only revealed adult-like thresholds, but also improved their performance through memory-based strategies. Our results imply that the development of face visibility follows a non-linear course and is determined by a radical improvement occurring between 10 and 15 months....

  6. Determinación del umbral de detección de Pseudococcus viburni (Hemiptera: Pseudococcidae por PCR Determination of the detection threshold of Pseudococcus viburni (Hemiptera: Pseudococcidae by PCR

    Directory of Open Access Journals (Sweden)

    Diana Vera

    2012-06-01

    inspections often results in rejections of fresh fruit in countries where the fruit is exported. In this study the detection threshold of P. viburni was determined by PCR technique. The DNA extraction kit, DNAzolTM, was used as a fast DNA extraction method. By using the above extraction kit, good DNA quality and concentration was obtained from specimens of P. viburni of different developmental stages (including eggs, nymphs and adults which were preserved in pro-analysis ethanol at -20ºC following the manufacturer protocol. One aliquote of this DNA was used as a template in a PCR reaction using specific P. viburni primers recorded in the literature and DNA of P viburni from an entomological collection was used as a control. In all stages the amplification generated an expected size band of 650 bp visualized in an agarose 1.5% gel. The detection threshold of P. viburni using our method was an egg. These results involve a specific detection technique for P. viburni and reliable species identification in a space of 48 h.

  7. Shifts in the relationship between motor unit recruitment thresholds versus derecruitment thresholds during fatigue.

    Science.gov (United States)

    Stock, Matt S; Mota, Jacob A

    2017-12-01

    Muscle fatigue is associated with diminished twitch force amplitude. We examined changes in the motor unit recruitment versus derecruitment threshold relationship during fatigue. Nine men (mean age = 26 years) performed repeated isometric contractions at 50% maximal voluntary contraction (MVC) knee extensor force until exhaustion. Surface electromyographic signals were detected from the vastus lateralis, and were decomposed into their constituent motor unit action potential trains. Motor unit recruitment and derecruitment thresholds and firing rates at recruitment and derecruitment were evaluated at the beginning, middle, and end of the protocol. On average, 15 motor units were studied per contraction. For the initial contraction, three subjects showed greater recruitment thresholds than derecruitment thresholds for all motor units. Five subjects showed greater recruitment thresholds than derecruitment thresholds for only low-threshold motor units at the beginning, with a mean cross-over of 31.6% MVC. As the muscle fatigued, many motor units were derecruited at progressively higher forces. In turn, decreased slopes and increased y-intercepts were observed. These shifts were complemented by increased firing rates at derecruitment relative to recruitment. As the vastus lateralis fatigued, the central nervous system's compensatory adjustments resulted in a shift of the regression line of the recruitment versus derecruitment threshold relationship. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  8. Compact quasi-monoenergetic photon sources from laser-plasma accelerators for nuclear detection and characterization

    Energy Technology Data Exchange (ETDEWEB)

    Geddes, Cameron G.R., E-mail: cgrgeddes@lbl.gov; Rykovanov, Sergey; Matlis, Nicholas H.; Steinke, Sven; Vay, Jean-Luc; Esarey, Eric H.; Ludewigt, Bernhard; Nakamura, Kei; Quiter, Brian J.; Schroeder, Carl B.; Toth, Csaba; Leemans, Wim P.

    2015-05-01

    Near-monoenergetic photon sources at MeV energies offer improved sensitivity at greatly reduced dose for active interrogation, and new capabilities in treaty verification, nondestructive assay of spent nuclear fuel and emergency response. Thomson (also referred to as Compton) scattering sources are an established method to produce appropriate photon beams. Applications are however restricted by the size of the required high-energy electron linac, scattering (photon production) system, and shielding for disposal of the high energy electron beam. Laser-plasma accelerators (LPAs) produce GeV electron beams in centimeters, using the plasma wave driven by the radiation pressure of an intense laser. Recent LPA experiments are presented which have greatly improved beam quality and efficiency, rendering them appropriate for compact high-quality photon sources based on Thomson scattering. Designs for MeV photon sources utilizing the unique properties of LPAs are presented. It is shown that control of the scattering laser, including plasma guiding, can increase photon production efficiency. This reduces scattering laser size and/or electron beam current requirements to scale compatible with the LPA. Lastly, the plasma structure can decelerate the electron beam after photon production, reducing the size of shielding required for beam disposal. Together, these techniques provide a path to a compact photon source system.

  9. Self-assembled films containing crude extract of avocado as a source of tyrosinase for monophenol detection

    Energy Technology Data Exchange (ETDEWEB)

    Vieira, Nirton C.S., E-mail: nirtoncristi@gmail.com [Instituto de Física de São Carlos/Universidade de São Paulo, CP 369, 13560-970 São Carlos, SP (Brazil); Ferreira, Reginaldo A. [Centro de Estudos e Inovação em Materiais Biofuncionais Avançados/Universidade Federal de Itajubá, CP 50, 37500-903 Itajubá, MG (Brazil); Cruz Rodrigues, Valquiria da; Guimarães, Francisco E.G. [Instituto de Física de São Carlos/Universidade de São Paulo, CP 369, 13560-970 São Carlos, SP (Brazil); Queiroz, Alvaro A.A. de [Centro de Estudos e Inovação em Materiais Biofuncionais Avançados/Universidade Federal de Itajubá, CP 50, 37500-903 Itajubá, MG (Brazil)

    2013-10-15

    This paper reports on the use of the crude extract of avocado (CEA) fruit (Persea americana) as a source of tyrosinase enzyme. CEA was immobilized via layer by layer (LbL) technique onto indium tin oxide (ITO) substrates and applied in the detection of monophenol using a potentiometric biosensor. Poly(propylene imine) dendrimer of generation 3 (PPI-G3) was used as a counter ion in the layer by layer process due to its highly porous structure and functional groups suitable for enzyme linkage. After the immobilization of the crude CEA as multilayered films, standard samples of monophenol were detected in the 0.25–4.00 mM linear range with approximately 28 mV mM{sup −1} of sensitivity. This sensitivity is 14 times higher than the values found in the literature for a similar system. The results show that it is possible to obtain efficient and low-cost biosensors for monophenol detection using potentiometric transducers and alternative sources of enzymes without purification. - Highlights: • ITO films were functionalized with multilayers of PPI dendrimer and crude extract of avocado. • The films were applied as potentiometric biosensor for the detection of monophenol. • The proposed system presented an excellent sensitivity to monophenol (27 mV mM{sup −1})

  10. Self-assembled films containing crude extract of avocado as a source of tyrosinase for monophenol detection

    International Nuclear Information System (INIS)

    Vieira, Nirton C.S.; Ferreira, Reginaldo A.; Cruz Rodrigues, Valquiria da; Guimarães, Francisco E.G.; Queiroz, Alvaro A.A. de

    2013-01-01

    This paper reports on the use of the crude extract of avocado (CEA) fruit (Persea americana) as a source of tyrosinase enzyme. CEA was immobilized via layer by layer (LbL) technique onto indium tin oxide (ITO) substrates and applied in the detection of monophenol using a potentiometric biosensor. Poly(propylene imine) dendrimer of generation 3 (PPI-G3) was used as a counter ion in the layer by layer process due to its highly porous structure and functional groups suitable for enzyme linkage. After the immobilization of the crude CEA as multilayered films, standard samples of monophenol were detected in the 0.25–4.00 mM linear range with approximately 28 mV mM −1 of sensitivity. This sensitivity is 14 times higher than the values found in the literature for a similar system. The results show that it is possible to obtain efficient and low-cost biosensors for monophenol detection using potentiometric transducers and alternative sources of enzymes without purification. - Highlights: • ITO films were functionalized with multilayers of PPI dendrimer and crude extract of avocado. • The films were applied as potentiometric biosensor for the detection of monophenol. • The proposed system presented an excellent sensitivity to monophenol (27 mV mM −1 )

  11. Evaluation of different analysis and identification methods for Salmonella detection in surface drinking water sources

    International Nuclear Information System (INIS)

    Hsu, Bing-Mu; Huang, Kuan-Hao; Huang, Shih-Wei; Tseng, Kuo-Chih; Su, Ming-Jen; Lin, Wei-Chen; Ji, Dar-Der; Shih, Feng-Cheng; Chen, Jyh-Larng; Kao, Po-Min

    2011-01-01

    The standard method for detecting Salmonella generally analyzes food or fecal samples. Salmonella often occur in relatively low concentrations in environmental waters. Therefore, some form of concentration and proliferation may be needed. This study compares three Salmonella analysis methods and develops a new Salmonella detection procedure for use in environmental water samples. The new procedure for Salmonella detection include water concentration, nutrient broth enrichment, selection of Salmonella containing broth by PCR, isolation of Salmonella strains by selective culture plates, detection of possible Salmonella isolate by PCR, and biochemical testing. Serological assay and pulsed-field gel electrophoresis (PFGE) can be used to identify Salmonella serotype and genotype, respectively. This study analyzed 116 raw water samples taken from 18 water plants and belonging to 5 watersheds. Of these 116, 10 water samples (8.6%) taken from 7 water plants and belonging to 4 watersheds were positive for a Salmonella-specific polymerase chain reaction targeting the invA gene. Guided by serological assay results, this study identified 7 cultured Salmonella isolates as Salmonella enterica serovar: Alnaby, Enteritidis, Houten, Montevideo, Newport, Paratyphi B var. Java, and Victoria. These seven Salmonella serovars were identified in clinical cases for the same geographical areas, but only one of them was 100% homologous with clinical cases in the PFGE pattern. - Research highlights: → A new Salmonella detecting procedure for environmental water is developed. → Salmonella isolates are identified by serological assay and PFGE. → A total of seven Salmonella serovars is isolated from environmental water.

  12. Evaluation of different analysis and identification methods for Salmonella detection in surface drinking water sources

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, Bing-Mu, E-mail: bmhsu@ccu.edu.tw [Department of Earth and Environmental Sciences, National Chung Cheng University, Chiayi, Taiwan, ROC (China); Huang, Kuan-Hao; Huang, Shih-Wei [Department of Earth and Environmental Sciences, National Chung Cheng University, Chiayi, Taiwan, ROC (China); Tseng, Kuo-Chih [Department of Internal Medicine, Buddhist Dalin Tzu Chi General Hospital, Chiayi, Taiwan, ROC (China); Su, Ming-Jen [Department of Clinical Pathology, Buddhist Dalin Tzu Chi General Hospital, Chiayi, Taiwan, ROC (China); Lin, Wei-Chen; Ji, Dar-Der [Research and Diagnostic Center, Centers for Disease Control, Taipei, Taiwan, ROC (China); Shih, Feng-Cheng; Chen, Jyh-Larng [Department of Environmental Engineering and Health, Yuanpei University of Science and Technology, HsinChu, Taiwan, ROC (China); Kao, Po-Min [Department of Earth and Environmental Sciences, National Chung Cheng University, Chiayi, Taiwan, ROC (China)

    2011-09-15

    The standard method for detecting Salmonella generally analyzes food or fecal samples. Salmonella often occur in relatively low concentrations in environmental waters. Therefore, some form of concentration and proliferation may be needed. This study compares three Salmonella analysis methods and develops a new Salmonella detection procedure for use in environmental water samples. The new procedure for Salmonella detection include water concentration, nutrient broth enrichment, selection of Salmonella containing broth by PCR, isolation of Salmonella strains by selective culture plates, detection of possible Salmonella isolate by PCR, and biochemical testing. Serological assay and pulsed-field gel electrophoresis (PFGE) can be used to identify Salmonella serotype and genotype, respectively. This study analyzed 116 raw water samples taken from 18 water plants and belonging to 5 watersheds. Of these 116, 10 water samples (8.6%) taken from 7 water plants and belonging to 4 watersheds were positive for a Salmonella-specific polymerase chain reaction targeting the invA gene. Guided by serological assay results, this study identified 7 cultured Salmonella isolates as Salmonella enterica serovar: Alnaby, Enteritidis, Houten, Montevideo, Newport, Paratyphi B var. Java, and Victoria. These seven Salmonella serovars were identified in clinical cases for the same geographical areas, but only one of them was 100% homologous with clinical cases in the PFGE pattern. - Research highlights: {yields} A new Salmonella detecting procedure for environmental water is developed. {yields} Salmonella isolates are identified by serological assay and PFGE. {yields} A total of seven Salmonella serovars is isolated from environmental water.

  13. Semi-empirical Calculation of Detection Efficiency for Voluminous Source Based on Effective Solid Angle Concept

    Energy Technology Data Exchange (ETDEWEB)

    Kang, M. Y.; Kim, J. H.; Choi, H. D.; Sun, G. M. [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    To calculate the full energy (FE) absorption peak efficiency for arbitrary volume sample, we developed and verified the Effective Solid Angle (ESA) Code. The procedure for semi-empirical determination of the FE efficiency for the arbitrary volume sources and the calculation principles and processes about ESA code is referred to, and the code was validated with a HPGe detector (relative efficiency 32%, n-type) in previous studies. In this study, we use different type and efficiency of HPGe detectors, in order to verify the performance of the ESA code for the various detectors. We calculated the efficiency curve of voluminous source and compared with experimental data. We will carry out additional validation by measurement of various medium, volume and shape of CRM volume sources with detector of different efficiency and type. And we will reflect the effect of the dead layer of p-type HPGe detector and coincidence summing correction technique in near future.

  14. Developing laser ablation in an electron cyclotron resonance ion source for actinide detection with AMS

    Energy Technology Data Exchange (ETDEWEB)

    Bauder, W. [Argonne National Laboratory, Physics Division, 9600 S. Cass Ave, Lemont, IL 60439 (United States); University of Notre Dame, Nuclear Science Laboratory, 124 Nieuwland Science Hall, Notre Dame, IN 46556 (United States); Pardo, R.C.; Kondev, F.G.; Kondrashev, S.; Nair, C.; Nusair, O. [Argonne National Laboratory, Physics Division, 9600 S. Cass Ave, Lemont, IL 60439 (United States); Palchan, T. [Hebrew University, Racah Institute of Physics, Jerusalem 91904 (Israel); Scott, R.; Seweryniak, D.; Vondrasek, R. [Argonne National Laboratory, Physics Division, 9600 S. Cass Ave, Lemont, IL 60439 (United States); Collon, P. [University of Notre Dame, Nuclear Science Laboratory, 124 Nieuwland Science Hall, Notre Dame, IN 46556 (United States); Paul, M. [Hebrew University, Racah Institute of Physics, Jerusalem 91904 (Israel)

    2015-10-15

    A laser ablation material injection system has been developed at the ATLAS electron cyclotron resonance (ECR) ion source for use in accelerator mass spectrometry experiments. Beam production with laser ablation initially suffered from instabilities due to fluctuations in laser energy and cratering on the sample surface by the laser. However, these instabilities were rectified by applying feedback correction for the laser energy and rastering the laser across the sample surface. An initial experiment successfully produced and accelerated low intensity actinide beams with up to 1000 counts per second. With continued development, laser ablation shows promise as an alternative material injection scheme for ECR ion sources and may help substantially reduce cross talk in the source.

  15. Combined sensors for the detection, identification and monitoring of radiation sources

    International Nuclear Information System (INIS)

    Yaar, I.

    2006-01-01

    Radiation sources widely used in industry, medicine, agriculture. Research and education are the most dangerous from the viewpoint of their widespread and easy access.The probability that these sources will be stolen and used to assemble a radiological dispersive (RDD) is nor negligible. Such a device can be used by terrorist groups for the purpose of contamination of industrial centers, airports, seaports and residential areas, which can affect a large sector of the economy of a country. Detonation of a RDD can lead to death and exposure of the population to radiation, but, as a whole, the use of the bomb is aimed at creating panic among population, causing economic damage and social shock to the society. In this work, ways to reduce the threat of radiation sources obtained outside and within a country will be discussed

  16. Microplastic effect thresholds for freshwater benthic macroinvertebrates

    NARCIS (Netherlands)

    Redondo Hasselerharm, P.E.; Dede Falahudin, Dede; Peeters, E.T.H.M.; Koelmans, A.A.

    2018-01-01

    Now that microplastics have been detected in lakes, rivers and estuaries all over the globe, evaluating their effects on biota has become an urgent research priority. This is the first study that aims at determining the effect thresholds for a battery of six freshwater benthic macroinvertebrates

  17. Control of emissions from stationary combustion sources: Pollutant detection and behavior in the atmosphere

    International Nuclear Information System (INIS)

    Licht, W.; Engel, A.J.; Slater, S.M.

    1979-01-01

    Stationary combustion resources continue to be significant sources of NOx and SOx pollutants in the ambient atmosphere. This volume considers four problem areas: (1) control of emissions from stationary combustion sources, particularly SOx and NOx (2) pollutant behavior in the atmosphere (3) advances in air pollution analysis and (4) air quality management. Topics of interest include carbon slurries for sulfur dioxide abatement, mass transfer in the Kellogg-Weir air quality control system, oxidation/inhibition of sulfite ion in aqueous solution, some micrometeorological methods of measuring dry deposition rates, Spanish moss as an indicator of airborne metal contamination, and air quality impacts from future electric power generation in Texas

  18. Hadron production near threshold

    Indian Academy of Sciences (India)

    Abstract. Final state interaction effects in pp → pΛK+ and pd → 3He η reactions are explored near threshold to study the sensitivity of the cross-sections to the pΛ potential and the ηN scattering matrix. The final state scattering wave functions between Λ and p and η and 3He are described rigorously. The Λ production is ...

  19. Casualties and threshold effects

    International Nuclear Information System (INIS)

    Mays, C.W.; National Cancer Inst., Bethesda

    1988-01-01

    Radiation effects like cancer are denoted as casualties. Other radiation effects occur almost in everyone when the radiation dose is sufficiently high. One then speaks of radiation effects with a threshold dose. In this article the author puts his doubt about this classification of radiation effects. He argues that some effects of exposure to radiation do not fit in this classification. (H.W.). 19 refs.; 2 figs.; 1 tab

  20. Resonance phenomena near thresholds

    International Nuclear Information System (INIS)

    Persson, E.; Mueller, M.; Rotter, I.; Technische Univ. Dresden

    1995-12-01

    The trapping effect is investigated close to the elastic threshold. The nucleus is described as an open quantum mechanical many-body system embedded in the continuum of decay channels. An ensemble of compound nucleus states with both discrete and resonance states is investigated in an energy-dependent formalism. It is shown that the discrete states can trap the resonance ones and also that the discrete states can directly influence the scattering cross section. (orig.)

  1. Non-Darwinian evolution for the source detection of atmospheric releases

    Science.gov (United States)

    Cervone, Guido; Franzese, Pasquale

    2011-08-01

    A non-Darwinian evolutionary algorithm is presented as search engine to identify the characteristics of a source of atmospheric pollutants, given a set of concentration measurements. The algorithm drives iteratively a forward dispersion model from tentative sources toward the real source. The solutions of non-Darwinian evolution processes are not generated through pseudo-random operators, unlike traditional evolutionary algorithms, but through a reasoning process based on machine learning rule generation and instantiation. The new algorithm is tested with both a synthetic case and with the Prairie Grass field experiment. To further test the capabilities of the algorithm to work in real-world scenarios, the source identification of all Prairie Grass releases was performed with a decreasing number of sensor measurements, and a relationship is found between the precision of the solution, the number of sensors available, and the levels of concentration measured by the sensors. The proposed methodology can be used for a variety of optimization problems, and is particularly suited for problems where the operations needed for evaluating new candidate solutions are computationally expensive.

  2. A systems approach for detecting sources of Phytophthora contamination in nurseries

    Science.gov (United States)

    Jennifer L. Parke; Niklaus Grünwald; Carrie Lewis; Val Fieland

    2010-01-01

    Nursery plants are also important long-distance vectors of non-indigenous pathogens such as P. ramorum and P. kernoviae. Pre-shipment inspections have not been adequate to ensure that shipped plants are free from Phytophthora, nor has this method informed growers about sources of contamination in their...

  3. Bootstrap inversion technique for atmospheric trace gas source detection and quantification using long open-path laser measurements

    Directory of Open Access Journals (Sweden)

    C. B. Alden

    2018-03-01

    Full Text Available Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m, integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB. The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model–data mismatch. It is also tested with field observations of (1 a non-leaking source location and (2 a source location where a controlled emission of 3.1  ×  10−5 kg s−1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests. The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability and measurement uncertainty of 5 ppb (1σ, when

  4. Bootstrap inversion technique for atmospheric trace gas source detection and quantification using long open-path laser measurements

    Science.gov (United States)

    Alden, Caroline B.; Ghosh, Subhomoy; Coburn, Sean; Sweeney, Colm; Karion, Anna; Wright, Robert; Coddington, Ian; Rieker, Gregory B.; Prasad, Kuldeep

    2018-03-01

    Advances in natural gas extraction technology have led to increased activity in the production and transport sectors in the United States and, as a consequence, an increased need for reliable monitoring of methane leaks to the atmosphere. We present a statistical methodology in combination with an observing system for the detection and attribution of fugitive emissions of methane from distributed potential source location landscapes such as natural gas production sites. We measure long (> 500 m), integrated open-path concentrations of atmospheric methane using a dual frequency comb spectrometer and combine measurements with an atmospheric transport model to infer leak locations and strengths using a novel statistical method, the non-zero minimum bootstrap (NZMB). The new statistical method allows us to determine whether the empirical distribution of possible source strengths for a given location excludes zero. Using this information, we identify leaking source locations (i.e., natural gas wells) through rejection of the null hypothesis that the source is not leaking. The method is tested with a series of synthetic data inversions with varying measurement density and varying levels of model-data mismatch. It is also tested with field observations of (1) a non-leaking source location and (2) a source location where a controlled emission of 3.1 × 10-5 kg s-1 of methane gas is released over a period of several hours. This series of synthetic data tests and outdoor field observations using a controlled methane release demonstrates the viability of the approach for the detection and sizing of very small leaks of methane across large distances (4+ km2 in synthetic tests). The field tests demonstrate the ability to attribute small atmospheric enhancements of 17 ppb to the emitting source location against a background of combined atmospheric (e.g., background methane variability) and measurement uncertainty of 5 ppb (1σ), when measurements are averaged over 2 min. The

  5. Theory of the detection of the field surrounding half-dressed sources

    International Nuclear Information System (INIS)

    Compagno, G.; Passante, R.; Persico, F.

    1988-01-01

    Half-dressed sources are defined as sources deprived partially or totally of the cloud of virtual quanta which surrounds them in the ground state of the total system. Two models of a half-dressed point source S are considered, the first in the framework of the theory of massive scalar fields and the second in quantum electrodynamics (QED). In both cases the detector is modeled by a second fully dressed source T of the same field, which is also bound to an oscillation center by harmonic forces. It is shown that when S at time t = 0 is suddenly coupled to or decoupled from the field, the detector T, which is initially at rest, is set in motion after a time t = R 0 /c, where R 0 is the S-T distance. Neglecting the reaction back on the field due to the oscillatory motion of T, the amplitude of oscillation for t = ∞ is obtained as a function of R 0 . Thus the time-varying virtual field of S is shown to be capable of exerting a force which excites the model detector. For the QED case, this force is related to the properties of the energy density of the virtual field. This energy density displays a singularity at r = ct, and the mathematical nature of this singularity is studied in detail. In this way it is shown that the energy density of the time-dependent virtual field is rather different from that of a pulse of radiation emitted by a source during energy-conserving processes. The differences are discussed in detail, as well as the limitations of the model

  6. Risk thresholds for alcohol consumption

    DEFF Research Database (Denmark)

    Wood, Angela M; Kaptoge, Stephen; Butterworth, Adam S

    2018-01-01

    previous cardiovascular disease. METHODS: We did a combined analysis of individual-participant data from three large-scale data sources in 19 high-income countries (the Emerging Risk Factors Collaboration, EPIC-CVD, and the UK Biobank). We characterised dose-response associations and calculated hazard......BACKGROUND: Low-risk limits recommended for alcohol consumption vary substantially across different national guidelines. To define thresholds associated with lowest risk for all-cause mortality and cardiovascular disease, we studied individual-participant data from 599 912 current drinkers without......·4 million person-years of follow-up. For all-cause mortality, we recorded a positive and curvilinear association with the level of alcohol consumption, with the minimum mortality risk around or below 100 g per week. Alcohol consumption was roughly linearly associated with a higher risk of stroke (HR per 100...

  7. Pulsed Operation of a Compact Fusion Neutron Source Using a High-Voltage Pulse Generator Developed for Landmine Detection

    International Nuclear Information System (INIS)

    Yamauchi, Kunihito; Watanabe, Masato; Okino, Akitoshi; Kohno, Toshiyuki; Hotta, Eiki; Yuura, Morimasa

    2005-01-01

    Preliminary experimental results of pulsed neutron source based on a discharge-type beam fusion called Inertial Electrostatic Confinement Fusion (IECF) for landmine detection are presented. In Japan, a research and development project for constructing an advanced anti-personnel landmine detection system by using IECF, which is effective not only for metal landmines but also for plastic ones, is now in progress. This project consists of some R and D topics, and one of them is R and D of a high-voltage pulse generator system specialized for landmine detection, which can be used in the severe environment such as that in the field in Afghanistan. Thus a prototype of the system for landmine detection was designed and fabricated in consideration of compactness, lightness, cooling performance, dustproof and robustness. By using this prototype pulse generator system, a conventional IECF device was operated as a preliminary experiment. As a result, it was confirmed that the suggested pulse generator system is suitable for landmine detection system, and the results follow the empirical law obtained by the previous experiments. The maximum neutron production rate of 2.0x10 8 n/s was obtained at a pulsed discharge of -51 kV, 7.3 A

  8. Robust fault detection for the dynamics of high-speed train with multi-source finite frequency interference.

    Science.gov (United States)

    Bai, Weiqi; Dong, Hairong; Yao, Xiuming; Ning, Bin

    2018-04-01

    This paper proposes a composite fault detection scheme for the dynamics of high-speed train (HST), using an unknown input observer-like (UIO-like) fault detection filter, in the presence of wind gust and operating noises which are modeled as disturbance generated by exogenous system and unknown multi-source disturbance within finite frequency domain. Using system input and system output measurements, the fault detection filter is designed to generate the needed residual signals. In order to decouple disturbance from residual signals without truncating the influence of faults, this paper proposes a method to partition the disturbance into two parts. One subset of the disturbance does not appear in residual dynamics, and the influence of the other subset is constrained by H ∞ performance index in a finite frequency domain. A set of detection subspaces are defined, and every different fault is assigned to its own detection subspace to guarantee the residual signals are diagonally affected promptly by the faults. Simulations are conducted to demonstrate the effectiveness and merits of the proposed method. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Development of portable mass spectrometer with electron cyclotron resonance ion source for detection of chemical warfare agents in air.

    Science.gov (United States)

    Urabe, Tatsuya; Takahashi, Kazuya; Kitagawa, Michiko; Sato, Takafumi; Kondo, Tomohide; Enomoto, Shuichi; Kidera, Masanori; Seto, Yasuo

    2014-01-01

    A portable mass spectrometer with an electron cyclotron resonance ion source (miniECRIS-MS) was developed. It was used for in situ monitoring of trace amounts of chemical warfare agents (CWAs) in atmospheric air. Instrumental construction and parameters were optimized to realize a fast response, high sensitivity, and a small body size. Three types of CWAs, i.e., phosgene, mustard gas, and hydrogen cyanide were examined to check if the mass spectrometer was able to detect characteristic elements and atomic groups. From the results, it was found that CWAs were effectively ionized in the miniECRIS-MS, and their specific signals could be discerned over the background signals of air. In phosgene, the signals of the 35Cl+ and 37Cl+ ions were clearly observed with high dose-response relationships in the parts-per-billion level, which could lead to the quantitative on-site analysis of CWAs. A parts-per-million level of mustard gas, which was far lower than its lethal dosage (LCt50), was successfully detected with a high signal-stability of the plasma ion source. It was also found that the chemical forms of CWAs ionized in the plasma, i.e., monoatomic ions, fragment ions, and molecular ions, could be detected, thereby enabling the effective identification of the target CWAs. Despite the disadvantages associated with miniaturization, the overall performance (sensitivity and response time) of the miniECRIS-MS in detecting CWAs exceeded those of sector-type ECRIS-MS, showing its potential for on-site detection in the future. Copyright © 2013 Elsevier B.V. All rights reserved.

  10. Use of an external source (60Co) for 32P detection efficiency determination by the Cerenkov effect, in soil extracts

    International Nuclear Information System (INIS)

    Nascimento Filho, V.F. do.

    1975-03-01

    The detection of 32 P in aqueous extracts is usually made with the aid of a Geiger-Muller detector, with thin window and sample on a planchet. Presently the technique is being developed of detection of high energy beta particles emitters ( 32 P, 42 K, 86 Rb) through the Cerenkov effect, using a commercial liquid scintillation system. This technique, despite being approximately 30 times more sensitive, has the inconvenience of varying the detection efficiency, mainly for color samples (soil extracts, for instance). From this stems the need for determining the detection efficiency for each sample. The internal standardization and channels ratio methods show a series of drawbacks, mainly the non-reutilization of the samples (1st method) and statistical uncertainty for low activity samples (2nd method). The elimination of these dreawbacks can be achieved through the utilization of the external standardization method. A 60 Co source with 1,4 μCi activity has been adapted to the sample elevator of the detector system, and a comparison was made with the channels ratio method to evaluate the efficiency of 32 P detection in soil extracts (P extraction and fractionation). The external standardization method showed to be more accurate, besides being influenced to a lesser degree by high voltage variation, sample volume and vial types. In the case of large samples, it is advisable to carry out detection in vials filled up to their full capacity; in the case of small samples, the whole volume should be transferred to the vials and completed up to 9 ml for nylon vials,10 ml for glass vials and 11 to 14 ml for polyethilene vials. On the other hand, plastic vials showed higher detection efficiency than ones. As to background radiation, the lowest rates were given by nylon vials and the highest by Beckman glass vials [pt

  11. Fast and accurate detection of spread source in large complex networks.

    Science.gov (United States)

    Paluch, Robert; Lu, Xiaoyan; Suchecki, Krzysztof; Szymański, Bolesław K; Hołyst, Janusz A

    2018-02-06

    Spread over complex networks is a ubiquitous process with increasingly wide applications. Locating spread sources is often important, e.g. finding the patient one in epidemics, or source of rumor spreading in social network. Pinto, Thiran and Vetterli introduced an algorithm (PTVA) to solve the important case of this problem in which a limited set of nodes act as observers and report times at which the spread reached them. PTVA uses all observers to find a solution. Here we propose a new approach in which observers with low quality information (i.e. with large spread encounter times) are ignored and potential sources are selected based on the likelihood gradient from high quality observers. The original complexity of PTVA is O(N α ), where α ∈ (3,4) depends on the network topology and number of observers (N denotes the number of nodes in the network). Our Gradient Maximum Likelihood Algorithm (GMLA) reduces this complexity to O (N 2 log (N)). Extensive numerical tests performed on synthetic networks and real Gnutella network with limitation that id's of spreaders are unknown to observers demonstrate that for scale-free networks with such limitation GMLA yields higher quality localization results than PTVA does.

  12. Detection of anomalies in radio tomography of asteroids: Source count and forward errors

    Science.gov (United States)

    Pursiainen, S.; Kaasalainen, M.

    2014-09-01

    The purpose of this study was to advance numerical methods for radio tomography in which asteroid's internal electric permittivity distribution is to be recovered from radio frequency data gathered by an orbiter. The focus was on signal generation via multiple sources (transponders) providing one potential, or even essential, scenario to be implemented in a challenging in situ measurement environment and within tight payload limits. As a novel feature, the effects of forward errors including noise and a priori uncertainty of the forward (data) simulation were examined through a combination of the iterative alternating sequential (IAS) inverse algorithm and finite-difference time-domain (FDTD) simulation of time evolution data. Single and multiple source scenarios were compared in two-dimensional localization of permittivity anomalies. Three different anomaly strengths and four levels of total noise were tested. Results suggest, among other things, that multiple sources can be necessary to obtain appropriate results, for example, to distinguish three separate anomalies with permittivity less or equal than half of the background value, relevant in recovery of internal cavities.

  13. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  14. Stories about breast cancer in Australian women's magazines: information sources for risk, early detection and treatment.

    Science.gov (United States)

    Wilkes, L; Withnall, J; Harris, R; White, K; Beale, B; Hobson, J; Durham, M; Kristjanson, L

    2001-06-01

    Sixty articles in five Australian women's magazines were analyzed for journalistic qualities, metaphors, narrative features and accuracy of clinical facts related to risk, early detection and treatment of breast cancer. The stories were features, news features or soft news stories. The stories reflected the 'good news' editorial style of women's magazines. A dominant theme in the stories was that early detection of breast cancer is crucial and equals survival. While there were few inaccuracies in the stories, there was little detail of treatment modalities, an emphasis on lifestyle as a risk factor and a prevailing message that a genetic history of breast cancer means you will get it. A major implication of the findings is that nurses, who provide information to women, must be aware of the goals of journalists and the educational power of narrative logic of stories in women's magazines.

  15. Highly sensitive time resolved singlet oxygen luminescence detection using LEDs as the excitation source

    International Nuclear Information System (INIS)

    Hackbarth, S; Schlothauer, J; Preuss, A; Röder, B

    2013-01-01

    For the first time singlet oxygen luminescence kinetics in living cells were detected at high precision using LED light for excitation. As LED technology evolves, the light intensity emitted by standard LEDs allows photosensitized singlet oxygen luminescence detection in solution and cell suspensions. We present measurements superior to those of most actual laser powered setups regarding precision of singlet oxygen kinetics in solutions and cell suspensions. Data presented here show that LED based setups allow the determination of the photosensitizer triplet and singlet oxygen decay times in vitro with an accuracy of 0.1 μs. This enables monitoring of the photosensitizer efficiency and interaction with the cellular components using illumination doses small enough not to cause cell death. (letter)

  16. Study on development and actual application of scientific crime detection technique using small scale neutron radiation source

    International Nuclear Information System (INIS)

    Suzuki, Yasuhiro; Kishi, Toru; Tachikawa, Noboru; Ishikawa, Isamu.

    1997-01-01

    PGA (Prompt γ-ray Analysis) is an analytic method of γ-ray generated from atomic nuclei of elements in the specimen just after irradiation (within 10(exp-14)sec.) of neutron to it. As using neutron with excellent transmission for an exciting source, this method can be used for inspecting the matters in closed containers non-destructively, and can also detect non-destructively light elements such as boron, nitrogen and others difficult by other non-destructive analysis. Especially, it is found that this method can detect such high concentration of nitrogen, chlorine and others which are characteristic elements for the explosives. However, as there are a number of limitations at the nuclear reactor site, development of an analytical apparatus for small scale neutron radiation source was begun, at first. In this fiscal year, analysis of the light elements such as nitrogen, chlorine and others using PGA was attempted by using 252-Cf as the simplest neutron source in its operation. As the 252-Cf neutron flux was considerably lower than that of nuclear reactor, its analytical sensitivity was also investigated. (G.K.)

  17. Developing an Open Source, Reusable Platform for Distributed Collaborative Information Management in the Early Detection Research Network

    Science.gov (United States)

    Hart, Andrew F.; Verma, Rishi; Mattmann, Chris A.; Crichton, Daniel J.; Kelly, Sean; Kincaid, Heather; Hughes, Steven; Ramirez, Paul; Goodale, Cameron; Anton, Kristen; hide

    2012-01-01

    For the past decade, the NASA Jet Propulsion Laboratory, in collaboration with Dartmouth University has served as the center for informatics for the Early Detection Research Network (EDRN). The EDRN is a multi-institution research effort funded by the U.S. National Cancer Institute (NCI) and tasked with identifying and validating biomarkers for the early detection of cancer. As the distributed network has grown, increasingly formal processes have been developed for the acquisition, curation, storage, and dissemination of heterogeneous research information assets, and an informatics infrastructure has emerged. In this paper we discuss the evolution of EDRN informatics, its success as a mechanism for distributed information integration, and the potential sustainability and reuse benefits of emerging efforts to make the platform components themselves open source. We describe our experience transitioning a large closed-source software system to a community driven, open source project at the Apache Software Foundation, and point to lessons learned that will guide our present efforts to promote the reuse of the EDRN informatics infrastructure by a broader community.

  18. Detecting Large-Scale Brain Networks Using EEG: Impact of Electrode Density, Head Modeling and Source Localization

    Science.gov (United States)

    Liu, Quanying; Ganzetti, Marco; Wenderoth, Nicole; Mantini, Dante

    2018-01-01

    Resting state networks (RSNs) in the human brain were recently detected using high-density electroencephalography (hdEEG). This was done by using an advanced analysis workflow to estimate neural signals in the cortex and to assess functional connectivity (FC) between distant cortical regions. FC analyses were conducted either using temporal (tICA) or spatial independent component analysis (sICA). Notably, EEG-RSNs obtained with sICA were very similar to RSNs retrieved with sICA from functional magnetic resonance imaging data. It still remains to be clarified, however, what technological aspects of hdEEG acquisition and analysis primarily influence this correspondence. Here we examined to what extent the detection of EEG-RSN maps by sICA depends on the electrode density, the accuracy of the head model, and the source localization algorithm employed. Our analyses revealed that the collection of EEG data using a high-density montage is crucial for RSN detection by sICA, but also the use of appropriate methods for head modeling and source localization have a substantial effect on RSN reconstruction. Overall, our results confirm the potential of hdEEG for mapping the functional architecture of the human brain, and highlight at the same time the interplay between acquisition technology and innovative solutions in data analysis. PMID:29551969

  19. Novel human-associated Lachnospiraceae genetic markers improve detection of fecal pollution sources in urban waters.

    Science.gov (United States)

    Feng, Shuchen; Bootsma, Melinda; McLellan, Sandra L

    2018-05-04

    The human microbiome contains many organisms that could potentially be used as indicators of human fecal pollution. Here we report the development of two novel human-associated genetic marker assays that target organisms within the family Lachnospiraceae Next-generation sequencing of the V6 region of the 16S rRNA gene from sewage and animal stool samples identified 40 human-associated marker candidates with a robust signal in sewage and low or no occurrence in nonhuman hosts. Two were chosen for quantitative PCR (qPCR) assay development using longer sequences (V2 to V9 regions) generated from clone libraries. Validation of these assays, designated Lachno3 and Lachno12, was performed using fecal samples (n=55) from cat, dog, pig, cow, deer, and gull sources, and compared with established host-associated assays (Lachno2, and two Human Bacteroides assays; HB and HF183/BacR287). Each of the established assays cross-reacted with at least one other animal, including animals common in urban areas. Lachno3 and Lachno12 were primarily human-associated; however, Lachno12 demonstrated low levels of cross-reactivity with select cows, and non-specific amplification in pigs. This limitation may not be problematic when testing urban waters. These novel markers resolved ambiguous results from previous investigations in stormwater-impacted waters, demonstrating their utility. The complexity of the microbiome in humans and animals suggests no single organism is strictly specific to humans, and multiple complementary markers used in combination will provide the highest resolution and specificity for assessing fecal pollution sources. IMPORTANCE Traditional fecal indicator bacteria do not distinguish animal from human fecal pollution, which is necessary to evaluate health risks and mitigate pollution sources. Assessing urban areas is challenging since water can be impacted by sewage, which has a high likelihood of carrying human pathogens, as well as pet waste and urban wildlife. We

  20. ON COMPUTING UPPER LIMITS TO SOURCE INTENSITIES

    International Nuclear Information System (INIS)

    Kashyap, Vinay L.; Siemiginowska, Aneta; Van Dyk, David A.; Xu Jin; Connors, Alanna; Freeman, Peter E.; Zezas, Andreas

    2010-01-01

    A common problem in astrophysics is determining how bright a source could be and still not be detected in an observation. Despite the simplicity with which the problem can be stated, the solution involves complicated statistical issues that require careful analysis. In contrast to the more familiar confidence bound, this concept has never been formally analyzed, leading to a great variety of often ad hoc solutions. Here we formulate and describe the problem in a self-consistent manner. Detection significance is usually defined by the acceptable proportion of false positives (background fluctuations that are claimed as detections, or Type I error), and we invoke the complementary concept of false negatives (real sources that go undetected, or Type II error), based on the statistical power of a test, to compute an upper limit to the detectable source intensity. To determine the minimum intensity that a source must have for it to be detected, we first define a detection threshold and then compute the probabilities of detecting sources of various intensities at the given threshold. The intensity that corresponds to the specified Type II error probability defines that minimum intensity and is identified as the upper limit. Thus, an upper limit is a characteristic of the detection procedure rather than the strength of any particular source. It should not be confused with confidence intervals or other estimates of source intensity. This is particularly important given the large number of catalogs that are being generated from increasingly sensitive surveys. We discuss, with examples, the differences between these upper limits and confidence bounds. Both measures are useful quantities that should be reported in order to extract the most science from catalogs, though they answer different statistical questions: an upper bound describes an inference range on the source intensity, while an upper limit calibrates the detection process. We provide a recipe for computing upper

  1. Recent progress of below-threshold harmonic generation

    International Nuclear Information System (INIS)

    Xiong, Wei-Hao; Peng, Liang-You; Gong, Qihuang

    2017-01-01

    The harmonics generated from the interaction of a strong laser field with atoms and molecules in the gas phase can be applied as coherent light sources and detecting techniques for structures and dynamics in matter. In the last three decades, the most prevailing experimental and theoretical studies have been focused on the high-order harmonic generation due to its applications in attosecond science. However, low-order harmonics near the ionization threshold of the target have been less explored, partially because the spectrum in this region is more complicated from both the theoretical and experimental point of view. After several pioneering investigations in the mid 1990s, near threshold harmonics (NTHs) begun to draw a great attention again because of the development of high repetition rate cavity enhanced harmonics about 10 years ago. Very recently, NTHs have attracted a lot of experimental and theoretical studies due to their potential applications as light sources and complicated mechanisms. In this topical review, we will summarize the progress of NTHs, including the early and recent experimental measurements in atoms and molecules, as well as the relevant theoretical explorations of these harmonics. (topical review)

  2. Plutonium Assay in Soil at the BRC Threshold

    International Nuclear Information System (INIS)

    Miller, T.

    2003-01-01

    The Atomic Weapons Establishment (AWE) at Aldermaston has investigated the performance of low and high-resolution gamma-ray detectors for plutonium (Pu) assay in soil at the UK Below Regulatory Concern (BRC) threshold (0.4 Bq/g above the natural background activity level). The goal was a rapid and economical technique for sorting large volumes of lightly contaminated soils into above and BRC fractions. The strategy involved utilizing the relatively high yield 60 keV emission from Am-241 ingrowth (Pu-241 daughter) and known isotopic ratios. This paper covers the determination of detector response factors for an Am-241 source positioned at various locations within a circular tray of soil. These factors were weighted, according to the relative volumes that they represent, in order to derive a uniform response factor and quantify the systematic error for non-uniform activity distributions. Detection limits and random errors were also derived from the counting data. The high-resolution detector was shown to have the best detection levels and lowest systematic and random errors. However, uncertainties for non-uniform distributions of contamination were relatively large. Hence, analyzing soils at the BRC threshold would only be feasible if contamination was well distributed throughout the soil sample being monitored. Fortunately, contaminated land at AWE is generally homogeneous and so the technique has wide applicability

  3. Image thresholding in the high resolution target movement monitor

    Science.gov (United States)

    Moss, Randy H.; Watkins, Steve E.; Jones, Tristan H.; Apel, Derek B.; Bairineni, Deepti

    2009-03-01

    Image thresholding in the High Resolution Target Movement Monitor (HRTMM) is examined. The HRTMM was developed at the Missouri University of Science and Technology to detect and measure wall movements in underground mines to help reduce fatality and injury rates. The system detects the movement of a target with sub-millimeter accuracy based on the images of one or more laser dots projected on the target and viewed by a high-resolution camera. The relative position of the centroid of the laser dot (determined by software using thresholding concepts) in the images is the key factor in detecting the target movement. Prior versions of the HRTMM set the image threshold based on a manual, visual examination of the images. This work systematically examines the effect of varying threshold on the calculated centroid position and describes an algorithm for determining a threshold setting. First, the thresholding effects on the centroid position are determined for a stationary target. Plots of the centroid positions as a function of varying thresholds are obtained to identify clusters of thresholds for which the centroid position does not change for stationary targets. Second, the target is moved away from the camera in sub-millimeter increments and several images are obtained at each position and analyzed as a function of centroid position, target movement and varying threshold values. With this approach, the HRTMM can accommodate images in batch mode without the need for manual intervention. The capability for the HRTMM to provide automated, continuous monitoring of wall movement is enhanced.

  4. Radioactive sources and contaminated materials in scrap: monitoring, detection and remedial actions

    International Nuclear Information System (INIS)

    Gallini, R.; Berna, V.; Bonora, A.; Santini, M.

    1999-01-01

    The scrap recycling in steel and other metal mills represents one of the most relevant activities in the Province of Brescia (Lombardy, Italy). In our Province more than 20 million tonnes of metal scrap are recycled every year by a melting process. Since 1990, many accidents which took place were caused by the unwanted melting of radioactive sources, that were probably hidden in metal scrap. In 1993, the Italian Government stated directives to monitor metal scrap imported from non-EC countries because of the suspicion of the illegal traffic of radioactive materials. In 1996, a law imposed the control of all metal scrap, regardless of their origins. Since 1993, our staff have controlled thousands of railway wagons and trucks. Approximately a hundred steel mills and foundries of aluminium, cooper, brass, etc. have also been controlled and many samples have been collected (flue dust, slag, finished products). During these controls, contaminated areas have been brought to light in two warehouses (Cs 137), in 6 companies (Cs 137 and Am 241), in two landfills of industrial waste (Cs 137) and in a quarry (Cs 137). Up to now the contaminated areas have been cleaned, except for the last one. About 150 radioactive sources on contaminated materials have been found in metal scrap. We found radioactive sources of Co 60, Ra 226, Ir 192, Kr 85, Am 241, while the contamination of metals was mainly due to Ra 226. The situation described above justifies an accurate control of the amount of scrap to reduce the risk of contamination of the workers in the working areas, in the environment and in the general public. (author)

  5. Determination of the detection limit and decision threshold for ionizing radiation measurements. Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment

    International Nuclear Information System (INIS)

    2000-01-01

    This part of ISO 11929 addresses the field of ionizing radiation measurements in which events (in particular pulses) are counted by high resolution gamma spectrometry registrating a pulse-heights distribution (acquisition of a multichannel spectrum), for example on samples. It considers exclusively the random character of radioactive decay and of pulse counting and ignores all other influences (e.g. arising from sample treatment, weighing, enrichment or the instability of the test setup). It assumes that the distance of neighbouring peaks of gamma lines is not smaller than four times the full width half maximum (FWHM) of gamma line and that the background near to gamma line is nearly a straight line. Otherwise ISO 11929-1 or ISO 11929-2 should be used. ISO 11929 consists of the following parts, under the general title Determination of the detection limit and decision threshold for ionizing radiation measurements: Part 1: Fundamentals and application to counting measurements without the influence of sample treatment; Part 2: Fundamentals and application to counting measurements with the influence of sample treatment; Part 3: Fundamentals and application to counting measurements by high resolution gamma spectrometry, without the influence of sample treatment; Part 4: Fundamentals and application to measurements by use of linear scale analogue ratemeters, without the influence of sample treatment. This part of ISO 11929 was prepared in parallel with other International Standards prepared by WG2 (now WG 17): ISO 11932:1996, Activity measurements of solid materials considered for recycling, re-use or disposal as nonradioactive waste, and ISO 11929-1, ISO 11929-2 and ISO 11929-4, and is, consequently, complementary to these documents

  6. On the Nature of Off-limb Flare Continuum Sources Detected by SDO /HMI

    Energy Technology Data Exchange (ETDEWEB)

    Heinzel, P.; Kašparová, J. [Astronomical Institute, Czech Academy of Sciences, 25165 Ondřejov (Czech Republic); Kleint, L.; Krucker, S., E-mail: pheinzel@asu.cas.cz [University of Applied Sciences and Arts Northwestern Switzerland, Bahnhofstrasse 6, 5210 Windisch (Switzerland)

    2017-09-20

    The Helioseismic and Magnetic Imager on board the Solar Dynamics Observatory has provided unique observations of off-limb flare emission. White-light continuum enhancements were detected in the “continuum” channel of the Fe 6173 Å line during the impulsive phase of the observed flares. In this paper we aim to determine which radiation mechanism is responsible for such enhancement being seen above the limb, at chromospheric heights around or below 1000 km. Using a simple analytical approach, we compare two candidate mechanisms, the hydrogen recombination continuum (Paschen) and the Thomson continuum due to scattering of disk radiation on flare electrons. Both mechanisms depend on the electron density, which is typically enhanced during the impulsive phase of a flare as the result of collisional ionization (both thermal and also non-thermal due to electron beams). We conclude that for electron densities higher than 10{sup 12} cm{sup −3}, the Paschen recombination continuum significantly dominates the Thomson scattering continuum and there is some contribution from the hydrogen free–free emission. This is further supported by detailed radiation-hydrodynamical (RHD) simulations of the flare chromosphere heated by the electron beams. We use the RHD code FLARIX to compute the temporal evolution of the flare-heating in a semi-circular loop. The synthesized continuum structure above the limb resembles the off-limb flare structures detected by HMI, namely their height above the limb, as well as the radiation intensity. These results are consistent with recent findings related to hydrogen Balmer continuum enhancements, which were clearly detected in disk flares by the IRIS near-ultraviolet spectrometer.

  7. Design of equipment for the detection of nickel-63 source of the gc-2010 electron capture detector

    International Nuclear Information System (INIS)

    Lopez O, A.

    2016-01-01

    It is necessary to design a system that is able to detect the presence of the radioactive source nickel-63, located within the electron capture detector gc- 2010 through the use of Arduino technology, presence sensors and motion This in turn It must be interfaced with a system of tracking and tracing , so collectively be mounted inside the mobile laboratory for research and technological development of systems and equipment for measuring radiation , gases and particles. The program has a structure reading sensors, processing the acquired data and execution of an action if necessary. The system does this by receiving data autonomously, the data is processed and at the end, determines whether the source is in normal operating conditions, if subjected to movements that may cause undesired operation if being handled, or has it has been extracted. (Author)

  8. Search Strategy of Detector Position For Neutron Source Multiplication Method by Using Detected-Neutron Multiplication Factor

    International Nuclear Information System (INIS)

    Endo, Tomohiro

    2011-01-01

    In this paper, an alternative definition of a neutron multiplication factor, detected-neutron multiplication factor kdet, is produced for the neutron source multiplication method..(NSM). By using kdet, a search strategy of appropriate detector position for NSM is also proposed. The NSM is one of the practical subcritical measurement techniques, i.e., the NSM does not require any special equipment other than a stationary external neutron source and an ordinary neutron detector. Additionally, the NSM method is based on steady-state analysis, so that this technique is very suitable for quasi real-time measurement. It is noted that the correction factors play important roles in order to accurately estimate subcriticality from the measured neutron count rates. The present paper aims to clarify how to correct the subcriticality measured by the NSM method, the physical meaning of the correction factors, and how to reduce the impact of correction factors by setting a neutron detector at an appropriate detector position

  9. Prospects of detection of the first sources with SKA using matched filters

    Science.gov (United States)

    Ghara, Raghunath; Choudhury, T. Roy; Datta, Kanan K.; Mellema, Garrelt; Choudhuri, Samir; Majumdar, Suman; Giri, Sambit K.

    2018-05-01

    The matched filtering technique is an efficient method to detect H ii bubbles and absorption regions in radio interferometric observations of the redshifted 21-cm signal from the epoch of reionization and the Cosmic Dawn. Here, we present an implementation of this technique to the upcoming observations such as the SKA1-low for a blind search of absorption regions at the Cosmic Dawn. The pipeline explores four dimensional parameter space on the simulated mock visibilities using a MCMC algorithm. The framework is able to efficiently determine the positions and sizes of the absorption/H ii regions in the field of view.

  10. Use of FDG-PET to detect a chronic odontogenic infection as a possible source of the brain abscess.

    Science.gov (United States)

    Sato, Jun; Kuroshima, Takeshi; Wada, Mayumi; Satoh, Akira; Watanabe, Shiro; Okamoto, Shozo; Shiga, Tohru; Tamaki, Nagara; Kitagawa, Yoshimasa

    2016-05-01

    This study describes the use of (18)F-fluoro-2-deoxyglucose positron emission tomography (FDG-PET) to detect a chronic odontogenic infection as the possible origin of a brain abscess (BA). A 74-year-old man with esophageal carcinoma was referred to our department to determine the origin of a BA in his oral cavity. He had no acute odontogenic infections. The BA was drained, and bacteria of the Staphylococcus milleri group were detected. Whole body FDG-PET revealed that the only sites of definite uptake of FDG were the esophageal carcinoma and the left upper maxillary region (SUVmax: 4.5). These findings suggested that the BA may have originated from a chronic periodontal infection. Six teeth with progressive chronic periodontal disease were extracted to remove the possible source of BA. These findings excluded the possibility of direct spread of bacteria from the odontogenic infectious lesion to the intracranial cavity. After extraction, there was no relapse of BA.

  11. Experimental optimisation of a moderated sup 2 sup 5 sup 2 Cf source for land mine detection

    CERN Document Server

    Zuin, L; Fabris, D; Lunardon, M; Nebbia, G; Viesti, G; Cinausero, M; Fioretto, E; Prete, G; Palomba, M; Pantaleo, A

    2000-01-01

    A moderated sup 2 sup 5 sup 2 Cf source to be used for the detection of buried land mines has been experimentally characterised. The moderating structure was obtained by using an inner sphere of Pb embedded in a high-density polyethylene (HDPe) brick assembly. The number of capture events in a Cd sample placed outside the structure was used, by varying the dimensions of the brick assembly, to determine the best moderator geometry. Furthermore, the detection of land mines by neutron capture reaction on nitrogen nuclei contained in the explosive was simulated by in field measurements placing the Cd sample at different depths in the soil. The obtained results are compared with predictions from Monte Carlo calculations.

  12. Tianma 65-m telescope detection of new OH maser features towards the water fountain source IRAS 18286-0959

    Science.gov (United States)

    Chen, Xi; Shen, Zhi-Qiang; Li, Xiao-Qiong; Yang, Kai; Nakashima, Jun-ichi; Wu, Ya-Jun; Zhao, Rong-Bin; Li, Juan; Wang, Jun-Zhi; Jiang, Dong-Rong; Wang, Jin-Qing; Li, Bin; Zhong, Wei-Ye; Yung, Bosco H. K.

    2017-07-01

    We report the results of the OH maser observation towards the water fountain source IRAS 18286-0959 using the newly built Shanghai Tianma 65-m Radio Telescope. We observed the three OH ground state transition lines at frequencies of 1612, 1665 and 1667 MHz. Comparing with the spectra of previous observations, we find new maser spectral components at velocity channels largely shifted from the systemic velocity: the velocity offsets of the newly found components lie in the range 20-40 km s-1 with respect to the systemic velocity. Besides maser variability, another possible interpretation for the newly detected maser features is that part of the molecular gas in the circumstellar envelope is accelerated. The acceleration is probably caused by the passage of a high-velocity molecular jet, which has been detected in previous Very Long Baseline Interferometry observations in the H2O maser line.

  13. Intermediate structure and threshold phenomena

    International Nuclear Information System (INIS)

    Hategan, Cornel

    2004-01-01

    The Intermediate Structure, evidenced through microstructures of the neutron strength function, is reflected in open reaction channels as fluctuations in excitation function of nuclear threshold effects. The intermediate state supporting both neutron strength function and nuclear threshold effect is a micro-giant neutron threshold state. (author)

  14. Flash sourcing, or rapid detection and characterization of earthquake effects through website traffic analysis

    Directory of Open Access Journals (Sweden)

    Laurent Frobert

    2011-06-01

    Full Text Available

    This study presents the latest developments of an approach called ‘flash sourcing’, which provides information on the effects of an earthquake within minutes of its occurrence. Information is derived from an analysis of the website traffic surges of the European–Mediterranean Seismological Centre website after felt earthquakes. These surges are caused by eyewitnesses to a felt earthquake, who are the first who are informed of, and hence the first concerned by, an earthquake occurrence. Flash sourcing maps the felt area, and at least in some circumstances, the regions affected by severe damage or network disruption. We illustrate how the flash-sourced information improves and speeds up the delivery of public earthquake information, and beyond seismology, we consider what it can teach us about public responses when experiencing an earthquake. Future developments should improve the description of the earthquake effects and potentially contribute to the improvement of the efficiency of earthquake responses by filling the information gap after the occurrence of an earthquake.

  15. Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery

    Directory of Open Access Journals (Sweden)

    N. C. Wright

    2018-04-01

    Full Text Available Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.

  16. Lead chromate detected as a source of atmospheric Pb and Cr (VI) pollution

    Science.gov (United States)

    Lee, Pyeong-Koo; Yu, Soonyoung; Chang, Hye Jung; Cho, Hye Young; Kang, Min-Ju; Chae, Byung-Gon

    2016-10-01

    Spherical black carbon aggregates were frequently observed in dust dry deposition in Daejeon, Korea. They were tens of micrometers in diameter and presented a mixture of black carbon and several mineral phases. Transmission electron microscopy (TEM) observations with energy-dispersive X-ray spectroscopy (EDS) and selected area diffraction pattern (SADP) analyses confirmed that the aggregates were compact and included significant amounts of lead chromate (PbCrO4). The compositions and morphologies of the nanosized lead chromate particles suggest that they probably originated from traffic paint used in roads and were combined as discrete minerals with black carbon. Based on Pb isotope analysis and air-mass backward trajectories, the dust in Daejeon received a considerable input of anthropogenic pollutants from heavily industrialized Chinese cities, which implies that long-range transported aerosols containing PbCrO4 were a possible source of the lead and hexavalent chromium levels in East Asia. Lead chromate should be considered to be a source of global atmospheric Pb and Cr(VI) pollution, especially given its toxicity.

  17. Lead chromate detected as a source of atmospheric Pb and Cr (VI) pollution.

    Science.gov (United States)

    Lee, Pyeong-Koo; Yu, Soonyoung; Chang, Hye Jung; Cho, Hye Young; Kang, Min-Ju; Chae, Byung-Gon

    2016-10-25

    Spherical black carbon aggregates were frequently observed in dust dry deposition in Daejeon, Korea. They were tens of micrometers in diameter and presented a mixture of black carbon and several mineral phases. Transmission electron microscopy (TEM) observations with energy-dispersive X-ray spectroscopy (EDS) and selected area diffraction pattern (SADP) analyses confirmed that the aggregates were compact and included significant amounts of lead chromate (PbCrO 4 ). The compositions and morphologies of the nanosized lead chromate particles suggest that they probably originated from traffic paint used in roads and were combined as discrete minerals with black carbon. Based on Pb isotope analysis and air-mass backward trajectories, the dust in Daejeon received a considerable input of anthropogenic pollutants from heavily industrialized Chinese cities, which implies that long-range transported aerosols containing PbCrO 4 were a possible source of the lead and hexavalent chromium levels in East Asia. Lead chromate should be considered to be a source of global atmospheric Pb and Cr(VI) pollution, especially given its toxicity.

  18. Open-source algorithm for detecting sea ice surface features in high-resolution optical imagery

    Science.gov (United States)

    Wright, Nicholas C.; Polashenski, Chris M.

    2018-04-01

    Snow, ice, and melt ponds cover the surface of the Arctic Ocean in fractions that change throughout the seasons. These surfaces control albedo and exert tremendous influence over the energy balance in the Arctic. Increasingly available meter- to decimeter-scale resolution optical imagery captures the evolution of the ice and ocean surface state visually, but methods for quantifying coverage of key surface types from raw imagery are not yet well established. Here we present an open-source system designed to provide a standardized, automated, and reproducible technique for processing optical imagery of sea ice. The method classifies surface coverage into three main categories: snow and bare ice, melt ponds and submerged ice, and open water. The method is demonstrated on imagery from four sensor platforms and on imagery spanning from spring thaw to fall freeze-up. Tests show the classification accuracy of this method typically exceeds 96 %. To facilitate scientific use, we evaluate the minimum observation area required for reporting a representative sample of surface coverage. We provide an open-source distribution of this algorithm and associated training datasets and suggest the community consider this a step towards standardizing optical sea ice imagery processing. We hope to encourage future collaborative efforts to improve the code base and to analyze large datasets of optical sea ice imagery.

  19. Detection of Traveling Ionospheric Disturbances (TIDs) from various man-made sources using Global Navigation Satellite System (GNSS)

    Science.gov (United States)

    Helmboldt, J.; Park, J.; von Frese, R. R. B.; Grejner-Brzezinska, D. A.

    2016-12-01

    Traveling ionospheric disturbance (TID) is generated by various sources and detectable by observing the spatial and temporal change of electron contents in the ionosphere. This study focused on detecting and analyzing TIDs generated by acoustic-gravity waves from man-made events including underground nuclear explosions (UNEs), mine collapses, mine blasts, and large chemical explosions (LCEs) using Global Navigation Satellite System (GNSS). In this study we selected different types of events for case study which covers two US and three North Korean UNEs, two large US mine collapses, three large US mine blasts, and a LCE in northern China and a second LCE at the Nevada Test Site. In most cases, we successfully detected the TIDs as array signatures from the multiple nearby GNSS stations. The array-based TID signatures from these studies were found to yield event-appropriate TID propagation speeds ranging from about a few hundred m/s to roughly a km/s. In addition, the event TID waveforms, and propagation angles and directions were established. The TID waveforms and the maximum angle between each event and the IPP of its TID with the longest travel distance from the source may help differentiate UNEs and LCEs, but the uneven distributions of the observing GNSS stations complicates these results. Thus, further analysis is required of the utility of the apertures of event signatures in the ionosphere for discriminating these events. In general, the results of this study show the potential utility of GNSS observations for detecting and mapping the ionospheric signatures of large-energy anthropological explosions and subsurface collapses.

  20. Coloring geographical threshold graphs

    Energy Technology Data Exchange (ETDEWEB)

    Bradonjic, Milan [Los Alamos National Laboratory; Percus, Allon [Los Alamos National Laboratory; Muller, Tobias [EINDHOVEN UNIV. OF TECH

    2008-01-01

    We propose a coloring algorithm for sparse random graphs generated by the geographical threshold graph (GTG) model, a generalization of random geometric graphs (RGG). In a GTG, nodes are distributed in a Euclidean space, and edges are assigned according to a threshold function involving the distance between nodes as well as randomly chosen node weights. The motivation for analyzing this model is that many real networks (e.g., wireless networks, the Internet, etc.) need to be studied by using a 'richer' stochastic model (which in this case includes both a distance between nodes and weights on the nodes). Here, we analyze the GTG coloring algorithm together with the graph's clique number, showing formally that in spite of the differences in structure between GTG and RGG, the asymptotic behavior of the chromatic number is identical: {chi}1n 1n n / 1n n (1 + {omicron}(1)). Finally, we consider the leading corrections to this expression, again using the coloring algorithm and clique number to provide bounds on the chromatic number. We show that the gap between the lower and upper bound is within C 1n n / (1n 1n n){sup 2}, and specify the constant C.

  1. Detection of interferon alpha protein reveals differential levels and cellular sources in disease.

    Science.gov (United States)

    Rodero, Mathieu P; Decalf, Jérémie; Bondet, Vincent; Hunt, David; Rice, Gillian I; Werneke, Scott; McGlasson, Sarah L; Alyanakian, Marie-Alexandra; Bader-Meunier, Brigitte; Barnerias, Christine; Bellon, Nathalia; Belot, Alexandre; Bodemer, Christine; Briggs, Tracy A; Desguerre, Isabelle; Frémond, Marie-Louise; Hully, Marie; van den Maagdenberg, Arn M J M; Melki, Isabelle; Meyts, Isabelle; Musset, Lucile; Pelzer, Nadine; Quartier, Pierre; Terwindt, Gisela M; Wardlaw, Joanna; Wiseman, Stewart; Rieux-Laucat, Frédéric; Rose, Yoann; Neven, Bénédicte; Hertel, Christina; Hayday, Adrian; Albert, Matthew L; Rozenberg, Flore; Crow, Yanick J; Duffy, Darragh

    2017-05-01

    Type I interferons (IFNs) are essential mediators of antiviral responses. These cytokines have been implicated in the pathogenesis of autoimmunity, most notably systemic lupus erythematosus (SLE), diabetes mellitus, and dermatomyositis, as well as monogenic type I interferonopathies. Despite a fundamental role in health and disease, the direct quantification of type I IFNs has been challenging. Using single-molecule array (Simoa) digital ELISA technology, we recorded attomolar concentrations of IFNα in healthy donors, viral infection, and complex and monogenic interferonopathies. IFNα protein correlated well with functional activity and IFN-stimulated gene expression. High circulating IFNα levels were associated with increased clinical severity in SLE patients, and a study of the cellular source of IFNα protein indicated disease-specific mechanisms. Measurement of IFNα attomolar concentrations by digital ELISA will enhance our understanding of IFN biology and potentially improve the diagnosis and stratification of pathologies associated with IFN dysregulation. © 2017 Rodero et al.

  2. Detection of a very bright source close to the LMC supernova SN 1987A

    Science.gov (United States)

    Nisenson, P.; Papaliolios, C.; Karovska, M.; Noyes, R.

    1987-01-01

    High angular resolution observations of the supernova in the Large Magellanic Cloud, SN 1987A, have revealed a bright source separated from the SN by approximately 60 mas with a magnitude difference of 2.7 at 656 nm (H-alpha). Speckle imaging techniques were applied to data recorded with the CfA two-dimensional photon counting detector on the CTIO 4 m telescope on March 25 and April 2 to allow measurements in H-alpha on both nights and at 533 nm and 450 nm on the second night. The nature of this object is as yet unknown, though it is almost certainly a phenomenon related to the SN.

  3. Multichannel deconvolution and source detection using sparse representations: application to Fermi project

    International Nuclear Information System (INIS)

    Schmitt, Jeremy

    2011-01-01

    This thesis presents new methods for spherical Poisson data analysis for the Fermi mission. Fermi main scientific objectives, the study of diffuse galactic background et the building of the source catalog, are complicated by the weakness of photon flux and the point spread function of the instrument. This thesis proposes a new multi-scale representation for Poisson data on the sphere, the Multi-Scale Variance Stabilizing Transform on the Sphere (MS-VSTS), consisting in the combination of a spherical multi-scale transform (wavelets, curvelets) with a variance stabilizing transform (VST). This method is applied to mono- and multichannel Poisson noise removal, missing data interpolation, background extraction and multichannel deconvolution. Finally, this thesis deals with the problem of component separation using sparse representations (template fitting). (author) [fr

  4. Radial Photonic Crystal for detection of frequency and position of radiation sources.

    Science.gov (United States)

    Carbonell, J; Díaz-Rubio, A; Torrent, D; Cervera, F; Kirleis, M A; Piqué, A; Sánchez-Dehesa, J

    2012-01-01

    Based on the concepts of artificially microstructured materials, i.e. metamaterials, we present here the first practical realization of a radial wave crystal. This type of device was introduced as a theoretical proposal in the field of acoustics, and can be briefly defined as a structured medium with radial symmetry, where the constitutive parameters are invariant under radial geometrical translations. Our practical demonstration is realized in the electromagnetic microwave spectrum, because of the equivalence between the wave problems in both fields. A device has been designed, fabricated and experimentally characterized. It is able to perform beam shaping of punctual wave sources, and also to sense position and frequency of external radiators. Owing to the flexibility offered by the design concept, other possible applications are discussed.

  5. Detection of new southern SiO maser sources associated with Mira and symbiotic stars

    International Nuclear Information System (INIS)

    Allen, D.A.; Hall, P.J.; Norris, R.P.; Troup, E.R.; Wark, R.M.; Wright, A.E.

    1989-01-01

    In 1987 July the Parkes radio telescope was used to search for 43.12 GHz SiO maser emission from southern late-type stars. We report the discovery of such emission from 12 Mira-like systems, including the symbiotic star H1-36, and discuss the implications of our data for the symbiotic stars. We identify several M-type Mira variables with unusually low SiO/infrared flux ratios, but with present data are not able to discredit the correlation between the two parameters. In addition, we present line profiles for the only other known symbiotic maser, R Aqr, at unprecedented signal-to-noise ratio; these profiles show linearly polarized emission from several components of the source. (author)

  6. Metamaterials-based sensor to detect and locate nonlinear elastic sources

    Energy Technology Data Exchange (ETDEWEB)

    Gliozzi, Antonio S.; Scalerandi, Marco [Department of Applied Science and Technology, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Torino (Italy); Miniaci, Marco; Bosia, Federico [Department of Physics, University of Torino, Via Pietro Giuria 1, 10125 Torino (Italy); Pugno, Nicola M. [Laboratory of Bio-Inspired and Graphene Nanomechanics, Department of Civil, Environmental and Mechanical Engineering, University of Trento, Via Mesiano 77, 38123 Trento (Italy); Center for Materials and Microsystems, Fondazione Bruno Kessler, Via Sommarive 18, 38123 Povo (Trento) (Italy); School of Engineering and Materials Science, Queen Mary University of London, Mile End Road, London E1 4NS (United Kingdom)

    2015-10-19

    In recent years, acoustic metamaterials have attracted increasing scientific interest for very diverse technological applications ranging from sound abatement to ultrasonic imaging, mainly due to their ability to act as band-stop filters. At the same time, the concept of chaotic cavities has been recently proposed as an efficient tool to enhance the quality of nonlinear signal analysis, particularly in the ultrasonic/acoustic case. The goal of the present paper is to merge the two concepts in order to propose a metamaterial-based device that can be used as a natural and selective linear filter for the detection of signals resulting from the propagation of elastic waves in nonlinear materials, e.g., in the presence of damage, and as a detector for the damage itself in time reversal experiments. Numerical simulations demonstrate the feasibility of the approach and the potential of the device in providing improved signal-to-noise ratios and enhanced focusing on the defect locations.

  7. Metamaterials-based sensor to detect and locate nonlinear elastic sources

    International Nuclear Information System (INIS)

    Gliozzi, Antonio S.; Scalerandi, Marco; Miniaci, Marco; Bosia, Federico; Pugno, Nicola M.

    2015-01-01

    In recent years, acoustic metamaterials have attracted increasing scientific interest for very diverse technological applications ranging from sound abatement to ultrasonic imaging, mainly due to their ability to act as band-stop filters. At the same time, the concept of chaotic cavities has been recently proposed as an efficient tool to enhance the quality of nonlinear signal analysis, particularly in the ultrasonic/acoustic case. The goal of the present paper is to merge the two concepts in order to propose a metamaterial-based device that can be used as a natural and selective linear filter for the detection of signals resulting from the propagation of elastic waves in nonlinear materials, e.g., in the presence of damage, and as a detector for the damage itself in time reversal experiments. Numerical simulations demonstrate the feasibility of the approach and the potential of the device in providing improved signal-to-noise ratios and enhanced focusing on the defect locations

  8. High-power, photofission-inducing bremsstrahlung source for intense pulsed active detection of fissile material

    Directory of Open Access Journals (Sweden)

    J. C. Zier

    2014-06-01

    Full Text Available Intense pulsed active detection (IPAD is a promising technique for detecting fissile material to prevent the proliferation of special nuclear materials. With IPAD, fissions are induced in a brief, intense radiation burst and the resulting gamma ray or neutron signals are acquired during a short period of elevated signal-to-noise ratio. The 8 MV, 200 kA Mercury pulsed-power generator at the Naval Research Laboratory coupled to a high-power vacuum diode produces an intense 30 ns bremsstrahlung beam to study this approach. The work presented here reports on Mercury experiments designed to maximize the photofission yield in a depleted-uranium (DU object in the bremsstrahlung far field by varying the anode-cathode (AK diode gap spacing and by adding an inner-diameter-reducing insert in the outer conductor wall. An extensive suite of diagnostics was fielded to measure the bremsstrahlung beam and DU fission yield as functions of diode geometry. Delayed fission neutrons from the DU proved to be a valuable diagnostic for measuring bremsstrahlung photons above 5 MeV. The measurements are in broad agreement with particle-in-cell and Monte Carlo simulations of electron dynamics and radiation transport. These show that with increasing AK gap, electron losses to the insert and outer conductor wall increase and that the electron angles impacting the bremsstrahlung converter approach normal incidence. The diode conditions for maximum fission yield occur when the gap is large enough to produce electron angles close to normal, yet small enough to limit electron losses.

  9. High-power, photofission-inducing bremsstrahlung source for intense pulsed active detection of fissile material

    Science.gov (United States)

    Zier, J. C.; Mosher, D.; Allen, R. J.; Commisso, R. J.; Cooperstein, G.; Hinshelwood, D. D.; Jackson, S. L.; Murphy, D. P.; Ottinger, P. F.; Richardson, A. S.; Schumer, J. W.; Swanekamp, S. B.; Weber, B. V.

    2014-06-01

    Intense pulsed active detection (IPAD) is a promising technique for detecting fissile material to prevent the proliferation of special nuclear materials. With IPAD, fissions are induced in a brief, intense radiation burst and the resulting gamma ray or neutron signals are acquired during a short period of elevated signal-to-noise ratio. The 8 MV, 200 kA Mercury pulsed-power generator at the Naval Research Laboratory coupled to a high-power vacuum diode produces an intense 30 ns bremsstrahlung beam to study this approach. The work presented here reports on Mercury experiments designed to maximize the photofission yield in a depleted-uranium (DU) object in the bremsstrahlung far field by varying the anode-cathode (AK) diode gap spacing and by adding an inner-diameter-reducing insert in the outer conductor wall. An extensive suite of diagnostics was fielded to measure the bremsstrahlung beam and DU fission yield as functions of diode geometry. Delayed fission neutrons from the DU proved to be a valuable diagnostic for measuring bremsstrahlung photons above 5 MeV. The measurements are in broad agreement with particle-in-cell and Monte Carlo simulations of electron dynamics and radiation transport. These show that with increasing AK gap, electron losses to the insert and outer conductor wall increase and that the electron angles impacting the bremsstrahlung converter approach normal incidence. The diode conditions for maximum fission yield occur when the gap is large enough to produce electron angles close to normal, yet small enough to limit electron losses.

  10. Threshold quantum cryptograph based on Grover's algorithm

    International Nuclear Information System (INIS)

    Du Jianzhong; Qin Sujuan; Wen Qiaoyan; Zhu Fuchen

    2007-01-01

    We propose a threshold quantum protocol based on Grover's operator and permutation operator on one two-qubit signal. The protocol is secure because the dishonest parties can only extract 2 bits from 3 bits information of operation on one two-qubit signal while they have to introduce error probability 3/8. The protocol includes a detection scheme to resist Trojan horse attack. With probability 1/2, the detection scheme can detect a multi-qubit signal that is used to replace a single-qubit signal, while it makes every legitimate qubit invariant

  11. A robust physiology-based source separation method for QRS detection in low amplitude fetal ECG recordings

    International Nuclear Information System (INIS)

    Vullings, R; Bergmans, J W M; Peters, C H L; Hermans, M J M; Wijn, P F F; Oei, S G

    2010-01-01

    The use of the non-invasively obtained fetal electrocardiogram (ECG) in fetal monitoring is complicated by the low signal-to-noise ratio (SNR) of ECG signals. Even after removal of the predominant interference (i.e. the maternal ECG), the SNR is generally too low for medical diagnostics, and hence additional signal processing is still required. To this end, several methods for exploiting the spatial correlation of multi-channel fetal ECG recordings from the maternal abdomen have been proposed in the literature, of which principal component analysis (PCA) and independent component analysis (ICA) are the most prominent. Both PCA and ICA, however, suffer from the drawback that they are blind source separation (BSS) techniques and as such suboptimum in that they do not consider a priori knowledge on the abdominal electrode configuration and fetal heart activity. In this paper we propose a source separation technique that is based on the physiology of the fetal heart and on the knowledge of the electrode configuration. This technique operates by calculating the spatial fetal vectorcardiogram (VCG) and approximating the VCG for several overlayed heartbeats by an ellipse. By subsequently projecting the VCG onto the long axis of this ellipse, a source signal of the fetal ECG can be obtained. To evaluate the developed technique, its performance is compared to that of both PCA and ICA and to that of augmented versions of these techniques (aPCA and aICA; PCA and ICA applied on preprocessed signals) in generating a fetal ECG source signal with enhanced SNR that can be used to detect fetal QRS complexes. The evaluation shows that the developed source separation technique performs slightly better than aPCA and aICA and outperforms PCA and ICA and has the main advantage that, with respect to aPCA/PCA and aICA/ICA, it performs more robustly. This advantage renders it favorable for employment in automated, real-time fetal monitoring applications

  12. Energy based source location by using acoustic emission for damage detection in steel and composite CNG tank

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Il Sik; Han, Byeong Hee; Park, Choon Su; Yoon, Dong Jin [Center for Safety Measurements, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2015-10-15

    Acoustic emission (AE) is an effective nondestructive test that uses transient elastic wave generated by the rapid release of energy within a material to detect any further growth or expansion of existing defects. Over the past decades, because of environmental issues, the use of compressed natural gas (CNG) as an alternative fuel for vehicles is increasing because of environmental issues. For this reason, the importance and necessity of detecting defects on a CNG fuel tank has also come to the fore. The conventional AE method used for source location is highly affected by the wave speed on the structure, and this creates problems in inspecting a composite CNG fuel tank. Because the speed and dispersion characteristics of the wave are different according to direction of structure and laminated layers. In this study, both the conventional AE method and the energy based contour map method were used for source location. This new method based on pre-acquired D/B was used for overcoming the limitation of damage localization in a composite CNG fuel tank specimen which consists of a steel liner cylinder overwrapped by GFRP. From the experimental results, it is observed that the damage localization is determined with a small error at all tested points by using the energy based contour map method, while there were a number of mis-locations or large errors at many tested points by using the conventional AE method. Therefore, the energy based contour map method used in this work is more suitable technology for inspecting composite structures.

  13. Crossing the Petawatt threshold

    International Nuclear Information System (INIS)

    Perry, M.

    1996-01-01

    A revolutionary new laser called the Petawatt, developed by Lawrence Livermore researchers after an intensive three-year development effort, has produced more than 1,000 trillion (open-quotes petaclose quotes) watts of power, a world record. By crossing the petawatt threshold, the extraordinarily powerful laser heralds a new age in laser research. Lasers that provide a petawatt of power or more in a picosecond may make it possible to achieve fusion using significantly less energy than currently envisioned, through a novel Livermore concept called open-quotes fast ignition.close quotes The petawatt laser will also enable researchers to study the fundamental properties of matter, thereby aiding the Department of Energy's Stockpile Stewardship efforts and opening entirely new physical regimes to study. The technology developed for the Petawatt has also provided several spinoff technologies, including a new approach to laser material processing

  14. Marine controlled source electromagnetic (mCSEM) detects hydrocarbon reservoirs in the Santos Basin - Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Buonora, Marco Polo Pereira; Rodrigues, Luiz Felipe [PETROBRAS, Rio de Janeiro, RJ (Brazil); Zerilli, Andrea; Labruzzo, Tiziano [WesternGeco, Houston, TX (United States)

    2008-07-01

    In recent years marine Controlled Source Electromagnetic (mCSEM) has driven the attention of an increasing number of operators due to its sensitivity to map resistive structures, such as hydrocarbon reservoirs beneath the ocean floor and successful case histories have been reported. The Santos basin mCSEM survey was performed as part of a technical co-operation project between PETROBRAS and Schlumberger to assess the integration of selected deep reading electromagnetic technologies into the full cycle of oil field exploration and development. The survey design was based on an in-depth sensitivity study, built on known reservoirs parameters, such as thickness, lateral extent, overburden and resistivities derived from seismic and well data. In this context, the mCSEM data were acquired to calibrate the technology over the area's known reservoirs, quantify the resistivity anomalies associated with those reservoirs, with the expectation that new prospective locations could be found. We show that the mCSEM response of the known reservoirs yields signatures that can be clearly imaged and accurately quantified and there are evident correlations between the mCSEM anomalies and the reservoirs. (author)

  15. Portable gamma-ray detection system for location of radioactive sources

    International Nuclear Information System (INIS)

    Worth, G.M.; Henery, C.N.; Hastings, R.D.; France, S.W.

    1976-01-01

    A portable, battery-powered gamma radiation detection system, the RBM 1100 Road Block Monitor, is described. The detector is a 12.7-cm diameter by 2.54-cm thick NaI(Tl) integral assembly, housed in a weatherproof, insulated case with high and low voltage batteries, amplifier, lower level discriminator, low power CMOS logic electronics, and a VHF FM transmitter. Alarms indicating the presence of radioactive material are generated by a logic system that periodically stores a trip level based on an average background plus an internally calculated standard deviation. The operator may select the optimum counting time, the trip level in numbers of standard deviations above the average background, the background updating mode, and the method of alarm annunciation. Alarms may be indicated locally by panel indication and/or audible tone, or remotely via a VHF FM transmitter with a maximum range of 1.6-km line-of-sight to a suitable receiver. Count rate may be read on a top panel meter or recorded on a chart recorder. A single frame camera may be attached to provide a photographic record. Battery lifetime is 100 to 300 hours, depending on alarm rate and use of the transmitter

  16. Initial study with Tc99m antigranulocyte antibody (MAK-47) in detection sources of infection

    International Nuclear Information System (INIS)

    Cwikla, J.B.; Buscombe, J.R.; Hilson, A.J.; Janoki, G.A.

    1997-01-01

    Antigranulocytes antibodies (AGAB) are antibodies directed against glycoprotein on the surface of granulocytes and as such provides in vivo cell labelling. They are easily labelled with Tc99m and comes a one step labelling technique. 20 patients have been studied 1 h and 4-6 hours after administration of 200 MBq of Tc99m AGAB (MAK-47). Less then 0.5 mg of antibody was given to each patients. Sites of uptake and outside of the reticular-endothelial system were reported as showing positive accumulation. Clinical results were confirmed by microbiological, pathological examinations, clinical follow-up and autopsy. There were 8 patients who had sites of infection confirmed by additional examination. All patients were visualized by Tc99m AGAB (MAK-47). There were 4 cases of osteomyelitis and septic arthritis and 4 cases of focal intra-abdominal infection. Two patients had uptake in non-infected/inflammatory arthritis, both in the knee. The remaining patients had true negative studies. The diagnostic accuracy of this study was as follows: sensitivity 100%, specificity 83%, positive predictive value (PPV) was 80% and negative predictive value (NPV) was 100%. Antigranulocyte antibody (MAK-47) seems to by promising tool in detecting focal infection in bone and soft tissue, except physiological accumulation in some parts of the body. It should be considered that antibodies can have non-specific uptake in non-infected, inflammation sites. It is easy to use and no had allergic reaction and HAMA antibody (human antimouse antibody). (author)

  17. Detection of the Diversity of Cytoplasmic Male Sterility Sources in Broccoli (Brassica Oleracea var. Italica) Using Mitochondrial Markers.

    Science.gov (United States)

    Shu, Jinshuai; Liu, Yumei; Li, Zhansheng; Zhang, Lili; Fang, Zhiyuan; Yang, Limei; Zhuang, Mu; Zhang, Yangyong; Lv, Honghao

    2016-01-01

    Broccoli (Brassica oleracea var. italica) is an important commercial vegetable crop. As part of an efficient pollination system, cytoplasmic male sterility (CMS) has been widely used for broccoli hybrid production. Identifying the original sources of CMS in broccoli accessions has become an important part of broccoli breeding. In this study, the diversity of the CMS sources of 39 broccoli accessions, including 19 CMS lines and 20 hybrids, were analyzed using mitochondrial markers. All CMS accessions contained the ogu orf138-related DNA fragment and the key genes of nap CMS, pol CMS, and tour CMS were not detected. The 39 CMS accessions were divided into five groups using six orf138-related and two simple sequence repeat markers. We observed that ogu CMS R3 constituted 79.49% of the CMS sources. CMS6 and CMS26 were differentiated from the other accessions using a specific primer. CMS32 was distinguished from the other accessions based on a 78-nucleotide deletion at the same locus as the orf138-related sequence. When the coefficient was about 0.90, five CMS accessions (13CMS6, 13CMS23, 13CMS24, 13CMS37, and 13CMS39) exhibiting abnormal floral organs with poor seed setting were grouped together. The polymerase chain reaction amplification profiles for these five accessions differed from those of the other accessions. We identified eight useful molecular markers that can be used to detect CMS types during broccoli breeding. Our data also provide important information relevant to future studies on the possible origins and molecular mechanisms of CMS in broccoli.

  18. Greenhouse Gas Source Detection and Attribution in the San Francisco Bay Area of California Using a Mobile Measurement Platform

    Science.gov (United States)

    Martien, P. T.; Guha, A.; Newman, S.; Young, A.; Bower, J.; Perkins, I.; Randall, S.; Stevenson, E.; Hilken, H.

    2017-12-01

    The Bay Area Air Quality Management District, the San Francisco Bay Area's air quality regulatory agency, has set a goal to reduce the region's greenhouse gas (GHG) emissions 80% below 1990 levels by 2050, consistent with the State of California's climate goals. Recently, the Air District's governing board adopted a 2017 Clean Air Plan advancing the agency's vision and including actions to put the region on a path to achieving the 2050 goal while also reducing air pollution and related health impacts. The Plan includes GHG rule-making efforts, policy initiatives, local government partnerships, outreach, grants and incentives, encompassing over 250 specific implementation actions across all economic sectors to effect ambitious emission reductions in the region. To support the 2017 Plan, the Air District has built a mobile measurement platform (GHG research van) to perform targeted CH4 emissions hotspot detection and source attribution. Instruments in the van measure CH4, CO2 and N2O in ambient plumes. Coincident measurements of source tracers like isotopic methane (13C - CH4), CO and ethane (C2H6) provide the capability to distinguish between biogenic, combustion-based and fossil-based fugitive methane sources. We report observations of CH4 plumes from source-specific measurements in and around facilities including a wastewater treatment plant, a composting operation, a waste-to-energy anaerobic digestion plant and a few refineries. We performed leak surveys inside several electric utility-operated facilities including a power plant and an underground natural gas storage facility. We sampled exhaust from a roadway tunnel and computed fleet-averaged automobile-related CH4 and N2O emission factors. We used tracer-to-tracer emission ratios to create chemical signatures of emissions from each sampled source category. We compare measurement-based ratios with those used to derive the regional GHG inventory. Data from these and other sources will lead to an improved

  19. Detection and characterization of lightning-based sources using continuous wavelet transform: application to audio-magnetotellurics

    Science.gov (United States)

    Larnier, H.; Sailhac, P.; Chambodut, A.

    2018-01-01

    Atmospheric electromagnetic waves created by global lightning activity contain information about electrical processes of the inner and the outer Earth. Large signal-to-noise ratio events are particularly interesting because they convey information about electromagnetic properties along their path. We introduce a new methodology to automatically detect and characterize lightning-based waves using a time-frequency decomposition obtained through the application of continuous wavelet transform. We focus specifically on three types of sources, namely, atmospherics, slow tails and whistlers, that cover the frequency range 10 Hz to 10 kHz. Each wave has distinguishable characteristics in the time-frequency domain due to source shape and dispersion processes. Our methodology allows automatic detection of each type of event in the time-frequency decomposition thanks to their specific signature. Horizontal polarization attributes are also recovered in the time-frequency domain. This procedure is first applied to synthetic extremely low frequency time-series with different signal-to-noise ratios to test for robustness. We then apply it on real data: three stations of audio-magnetotelluric data acquired in Guadeloupe, oversea French territories. Most of analysed atmospherics and slow tails display linear polarization, whereas analysed whistlers are elliptically polarized. The diversity of lightning activity is finally analysed in an audio-magnetotelluric data processing framework, as used in subsurface prospecting, through estimation of the impedance response functions. We show that audio-magnetotelluric processing results depend mainly on the frequency content of electromagnetic waves observed in processed time-series, with an emphasis on the difference between morning and afternoon acquisition. Our new methodology based on the time-frequency signature of lightning-induced electromagnetic waves allows automatic detection and characterization of events in audio

  20. A LIMIT ON THE NUMBER OF ISOLATED NEUTRON STARS DETECTED IN THE ROSAT ALL-SKY-SURVEY BRIGHT SOURCE CATALOG

    International Nuclear Information System (INIS)

    Turner, Monica L.; Rutledge, Robert E.; Letcavage, Ryan; Shevchuk, Andrew S. H.; Fox, Derek B.

    2010-01-01

    Using new and archival observations made with the Swift satellite and other facilities, we examine 147 X-ray sources selected from the ROSAT All-Sky-Survey Bright Source Catalog (RASS/BSC) to produce a new limit on the number of isolated neutron stars (INSs) in the RASS/BSC, the most constraining such limit to date. Independent of X-ray spectrum and variability, the number of INSs is ≤48 (90% confidence). Restricting attention to soft (kT eff < 200 eV), non-variable X-ray sources-as in a previous study-yields an all-sky limit of ≤31 INSs. In the course of our analysis, we identify five new high-quality INS candidates for targeted follow-up observations. A future all-sky X-ray survey with eROSITA, or another mission with similar capabilities, can be expected to increase the detected population of X-ray-discovered INSs from the 8-50 in the BSC, to (for a disk population) 240-1500, which will enable a more detailed study of neutron star population models.

  1. FijiWingsPolarity: An open source toolkit for semi-automated detection of cell polarity.

    Science.gov (United States)

    Dobens, Leonard L; Shipman, Anna; Axelrod, Jeffrey D

    2018-01-02

    Epithelial cells are defined by apical-basal and planar cell polarity (PCP) signaling, the latter of which establishes an orthogonal plane of polarity in the epithelial sheet. PCP signaling is required for normal cell migration, differentiation, stem cell generation and tissue repair, and defects in PCP have been associated with developmental abnormalities, neuropathologies and cancers. While the molecular mechanism of PCP is incompletely understood, the deepest insights have come from Drosophila, where PCP is manifest in hairs and bristles across the adult cuticle and organization of the ommatidia in the eye. Fly wing cells are marked by actin-rich trichome structures produced at the distal edge of each cell in the developing wing epithelium and in a mature wing the trichomes orient collectively in the distal direction. Genetic screens have identified key PCP signaling pathway components that disrupt trichome orientation, which has been measured manually in a tedious and error prone process. Here we describe a set of image processing and pattern-recognition macros that can quantify trichome arrangements in micrographs and mark these directly by color, arrow or colored arrow to indicate trichome location, length and orientation. Nearest neighbor calculations are made to exploit local differences in orientation to better and more reliably detect and highlight local defects in trichome polarity. We demonstrate the use of these tools on trichomes in adult wing preps and on actin-rich developing trichomes in pupal wing epithelia stained with phalloidin. FijiWingsPolarity is freely available and will be of interest to a broad community of fly geneticists studying the effect of gene function on PCP.

  2. 64 x 64 thresholding photodetector array for optical pattern recognition

    Science.gov (United States)

    Langenbacher, Harry; Chao, Tien-Hsin; Shaw, Timothy; Yu, Jeffrey W.

    1993-10-01

    A high performance 32 X 32 peak detector array is introduced. This detector consists of a 32 X 32 array of thresholding photo-transistor cells, manufactured with a standard MOSIS digital 2-micron CMOS process. A built-in thresholding function that is able to perform 1024 thresholding operations in parallel strongly distinguishes this chip from available CCD detectors. This high speed detector offers responses from one to 10 milliseconds that is much higher than the commercially available CCD detectors operating at a TV frame rate. The parallel multiple peaks thresholding detection capability makes it particularly suitable for optical correlator and optoelectronically implemented neural networks. The principle of operation, circuit design and the performance characteristics are described. Experimental demonstration of correlation peak detection is also provided. Recently, we have also designed and built an advanced version of a 64 X 64 thresholding photodetector array chip. Experimental investigation of using this chip for pattern recognition is ongoing.

  3. Search for Spatially Extended Fermi-LAT Sources Using Two Years of Data

    Energy Technology Data Exchange (ETDEWEB)

    Lande, Joshua; Ackermann, Markus; Allafort, Alice; Ballet, Jean; Bechtol, Keith; Burnett, Toby; Cohen-Tanugi, Johann; Drlica-Wagner, Alex; Funk, Stefan; Giordano, Francesco; Grondin, Marie-Helene; Kerr, Matthew; Lemoine-Goumard, Marianne

    2012-07-13

    Spatial extension is an important characteristic for correctly associating {gamma}-ray-emitting sources with their counterparts at other wavelengths and for obtaining an unbiased model of their spectra. We present a new method for quantifying the spatial extension of sources detected by the Large Area Telescope (LAT), the primary science instrument on the Fermi Gamma-ray Space Telescope (Fermi). We perform a series of Monte Carlo simulations to validate this tool and calculate the LAT threshold for detecting the spatial extension of sources. We then test all sources in the second Fermi -LAT catalog (2FGL) for extension. We report the detection of seven new spatially extended sources.

  4. Source-specific sewage pollution detection in urban river waters using pharmaceuticals and personal care products as molecular indicators.

    Science.gov (United States)

    Kiguchi, Osamu; Sato, Go; Kobayashi, Takashi

    2016-11-01

    Source-specific elucidation of domestic sewage pollution caused by various effluent sources in an urban river water, as conducted for this study, demands knowledge of the relation between concentrations of pharmaceuticals and personal care products (PPCPs) as molecular indicators (caffeine, carbamazepine, triclosan) and water quality concentrations of total nitrogen (T-N) and total phosphorous (T-P). River water and wastewater samples from the Asahikawa River Basin in northern Japan were analyzed using derivatization-gas chromatography/mass spectrometry. Caffeine, used as an indicator of domestic sewage in the Asahikawa River Basin, was more ubiquitous than either carbamazepine or triclosan (92-100 %). Its concentration was higher than any target compound used to assess the basin: caffeine, caffeine concentrations detected in wastewater effluents and the strongly positive mutual linear correlation between caffeine and T-N or T-P (R 2  > 0.759) reflect the contribution of septic tank system effluents to the lower Asahikawa River Basin. Results of relative molecular indicators in combination with different molecular indicators (caffeine/carbamazepine and triclosan/carbamazepine) and cluster analysis better reflect the contribution of sewage than results obtained using concentrations of respective molecular indicators and cluster analysis. Relative molecular indicators used with water quality parameters (e.g., caffeine/T-N ratio) in this study provide results more clearly, relatively, and quantitatively than results obtained using molecular indicators alone. Moreover, the caffeine/T-N ratio reflects variations of caffeine flux from effluent sources. These results suggest strongly relative molecular indicators are also useful indicators, reflecting differences in spatial contributions of domestic sources for PPCPs in urban areas.

  5. Upgrade for detection system of JARREL ASH 70-000 atomic emission spectrography with source of arc - spark

    International Nuclear Information System (INIS)

    Santa Cruz D E; Grau F N; Bellavigna H J; Garavaglia R N; Fernandez R O; Servant R

    2012-01-01

    Methodologies used in spectrochemical analysis have showed breakthroughs in last decades due to the newest digital technologies. The simultaneous determination of multiple elements in a short time has been allowed by the development of solid state multichannel detectors. Since its beginning CNEA has developed several spectroscopic methodologies that have been applied to the area of fuels and specific materials of the nuclear activity. The Analytical Chemistry Department has an atomic emission spectrograph with a source of arc/spark whose focal length is 3.4 meters and a spectral dispersion from 2.5 to 5 A/mm. This equipment was originally equipped with photographic detection. This feature although allowed the simultaneous detection of multiple elements, their response (photographic plate) has been not linear and their data's treatment has been very complex. Two alternatives of digital detection have been examined: CCD2 and CMOS3 according to the progress achieved in the instrumentation that is applied to similar techniques. After exhaustive evaluation an arrangement of 9 linear CCD detectors located in the focal plane originally occupied by 2 x10 inch photographic plates was chosen. The software provided by the manufacturer has been insufficient for cover our analytical necessities due to the requirements of our instrumental application. This led to develop an own program for our applications. Today, our detection system includes an assembly of 7 detectors and an acquisition program with basic control that has been developed in-house. Calibration curves for some chemical elements have shown very promising results, the sensitivity has increased at least 10 times and an important improvement of accuracy of the measurements has also been achieved thanks to our modification. An upgrade with an associated database that will allow obtaining spectra in 3D configuration and extend the instrumental capabilities to second order is being prepared (author)

  6. A critical experimental study of the classical tactile threshold theory

    Directory of Open Access Journals (Sweden)

    Medina Leonel E

    2010-06-01

    Full Text Available Abstract Background The tactile sense is being used in a variety of applications involving tactile human-machine interfaces. In a significant number of publications the classical threshold concept plays a central role in modelling and explaining psychophysical experimental results such as in stochastic resonance (SR phenomena. In SR, noise enhances detection of sub-threshold stimuli and the phenomenon is explained stating that the required amplitude to exceed the sensory threshold barrier can be reached by adding noise to a sub-threshold stimulus. We designed an experiment to test the validity of the classical vibrotactile threshold. Using a second choice experiment, we show that individuals can order sensorial events below the level known as the classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance level. Nevertheless, our experimental results are above that chance level contradicting the definition of the classical tactile threshold. Results We performed a three alternative forced choice detection experiment on 6 subjects asking them first and second choices. In each trial, only one of the intervals contained a stimulus and the others contained only noise. According to the classical threshold assumptions, a correct second choice response corresponds to a guess attempt with a statistical frequency of 50%. Results show an average of 67.35% (STD = 1.41% for the second choice response that is not explained by the classical threshold definition. Additionally, for low stimulus amplitudes, second choice correct detection is above chance level for any detectability level. Conclusions Using a second choice experiment, we show that individuals can order sensorial events below the level known as a classical threshold. If the observer's sensorial system is not activated by stimuli below the threshold, then a second choice could not be above the chance

  7. Recognition Memory zROC Slopes for Items with Correct versus Incorrect Source Decisions Discriminate the Dual Process and Unequal Variance Signal Detection Models

    Science.gov (United States)

    Starns, Jeffrey J.; Rotello, Caren M.; Hautus, Michael J.

    2014-01-01

    We tested the dual process and unequal variance signal detection models by jointly modeling recognition and source confidence ratings. The 2 approaches make unique predictions for the slope of the recognition memory zROC function for items with correct versus incorrect source decisions. The standard bivariate Gaussian version of the unequal…

  8. Crossing the threshold

    Science.gov (United States)

    Bush, John; Tambasco, Lucas

    2017-11-01

    First, we summarize the circumstances in which chaotic pilot-wave dynamics gives rise to quantum-like statistical behavior. For ``closed'' systems, in which the droplet is confined to a finite domain either by boundaries or applied forces, quantum-like features arise when the persistence time of the waves exceeds the time required for the droplet to cross its domain. Second, motivated by the similarities between this hydrodynamic system and stochastic electrodynamics, we examine the behavior of a bouncing droplet above the Faraday threshold, where a stochastic element is introduced into the drop dynamics by virtue of its interaction with a background Faraday wave field. With a view to extending the dynamical range of pilot-wave systems to capture more quantum-like features, we consider a generalized theoretical framework for stochastic pilot-wave dynamics in which the relative magnitudes of the drop-generated pilot-wave field and a stochastic background field may be varied continuously. We gratefully acknowledge the financial support of the NSF through their CMMI and DMS divisions.

  9. Albania - Thresholds I and II

    Data.gov (United States)

    Millennium Challenge Corporation — From 2006 to 2011, the government of Albania (GOA) received two Millennium Challenge Corporation (MCC) Threshold Programs totaling $29.6 million. Albania received...

  10. Fermi Non-detections of Four X-Ray Jet Sources and Implications for the IC/CMB Mechanism

    Science.gov (United States)

    Breiding, Peter; Meyer, Eileen T.; Georganopoulos, Markos; Keenan, M. E.; DeNigris, N. S.; Hewitt, Jennifer

    2017-11-01

    Since its launch in 1999, the Chandra X-ray observatory has discovered several dozen X-ray jets associated with powerful quasars. In many cases, the X-ray spectrum is hard and appears to come from a second spectral component. The most popular explanation for the kpc-scale X-ray emission in these cases has been inverse-Compton (IC) scattering of Cosmic Microwave Background (CMB) photons by relativistic electrons in the jet (the IC/CMB model). Requiring the IC/CMB emission to reproduce the observed X-ray flux density inevitably predicts a high level of gamma-ray emission, which should be detectable with the Fermi Large Area Telescope (LAT). In previous work, we found that gamma-ray upper limits from the large-scale jets of 3C 273 and PKS 0637-752 violate the predictions of the IC/CMB model. Here, we present Fermi/LAT flux density upper limits for the X-ray jets of four additional sources: PKS 1136-135, PKS 1229-021, PKS 1354+195, and PKS 2209+080. We show that these limits violate the IC/CMB predictions at a very high significance level. We also present new Hubble Space Telescope observations of the quasar PKS 2209+080 showing a newly detected optical jet, and Atacama Large Millimeter/submillimeter Array band 3 and 6 observations of all four sources, which provide key constraints on the spectral shape that enable us to rule out the IC/CMB model.

  11. Methane Flux Estimation from Point Sources using GOSAT Target Observation: Detection Limit and Improvements with Next Generation Instruments

    Science.gov (United States)

    Kuze, A.; Suto, H.; Kataoka, F.; Shiomi, K.; Kondo, Y.; Crisp, D.; Butz, A.

    2017-12-01

    Atmospheric methane (CH4) has an important role in global radiative forcing of climate but its emission estimates have larger uncertainties than carbon dioxide (CO2). The area of anthropogenic emission sources is usually much smaller than 100 km2. The Thermal And Near infrared Sensor for carbon Observation Fourier-Transform Spectrometer (TANSO-FTS) onboard the Greenhouse gases Observing SATellite (GOSAT) has measured CO2 and CH4 column density using sun light reflected from the earth's surface. It has an agile pointing system and its footprint can cover 87-km2 with a single detector. By specifying pointing angles and observation time for every orbit, TANSO-FTS can target various CH4 point sources together with reference points every 3 day over years. We selected a reference point that represents CH4 background density before or after targeting a point source. By combining satellite-measured enhancement of the CH4 column density and surface measured wind data or estimates from the Weather Research and Forecasting (WRF) model, we estimated CH4emission amounts. Here, we picked up two sites in the US West Coast, where clear sky frequency is high and a series of data are available. The natural gas leak at Aliso Canyon showed a large enhancement and its decrease with time since the initial blowout. We present time series of flux estimation assuming the source is single point without influx. The observation of the cattle feedlot in Chino, California has weather station within the TANSO-FTS footprint. The wind speed is monitored continuously and the wind direction is stable at the time of GOSAT overpass. The large TANSO-FTS footprint and strong wind decreases enhancement below noise level. Weak wind shows enhancements in CH4, but the velocity data have large uncertainties. We show the detection limit of single samples and how to reduce uncertainty using time series of satellite data. We will propose that the next generation instruments for accurate anthropogenic CO2 and CH

  12. Detection and dosimetry studies on the response of silicon diodes to an 241Am-Be source

    International Nuclear Information System (INIS)

    Lotfi, Y; Dizaji, H Zaki; Davani, F Abbasi

    2014-01-01

    Silicon diode detectors show potential for the development of an active personal dosimeter for neutron and photon radiation. Photons interact with the constituents of the diode detector and produce electrons. Fast neutrons interact with the constituents of the diode detector and converter, producing recoil nuclei and causing (n,α) and (n,p) reactions. These photon- and neutron-induced charged particles contribute to the response of diode detectors. In this work, a silicon pin diode was used as a detector to produce pulses created by photon and neutron. A polyethylene fast neutron converter was used as a recoil proton source in front of the detector. The total registered photon and neutron efficiency and the partial contributions of the efficiency, due to interactions with the diode and converter, were calculated. The results show that the efficiency of the converter-diode is a function of the incident photon and neutron energy. The optimized thicknesses of the converter for neutron detection and neutron dosimetry were found to be 1 mm and 0.1 mm respectively. The neutron records caused by the (n,α) and (n,p) reactions were negligible. The photon records were strongly dependent upon the energy and the depletion layer of the diode. The photons and neutrons efficiency of the diode-based dosimeter was calculated by the MCNPX code, and the results were in good agreement with experimental results for photons and neutrons from an 241 Am-Be source

  13. Norm based Threshold Selection for Fault Detectors

    DEFF Research Database (Denmark)

    Rank, Mike Lind; Niemann, Henrik

    1998-01-01

    The design of fault detectors for fault detection and isolation (FDI) in dynamic systems is considered from a norm based point of view. An analysis of norm based threshold selection is given based on different formulations of FDI problems. Both the nominal FDI problem as well as the uncertain FDI...... problem are considered. Based on this analysis, a performance index based on norms of the involved transfer functions is given. The performance index allows us also to optimize the structure of the fault detection filter directly...

  14. Detection of special nuclear material from delayed neutron emission induced by a dual-particle monoenergetic source

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, M. [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, Pennsylvania 16802 (United States); Nattress, J.; Jovanovic, I., E-mail: ijov@umich.edu [Department of Nuclear Engineering and Radiological Sciences, University of Michigan, Ann Arbor, Michigan 48109 (United States)

    2016-06-27

    Detection of unique signatures of special nuclear materials is critical for their interdiction in a variety of nuclear security and nonproliferation scenarios. We report on the observation of delayed neutrons from fission of uranium induced in dual-particle active interrogation based on the {sup 11}B(d,n γ){sup 12}C nuclear reaction. Majority of the fissions are attributed to fast fission induced by the incident quasi-monoenergetic neutrons. A Li-doped glass–polymer composite scintillation neutron detector, which displays excellent neutron/γ discrimination at low energies, was used in the measurements, along with a recoil-based liquid scintillation detector. Time-dependent buildup and decay of delayed neutron emission from {sup 238}U were measured between the interrogating beam pulses and after the interrogating beam was turned off, respectively. Characteristic buildup and decay time profiles were compared to the common parametrization into six delayed neutron groups, finding a good agreement between the measurement and nuclear data. This method is promising for detecting fissile and fissionable materials in cargo scanning applications and can be readily integrated with transmission radiography using low-energy nuclear reaction sources.

  15. Study on the performance of infrared thermal imaging light source for detection of impact defects in CFRP composite sandwich panels

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hee Sang [R and D, Korea Research Institute of Smart Material and Structures System Association, Daejeon (Korea, Republic of); Choi, Man Yong; Kwon, Koo Ahn; Park, Jeong Hak; Choi, Won Jae [Safety measurement center, Korea research Institute of Standards and Science, Daejeon (Korea, Republic of); Jung, Hyun Chul [Dept. of Mechanical Engineering Chosun University, Gwangju (Korea, Republic of)

    2017-04-15

    Recently, composite materials have been mainly used in the main wings, ailerons, and fuselages of aircraft and rotor blades of helicopters. Composite materials used in rapid moving structures are subject to impact by hail, lightning, and bird strike. Such an impact can destroy fiber tissues in the composite materials as well as deform the composite materials, resulting in various problems such as weakened rigidity of the composite structure and penetration of water into tiny cracks. In this study, experiments were conducted using a 2 kW halogen lamp which is most frequently used as a light source, a 2 kW near-infrared lamp, which is used for heating to a high temperature, and a 6 kW xenon flash lamp which emits a large amount of energy for a moment. CFRP composite sandwich panels using Nomex honeycomb core were used as the specimens. Experiments were carried out under impact damages of 1, 4 and 8 J. It was found that the detection of defects was fast when the xenon flash lamp was used. The detection of damaged regions was excellent when the halogen lamp was used. Furthermore, the near-infrared lamp is an effective technology for showing the surface of a test object.

  16. MRI-detection rate and incidence of lumbar bleeding sources in 190 patients with non-aneurysmal SAH.

    Directory of Open Access Journals (Sweden)

    Sepide Kashefiolasl

    Full Text Available Up to 15% of all spontaneous subarachnoid hemorrhages (SAH have a non-aneurysmal SAH (NASAH. The evaluation of SAH patients with negative digital subtraction angiography (DSA is sometimes a diagnostic challenge. Our goal in this study was to reassess the yield of standard MR-imaging of the complete spinal axis to rule out spinal bleeding sources in patients with NASAH.We retrospectively analyzed the spinal MRI findings in 190 patients with spontaneous NASAH, containing perimesencephalic (PM and non-perimesencephalic (NPM SAH, diagnosed by computer tomography (CT and/or lumbar puncture (LP, and negative 2nd DSA.190 NASAH patients were included in the study, divided into PM-SAH (n = 87; 46% and NPM-SAH (n = 103; 54%. Overall, 23 (22% patients had a CT negative SAH, diagnosed by positive LP. MR-imaging of the spinal axis detected two patients with lumbar ependymoma (n = 2; 1,05%. Both patients complained of radicular sciatic pain. The detection rate raised up to 25%, if only patients with radicular sciatic pain received an MRI.Routine radiological investigation of the complete spinal axis in NASAH patients is expensive and can not be recommended for standard procedure. However, patients with clinical signs of low-back/sciatic pain should be worked up for a spinal pathology.

  17. Study on the performance of infrared thermal imaging light source for detection of impact defects in CFRP composite sandwich panels

    International Nuclear Information System (INIS)

    Park, Hee Sang; Choi, Man Yong; Kwon, Koo Ahn; Park, Jeong Hak; Choi, Won Jae; Jung, Hyun Chul

    2017-01-01

    Recently, composite materials have been mainly used in the main wings, ailerons, and fuselages of aircraft and rotor blades of helicopters. Composite materials used in rapid moving structures are subject to impact by hail, lightning, and bird strike. Such an impact can destroy fiber tissues in the composite materials as well as deform the composite materials, resulting in various problems such as weakened rigidity of the composite structure and penetration of water into tiny cracks. In this study, experiments were conducted using a 2 kW halogen lamp which is most frequently used as a light source, a 2 kW near-infrared lamp, which is used for heating to a high temperature, and a 6 kW xenon flash lamp which emits a large amount of energy for a moment. CFRP composite sandwich panels using Nomex honeycomb core were used as the specimens. Experiments were carried out under impact damages of 1, 4 and 8 J. It was found that the detection of defects was fast when the xenon flash lamp was used. The detection of damaged regions was excellent when the halogen lamp was used. Furthermore, the near-infrared lamp is an effective technology for showing the surface of a test object

  18. Detection of the source of hemorrhage using postmortem computerized tomographic angiography in a case of a giant juvenile nasopharyngeal angiofibroma after surgical treatment.

    Science.gov (United States)

    do Nascimento, Felipe Barjud Pereira; dos Santos, Glaucia Aparecida Bento; Melo, Nelson Almeida d'Ávila; Damasceno, Eduarda Bittencourt; Mauad, Thais

    2015-09-01

    Postmortem computerized tomographic angiography (PMCTA) has been increasingly used in forensic medicine to detect and locate the source of bleeding in cases of fatal acute hemorrhage. In this paper, we report a case of postoperative complication in a patient with a giant juvenile nasopharyngeal angiofibroma in which the source of bleeding was detected by PMCTA. A case description and evaluations of the pre- and postoperative exams, postmortem CT angiogram, and conventional autopsy results are provided. The source of bleeding was identified by postmortem CT angiography but not by conventional autopsy. The established protocol, injecting contrast medium into the femoral artery, was effective in identifying the source of bleeding. Postoperative bleeding is a rare and frequently fatal complication of juvenile nasopharyngeal angiofibroma. As a complement to conventional autopsy, postmortem angiography is a valuable tool for the detection of lethal acute hemorrhagic foci, and establishing a routine procedure for PMCTA may improve its efficiency.

  19. Using additional external inputs to forecast water quality with an artificial neural network for contamination event detection in source water

    Science.gov (United States)

    Schmidt, F.; Liu, S.

    2016-12-01

    Source water quality plays an important role for the safety of drinking water and early detection of its contamination is vital to taking appropriate countermeasures. However, compared to drinking water, it is more difficult to detect contamination events because its environment is less controlled and numerous natural causes contribute to a high variability of the background values. In this project, Artificial Neural Networks (ANNs) and a Contamination Event Detection Process (CED Process) were used to identify events in river water. The ANN models the response of basic water quality sensors obtained in laboratory experiments in an off-line learning stage and continuously forecasts future values of the time line in an on-line forecasting step. During this second stage, the CED Process compares the forecast to the measured value and classifies it as regular background or event value, which modifies the ANN's continuous learning and influences its forecasts. In addition to this basic setup, external information is fed to the CED Process: A so-called Operator Input (OI) is provided to inform about unusual water quality levels that are unrelated to the presence of contamination, for example due to cooling water discharge from a nearby power plant. This study's primary goal is to evaluate how well the OI fits into the design of the combined forecasting ANN and CED Process and to understand its effects on the online forecasting stage. To test this, data from laboratory experiments conducted previously at the School of Environment, Tsinghua University, have been used to perform simulations highlighting features and drawbacks of this method. Applying the OI has been shown to have a positive influence on the ANN's ability to handle a sudden change in background values, which is unrelated to contamination. However, it might also mask the presence of an event, an issue that underlines the necessity to have several instances of the algorithm run in parallel. Other difficulties

  20. The German system to prevent, detect and respond to illicit uses of nuclear materials and radioactive sources

    International Nuclear Information System (INIS)

    Fechner, J.B.

    2001-01-01

    The German system to prevent, detect and respond to illicit uses of nuclear materials and radioactive sources consists of a variety of different elements: International and national laws and regulations covering safeguards, physical protection, and import/export control; Licensing and regulatory supervision of all activities related to nuclear materials and radioactive sources, including import and export; Responsibility of the licensee to ensure compliance with licensing conditions; sanctions; Law enforcement by police, security and customs authorities; prosecution and penalties; Detection of illicitly trafficked radioactive materials through intelligence and technical means; analysis capabilities; Response arrangements for normal and for severe cases of illicit use of nuclear materials; Participation in international programmes and POC-systems. Safeguards measures have been implemented in Germany in accordance with the Non- Proliferation Treaty and with safeguards agreements based on INFCIRC/153. As Germany is a member of the European Union, the Euratom Treaty and the Euratom-Ordinance Nr. 3227/76 together with the Verification Agreement between the IAEA, the European Commission and the European Member States have led to safeguards measures jointly implemented by the IAEA and by Euratom. The relevant international law for the physical protection of nuclear material in force in Germany is the Convention on the Physical Protection of Nuclear Material. The recommendations on physical protection objectives and fundamentals and on physical protection measures specified in INFCIRC/225/Rev. 4 have been taken into account in various national regulations pertaining to the national design basis threat, the physical protection of LWR nuclear power plants, of interim spent fuel storage facilities, of facilities containing category III material, of nuclear material and radioactive waste transports by road or railway vehicles, aircraft or sea vessels; additional guidelines

  1. The investigation of fast neutron Threshold Activation Detectors (TAD)

    International Nuclear Information System (INIS)

    Gozani, T; King, M J; Stevenson, J

    2012-01-01

    The detection of fast neutrons is usually done by liquid hydrogenous organic scintillators, where the separation between the ever present gamma rays and neutrons is achieved by the pulse shape discrimination (PSD). In many practical situation the detection of fast neutrons has to be carried out while the intense source (be it neutrons, gamma rays or x-rays) that creates these neutrons, for example by the fission process, is present. This source, or ''flash'', usually blinds the neutron detectors and temporarily incapacitates them. By the time the detectors recover the prompt neutron signature does not exist. Thus to overcome the blinding background, one needs to search for processes whereby the desired signature, such as fission neutrons could in some way be measured long after the fission occurred and when the neutron detector is fully recovered from the overload. A new approach was proposed and demonstrated a good sensitivity for the detection of fast neutrons in adverse overload situations where normally it could not be done. A temporal separation of the fission event from the prompt neutrons detection is achieved via the activation process. The main idea, called Threshold Activation Detection (or detector)-TAD, is to find appropriate substances that can be selectively activated by the fission neutrons and not by the source radiation, and then measure the radioactively decaying activation products (typically beta and γ-rays) well after the source pulse has ended. The activation material should possess certain properties: a suitable half-life; an energy threshold below which the numerous source neutrons will not activate it (e.g. about 3 MeV); easily detectable activation products and has a usable cross section for the selected reaction. Ideally the substance would be part of the scintillator. There are several good candidates for TAD. The first one we have selected is based on fluorine. One of the major advantages of this element is the fact that it is a major

  2. The investigation of fast neutron Threshold Activation Detectors (TAD)

    Science.gov (United States)

    Gozani, T.; King, M. J.; Stevenson, J.

    2012-02-01

    The detection of fast neutrons is usually done by liquid hydrogenous organic scintillators, where the separation between the ever present gamma rays and neutrons is achieved by the pulse shape discrimination (PSD). In many practical situation the detection of fast neutrons has to be carried out while the intense source (be it neutrons, gamma rays or x-rays) that creates these neutrons, for example by the fission process, is present. This source, or ``flash'', usually blinds the neutron detectors and temporarily incapacitates them. By the time the detectors recover the prompt neutron signature does not exist. Thus to overcome the blinding background, one needs to search for processes whereby the desired signature, such as fission neutrons could in some way be measured long after the fission occurred and when the neutron detector is fully recovered from the overload. A new approach was proposed and demonstrated a good sensitivity for the detection of fast neutrons in adverse overload situations where normally it could not be done. A temporal separation of the fission event from the prompt neutrons detection is achieved via the activation process. The main idea, called Threshold Activation Detection (or detector)-TAD, is to find appropriate substances that can be selectively activated by the fission neutrons and not by the source radiation, and then measure the radioactively decaying activation products (typically beta and γ-rays) well after the source pulse has ended. The activation material should possess certain properties: a suitable half-life; an energy threshold below which the numerous source neutrons will not activate it (e.g. about 3 MeV); easily detectable activation products and has a usable cross section for the selected reaction. Ideally the substance would be part of the scintillator. There are several good candidates for TAD. The first one we have selected is based on fluorine. One of the major advantages of this element is the fact that it is a major

  3. Threshold Concepts and Information Literacy

    Science.gov (United States)

    Townsend, Lori; Brunetti, Korey; Hofer, Amy R.

    2011-01-01

    What do we teach when we teach information literacy in higher education? This paper describes a pedagogical approach to information literacy that helps instructors focus content around transformative learning thresholds. The threshold concept framework holds promise for librarians because it grounds the instructor in the big ideas and underlying…

  4. Multimodal distribution of human cold pain thresholds.

    Science.gov (United States)

    Lötsch, Jörn; Dimova, Violeta; Lieb, Isabel; Zimmermann, Michael; Oertel, Bruno G; Ultsch, Alfred

    2015-01-01

    It is assumed that different pain phenotypes are based on varying molecular pathomechanisms. Distinct ion channels seem to be associated with the perception of cold pain, in particular TRPM8 and TRPA1 have been highlighted previously. The present study analyzed the distribution of cold pain thresholds with focus at describing the multimodality based on the hypothesis that it reflects a contribution of distinct ion channels. Cold pain thresholds (CPT) were available from 329 healthy volunteers (aged 18 - 37 years; 159 men) enrolled in previous studies. The distribution of the pooled and log-transformed threshold data was described using a kernel density estimation (Pareto Density Estimation (PDE)) and subsequently, the log data was modeled as a mixture of Gaussian distributions using the expectation maximization (EM) algorithm to optimize the fit. CPTs were clearly multi-modally distributed. Fitting a Gaussian Mixture Model (GMM) to the log-transformed threshold data revealed that the best fit is obtained when applying a three-model distribution pattern. The modes of the identified three Gaussian distributions, retransformed from the log domain to the mean stimulation temperatures at which the subjects had indicated pain thresholds, were obtained at 23.7 °C, 13.2 °C and 1.5 °C for Gaussian #1, #2 and #3, respectively. The localization of the first and second Gaussians was interpreted as reflecting the contribution of two different cold sensors. From the calculated localization of the modes of the first two Gaussians, the hypothesis of an involvement of TRPM8, sensing temperatures from 25 - 24 °C, and TRPA1, sensing cold from 17 °C can be derived. In that case, subjects belonging to either Gaussian would possess a dominance of the one or the other receptor at the skin area where the cold stimuli had been applied. The findings therefore support a suitability of complex analytical approaches to detect mechanistically determined patterns from pain phenotype data.

  5. Identifying thresholds for ecosystem-based management.

    Directory of Open Access Journals (Sweden)

    Jameal F Samhouri

    Full Text Available BACKGROUND: One of the greatest obstacles to moving ecosystem-based management (EBM from concept to practice is the lack of a systematic approach to defining ecosystem-level decision criteria, or reference points that trigger management action. METHODOLOGY/PRINCIPAL FINDINGS: To assist resource managers and policymakers in developing EBM decision criteria, we introduce a quantitative, transferable method for identifying utility thresholds. A utility threshold is the level of human-induced pressure (e.g., pollution at which small changes produce substantial improvements toward the EBM goal of protecting an ecosystem's structural (e.g., diversity and functional (e.g., resilience attributes. The analytical approach is based on the detection of nonlinearities in relationships between ecosystem attributes and pressures. We illustrate the method with a hypothetical case study of (1 fishing and (2 nearshore habitat pressure using an empirically-validated marine ecosystem model for British Columbia, Canada, and derive numerical threshold values in terms of the density of two empirically-tractable indicator groups, sablefish and jellyfish. We also describe how to incorporate uncertainty into the estimation of utility thresholds and highlight their value in the context of understanding EBM trade-offs. CONCLUSIONS/SIGNIFICANCE: For any policy scenario, an understanding of utility thresholds provides insight into the amount and type of management intervention required to make significant progress toward improved ecosystem structure and function. The approach outlined in this paper can be applied in the context of single or multiple human-induced pressures, to any marine, freshwater, or terrestrial ecosystem, and should facilitate more effective management.

  6. Differential equation models for sharp threshold dynamics.

    Science.gov (United States)

    Schramm, Harrison C; Dimitrov, Nedialko B

    2014-01-01

    We develop an extension to differential equation models of dynamical systems to allow us to analyze probabilistic threshold dynamics that fundamentally and globally change system behavior. We apply our novel modeling approach to two cases of interest: a model of infectious disease modified for malware where a detection event drastically changes dynamics by introducing a new class in competition with the original infection; and the Lanchester model of armed conflict, where the loss of a key capability drastically changes the effectiveness of one of the sides. We derive and demonstrate a step-by-step, repeatable method for applying our novel modeling approach to an arbitrary system, and we compare the resulting differential equations to simulations of the system's random progression. Our work leads to a simple and easily implemented method for analyzing probabilistic threshold dynamics using differential equations. Published by Elsevier Inc.

  7. Seminal plasma as a source of prostate cancer peptide biomarker candidates for detection of indolent and advanced disease.

    Directory of Open Access Journals (Sweden)

    Jochen Neuhaus

    Full Text Available BACKGROUND: Extensive prostate specific antigen screening for prostate cancer generates a high number of unnecessary biopsies and over-treatment due to insufficient differentiation between indolent and aggressive tumours. We hypothesized that seminal plasma is a robust source of novel prostate cancer (PCa biomarkers with the potential to improve primary diagnosis of and to distinguish advanced from indolent disease. METHODOLOGY/PRINCIPAL FINDINGS: In an open-label case/control study 125 patients (70 PCa, 21 benign prostate hyperplasia, 25 chronic prostatitis, 9 healthy controls were enrolled in 3 centres. Biomarker panels a for PCa diagnosis (comparison of PCa patients versus benign controls and b for advanced disease (comparison of patients with post surgery Gleason score 7 were sought. Independent cohorts were used for proteomic biomarker discovery and testing the performance of the identified biomarker profiles. Seminal plasma was profiled using capillary electrophoresis mass spectrometry. Pre-analytical stability and analytical precision of the proteome analysis were determined. Support vector machine learning was used for classification. Stepwise application of two biomarker signatures with 21 and 5 biomarkers provided 83% sensitivity and 67% specificity for PCa detection in a test set of samples. A panel of 11 biomarkers for advanced disease discriminated between patients with Gleason score 7 and organ-confined (source of potential peptide makers

  8. Detection of Intrinsic Source Structure at ∼3 Schwarzschild Radii with Millimeter-VLBI Observations of SAGITTARIUS A*

    Science.gov (United States)

    Lu, Ru-Sen; Krichbaum, Thomas P.; Roy, Alan L.; Fish, Vincent L.; Doeleman, Sheperd S.; Johnson, Michael D.; Akiyama, Kazunori; Psaltis, Dimitrios; Alef, Walter; Asada, Keiichi; Beaudoin, Christopher; Bertarini, Alessandra; Blackburn, Lindy; Blundell, Ray; Bower, Geoffrey C.; Brinkerink, Christiaan; Broderick, Avery E.; Cappallo, Roger; Crew, Geoffrey B.; Dexter, Jason; Dexter, Matt; Falcke, Heino; Freund, Robert; Friberg, Per; Greer, Christopher H.; Gurwell, Mark A.; Ho, Paul T. P.; Honma, Mareki; Inoue, Makoto; Kim, Junhan; Lamb, James; Lindqvist, Michael; Macmahon, David; Marrone, Daniel P.; Martí-Vidal, Ivan; Menten, Karl M.; Moran, James M.; Nagar, Neil M.; Plambeck, Richard L.; Primiani, Rurik A.; Rogers, Alan E. E.; Ros, Eduardo; Rottmann, Helge; SooHoo, Jason; Spilker, Justin; Stone, Jordan; Strittmatter, Peter; Tilanus, Remo P. J.; Titus, Michael; Vertatschitsch, Laura; Wagner, Jan; Weintroub, Jonathan; Wright, Melvyn; Young, Ken H.; Zensus, J. Anton; Ziurys, Lucy M.

    2018-05-01

    We report results from very long baseline interferometric (VLBI) observations of the supermassive black hole in the Galactic center, Sgr A*, at 1.3 mm (230 GHz). The observations were performed in 2013 March using six VLBI stations in Hawaii, California, Arizona, and Chile. Compared to earlier observations, the addition of the APEX telescope in Chile almost doubles the longest baseline length in the array, provides additional uv coverage in the N–S direction, and leads to a spatial resolution of ∼30 μas (∼3 Schwarzschild radii) for Sgr A*. The source is detected even at the longest baselines with visibility amplitudes of ∼4%–13% of the total flux density. We argue that such flux densities cannot result from interstellar refractive scattering alone, but indicate the presence of compact intrinsic source structure on scales of ∼3 Schwarzschild radii. The measured nonzero closure phases rule out point-symmetric emission. We discuss our results in the context of simple geometric models that capture the basic characteristics and brightness distributions of disk- and jet-dominated models and show that both can reproduce the observed data. Common to these models are the brightness asymmetry, the orientation, and characteristic sizes, which are comparable to the expected size of the black hole shadow. Future 1.3 mm VLBI observations with an expanded array and better sensitivity will allow more detailed imaging of the horizon-scale structure and bear the potential for a deep insight into the physical processes at the black hole boundary.

  9. Fatty acids oxidation and alternative energy sources detected in Taenia crassiceps cysticerci after host treatment with antihelminthic drugs.

    Science.gov (United States)

    Fraga, Carolina Miguel; Costa, Tatiane Luiza; Bezerra, José Clecildo Barreto; de Souza Lino Junior, Ruy; Vinaud, Marina Clare

    2012-05-01

    Human cysticercosis caused by Taenia crassiceps is rare however it is considered of zoonotic risk. The treatment of the infected patients was successful when using albendazole or praziquantel. The active forms of albendazole inhibit the glucose uptake and the active forms of praziquantel alter glycogen levels and nutrients absorption. The aim of this study was to analyze the production of organic acids that indicate the oxidation of fatty acids and the use of alternative energy sources from T. crassiceps cysticerci removed from the peritoneal cavity of mice treated with low dosages of albendazole (5.75 and 11.5mg/kg) or praziquantel (3.83 and 7.67 mg/kg). The beta-hydroxibutyrate production was higher by the larval stage cysticerci in all treated groups and the propionate production was higher in final stage cysticerci treated with 11.5mg/kg of albendazole when compared to the control group. The larval stages of cysticerci from the groups treated with 5.75 mg/kg of albendazole and 3.83 mg/kg of praziquantel produced more urea than the initial and final stages which indicate amino acids breakdown. We conclude that it was possible to detect the fatty acid oxidation and amino acids breakdown which indicate the use of alternative energy production sources as the used dosages only cause a partial blockage of the glucose uptake and leads to metabolic alterations in the cysticerci. The metabolic behavior observed after host treatment was different from former descriptions of the in vitro one which indicates great host-parasite interaction. Copyright © 2012 Elsevier Inc. All rights reserved.

  10. Convection Weather Detection by General Aviation Pilots with Convectional and Data-Linked Graphical Weather Information Sources

    Science.gov (United States)

    Chamberlain, James P.; Latorella, Kara A.

    2001-01-01

    This study compares how well general aviation (GA) pilots detect convective weather in flight with different weather information sources. A flight test was conducted in which GA pilot test subjects were given different in-flight weather information cues and flown toward convective weather of moderate or greater intensity. The test subjects were not actually flying the aircraft, but were given pilot tasks representative of the workload and position awareness requirements of the en route portion of a cross country GA flight. On each flight, one test subject received weather cues typical of a flight in visual meteorological conditions (VMC), another received cues typical of flight in instrument meteorological conditions (IMC), and a third received cues typical of flight in IMC but augmented with a graphical weather information system (GWIS). The GWIS provided the subject with near real time data-linked weather products, including a weather radar mosaic superimposed on a moving map with a symbol depicting the aircraft's present position and direction of track. At several points during each flight, the test subjects completed short questionnaires which included items addressing their weather situation awareness and flight decisions. In particular, test subjects were asked to identify the location of the nearest convective cells. After the point of nearest approach to convective weather, the test subjects were asked to draw the location of convective weather on an aeronautical chart, along with the aircraft's present position. This paper reports preliminary results on how accurately test subjects provided with these different weather sources could identify the nearest cell of moderate or greater intensity along their route of flight. Additional flight tests are currently being conducted to complete the data set.

  11. Biomarkers of Atrial Cardiopathy and Atrial Fibrillation Detection on Mobile Outpatient Continuous Telemetry After Embolic Stroke of Undetermined Source.

    Science.gov (United States)

    Sebasigari, Denise; Merkler, Alexander; Guo, Yang; Gialdini, Gino; Kummer, Benjamin; Hemendinger, Morgan; Song, Christopher; Chu, Antony; Cutting, Shawna; Silver, Brian; Elkind, Mitchell S V; Kamel, Hooman; Furie, Karen L; Yaghi, Shadi

    2017-06-01

    Biomarkers of atrial dysfunction or "cardiopathy" are associated with embolic stroke risk. However, it is unclear if this risk is mediated by undiagnosed paroxysmal atrial fibrillation or flutter (AF). We aim to determine whether atrial cardiopathy biomarkers predict AF on continuous heart-rhythm monitoring after embolic stroke of undetermined source (ESUS). This was a single-center retrospective study including all patients with ESUS undergoing 30 days of ambulatory heart-rhythm monitoring to look for AF between January 1, 2013 and December 31, 2015. We reviewed medical records for clinical, radiographic, and cardiac variables. The primary outcome was a new diagnosis of AF detected during heart-rhythm monitoring. The primary predictors were atrial biomarkers: left atrial diameter on echocardiography, P-wave terminal force in electrocardiogram (ECG) lead V1, and P wave - R wave (PR) interval on ECG. A multiple logistic regression model was used to assess the relationship between atrial biomarkers and AF detection. Among 196 eligible patients, 23 (11.7%) were diagnosed with AF. In unadjusted analyses, patients with AF were older (72.4 years versus 61.4 years, P atrial diameter (39.2 mm versus 35.7 mm, P = .03). In a multivariable model, the only predictor of AF was age ≥ 60 years (odds ratio, 3.0; 95% CI, 1.06-8.5; P = .04). Atrial biomarkers were weakly associated with AF after ESUS. This suggests that previously reported associations between these markers and stroke may reflect independent cardiac pathways leading to stroke. Prospective studies are needed to investigate these mechanisms. Copyright © 2017 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  12. Distributed Denial of Service Attack Source Detection Using Efficient Traceback Technique (ETT) in Cloud-Assisted Healthcare Environment.

    Science.gov (United States)

    Latif, Rabia; Abbas, Haider; Latif, Seemab; Masood, Ashraf

    2016-07-01

    Security and privacy are the first and foremost concerns that should be given special attention when dealing with Wireless Body Area Networks (WBANs). As WBAN sensors operate in an unattended environment and carry critical patient health information, Distributed Denial of Service (DDoS) attack is one of the major attacks in WBAN environment that not only exhausts the available resources but also influence the reliability of information being transmitted. This research work is an extension of our previous work in which a machine learning based attack detection algorithm is proposed to detect DDoS attack in WBAN environment. However, in order to avoid complexity, no consideration was given to the traceback mechanism. During traceback, the challenge lies in reconstructing the attack path leading to identify the attack source. Among existing traceback techniques, Probabilistic Packet Marking (PPM) approach is the most commonly used technique in conventional IP- based networks. However, since marking probability assignment has significant effect on both the convergence time and performance of a scheme, it is not directly applicable in WBAN environment due to high convergence time and overhead on intermediate nodes. Therefore, in this paper we have proposed a new scheme called Efficient Traceback Technique (ETT) based on Dynamic Probability Packet Marking (DPPM) approach and uses MAC header in place of IP header. Instead of using fixed marking probability, the proposed scheme uses variable marking probability based on the number of hops travelled by a packet to reach the target node. Finally, path reconstruction algorithms are proposed to traceback an attacker. Evaluation and simulation results indicate that the proposed solution outperforms fixed PPM in terms of convergence time and computational overhead on nodes.

  13. Wafer plane inspection with soft resist thresholding

    Science.gov (United States)

    Hess, Carl; Shi, Rui-fang; Wihl, Mark; Xiong, Yalin; Pang, Song

    2008-10-01

    Wafer Plane Inspection (WPI) is an inspection mode on the KLA-Tencor TeraScaTM platform that uses the high signalto- noise ratio images from the high numerical aperture microscope, and then models the entire lithographic process to enable defect detection on the wafer plane[1]. This technology meets the needs of some advanced mask manufacturers to identify the lithographically-significant defects while ignoring the other non-lithographically-significant defects. WPI accomplishes this goal by performing defect detection based on a modeled image of how the mask features would actually print in the photoresist. There are several advantages to this approach: (1) the high fidelity of the images provide a sensitivity advantage over competing approaches; (2) the ability to perform defect detection on the wafer plane allows one to only see those defects that have a printing impact on the wafer; (3) the use of modeling on the lithographic portion of the flow enables unprecedented flexibility to support arbitrary illumination profiles, process-window inspection in unit time, and combination modes to find both printing and non-printing defects. WPI is proving to be a valuable addition to the KLA-Tencor detection algorithm suite. The modeling portion of WPI uses a single resist threshold as the final step in the processing. This has been shown to be adequate on several advanced customer layers, but is not ideal for all layers. Actual resist chemistry has complicated processes including acid and base-diffusion and quench that are not consistently well-modeled with a single resist threshold. We have considered the use of an advanced resist model for WPI, but rejected it because the burdensome requirements for the calibration of the model were not practical for reticle inspection. This paper describes an alternative approach that allows for a "soft" resist threshold to be applied that provides a more robust solution for the most challenging processes. This approach is just

  14. Music effect on pain threshold evaluated with current perception threshold

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    AIM: Music relieves anxiety and psychotic tension. This effect of music is applied to surgical operation in the hospital and dental office. It is still unclear whether this music effect is only limited to the psychological aspect but not to the physical aspect or whether its music effect is influenced by the mood or emotion of audience. To elucidate these issues, we evaluated the music effect on pain threshold by current perception threshold (CPT) and profile of mood states (POMC) test. METHODS: Healthy 30 subjects (12 men, 18 women, 25-49 years old, mean age 34.9) were tested. (1)After POMC test, all subjects were evaluated pain threshold with CPT by Neurometer (Radionics, USA) under 6 conditions, silence, listening to the slow tempo classic music, nursery music, hard rock music, classic paino music and relaxation music with 30 seconds interval. (2)After Stroop color word test as the stresser, pain threshold was evaluated with CPT under 2 conditions, silence and listening to the slow tempo classic music. RESULTS: Under litening to the music, CPT sores increased, especially 2 000 Hz level related with compression, warm and pain sensation. Type of music, preference of music and stress also affected CPT score. CONCLUSION: The present study demonstrated that the concentration on the music raise the pain threshold and that stress and mood influence the music effect on pain threshold.

  15. A Soft Computing Approach to Crack Detection and Impact Source Identification with Field-Programmable Gate Array Implementation

    Directory of Open Access Journals (Sweden)

    Arati M. Dixit

    2013-01-01

    Full Text Available The real-time nondestructive testing (NDT for crack detection and impact source identification (CDISI has attracted the researchers from diverse areas. This is apparent from the current work in the literature. CDISI has usually been performed by visual assessment of waveforms generated by a standard data acquisition system. In this paper we suggest an automation of CDISI for metal armor plates using a soft computing approach by developing a fuzzy inference system to effectively deal with this problem. It is also advantageous to develop a chip that can contribute towards real time CDISI. The objective of this paper is to report on efforts to develop an automated CDISI procedure and to formulate a technique such that the proposed method can be easily implemented on a chip. The CDISI fuzzy inference system is developed using MATLAB’s fuzzy logic toolbox. A VLSI circuit for CDISI is developed on basis of fuzzy logic model using Verilog, a hardware description language (HDL. The Xilinx ISE WebPACK9.1i is used for design, synthesis, implementation, and verification. The CDISI field-programmable gate array (FPGA implementation is done using Xilinx’s Spartan 3 FPGA. SynaptiCAD’s Verilog Simulators—VeriLogger PRO and ModelSim—are used as the software simulation and debug environment.

  16. Detection of silver nanoparticles in parsley by solid sampling high-resolution-continuum source atomic absorption spectrometry.

    Science.gov (United States)

    Feichtmeier, Nadine S; Leopold, Kerstin

    2014-06-01

    In this work, we present a fast and simple approach for detection of silver nanoparticles (AgNPs) in biological material (parsley) by solid sampling high-resolution-continuum source atomic absorption spectrometry (HR-CS AAS). A novel evaluation strategy was developed in order to distinguish AgNPs from ionic silver and for sizing of AgNPs. For this purpose, atomisation delay was introduced as significant indication of AgNPs, whereas atomisation rates allow distinction of 20-, 60-, and 80-nm AgNPs. Atomisation delays were found to be higher for samples containing silver ions than for samples containing silver nanoparticles. A maximum difference in atomisation delay normalised by the sample weight of 6.27 ± 0.96 s mg(-1) was obtained after optimisation of the furnace program of the AAS. For this purpose, a multivariate experimental design was used varying atomisation temperature, atomisation heating rate and pyrolysis temperature. Atomisation rates were calculated as the slope of the first inflection point of the absorbance signals and correlated with the size of the AgNPs in the biological sample. Hence, solid sampling HR-CS AAS was proved to be a promising tool for identifying and distinguishing silver nanoparticles from ionic silver directly in solid biological samples.

  17. The development of a XEOL and TR XEOL detection system for the I18 microfocus beamline Diamond light source

    International Nuclear Information System (INIS)

    Taylor, R.P.; Finch, A.A.; Mosselmans, J.F.W.; Quinn, P.D.

    2013-01-01

    We describe the design and capabilities of a new Continuous Wave X-ray Excited Optical Luminescence (CW-XEOL) and Time Resolved X-ray Excited Optical Luminescence (TR-XEOL) facility on the I18 beamline at the DIAMOND light source, the UK national synchrotron facility. Experimental data from a suite of framework silicates are presented to illustrate the capabilities of the system. Experiments studied include simple (CW-XEOL) spectroscopy, (TR XEOL) lifetime experiments dose and dose rate dependence (CW-XEOL) experiments, spatial (TR XEOL) on heterogeneous sample, wavelength resolved (TR XEOL), Optically Derived X-ray Absorption Spectroscopy, (OD XAS). - Highlights: ► We describe the capabilities of a new CW-XEOL and TR-XEOL detection system for a hard X-ray beamline. ► We model TR XEOL with luminescence lifetimes from ∼25 ps to ∼400 ps from framework silicates. ► CW XEOL, 200–900 nm with a resolution of ∼0.9 nm is used to complete dose and dose rate experiments. ► Micro-beam high spatial resolution XEOL within X-ray excitation energies 2–20 keV.

  18. A non-parametric framework for estimating threshold limit values</