WorldWideScience

Sample records for micro-spectroscopic detection analysis

  1. Micro-Spectroscopic Chemical Imaging of Individual Identified Marine Biogenic and Ambient Organic Ice Nuclei (Invited)

    Science.gov (United States)

    Knopf, D. A.; Alpert, P. A.; Wang, B.; OBrien, R. E.; Moffet, R. C.; Aller, J. Y.; Laskin, A.; Gilles, M.

    2013-12-01

    Atmospheric ice formation represents one of the least understood atmospheric processes with important implications for the hydrological cycle and climate. Current freezing descriptions assume that ice active sites on the particle surface initiate ice nucleation, however, the nature of these sites remains elusive. Here, we present a new experimental method that allows us to relate physical and chemical properties of individual particles with observed water uptake and ice nucleation ability using a combination of micro-spectroscopic and optical single particle analytical techniques. We apply this method to field-collected particles and particles generated via bursting of bubbles produced by glass frit aeration and plunging water impingement jets in a mesocosm containing artificial sea water and bacteria and/or phytoplankton. The most efficient ice nuclei (IN) within a particle population are identified and characterized. Single particle characterization is achieved by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy. A vapor controlled cooling-stage coupled to an optical microscope is used to determine the onsets of water uptake, immersion freezing, and deposition ice nucleation of the individual particles as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. In addition, we perform CCSEM/EDX to obtain on a single particle level the elemental composition of the entire particle population. Thus, we can determine if the IN are exceptional in nature or belong to a major particle type class with respect to composition and size. We find that ambient and sea spray particles are coated by organic material and can induce ice formation under tropospheric relevant conditions. Micro-spectroscopic single particle analysis of the investigated particle samples invokes a potential

  2. Activation analysis. Detection limits

    International Nuclear Information System (INIS)

    Revel, G.

    1999-01-01

    Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)

  3. Visible-IR and Raman micro-spectroscopic investigation of three Itokawa particles collected by Hayabusa

    Science.gov (United States)

    Brunetto, R.; Bonal, L.; Beck, P.; Dartois, E.; Dionnet, Z.; Djouadi, Z.; Füri, E.; Kakazu, Y.; Oudayer, P.; Quirico, E.; Engrand, C.

    2014-07-01

    distinct Hayabusa particles [e.g., 1]. The Itokawa materials are compatible with an LL4-6 chondrite classification based on O isotopes and chemical compositions of minerals (e.g., [1,2]). In particular, -0163 might be related to the least metamorphosed particles (LL4), based on the high Fo content of the olivine [1]. The diffuse reflectance VIS-NIR spectra are consistent with the presence of the mineral groups detected via Raman and IR. In particular, the spectra of particles -0163 and -0213 are also compatible with the ground-based observations of the asteroid Itokawa [3] both in terms of the 1-μ m band depth and the spectral slope. Particle -0174 has a similar 1-μ m band depth but higher (redder) spectral slope, possibly indicative of the presence of a larger amount of nanophase metallic iron, a by-product of space weathering induced by solar wind, similarly to what has been detected on other Itokawa particles [4]. Future work: A noble gas study of the particles will be performed. We will determine the noble gas (He-Ne-Ar) and nitrogen abundance and isotope characteristics of the two grains by CO_2 laser heating or UV laser ablation. By identifying and quantifying the proportion of solar and cosmogenic volatiles in Itokawa samples, we will be able to better constrain the residence time of dust particles on the surface of the asteroid, and to determine if any primordial volatile component has survived in the regolith material.

  4. Malware detection and analysis

    Science.gov (United States)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan; Sanders, Benjamin; Erickson, Jeremy Lee; Fritz, David Jakob

    2016-03-22

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable to the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.

  5. Vibrational Micro-Spectroscopy of Human Tissues Analysis: Review.

    Science.gov (United States)

    Bunaciu, Andrei A; Hoang, Vu Dang; Aboul-Enein, Hassan Y

    2017-05-04

    Vibrational spectroscopy (Infrared (IR) and Raman) and, in particular, micro-spectroscopy and micro-spectroscopic imaging have been used to characterize developmental changes in tissues, to monitor these changes in cell cultures and to detect disease and drug-induced modifications. The conventional methods for biochemical and histophatological tissue characterization necessitate complex and "time-consuming" sample manipulations and the results are rarely quantifiable. The spectroscopy of molecular vibrations using mid-IR or Raman techniques has been applied to samples of human tissue. This article reviews the application of these vibrational spectroscopic techniques for analysis of biological tissue published between 2005 and 2015.

  6. Signal analysis for failure detection

    International Nuclear Information System (INIS)

    Parpaglione, M.C.; Perez, L.V.; Rubio, D.A.; Czibener, D.; D'Attellis, C.E.; Brudny, P.I.; Ruzzante, J.E.

    1994-01-01

    Several methods for analysis of acoustic emission signals are presented. They are mainly oriented to detection of changes in noisy signals and characterization of higher amplitude discrete pulses or bursts. The aim was to relate changes and events with failure, crack or wear in materials, being the final goal to obtain automatic means of detecting such changes and/or events. Performance evaluation was made using both simulated and laboratory test signals. The methods being presented are the following: 1. Application of the Hopfield Neural Network (NN) model for classifying faults in pipes and detecting wear of a bearing. 2. Application of the Kohonnen and Back Propagation Neural Network model for the same problem. 3. Application of Kalman filtering to determine time occurrence of bursts. 4. Application of a bank of Kalman filters (KF) for failure detection in pipes. 5. Study of amplitude distribution of signals for detecting changes in their shape. 6. Application of the entropy distance to measure differences between signals. (author). 10 refs, 11 figs

  7. Development of synchrotron x-ray micro-spectroscopic techniques and application to problems in low temperature geochemistry. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The focus of the technical development effort has been the development of apparatus and techniques for the utilization of X-ray Fluorescence (XRF), Extended X-ray Absorption Fine Structure (EXAFS) and X-ray Absorption Near Edge Structure (XANES) spectroscopies in a microprobe mode. The present XRM uses white synchrotron radiation (3 to 30 keV) from a bending magnet for trace element analyses using the x-ray fluorescence technique Two significant improvements to this device have been recently implemented. Focusing Mirror: An 8:1 ellipsoidal mirror was installed in the X26A beamline to focus the incident synchrotron radiation and thereby increase the flux on the sample by about a factor of 30. Incident Beam Monochromator: The monochromator has been successfully installed and commissioned in the X26A beamline upstream of the mirror to permit analyses with focused monochromatic radiation. The monochromator consists of a channel-cut silicon (111) crystal driven by a Klinger stepping motor translator. We have demonstrated the operating range of this instrument is 4 and 20 keV with 0.01 eV steps and produces a beam with a {approximately}10{sup {minus}4} energy bandwidth. The primary purpose of the monochromator is for x-ray absorption spectroscopy (XAS) measurements but it is also used for selective excitation in trace element microanalysis. To date, we have conducted XANES studies on Ti, Cr, Fe, Ce and U, spanning the entire accessible energy range and including both K and L edge spectra. Practical detection limits for microXANES are 10--100 ppM for 100 {mu}m spots.

  8. Micro-spectroscopic investigation of valence change processes in resistive switching SrTiO3 thin films

    International Nuclear Information System (INIS)

    Koehl, Annemarie

    2014-01-01

    oxide film and the electrode is investigated by photoemission electron microscopy. Within this work devices with different thickness of the oxide layer are studied. While the results for thicker films can be explained by a localization of the switching effect within growth defects, for films with a lower oxide thickness we observe a considerable modification of the chemical structure up to phase formation on an extended lateral scale. In particular, we detect the formation of a new, Sr-rich phase which can be modeled by a special Ruddlesden-Popper phase using ab-initio theory. While most switching models assume only the diffusion of oxygen vacancies, our experiments clearly reveal that (at least) during forming diffusion is also enabled within the cation sublattice.

  9. Micro-spectroscopic investigation of valence change processes in resistive switching SrTiO{sub 3} thin films

    Energy Technology Data Exchange (ETDEWEB)

    Koehl, Annemarie

    2014-05-15

    between the oxide film and the electrode is investigated by photoemission electron microscopy. Within this work devices with different thickness of the oxide layer are studied. While the results for thicker films can be explained by a localization of the switching effect within growth defects, for films with a lower oxide thickness we observe a considerable modification of the chemical structure up to phase formation on an extended lateral scale. In particular, we detect the formation of a new, Sr-rich phase which can be modeled by a special Ruddlesden-Popper phase using ab-initio theory. While most switching models assume only the diffusion of oxygen vacancies, our experiments clearly reveal that (at least) during forming diffusion is also enabled within the cation sublattice.

  10. Traffic sign detection and analysis

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Trivedi, Mohan M.; Moeslund, Thomas B.

    2012-01-01

    Traffic sign recognition (TSR) is a research field that has seen much activity in the recent decade. This paper introduces the problem and presents 4 recent papers on traffic sign detection and 4 recent papers on traffic sign classification. It attempts to extract recent trends in the field...

  11. Radar fall detection using principal component analysis

    Science.gov (United States)

    Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem

    2016-05-01

    Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.

  12. Crack Detection with Lamb Wave Wavenumber Analysis

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  13. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  14. Tornado detection data reduction and analysis

    Science.gov (United States)

    Davisson, L. D.

    1977-01-01

    Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.

  15. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  16. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  17. Linear discriminant analysis for welding fault detection

    International Nuclear Information System (INIS)

    Li, X.; Simpson, S.W.

    2010-01-01

    This work presents a new method for real time welding fault detection in industry based on Linear Discriminant Analysis (LDA). A set of parameters was calculated from one second blocks of electrical data recorded during welding and based on control data from reference welds under good conditions, as well as faulty welds. Optimised linear combinations of the parameters were determined with LDA and tested with independent data. Short arc welds in overlap joints were studied with various power sources, shielding gases, wire diameters, and process geometries. Out-of-position faults were investigated. Application of LDA fault detection to a broad range of welding procedures was investigated using a similarity measure based on Principal Component Analysis. The measure determines which reference data are most similar to a given industrial procedure and the appropriate LDA weights are then employed. Overall, results show that Linear Discriminant Analysis gives an effective and consistent performance in real-time welding fault detection.

  18. Analysis and detection of climate change

    International Nuclear Information System (INIS)

    Thejll, P.; Stendel, M.

    2001-01-01

    The authors first discuss the concepts 'climate' and 'climate change detection', outlining the difficulties of the latter in terms of the properties of the former. In more detail they then discuss the analysis and detection, carried out at the Danish Climate Centre, of anthropogenic climate change and the nonanthropogenic changes regarding anthropogenic climate change the emphasis is on the improvement of global and regional climate models, and the reconstruction of past climates regarding non-anthropogenic changes the authors describe two case studies of potential solar influence on climate. (LN)

  19. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  20. EXOPLANETARY DETECTION BY MULTIFRACTAL SPECTRAL ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Sahil; Wettlaufer, John S. [Program in Applied Mathematics, Yale University, New Haven, CT (United States); Sordo, Fabio Del [Department of Astronomy, Yale University, New Haven, CT (United States)

    2017-01-01

    Owing to technological advances, the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies timescales that characterize planetary orbital motion around the host star and those that arise from stellar features such as spots. Without fitting stellar models to spectral data, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the timescales obtained to primary and secondary eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via Doppler shift detection. Finally, we have analyzed synthetic spectra obtained using the SOAP 2.0 tool, which simulates a stellar spectrum and the influence of the presence of a planet or a spot on that spectrum over one orbital period. We have demonstrated that, so long as the signal-to-noise-ratio ≥ 75, our approach reconstructs the planetary orbital period, as well as the rotation period of a spot on the stellar surface.

  1. Detection of irradiated spices by thermoluminescence analysis

    International Nuclear Information System (INIS)

    Hammerton, K.M.; Banos, C.

    1996-01-01

    Spices are used extensively in prepared foods. The high levels of contamination of many spices with microorganisms poses a problem for the food industry. Irradiation treatment is the most effective means of reducing the microbial load to safe levels. Although the process is currently subject to a moratorium in Australia, it is used in several countries for the decontamination of spices. Methods for detecting irradiation treatment of spices are necessary to enforce compliance with labelling requirements or with a prohibition on the sale of irradiated foods. Thermoluminescence (TL) analysis of spice samples has been shown to be an applicable method for the detection of all irradiated spices. It was established that the TL response originates from the adhering mineral dust in the sample. Definitive identification of many irradiated spices requires the separation of a mineral extract from the organic fraction of the spice sample. This separation can be achieved by using density centrifugation with a heavy liquid, sodium polytungstate. Clear discrimination between untreated and irradiated spice samples has been obtained by re-irradiation of the mineral extract after the first TL analysis with an absorbed dose of about 1 kGy (normalisation). The ratio of the first to second TL response was about one for irradiated samples and well below one for untreated samples. These methods have been investigated with a range of spices to establish the most suitable method for routine control purposes. (author)

  2. Chemical detection, identification, and analysis system

    International Nuclear Information System (INIS)

    Morel, R.S.; Gonzales, D.; Mniszewski, S.

    1990-01-01

    The chemical detection, identification, and analysis system (CDIAS) has three major goals. The first is to display safety information regarding chemical environment before personnel entry. The second is to archive personnel exposure to the environment. Third, the system assists users in identifying the stage of a chemical process in progress and suggests safety precautions associated with that process. In addition to these major goals, the system must be sufficiently compact to provide transportability, and it must be extremely simple to use in order to keep user interaction at a minimum. The system created to meet these goals includes several pieces of hardware and the integration of four software packages. The hardware consists of a low-oxygen, carbon monoxide, explosives, and hydrogen sulfide detector; an ion mobility spectrometer for airborne vapor detection; and a COMPAQ 386/20 portable computer. The software modules are a graphics kernel, an expert system shell, a data-base management system, and an interface management system. A supervisory module developed using the interface management system coordinates the interaction of the other software components. The system determines the safety of the environment using conventional data acquisition and analysis techniques. The low-oxygen, carbon monoxide, hydrogen sulfide, explosives, and vapor detectors are monitored for hazardous levels, and warnings are issued accordingly

  3. Detection and analysis of CRISPRs of Shigella.

    Science.gov (United States)

    Guo, Xiangjiao; Wang, Yingfang; Duan, Guangcai; Xue, Zerun; Wang, Linlin; Wang, Pengfei; Qiu, Shaofu; Xi, Yuanlin; Yang, Haiyan

    2015-01-01

    The recently discovered CRISPRs (Clustered regularly interspaced short palindromic repeats) and Cas (CRISPR-associated) proteins are a novel genetic barrier that limits horizontal gene transfer in prokaryotes and the CRISPR loci provide a historical view of the exposure of prokaryotes to a variety of foreign genetic elements. The aim of study was to investigate the occurrence and distribution of the CRISPRs in Shigella. A collection of 61 strains of Shigella were screened for the existence of CRISPRs. Three CRISPR loci were identified among 61 shigella strains. CRISPR1/cas loci are detected in 49 strains of shigella. Yet, IS elements were detected in cas gene in some strains. In the remaining 12 Shigella flexneri strains, the CRISPR1/cas locus is deleted and only a cas3' pseudo gene and a repeat sequence are present. The presence of CRISPR2 is frequently accompanied by the emergence of CRISPR1. CRISPR3 loci were present in almost all strains (52/61). The length of CRISPR arrays varied from 1 to 9 spacers. Sequence analysis of the CRISPR arrays revealed that few spacers had matches in the GenBank databases. However, one spacer in CRISPR3 loci matches the cognate cas3 genes and no cas gene was present around CRISPR3 region. Analysis of CRISPR sequences show that CRISPR have little change which makes CRISPR poor genotyping markers. The present study is the first attempt to determine and analyze CRISPRs of shigella isolated from clinical patients.

  4. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W.B.; Johnston, P.N.; Walker, S.R.; Bubb, I.F. [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J.F. [New South Wales Univ., Kensington, NSW (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  5. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W B; Johnston, P N; Walker, S R; Bubb, I F [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J F [New South Wales Univ., Kensington, NSW (Australia); Cohen, D D; Dytlewski, N [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  6. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  7. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2005-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  8. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2004-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  9. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  10. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    maintains the flexibility of deciding sooner than the fixed sample size procedure at the price of some lower power [13, 514]. The sequential probability... markets , detection of signals with unknown arrival time in seismology, navigation, radar and sonar signal processing, speech segmentation, and the... skimming cruise missile can yield a significant increase in the probability of raid annihilation. Furthermore, usually detection systems are

  11. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  12. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  13. Website Detection Using Remote Traffic Analysis

    OpenAIRE

    Gong, Xun; Kiyavash, Negar; Schear, Nabíl; Borisov, Nikita

    2011-01-01

    Recent work in traffic analysis has shown that traffic patterns leaked through side channels can be used to recover important semantic information. For instance, attackers can find out which website, or which page on a website, a user is accessing simply by monitoring the packet size distribution. We show that traffic analysis is even a greater threat to privacy than previously thought by introducing a new attack that can be carried out remotely. In particular, we show that, to perform traffi...

  14. Analysis of Exhaled Breath for Disease Detection

    Science.gov (United States)

    Amann, Anton; Miekisch, Wolfram; Schubert, Jochen; Buszewski, Bogusław; Ligor, Tomasz; Jezierski, Tadeusz; Pleil, Joachim; Risby, Terence

    2014-06-01

    Breath analysis is a young field of research with great clinical potential. As a result of this interest, researchers have developed new analytical techniques that permit real-time analysis of exhaled breath with breath-to-breath resolution in addition to the conventional central laboratory methods using gas chromatography-mass spectrometry. Breath tests are based on endogenously produced volatiles, metabolites of ingested precursors, metabolites produced by bacteria in the gut or the airways, or volatiles appearing after environmental exposure. The composition of exhaled breath may contain valuable information for patients presenting with asthma, renal and liver diseases, lung cancer, chronic obstructive pulmonary disease, inflammatory lung disease, or metabolic disorders. In addition, oxidative stress status may be monitored via volatile products of lipid peroxidation. Measurement of enzyme activity provides phenotypic information important in personalized medicine, whereas breath measurements provide insight into perturbations of the human exposome and can be interpreted as preclinical signals of adverse outcome pathways.

  15. Kernel principal component analysis for change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Morton, J.C.

    2008-01-01

    region acquired at two different time points. If change over time does not dominate the scene, the projection of the original two bands onto the second eigenvector will show change over time. In this paper a kernel version of PCA is used to carry out the analysis. Unlike ordinary PCA, kernel PCA...... with a Gaussian kernel successfully finds the change observations in a case where nonlinearities are introduced artificially....

  16. A statistical analysis on the leak detection performance of ...

    Indian Academy of Sciences (India)

    Chinedu Duru

    2017-11-09

    Nov 9, 2017 ... of underground and overground pipelines with wireless sensor networks through the .... detection performance analysis of pipeline leakage. This study and ..... case and apply to all materials transported through the pipeline.

  17. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    Lu Zhigang; Wu Huan; Liu Baoxu

    2007-01-01

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  18. Social Media Sentiment Analysis and Topic Detection for Singapore English

    Science.gov (United States)

    2013-09-01

    study of NLP techniques,” La Revista de Procesamiento de Lenguaje Natural , vol. 50, pp. 45–52, 2013. [5] F. Batista, and R. Ribeiro, “Sentiment...have been made possible via social-media applications. Sentiment analysis and topic detection are two growing areas in Natural Language Processing...social-media applications. Sentiment analysis and topic detection are two growing areas in Natural Language Processing, and there are increasing

  19. Optimizing detection and analysis of slow waves in sleep EEG.

    Science.gov (United States)

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters.

  1. Data analysis of inertial sensor for train positioning detection system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seong Jin; Park, Sung Soo; Lee, Jae Ho; Kang, Dong Hoon [Korea Railroad Research Institute, Uiwang (Korea, Republic of)

    2015-02-15

    Train positioning detection information is fundamental for high-speed railroad inspection, making it possible to simultaneously determine the status and evaluate the integrity of railroad equipment. This paper presents the results of measurements and an analysis of an inertial measurement unit (IMU) used as a positioning detection sensors. Acceleration and angular rate measurements from the IMU were analyzed in the amplitude and frequency domains, with a discussion on vibration and train motions. Using these results and GPS information, the positioning detection of a Korean tilting train express was performed from Naju station to Illo station on the Honam-line. The results of a synchronized analysis of sensor measurements and train motion can help in the design of a train location detection system and improve the positioning detection performance.

  2. Digital Printing Quality Detection and Analysis Technology Based on CCD

    Science.gov (United States)

    He, Ming; Zheng, Liping

    2017-12-01

    With the help of CCD digital printing quality detection and analysis technology, it can carry out rapid evaluation and objective detection of printing quality, and can play a certain control effect on printing quality. It can be said CDD digital printing quality testing and analysis of the rational application of technology, its digital printing and printing materials for a variety of printing equipments to improve the quality of a very positive role. In this paper, we do an in-depth study and discussion based on the CCD digital print quality testing and analysis technology.

  3. Detecting fire in video stream using statistical analysis

    Directory of Open Access Journals (Sweden)

    Koplík Karel

    2017-01-01

    Full Text Available The real time fire detection in video stream is one of the most interesting problems in computer vision. In fact, in most cases it would be nice to have fire detection algorithm implemented in usual industrial cameras and/or to have possibility to replace standard industrial cameras with one implementing the fire detection algorithm. In this paper, we present new algorithm for detecting fire in video. The algorithm is based on tracking suspicious regions in time with statistical analysis of their trajectory. False alarms are minimized by combining multiple detection criteria: pixel brightness, trajectories of suspicious regions for evaluating characteristic fire flickering and persistence of alarm state in sequence of frames. The resulting implementation is fast and therefore can run on wide range of affordable hardware.

  4. Steam leak detection method in pipeline using histogram analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun [Saean Engineering Corp, Seoul (Korea, Republic of); Park, Jong Won [Dept. of Information Communications Engineering, Chungnam NationalUnversity, Daejeon (Korea, Republic of)

    2015-10-15

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results.

  5. Human detection and motion analysis at security points

    Science.gov (United States)

    Ozer, I. Burak; Lv, Tiehan; Wolf, Wayne H.

    2003-08-01

    This paper presents a real-time video surveillance system for the recognition of specific human activities. Specifically, the proposed automatic motion analysis is used as an on-line alarm system to detect abnormal situations in a campus environment. A smart multi-camera system developed at Princeton University is extended for use in smart environments in which the camera detects the presence of multiple persons as well as their gestures and their interaction in real-time.

  6. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar....... The emphasis of this paper is on the signal-to-noise ratio of the detection and its relation to the sensitivity. Two absorbance cells with an optical path length of 100 μm and 1000 μm were characterized and compared in terms of sensitivity, limit of detection and effective path length for measurements...

  7. Cascaded image analysis for dynamic crack detection in material testing

    Science.gov (United States)

    Hampel, U.; Maas, H.-G.

    Concrete probes in civil engineering material testing often show fissures or hairline-cracks. These cracks develop dynamically. Starting at a width of a few microns, they usually cannot be detected visually or in an image of a camera imaging the whole probe. Conventional image analysis techniques will detect fissures only if they show a width in the order of one pixel. To be able to detect and measure fissures with a width of a fraction of a pixel at an early stage of their development, a cascaded image analysis approach has been developed, implemented and tested. The basic idea of the approach is to detect discontinuities in dense surface deformation vector fields. These deformation vector fields between consecutive stereo image pairs, which are generated by cross correlation or least squares matching, show a precision in the order of 1/50 pixel. Hairline-cracks can be detected and measured by applying edge detection techniques such as a Sobel operator to the results of the image matching process. Cracks will show up as linear discontinuities in the deformation vector field and can be vectorized by edge chaining. In practical tests of the method, cracks with a width of 1/20 pixel could be detected, and their width could be determined at a precision of 1/50 pixel.

  8. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  9. TB case detection in Tajikistan – analysis of existing obstacles

    Directory of Open Access Journals (Sweden)

    Alexei Korobitsyn

    2013-10-01

    Full Text Available Background: Tajikistan National TB Control ProgramObjective: (1 To identify the main obstacles to increasing TB Detection in Tajikistan. (2 To identify interventions that improve TB detection.Methods: Review of the available original research data, health normative base, health systems performance and national economic data, following WHO framework for detection of TB cases, which is based on three scenarios of why incident cases of TB may not be notified.Results: Data analysis revealed that some aspects of TB case detection are more problematic than others and that there are gaps in the knowledge of specific obstacles to TB case detection. The phenomenon of “initial default” in Tajikistan has been documented; however, it needs to be studied further. The laboratory services detect infectious TB cases effectively; however, referrals of appropriate suspects for TB diagnosis may lag behind. The knowledge about TB in the general population has improved. Yet, the problem of TB related stigma persists, thus being an obstacle for effective TB detection. High economic cost of health services driven by under-the-table payments was identified as another barrier for access to health services.Conclusion: Health system strengthening should become a primary intervention to improve case detection in Tajikistan. More research on reasons contributing to the failure to register TB cases, as well as factors underlying stigma is needed.

  10. Microstructuring of piezoresistive cantilevers for gas detection and analysis

    International Nuclear Information System (INIS)

    Sarov, Y.; Sarova, V.; Bitterlich, Ch.; Richter, O.; Guliyev, E.; Zoellner, J.-P.; Rangelow, I. W.; Andok, R.; Bencurova, A.

    2011-01-01

    In this work we report on a design and fabrication of cantilevers for gas detection and analysis. The cantilevers have expanded area of interaction with the gas, while the signal transduction is realized by an integrated piezoresistive deflection sensor, placed at the narrowed cantilever base with highest stress along the cantilever. Moreover, the cantilevers have integrated bimorph micro-actuator detection in a static and dynamic mode. The cantilevers are feasible as pressure, temperature and flow sensors and under chemical functionalization - for gas recognition, tracing and composition analysis. (authors)

  11. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  12. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  13. Analysis of Android Device-Based Solutions for Fall Detection

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  14. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  15. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Zeng Jun; Chu Chengsheng; Ding Ge; Xiang Qingpei; Hao Fanhua; Luo Xiaobing

    2013-01-01

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr 3 (Ce) fast scintillator detector and 252 Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  16. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  17. Detection of charged particles through a photodiode: design and analysis

    International Nuclear Information System (INIS)

    Angoli, A.; Quirino, L.L.; Hernandez, V.M.; Lopez del R, H.; Mireles, F.; Davila, J.I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    This project develops and construct an charge particle detector mean a pin photodiode array, design and analysis using a silicon pin Fotodiodo that generally is used to detect visible light, its good efficiency, size compact and reduced cost specifically allows to its use in the radiation monitoring and alpha particle detection. Here, so much, appears the design of the system of detection like its characterization for alpha particles where one is reported as alpha energy resolution and detection efficiency. The equipment used in the development of work consists of alpha particle a triple source composed of Am-241, Pu-239 and Cm-244 with 5,55 KBq as total activity, Maestro 32 software made by ORTEC, a multi-channel card Triumph from ORTEC and one low activity electroplated uranium sample. (Author)

  18. Live face detection based on the analysis of Fourier spectra

    Science.gov (United States)

    Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.

    2004-08-01

    Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.

  19. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  20. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  1. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T R.H.; Whitlow, H J [Lund Univ. (Sweden); Bubb, I F; Short, R; Johnston, P N [Royal Melbourne Inst. of Tech., VIC (Australia)

    1997-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  2. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T.R.H.; Whitlow, H.J. [Lund Univ. (Sweden); Bubb, I.F.; Short, R.; Johnston, P.N. [Royal Melbourne Inst. of Tech., VIC (Australia)

    1996-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  3. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    Directory of Open Access Journals (Sweden)

    Peter J. Best

    2009-03-01

    Full Text Available Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1 threat monitoring-surveillance of security audit logs for ‘red flags’, (2 automated extraction and analysis of data from audit trails, and (3 using forensic investigation techniques to determine whether a fraud has actually occurred. We demonstrate how mySAP, an enterprise system, can be used for audit trail analysis in detecting financial frauds; afterwards we use a case study of a suspected fraud to illustrate how to implement the methodology.

  4. Detection of land mines using fast and thermal neutron analysis

    International Nuclear Information System (INIS)

    Bach, P.

    1998-01-01

    The detection of land mines is made possible by using nuclear sensor based on neutron interrogation. Neutron interrogation allows to detect the sensitive elements (C, H, O, N) of the explosives in land mines or in unexploded shells: the evaluation of characteristic ratio N/O and C/O in a volume element gives a signature of high explosives. Fast neutron interrogation has been qualified in our laboratories as a powerful close distance method for identifying the presence of a mine or explosive. This method could be implemented together with a multisensor detection system - for instance IR or microwave - to reduce the false alarm rate by addressing the suspected area. Principle of operation is based on the measurement of gamma rays induced by neutron interaction with irradiated nuclei from the soil and from a possible mine. Specific energy of these gamma rays allows to recognise the elements at the origin of neutron interaction. Several detection methods can be used, depending on nuclei to be identified. Analysis of physical data, computations by simulation codes, and experimentations performed in our laboratory have shown the interest of Fast Neutron Analysis (FNA) combined with Thermal Neutron Analysis (TNA) techniques, especially for detection of nitrogen 14 N, carbon 12 C and oxygen 16 O. The FNA technique can be implemented using a 14 MeV sealed neutron tube, and a set of detectors. The mines detection has been demonstrated from our investigations, using a low power neutron generator working in the 10 8 n/s range, which is reasonable when considering safety rules. A fieldable demonstrator would be made with a detection head including tube and detectors, and with remote electronics, power supplies and computer installed in a vehicle. (author)

  5. Gravitational wave detection and data analysis for pulsar timing arrays

    NARCIS (Netherlands)

    Haasteren, Rutger van

    2011-01-01

    Long-term precise timing of Galactic millisecond pulsars holds great promise for measuring long-period (months-to-years) astrophysical gravitational waves. In this work we develop a Bayesian data analysis method for projects called pulsar timing arrays; projects aimed to detect these gravitational

  6. Detecting bots using multi-level traffic analysis

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2016-01-01

    introduces a novel multi-level botnet detection approach that performs network traffic analysis of three protocols widely considered as the main carriers of botnet Command and Control (C&C) and attack traffic, i.e. TCP, UDP and DNS. The proposed method relies on supervised machine learning for identifying...

  7. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  8. The Detection and Analysis of Chromosome Fragile Sites

    DEFF Research Database (Denmark)

    Bjerregaard, Victoria A; Özer, Özgün; Hickson, Ian D

    2018-01-01

    A fragile site is a chromosomal locus that is prone to form a gap or constriction visible within a condensed metaphase chromosome, particularly following exposure of cells to DNA replication stress. Based on their frequency, fragile sites are classified as either common (CFSs; present in all...... for detection and analysis of chromosome fragile sites....

  9. Use of Sparse Principal Component Analysis (SPCA) for Fault Detection

    DEFF Research Database (Denmark)

    Gajjar, Shriram; Kulahci, Murat; Palazoglu, Ahmet

    2016-01-01

    Principal component analysis (PCA) has been widely used for data dimension reduction and process fault detection. However, interpreting the principal components and the outcomes of PCA-based monitoring techniques is a challenging task since each principal component is a linear combination of the ...

  10. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  11. Rapid Detection of Biological and Chemical Threat Agents Using Physical Chemistry, Active Detection, and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung; Dong, Li; Fu, Rong; Liotta, Lance; Narayanan, Aarthi; Petricoin, Emanuel; Ross, Mark; Russo, Paul; Zhou, Weidong; Luchini, Alessandra; Manes, Nathan; Chertow, Jessica; Han, Suhua; Kidd, Jessica; Senina, Svetlana; Groves, Stephanie

    2007-01-01

    Basic technologies have been successfully developed within this project: rapid collection of aerosols and a rapid ultra-sensitive immunoassay technique. Water-soluble, humidity-resistant polyacrylamide nano-filters were shown to (1) capture aerosol particles as small as 20 nm, (2) work in humid air and (3) completely liberate their captured particles in an aqueous solution compatible with the immunoassay technique. The immunoassay technology developed within this project combines electrophoretic capture with magnetic bead detection. It allows detection of as few as 150-600 analyte molecules or viruses in only three minutes, something no other known method can duplicate. The technology can be used in a variety of applications where speed of analysis and/or extremely low detection limits are of great importance: in rapid analysis of donor blood for hepatitis, HIV and other blood-borne infections in emergency blood transfusions, in trace analysis of pollutants, or in search of biomarkers in biological fluids. Combined in a single device, the water-soluble filter and ultra-sensitive immunoassay technique may solve the problem of early warning type detection of aerosolized pathogens. These two technologies are protected with five patent applications and are ready for commercialization.

  12. Citation-based plagiarism detection detecting disguised and cross-language plagiarism using citation pattern analysis

    CERN Document Server

    Gipp, Bela

    2014-01-01

    Plagiarism is a problem with far-reaching consequences for the sciences. However, even today's best software-based systems can only reliably identify copy & paste plagiarism. Disguised plagiarism forms, including paraphrased text, cross-language plagiarism, as well as structural and idea plagiarism often remain undetected. This weakness of current systems results in a large percentage of scientific plagiarism going undetected. Bela Gipp provides an overview of the state-of-the art in plagiarism detection and an analysis of why these approaches fail to detect disguised plagiarism forms. The aut

  13. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  14. Multiple scattering problems in heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Johnston, P.N.; El Bouanani, M.; Stannard, W.B.; Bubb, I.F.; Cohen, D.D.; Dytlewski, N.; Siegele, R.

    1998-01-01

    A number of groups use Heavy Ion Elastic Recoil Detection Analysis (HIERDA) to study materials science problems. Nevertheless, there is no standard methodology for the analysis of HIERDA spectra. To overcome this deficiency we have been establishing codes for 2-dimensional data analysis. A major problem involves the effects of multiple and plural scattering which are very significant, even for quite thin (∼100 nm) layers of the very heavy elements. To examine the effects of multiple scattering we have made comparisons between the small-angle model of Sigmund et al. and TRIM calculations. (authors)

  15. Confirmation of identity and detection limit in neutron activation analysis

    International Nuclear Information System (INIS)

    Yustina Tri Handayani; Slamet Wiyuniati; Tulisna

    2010-01-01

    Neutron Activation Analysis (NAA) based on neutron capture by nuclides. Of the various possibilities of radionuclides that occur, radionuclides and gamma radiation which provides the identity of the element were analyzed and the best sensitivity should be determined. Confirmation for elements in sediment samples was done theoretically and experimentally. The result of confirmation shows that Al, V, Cr K, Na, Ca and Zn were analyzed based on radionuclides of Al-28, V-52, Cr-51 , K-42, Na-24, Ca-48, Zn-65. Elements of Mg, Mn, Fe, Co were analyzed based on radionuclides of Mg-27, Mn-56, Fe-59, Co-60 through peak which the highest value of combined probability of radiation emission and efficiency. Cu can be analyzed through Cu-64 or Cu-66, but the second is more sensitive. Detection limit is determined at a certain measurement conditions carried out by a laboratory. Detection limit in the NAA is determined based on the Compton continue area by Curie method. The detection limit of Al, V, Ca, Mg, Mn, As, K, Na, Mg, Ce, Co, Cr, Fe, La, Sc, and Zn in sediment samples are 240, 27, 4750, 2600, 21, 3.3 , 75, 1.4, 1.8, 0.5, 2.7, 29, 1, 0.05, and 37 ppm. Analysis of Cu in sediments which concentrations of 98.6 ppm, Cu-66 is not detected. Tests using pure standard solutions of Cu obtained detection limit of 0.12 µg, or 7.9 ppm in samples of 15 mg. In general, the detection limit obtained was higher than the detection limit of the reference, it was caused by the differences in the sample matrix and analytical conditions. (author)

  16. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.

  17. Early detection of foot ulcers through asymmetry analysis

    Science.gov (United States)

    Kaabouch, Naima; Chen, Yi; Hu, Wen-Chen; Anderson, Julie; Ames, Forrest; Paulson, Rolf

    2009-02-01

    Foot ulcers affect millions of Americans annually. Areas that are likely to ulcerate have been associated with increased local skin temperatures due to inflammation and enzymatic autolysis of tissue. Conventional methods to assess skin, including inspection and palpation, may be valuable approaches, but usually they do not detect changes in skin integrity until an ulcer has already developed. Conversely, infrared imaging is a technology able to assess the integrity of the skin and its many layers, thus having the potential to index the cascade of physiological events in the prevention, assessment, and management of foot ulcers. In this paper, we propose a technique, asymmetry analysis, to automatically analyze the infrared images in order to detect inflammation. Preliminary results show that the proposed technique can be reliable and efficient to detect inflammation and, hence, predict potential ulceration.

  18. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    Bruschi, R.; Di Porto, P.; Pallottelli, R.

    1985-01-01

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  19. Analysis of accelerants and fire debris using aroma detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Barshick, S.A.

    1997-01-17

    The purpose of this work was to investigate the utility of electronic aroma detection technologies for the detection and identification of accelerant residues in suspected arson debris. Through the analysis of known accelerant residues, a trained neural network was developed for classifying suspected arson samples. Three unknown fire debris samples were classified using this neural network. The item corresponding to diesel fuel was correctly identified every time. For the other two items, wide variations in sample concentration and excessive water content, producing high sample humidities, were shown to influence the sensor response. Sorbent sampling prior to aroma detection was demonstrated to reduce these problems and to allow proper neural network classification of the remaining items corresponding to kerosene and gasoline.

  20. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  1. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.; Palanque-Delabrouille, N. [Irfu, SPP, CEA Saclay, F-91191 Gif sur Yvette cedex (France); Lanusse, F.; Starck, J.-L., E-mail: anais.moller@cea.fr, E-mail: vanina.ruhlmann-kleider@cea.fr, E-mail: francois.lanusse@cea.fr, E-mail: jeremy.neveu@cea.fr, E-mail: nathalie.palanque-delabrouille@cea.fr, E-mail: jstarck@cea.fr [Laboratoire AIM, UMR CEA-CNRS-Paris 7, Irfu, SAp, CEA Saclay, F-91191 Gif sur Yvette cedex (France)

    2015-04-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000 detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.

  2. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  3. On damage detection in wind turbine gearboxes using outlier analysis

    Science.gov (United States)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  4. Application of factor analysis to the explosive detection

    International Nuclear Information System (INIS)

    Park, Yong Joon; Song, Byung Chul; Im, Hee Jung; Kim, Won Ho; Cho, Jung Hwan

    2005-01-01

    The detection of explosive devices hidden in airline baggage is significant problem, particularly in view of the development of modern plastic explosives which can formed into various innocent-appearing shapes and which are sufficiently powerful that small quantities can destroy an aircraft in flight. Besides, the biggest difficulty occurs from long detection time required for the explosive detection system based on thermal neutron interrogation, which involves exposing baggage to slow neutrons having energy in the order of 0.025 eV. The elemental compositions of explosives can be determined by the Neutron Induced Prompt gamma Spectroscopy (NIPS) which has been installed in Korea Atomic Energy Research Institute as a tool for the detection of explosives in passenger baggage. In this work, the factor analysis has been applied to the NIPS system to increase the signal-to-noise ratio of the prompt gamma spectrum for the detection of explosive hidden in a passenger's baggage, especially for the noisy prompt gamma spectrum obtained with short measurement time

  5. Hypernasal Speech Detection by Acoustic Analysis of Unvoiced Plosive Consonants

    Directory of Open Access Journals (Sweden)

    Alexander Sepúlveda-Sepúlveda

    2009-12-01

    Full Text Available People with a defective velopharyngeal mechanism speak with abnormal nasal resonance (hypernasal speech. Voice analysis methods for hypernasality detection commonly use vowels and nasalized vowels. However to obtain a more general assessment of this abnormality it is necessary to analyze stops and fricatives. This study describes a method with high generalization capability for hypernasality detection analyzing unvoiced Spanish stop consonants. The importance of phoneme-by-phoneme analysis is shown, in contrast with whole word parametrization which includes irrelevant segments from the classification point of view. Parameters that correlate the imprints of Velopharyngeal Incompetence (VPI over voiceless stop consonants were used in the feature estimation stage. Classification was carried out using a Support Vector Machine (SVM, including the Rademacher complexity model with the aim of increasing the generalization capability. Performances of 95.2% and 92.7% were obtained in the processing and verification stages for a repeated cross-validation classifier evaluation.

  6. Leak detection in pipelines through spectral analysis of pressure signals

    Directory of Open Access Journals (Sweden)

    Souza A.L.

    2000-01-01

    Full Text Available The development and test of a technique for leak detection in pipelines is presented. The technique is based on the spectral analysis of pressure signals measured in pipeline sections where the formation of stationary waves is favoured, allowing leakage detection during the start/stop of pumps. Experimental tests were performed in a 1250 m long pipeline for various operational conditions of the pipeline (liquid flow rate and leakage configuration. Pressure transients were obtained by four transducers connected to a PC computer. The obtained results show that the spectral analysis of pressure transients, together with the knowledge of reflection points provide a simple and efficient way of identifying leaks during the start/stop of pumps in pipelines.

  7. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira

    2013-01-01

    which rely on notions of distances or densities, this approach introduces a novel concept based on local quadratic entropy for evaluating the similarity of a data object with its neighbors. This information theoretic quantity is used to regularize the closeness amongst data instances and subsequently......Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature...... benefits the process of mapping data into a usually lower dimensional space. Outliers are then identified by spectral analysis of the eigenspace spanned by the set of leading eigenvectors derived from the mapping procedure. The proposed technique is purely data-driven and imposes no assumptions regarding...

  8. Establishment of analysis method for methane detection by gas chromatography

    Science.gov (United States)

    Liu, Xinyuan; Yang, Jie; Ye, Tianyi; Han, Zeyu

    2018-02-01

    The study focused on the establishment of analysis method for methane determination by gas chromatography. Methane was detected by hydrogen flame ionization detector, and the quantitative relationship was determined by working curve of y=2041.2x+2187 with correlation coefficient of 0.9979. The relative standard deviation of 2.60-6.33% and the recovery rate of 96.36%∼105.89% were obtained during the parallel determination of standard gas. This method was not quite suitable for biogas content analysis because methane content in biogas would be over the measurement range in this method.

  9. Nonlinear damage detection in composite structures using bispectral analysis

    Science.gov (United States)

    Ciampa, Francesco; Pickering, Simon; Scarselli, Gennaro; Meo, Michele

    2014-03-01

    Literature offers a quantitative number of diagnostic methods that can continuously provide detailed information of the material defects and damages in aerospace and civil engineering applications. Indeed, low velocity impact damages can considerably degrade the integrity of structural components and, if not detected, they can result in catastrophic failure conditions. This paper presents a nonlinear Structural Health Monitoring (SHM) method, based on ultrasonic guided waves (GW), for the detection of the nonlinear signature in a damaged composite structure. The proposed technique, based on a bispectral analysis of ultrasonic input waveforms, allows for the evaluation of the nonlinear response due to the presence of cracks and delaminations. Indeed, such a methodology was used to characterize the nonlinear behaviour of the structure, by exploiting the frequency mixing of the original waveform acquired from a sparse array of sensors. The robustness of bispectral analysis was experimentally demonstrated on a damaged carbon fibre reinforce plastic (CFRP) composite panel, and the nonlinear source was retrieved with a high level of accuracy. Unlike other linear and nonlinear ultrasonic methods for damage detection, this methodology does not require any baseline with the undamaged structure for the evaluation of the nonlinear source, nor a priori knowledge of the mechanical properties of the specimen. Moreover, bispectral analysis can be considered as a nonlinear elastic wave spectroscopy (NEWS) technique for materials showing either classical or non-classical nonlinear behaviour.

  10. Development of vibrational analysis for detection of antisymmetric shells

    International Nuclear Information System (INIS)

    Esmailzadeh Khadem, S.; Mahmoodi, M.; Rezaee, M.

    2002-01-01

    In this paper, vibrational behavior of bodies of revolution with different types of structural faults is studied. Since vibrational characteristics of structures are natural properties of system, the existence of any structural faults causes measurable changes in these properties. Here, this matter is demonstrated. In other words, vibrational behavior of a body of revolution with no structural faults is analyzed by two methods of I) numerical analysis using super sap software, II) Experimental model analysis, and natural frequencies and mode shapes are obtained. Then, different types of cracks are introduced in the structure, and analysis is repeated and the results are compared. Based on this study, one may perform crack detection by measuring the natural frequencies and mode shapes of the samples and comparing with reference information obtained from the vibration analysis of the original structure with no fault

  11. Limits of detection in instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Guinn, V.P.

    1990-01-01

    Lower limits of detection (LLODs), frequently referred to simply as limits of detection and abbreviated as LODs, often appear in the literature of analytical chemistry - for numerous different methods of elemental and/or molecular analysis. In this chapter, one particular method of quantitative elemental analysis, that of instrumental neutron activation analysis (INAA), is the subject discussed, with reference to LODs. Particularly in the literature of neutron activation analysis (NAA), many tables of 'interference-free' NAA LODs are available. Not all of these are of much use, because (1) for many the definition used for LOD is not clear, or reasonable, (2) for many, the analysis conditions used are not clearly specified, and (3) for many, the analysis conditions used are specified, but not very practicable for most laboratories. For NAA work, such tables of interference-free LODs are, in any case, only applicable to samples in which, at the time of counting, only one radionuclide is present to any significant extent in the activated sample. It is important to note that tables of INAA LODs, per se, do not exist - since the LOD for a given element, under stated analysis conditions, can vary by orders of magnitude, depending on the elemental composition of the matrix in which it is present. For any given element, its INAA LOD will always be as large as, and usually much larger than, its tabulated 'interference-free' NAA LOD - how much larger depending upon the elemental composition of the matrix in which it is present. As discussed in this chapter, however, an INAA computer program exists that can calculate realistic INAA LODs for any elements of interest, in any kind of specified sample matrix, under any given set of analysis conditions

  12. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  13. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  14. Sensor Failure Detection of FASSIP System using Principal Component Analysis

    Science.gov (United States)

    Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina

    2018-02-01

    In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.

  15. Analysis of the theoretical bias in dark matter direct detection

    International Nuclear Information System (INIS)

    Catena, Riccardo

    2014-01-01

    Fitting the model ''A'' to dark matter direct detection data, when the model that underlies the data is ''B'', introduces a theoretical bias in the fit. We perform a quantitative study of the theoretical bias in dark matter direct detection, with a focus on assumptions regarding the dark matter interactions, and velocity distribution. We address this problem within the effective theory of isoscalar dark matter-nucleon interactions mediated by a heavy spin-1 or spin-0 particle. We analyze 24 benchmark points in the parameter space of the theory, using frequentist and Bayesian statistical methods. First, we simulate the data of future direct detection experiments assuming a momentum/velocity dependent dark matter-nucleon interaction, and an anisotropic dark matter velocity distribution. Then, we fit a constant scattering cross section, and an isotropic Maxwell-Boltzmann velocity distribution to the simulated data, thereby introducing a bias in the analysis. The best fit values of the dark matter particle mass differ from their benchmark values up to 2 standard deviations. The best fit values of the dark matter-nucleon coupling constant differ from their benchmark values up to several standard deviations. We conclude that common assumptions in dark matter direct detection are a source of potentially significant bias

  16. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  17. Detection of irradiated chicken by 2-alkylcyclobutanone analysis

    International Nuclear Information System (INIS)

    Tanabe, Hiroko; Goto, Michiko; Miyahara, Makoto

    2001-01-01

    Chicken meat irradiated at 0.5 kGy or higher doses were identified by GC/MS method analyzing 2-dodecylcyclobutanone (2-DCB) and 2-tetradecylcyclobutanone (2-TCB), which are formed from palmitic acid and stearic acid respectively, and isolated using extraction procedures of soxhlet-florisil chromatography. Many fat-containing foods have oleic acid in abundance as parent fatty acid, and chicken meat contains palmitoleic acid to the amount as much as stearic acid. In this study, we detected 2-tetradec-5'-enylcyclobutanone (2-TeCB) and 2-dodec-5'-enylcyclobutanone (2-DeCB) in chicken meat, which are formed from oleic acid and palmitoleic acid by irradiation respectively, using GC/MS method. Sensitivity in detection of both 2-TeCB and 2-DeCB were lower than that of 2-DCB. However, at least 0.57 μg/g/fat of 2-TeCB was detected in chicken meat irradiated at 0.5 kGy, so 2-TeCB seems to be a useful marker for the identification of irradiated foods containing fat. On the contrary, 2-DeCB was not detected clearly at low doses. This suggests that 2-DeCB may be a useful marker for irradiated fat in the food having enough amount of palmitoleic acid needed to analysis. In addition, 2-tetradecadienylcyclobutanone, which is formed from linoleic acid was also found in chicken meat. (author)

  18. Detection of hidden explosives by fast neutron activation analysis

    International Nuclear Information System (INIS)

    Li Xinnian; Guo Junpeng; Luo Wenyun; Wang Chuanshan; Fang Xiaoming; Yu Tailiu

    2008-01-01

    The paper describes the method and principle for detection of hidden explosive by fast neutron activation analysis (FNAA). The method of detection of explosives by FNAA has the specific properties of simple determination equipments, high reliability, and low detecting cost, and would be beneficial to the applicability and popularization in the field of protecting and securing nation. The contents of nitrogen and oxygen in four explosives, more then ten common materials and TNT samples covered with soil, were measured by FNAA. 14 MeV fast neutrons were generated from (d, t) reaction with a 400 kV Cockcroft Walton type accelerator. The two-dimension distributions for nitro- gen and oxygen counting rates per unit mass of determined matters were obtained, and the characteristic area of explosives and non-explosives can be defined. By computer aided pattern recognition, the samples were identified with low false alarm or omission rates. The Monte-Carlo simulation indicates that there is no any radiation at 15 m apart from neutron source and is safe for irradiation after 1 h. It is suggested that FNAA may be potential in remote controlling for detection hidden explosive system with multi-probe large array. (authors)

  19. LANDSAT-8 OPERATIONAL LAND IMAGER CHANGE DETECTION ANALYSIS

    Directory of Open Access Journals (Sweden)

    W. Pervez

    2017-05-01

    Full Text Available This paper investigated the potential utility of Landsat-8 Operational Land Imager (OLI for change detection analysis and mapping application because of its superior technical design to previous Landsat series. The OLI SVM classified data was successfully classified with regard to all six test classes (i.e., bare land, built-up land, mixed trees, bushes, dam water and channel water. OLI support vector machine (SVM classified data for the four seasons (i.e., spring, autumn, winter, and summer was used to change detection results of six cases: (1 winter to spring which resulted reduction in dam water mapping and increases of bushes; (2 winter to summer which resulted reduction in dam water mapping and increase of vegetation; (3 winter to autumn which resulted increase in dam water mapping; (4 spring to summer which resulted reduction of vegetation and shallow water; (5 spring to autumn which resulted decrease of vegetation; and (6 summer to autumn which resulted increase of bushes and vegetation . OLI SVM classified data resulted higher overall accuracy and kappa coefficient and thus found suitable for change detection analysis.

  20. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  1. Detection and Analysis of Threats to the Energy Sector: DATES

    Energy Technology Data Exchange (ETDEWEB)

    Alfonso Valdes

    2010-03-31

    This report summarizes Detection and Analysis of Threats to the Energy Sector (DATES), a project sponsored by the United States Department of Energy and performed by a team led by SRI International, with collaboration from Sandia National Laboratories, ArcSight, Inc., and Invensys Process Systems. DATES sought to advance the state of the practice in intrusion detection and situational awareness with respect to cyber attacks in energy systems. This was achieved through adaptation of detection algorithms for process systems as well as development of novel anomaly detection techniques suited for such systems into a detection suite. These detection components, together with third-party commercial security systems, were interfaced with the commercial Security Information Event Management (SIEM) solution from ArcSight. The efficacy of the integrated solution was demonstrated on two testbeds, one based on a Distributed Control System (DCS) from Invensys, and the other based on the Virtual Control System Environment (VCSE) from Sandia. These achievements advance the DOE Cybersecurity Roadmap [DOE2006] goals in the area of security monitoring. The project ran from October 2007 until March 2010, with the final six months focused on experimentation. In the validation phase, team members from SRI and Sandia coupled the two test environments and carried out a number of distributed and cross-site attacks against various points in one or both testbeds. Alert messages from the distributed, heterogeneous detection components were correlated using the ArcSight SIEM platform, providing within-site and cross-site views of the attacks. In particular, the team demonstrated detection and visualization of network zone traversal and denial-of-service attacks. These capabilities were presented to the DistribuTech Conference and Exhibition in March 2010. The project was hampered by interruption of funding due to continuing resolution issues and agreement on cost share for four months in 2008

  2. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  3. Analysis of Public Datasets for Wearable Fall Detection Systems.

    Science.gov (United States)

    Casilari, Eduardo; Santoyo-Ramón, José-Antonio; Cano-García, José-Manuel

    2017-06-27

    Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs) have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs). In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.). Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  4. Analysis of Public Datasets for Wearable Fall Detection Systems

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2017-06-01

    Full Text Available Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs. In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.. Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  5. Detecting depression stigma on social media: A linguistic analysis.

    Science.gov (United States)

    Li, Ang; Jiao, Dongdong; Zhu, Tingshao

    2018-05-01

    Efficient detection of depression stigma in mass media is important for designing effective stigma reduction strategies. Using linguistic analysis methods, this paper aims to build computational models for detecting stigma expressions in Chinese social media posts (Sina Weibo). A total of 15,879 Weibo posts with keywords were collected and analyzed. First, a content analysis was conducted on all 15,879 posts to determine whether each of them reflected depression stigma or not. Second, using four algorithms (Simple Logistic Regression, Multilayer Perceptron Neural Networks, Support Vector Machine, and Random Forest), two groups of classification models were built based on selected linguistic features; one for differentiating between posts with and without depression stigma, and one for differentiating among posts with three specific types of depression stigma. First, 967 of 15,879 posts (6.09%) indicated depression stigma. 39.30%, 15.82%, and 14.99% of them endorsed the stigmatizing view that "People with depression are unpredictable", "Depression is a sign of personal weakness", and "Depression is not a real medical illness", respectively. Second, the highest F-Measure value for differentiating between stigma and non-stigma reached 75.2%. The highest F-Measure value for differentiating among three specific types of stigma reached 86.2%. Due to the limited and imbalanced dataset of Chinese Weibo posts, the findings of this study might have limited generalizability. This paper confirms that incorporating linguistic analysis methods into online detection of stigma can be beneficial to improve the performance of stigma reduction programs. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Improvement of retinal blood vessel detection using morphological component analysis.

    Science.gov (United States)

    Imani, Elaheh; Javidi, Malihe; Pourreza, Hamid-Reza

    2015-03-01

    Detection and quantitative measurement of variations in the retinal blood vessels can help diagnose several diseases including diabetic retinopathy. Intrinsic characteristics of abnormal retinal images make blood vessel detection difficult. The major problem with traditional vessel segmentation algorithms is producing false positive vessels in the presence of diabetic retinopathy lesions. To overcome this problem, a novel scheme for extracting retinal blood vessels based on morphological component analysis (MCA) algorithm is presented in this paper. MCA was developed based on sparse representation of signals. This algorithm assumes that each signal is a linear combination of several morphologically distinct components. In the proposed method, the MCA algorithm with appropriate transforms is adopted to separate vessels and lesions from each other. Afterwards, the Morlet Wavelet Transform is applied to enhance the retinal vessels. The final vessel map is obtained by adaptive thresholding. The performance of the proposed method is measured on the publicly available DRIVE and STARE datasets and compared with several state-of-the-art methods. An accuracy of 0.9523 and 0.9590 has been respectively achieved on the DRIVE and STARE datasets, which are not only greater than most methods, but are also superior to the second human observer's performance. The results show that the proposed method can achieve improved detection in abnormal retinal images and decrease false positive vessels in pathological regions compared to other methods. Also, the robustness of the method in the presence of noise is shown via experimental result. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. Automated rice leaf disease detection using color image analysis

    Science.gov (United States)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  8. Streak detection and analysis pipeline for optical images

    Science.gov (United States)

    Virtanen, J.; Granvik, M.; Torppa, J.; Muinonen, K.; Poikonen, J.; Lehti, J.; Säntti, T.; Komulainen, T.; Flohrer, T.

    2014-07-01

    We describe a novel data processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data to support the development and validation of population models, and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. We focus on the low signal-to-noise (SNR) detection of objects with high angular velocities, resulting in long and faint object trails, or streaks, in the optical images. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and, particularly for satellites, within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a 'track-before-detect' problem, resulting in streaks of arbitrary lengths. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, algorithms are not readily available yet. In the ESA-funded StreakDet (Streak detection and astrometric reduction) project, we develop and evaluate an automated processing pipeline applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The algorithmic

  9. Detection and monitoring of neurotransmitters--a spectroscopic analysis.

    Science.gov (United States)

    Manciu, Felicia S; Lee, Kendall H; Durrer, William G; Bennet, Kevin E

    2013-01-01

    We demonstrate that confocal Raman mapping spectroscopy provides rapid, detailed, and accurate neurotransmitter analysis, enabling millisecond time resolution monitoring of biochemical dynamics. As a prototypical demonstration of the power of the method, we present real-time in vitro serotonin, adenosine, and dopamine detection, and dopamine diffusion in an inhomogeneous organic gel, which was used as a substitute for neurologic tissue.  Dopamine, adenosine, and serotonin were used to prepare neurotransmitter solutions in distilled water. The solutions were applied to the surfaces of glass slides, where they interdiffused. Raman mapping was achieved by detecting nonoverlapping spectral signatures characteristic of the neurotransmitters with an alpha 300 WITec confocal Raman system, using 532 nm neodymium-doped yttrium aluminum garnet laser excitation. Every local Raman spectrum was recorded in milliseconds and complete Raman mapping in a few seconds.  Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific microscale image regions. Such information is particularly important for complex, heterogeneous samples, where changes in composition can influence neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.  Accurate nondestructive characterization for real-time detection of neurotransmitters in inhomogeneous environments without the requirement of sample labeling is a key issue in neuroscience. Our work demonstrates the capabilities of Raman spectroscopy in biological applications, possibly providing a new tool for elucidating the mechanism and kinetics of deep brain stimulation. © 2012 International Neuromodulation Society.

  10. Overlapping communities detection based on spectral analysis of line graphs

    Science.gov (United States)

    Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan

    2018-05-01

    Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.

  11. Network structure detection and analysis of Shanghai stock market

    Directory of Open Access Journals (Sweden)

    Sen Wu

    2015-04-01

    Full Text Available Purpose: In order to investigate community structure of the component stocks of SSE (Shanghai Stock Exchange 180-index, a stock correlation network is built to find the intra-community and inter-community relationship. Design/methodology/approach: The stock correlation network is built taking the vertices as stocks and edges as correlation coefficients of logarithm returns of stock price. It is built as undirected weighted at first. GN algorithm is selected to detect community structure after transferring the network into un-weighted with different thresholds. Findings: The result of the network community structure analysis shows that the stock market has obvious industrial characteristics. Most of the stocks in the same industry or in the same supply chain are assigned to the same community. The correlation of the internal stock prices’ fluctuation is closer than in different communities. The result of community structure detection also reflects correlations among different industries. Originality/value: Based on the analysis of the community structure in Shanghai stock market, the result reflects some industrial characteristics, which has reference value to relationship among industries or sub-sectors of listed companies.

  12. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  13. Detection of Adult Green Sturgeon Using Environmental DNA Analysis.

    Directory of Open Access Journals (Sweden)

    Paul S Bergman

    Full Text Available Environmental DNA (eDNA is an emerging sampling method that has been used successfully for detection of rare aquatic species. The Identification of sampling tools that are less stressful for target organisms has become increasingly important for rare and endangered species. A decline in abundance of the Southern Distinct Population Segment (DPS of North American Green Sturgeon located in California's Central Valley has led to its listing as Threatened under the Federal Endangered Species Act in 2006. While visual surveys of spawning Green Sturgeon in the Central Valley are effective at monitoring fish densities in concentrated pool habitats, results do not scale well to the watershed level, providing limited spatial and temporal context. Unlike most traditional survey methods, environmental DNA analysis provides a relatively quick, inexpensive tool that could efficiently monitor the presence and distribution of aquatic species. We positively identified Green Sturgeon DNA at two locations of known presence in the Sacramento River, proving that eDNA can be effective for monitoring the presence of adult sturgeon. While further study is needed to understand uncertainties of the sampling method, our study represents the first documented detection of Green Sturgeon eDNA, indicating that eDNA analysis could provide a new tool for monitoring Green Sturgeon distribution in the Central Valley, complimenting traditional on-going survey methods.

  14. Non-Harmonic Fourier Analysis for bladed wheels damage detection

    Science.gov (United States)

    Neri, P.; Peeters, B.

    2015-11-01

    The interaction between bladed wheels and the fluid distributed by the stator vanes results in cyclic loading of the rotating components. Compressors and turbines wheels are subject to vibration and fatigue issues, especially when resonance conditions are excited. Even if resonance conditions can be often predicted and avoided, high cycle fatigue failures can occur, causing safety issues and economic loss. Rigorous maintenance programs are then needed, forcing the system to expensive shut-down. Blade crack detection methods are beneficial for condition-based maintenance. While contact measurement systems are not always usable in exercise conditions (e.g. high temperature), non-contact methods can be more suitable. One (or more) stator-fixed sensor can measure all the blades as they pass by, in order to detect the damaged ones. The main drawback in this situation is the short acquisition time available for each blade, which is shortened by the high rotational speed of the components. A traditional Discrete Fourier Transform (DFT) analysis would result in a poor frequency resolution. A Non-Harmonic Fourier Analysis (NHFA) can be executed with an arbitrary frequency resolution instead, allowing to obtain frequency information even with short-time data samples. This paper shows an analytical investigation of the NHFA method. A data processing algorithm is then proposed to obtain frequency shift information from short time samples. The performances of this algorithm are then studied by experimental and numerical tests.

  15. Frontiers in In-Situ Cosmic Dust Detection and Analysis

    International Nuclear Information System (INIS)

    Sternovsky, Zoltan; Auer, Siegfried; Drake, Keith; Gruen, Eberhard; Horanyi, Mihaly; Le, Huy; Xie Jianfeng; Srama, Ralf

    2011-01-01

    In-situ cosmic dust instruments and measurements played a critical role in the emergence of the field of dusty plasmas. The major breakthroughs included the discovery of β-meteoroids, interstellar dust particles within the solar system, Jovian stream particles, and the detection and analysis of Enceladus's plumes. The science goals of cosmic dust research require the measurements of the charge, the spatial, size and velocity distributions, and the chemical and isotopic compositions of individual dust particles. In-situ dust instrument technology has improved significantly in the last decade. Modern dust instruments with high sensitivity can detect submicron-sized particles even at low impact velocities. Innovative ion optics methods deliver high mass resolution, m/dm>100, for chemical and isotopic analysis. The accurate trajectory measurement of cosmic dust is made possible even for submicron-sized grains using the Dust Trajectory Sensor (DTS). This article is a brief review of the current capabilities of modern dust instruments, future challenges and opportunities in cosmic dust research.

  16. Detection and analysis of diamond fingerprinting feature and its application

    Energy Technology Data Exchange (ETDEWEB)

    Li Xin; Huang Guoliang; Li Qiang; Chen Shengyi, E-mail: tshgl@tsinghua.edu.cn [Department of Biomedical Engineering, the School of Medicine, Tsinghua University, Beijing, 100084 (China)

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the 'Diamond ID' and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  17. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Directory of Open Access Journals (Sweden)

    Zhi-Wei Liu

    2015-04-01

    Full Text Available Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L−1 (S/N ≥ 3. A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L−1-40μg L−1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p < 0.001. The sensor showed no response for abrin, BSA, and could overcome the influence of complex environmental disruptors, indicating high specificity and good selectivity. Recovery and reproducibility in the result of simulated samples (simulated water, soil, and flour sample determination met the analysis requirements, which was 90.5∼95.5% and 7.85%∼9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R was greater than or equal to 0.9456 (p < 0.001. Response voltage (ΔUe and response time (t0 obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  18. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  19. "Textural analysis of multiparametric MRI detects transition zone prostate cancer".

    Science.gov (United States)

    Sidhu, Harbir S; Benigno, Salvatore; Ganeshan, Balaji; Dikaios, Nikos; Johnston, Edward W; Allen, Clare; Kirkham, Alex; Groves, Ashley M; Ahmed, Hashim U; Emberton, Mark; Taylor, Stuart A; Halligan, Steve; Punwani, Shonit

    2017-06-01

    To evaluate multiparametric-MRI (mpMRI) derived histogram textural-analysis parameters for detection of transition zone (TZ) prostatic tumour. Sixty-seven consecutive men with suspected prostate cancer underwent 1.5T mpMRI prior to template-mapping-biopsy (TPM). Twenty-six men had 'significant' TZ tumour. Two radiologists in consensus matched TPM to the single axial slice best depicting tumour, or largest TZ diameter for those with benign histology, to define single-slice whole TZ-regions-of-interest (ROIs). Textural-parameter differences between single-slice whole TZ-ROI containing significant tumour versus benign/insignificant tumour were analysed using Mann Whitney U test. Diagnostic accuracy was assessed by receiver operating characteristic area under curve (ROC-AUC) analysis cross-validated with leave-one-out (LOO) analysis. ADC kurtosis was significantly lower (p Textural features of the whole prostate TZ can discriminate significant prostatic cancer through reduced kurtosis of the ADC-histogram where significant tumour is included in TZ-ROI and reduced T1 entropy independent of tumour inclusion. • MR textural features of prostate transition zone may discriminate significant prostatic cancer. • Transition zone (TZ) containing significant tumour demonstrates a less peaked ADC histogram. • TZ containing significant tumour reveals higher post-contrast T1-weighted homogeneity. • The utility of MR texture analysis in prostate cancer merits further investigation.

  20. Non destructive defect detection by spectral density analysis.

    Science.gov (United States)

    Krejcar, Ondrej; Frischer, Robert

    2011-01-01

    The potential nondestructive diagnostics of solid objects is discussed in this article. The whole process is accomplished by consecutive steps involving software analysis of the vibration power spectrum (eventually acoustic emissions) created during the normal operation of the diagnosed device or under unexpected situations. Another option is to create an artificial pulse, which can help us to determine the actual state of the diagnosed device. The main idea of this method is based on the analysis of the current power spectrum density of the received signal and its postprocessing in the Matlab environment with a following sample comparison in the Statistica software environment. The last step, which is comparison of samples, is the most important, because it is possible to determine the status of the examined object at a given time. Nowadays samples are compared only visually, but this method can't produce good results. Further the presented filter can choose relevant data from a huge group of data, which originate from applying FFT (Fast Fourier Transform). On the other hand, using this approach they can be subjected to analysis with the assistance of a neural network. If correct and high-quality starting data are provided to the initial network, we are able to analyze other samples and state in which condition a certain object is. The success rate of this approximation, based on our testing of the solution, is now 85.7%. With further improvement of the filter, it could be even greater. Finally it is possible to detect defective conditions or upcoming limiting states of examined objects/materials by using only one device which contains HW and SW parts. This kind of detection can provide significant financial savings in certain cases (such as continuous casting of iron where it could save hundreds of thousands of USD).

  1. Detection and Analysis of Circular RNAs by RT-PCR.

    Science.gov (United States)

    Panda, Amaresh C; Gorospe, Myriam

    2018-03-20

    Gene expression in eukaryotic cells is tightly regulated at the transcriptional and posttranscriptional levels. Posttranscriptional processes, including pre-mRNA splicing, mRNA export, mRNA turnover, and mRNA translation, are controlled by RNA-binding proteins (RBPs) and noncoding (nc)RNAs. The vast family of ncRNAs comprises diverse regulatory RNAs, such as microRNAs and long noncoding (lnc)RNAs, but also the poorly explored class of circular (circ)RNAs. Although first discovered more than three decades ago by electron microscopy, only the advent of high-throughput RNA-sequencing (RNA-seq) and the development of innovative bioinformatic pipelines have begun to allow the systematic identification of circRNAs (Szabo and Salzman, 2016; Panda et al ., 2017b; Panda et al ., 2017c). However, the validation of true circRNAs identified by RNA sequencing requires other molecular biology techniques including reverse transcription (RT) followed by conventional or quantitative (q) polymerase chain reaction (PCR), and Northern blot analysis (Jeck and Sharpless, 2014). RT-qPCR analysis of circular RNAs using divergent primers has been widely used for the detection, validation, and sometimes quantification of circRNAs (Abdelmohsen et al ., 2015 and 2017; Panda et al ., 2017b). As detailed here, divergent primers designed to span the circRNA backsplice junction sequence can specifically amplify the circRNAs and not the counterpart linear RNA. In sum, RT-PCR analysis using divergent primers allows direct detection and quantification of circRNAs.

  2. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  3. Linkage analysis: Inadequate for detecting susceptibility loci in complex disorders?

    Energy Technology Data Exchange (ETDEWEB)

    Field, L.L.; Nagatomi, J. [Univ. of Calgary, Alberta (Canada)

    1994-09-01

    Insulin-dependent diabetes mellitus (IDDM) may provide valuable clues about approaches to detecting susceptibility loci in other oligogenic disorders. Numerous studies have demonstrated significant association between IDDM and a VNTR in the 5{prime} flanking region of the insulin (INS) gene. Paradoxically, all attempts to demonstrate linkage of IDDM to this VNTR have failed. Lack of linkage has been attributed to insufficient marker locus information, genetic heterogeneity, or high frequency of the IDDM-predisposing allele in the general population. Tyrosine hydroxylase (TH) is located 2.7 kb from INS on the 5` side of the VNTR and shows linkage disequilibrium with INS region loci. We typed a highly polymorphic microsatellite within TH in 176 multiplex families, and performed parametric (lod score) linkage analysis using various intermediate reduced penetrance models for IDDM (including rare and common disease allele frequencies), as well as non-parametric (affected sib pair) linkage analysis. The scores significantly reject linkage for recombination values of .05 or less, excluding the entire 19 kb region containing TH, the 5{prime} VNTR, the INS gene, and IGF2 on the 3{prime} side of INS. Non-parametric linkage analysis also provided no significant evidence for linkage (mean TH allele sharing 52.5%, P=.12). These results have important implications for efforts to locate genes predisposing to complex disorders, strongly suggesting that regions which are significantly excluded by linkage methods may nevertheless contain predisposing genes readily detectable by association methods. We advocate that investigators routinely perform association analyses in addition to linkage analyses.

  4. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  5. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Science.gov (United States)

    2015-03-27

    INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL VEHICLE...protection in the United States. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS

  6. Towards the automatic detection and analysis of sunspot rotation

    Science.gov (United States)

    Brown, Daniel S.; Walker, Andrew P.

    2016-10-01

    Torsional rotation of sunspots have been noted by many authors over the past century. Sunspots have been observed to rotate up to the order of 200 degrees over 8-10 days, and these have often been linked with eruptive behaviour such as solar flares and coronal mass ejections. However, most studies in the literature are case studies or small-number studies which suffer from selection bias. In order to better understand sunspot rotation and its impact on the corona, unbiased large-sample statistical studies are required (including both rotating and non-rotating sunspots). While this can be done manually, a better approach is to automate the detection and analysis of rotating sunspots using robust methods with well characterised uncertainties. The SDO/HMI instrument provide long-duration, high-resolution and high-cadence continuum observations suitable for extracting a large number of examples of rotating sunspots. This presentation will outline the analysis of SDI/HMI data to determine the rotation (and non-rotation) profiles of sunspots for the complete duration of their transit across the solar disk, along with how this can be extended to automatically identify sunspots and initiate their analysis.

  7. Indirect Detection Analysis: Wino Dark Matter Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Hryczuk, Andrzej [Munich, Tech. U.; Cholis, Ilias [Fermilab; Iengo, Roberto [SISSA, Trieste; Tavakoli, Maryam [IPM, Tehran; Ullio, Piero [INFN, Trieste

    2014-07-15

    We perform a multichannel analysis of the indirect signals for the Wino Dark Matter, including one-loop electroweak and Sommerfeld enhancement corrections. We derive limits from cosmic ray antiprotons and positrons, from continuum galactic and extragalactic diffuse γ-ray spectra, from the absence of γ-ray line features at the galactic center above 500 GeV in energy, from γ-rays toward nearby dwarf spheroidal galaxies and galaxy clusters, and from CMB power-spectra. Additionally, we show the future prospects for neutrino observations toward the inner Galaxy and from antideuteron searches. For each of these indirect detection probes we include and discuss the relevance of the most important astrophysical uncertainties that can impact the strength of the derived limits. We find that the Wino as a dark matter candidate is excluded in the mass range bellow simeq 800 GeV from antiprotons and between 1.8 and 3.5 TeV from the absence of a γ-ray line feature toward the galactic center. Limits from other indirect detection probes confirm the main bulk of the excluded mass ranges.

  8. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  9. Fraud diamond: Detection analysis on the fraudulent financial reporting

    Directory of Open Access Journals (Sweden)

    Stefani Lily Indarto

    2016-11-01

    Full Text Available The accounting scandal became one of the reasons for analyzing financial statements in order to minimize fraud against the financial reporting. Therefore, companies use the services of a public accountant to audit the financial statements of companies that are expected to limit the fraudulent practices that increase the public’s confidence in the company’s financial statements. This study aims to detect fraud by using analysis of fraud diamond . This study takes banking companies listed on the Indonesian Stock Exchange in 2009-2014, with a total sample of 149 banks. Based on the results the external pressure, financial stability and capability have influence on fraudulent financial reporting. While target financial, ineffective monitoring and rationalization does not affect the fraudulent financial reporting

  10. Packer Detection for Multi-Layer Executables Using Entropy Analysis

    Directory of Open Access Journals (Sweden)

    Munkhbayar Bat-Erdene

    2017-03-01

    Full Text Available Packing algorithms are broadly used to avoid anti-malware systems, and the proportion of packed malware has been growing rapidly. However, just a few studies have been conducted on detection various types of packing algorithms in a systemic way. Following this understanding, we elaborate a method to classify packing algorithms of a given executable into three categories: single-layer packing, re-packing, or multi-layer packing. We convert entropy values of the executable file loaded into memory into symbolic representations, for which we used SAX (Symbolic Aggregate Approximation. Based on experiments of 2196 programs and 19 packing algorithms, we identify that precision (97.7%, accuracy (97.5%, and recall ( 96.8% of our method are respectively high to confirm that entropy analysis is applicable in identifying packing algorithms.

  11. GOTRES: an expert system for fault detection and analysis

    International Nuclear Information System (INIS)

    Chung, D.T.; Modarres, M.

    1989-01-01

    This paper describes a deep-knowledge expert system shell for diagnosing faults in process operations. The expert program shell is called GOTRES (GOal TRee Expert System) and uses a goal tree-success tree deep-knowledge structure to model its knowledge-base. To demonstrate GOTRES, we have built an on-line fault diagnosis expert system for an experimental nuclear reactor facility using this shell. The expert system is capable of diagnosing fault conditions using system goal tree as well as utilizing accumulated operating knowledge to predict plant causal and temporal behaviours. The GOTRES shell has also been used for root-cause detection and analysis in a nuclear plant. (author)

  12. Remote calorimetric detection of urea via flow injection analysis.

    Science.gov (United States)

    Gaddes, David E; Demirel, Melik C; Reeves, W Brian; Tadigadapa, Srinivas

    2015-12-07

    The design and development of a calorimetric biosensing system enabling relatively high throughput sample analysis are reported. The calorimetric biosensor system consists of a thin (∼20 μm) micromachined Y-cut quartz crystal resonator (QCR) as a temperature sensor placed in close proximity to a fluidic chamber packed with an immobilized enzyme. Layer by layer enzyme immobilization of urease is demonstrated and its activity as a function of the number of layers, pH, and time has been evaluated. This configuration enables a sensing system where a transducer element is physically separated from the analyte solution of interest and is thereby free from fouling effects typically associated with biochemical reactions occuring on the sensor surface. The performance of this biosensing system is demonstrated by detection of 1-200 mM urea in phosphate buffer via a flow injection analysis (FIA) technique. Miniaturized fluidic systems were used to provide continuous flow through a reaction column. Under this configuration the biosensor has an ultimate resolution of less than 1 mM urea and showed a linear response between 0-50 mM. This work demonstrates a sensing modality in which the sensor itself is not fouled or contaminated by the solution of interest and the enzyme immobilized Kapton® fluidic reaction column can be used as a disposable cartridge. Such a system enables reuse and reliability for long term sampling measurements. Based on this concept a biosensing system is envisioned which can perform rapid measurements to detect biomarkers such as glucose, creatinine, cholesterol, urea and lactate in urine and blood continuously over extended periods of time.

  13. Image analysis and machine learning for detecting malaria.

    Science.gov (United States)

    Poostchi, Mahdieh; Silamut, Kamolrat; Maude, Richard J; Jaeger, Stefan; Thoma, George

    2018-04-01

    Malaria remains a major burden on global health, with roughly 200 million cases worldwide and more than 400,000 deaths per year. Besides biomedical research and political efforts, modern information technology is playing a key role in many attempts at fighting the disease. One of the barriers toward a successful mortality reduction has been inadequate malaria diagnosis in particular. To improve diagnosis, image analysis software and machine learning methods have been used to quantify parasitemia in microscopic blood slides. This article gives an overview of these techniques and discusses the current developments in image analysis and machine learning for microscopic malaria diagnosis. We organize the different approaches published in the literature according to the techniques used for imaging, image preprocessing, parasite detection and cell segmentation, feature computation, and automatic cell classification. Readers will find the different techniques listed in tables, with the relevant articles cited next to them, for both thin and thick blood smear images. We also discussed the latest developments in sections devoted to deep learning and smartphone technology for future malaria diagnosis. Published by Elsevier Inc.

  14. URBAN DETECTION, DELIMITATION AND MORPHOLOGY: COMPARATIVE ANALYSIS OF SELECTIVE "MEGACITIES"

    Directory of Open Access Journals (Sweden)

    B. Alhaddad

    2012-08-01

    Full Text Available Over the last 50 years, the world has faced an impressive growth of urban population. The walled city, close to the outside, an "island"for economic activities and population density within the rural land, has led to the spread of urban life and urban networks in almost all the territory. There was, as said Margalef (1999, "a topological inversion of the landscape". The "urban" has gone from being an island in the ocean of rural land vastness, to represent the totally of the space in which are inserted natural and rural "systems". New phenomena such as the fall of the fordist model of production, the spread of urbanization known as urban sprawl, and the change of scale of the metropolis, covering increasingly large regions, called "megalopolis" (Gottmann, 1961, have characterized the century. However there are no rigorous databases capable of measuring and evaluating the phenomenon of megacities and in general the process of urbanization in the contemporary world. The aim of this paper is to detect, identify and analyze the morphology of the megacities through remote sensing instruments as well as various indicators of landscape. To understand the structure of these heterogeneous landscapes called megacities, land consumption and spatial complexity needs to be quantified accurately. Remote sensing might be helpful in evaluating how the different land covers shape urban megaregions. The morphological landscape analysis allows establishing the analogies and the differences between patterns of cities and studying the symmetry, growth direction, linearity, complexity and compactness of the urban form. The main objective of this paper is to develop a new methodology to detect urbanized land of some megacities around the world (Tokyo, Mexico, Chicago, New York, London, Moscow, Sao Paulo and Shanghai using Landsat 7 images.

  15. Statistical Analysis of Data with Non-Detectable Values

    Energy Technology Data Exchange (ETDEWEB)

    Frome, E.L.

    2004-08-26

    Environmental exposure measurements are, in general, positive and may be subject to left censoring, i.e. the measured value is less than a ''limit of detection''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. A basic problem of interest in environmental risk assessment is to determine if the mean concentration of an analyte is less than a prescribed action level. Parametric methods, used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level and/or an upper percentile (e.g. the 95th percentile) are used to characterize exposure levels, and upper confidence limits are needed to describe the uncertainty in these estimates. In certain situations it is of interest to estimate the probability of observing a future (or ''missed'') value of a lognormal variable. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on the 95th percentile (i.e. the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical

  16. A signal detection theory analysis of an unconscious perception effect.

    Science.gov (United States)

    Haase, S J; Theios, J; Jenison, R

    1999-07-01

    The independent observation model (Macmillan & Creelman, 1991) is fitted to detection-identification data collected under conditions of heavy masking. The model accurately predicts a quantitative relationship between stimulus detection and stimulus identification over a wide range of detection performance. This model can also be used to offer a signal detection interpretation of the common finding of above-chance identification following a missed signal. While our finding is not a new one, the stimuli used in this experiment (redundant three-letter strings) differ slightly from those used in traditional signal detection work. Also, the stimuli were presented very briefly and heavily masked, conditions typical in the study of unconscious perception effects.

  17. Lower detectable limit of sulfur by fast neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Shani, G; Cohen, D [Ben-Gurion Univ. of the Negev, Beersheba (Israel). Dept. of Nuclear Engineering

    1976-07-01

    For the purpose of air pollution research, the possibility of fast neutron activation analysis of sulfur was investigated. The only reaction that can be used for this purpose is S/sup 34/(n, p)P/sup 34/. A rabbit system was installed, synchronized with a 150 kV D-T neutron generator and an electronic analysing system. The whole system was operated so that the sample was irradiated for 10 sec and the 2.13 MeV ..gamma..-ray was counted for 10 sec. 5 samples were prepared containing sulfur from 0.5 to 0.1 g. Each measurement lasted 30 min and the activity was plotted as a function of sulfur weight. The relative error is increased very much when the amount of sulfur is below 0.1 g. This is what sets the lower detectable limit. Collection of more than 0.1 g of sulfur even during a long collection time means a very high SO/sub 2/ concentration in the air.

  18. Detection of mastitis pathogens by analysis of volatile bacterial metabolites.

    Science.gov (United States)

    Hettinga, K A; van Valenberg, H J F; Lam, T J G M; van Hooijdonk, A C M

    2008-10-01

    The ability to detect mastitis pathogens based on their volatile metabolites was studied. Milk samples from cows with clinical mastitis, caused by Staphylococcus aureus, coagulase-negative staphylococci, Streptococcus uberis, Streptococcus dysgalactiae, and Escherichia coli were collected. In addition, samples from cows without clinical mastitis and with low somatic cell count (SCC) were collected for comparison. All mastitis samples were examined by using classical microbiological methods, followed by headspace analysis for volatile metabolites. Milk from culture-negative samples contained a lower number and amount of volatile components compared with cows with clinical mastitis. Because of variability between samples within a group, comparisons between pathogens were not sufficient for classification of the samples by univariate statistics. Therefore, an artificial neural network was trained to classify the pathogen in the milk samples based on the bacterial metabolites. The trained network differentiated milk from uninfected and infected quarters very well. When comparing pathogens, Staph. aureus produced a very different pattern of volatile metabolites compared with the other samples. Samples with coagulase-negative staphylococci and E. coli had enough dissimilarity with the other pathogens, making it possible to separate these 2 pathogens from each other and from the other samples. The 2 streptococcus species did not show significant differences between each other but could be identified as a different group from the other pathogens. Five groups can thus be identified based on the volatile bacterial metabolites: Staph. aureus, coagulase-negative staphylococci, streptococci (Strep. uberis and Strep. dysgalactiae as one group), E. coli, and uninfected quarters.

  19. Detection Of Alterations In Audio Files Using Spectrograph Analysis

    Directory of Open Access Journals (Sweden)

    Anandha Krishnan G

    2015-08-01

    Full Text Available The corresponding study was carried out to detect changes in audio file using spectrograph. An audio file format is a file format for storing digital audio data on a computer system. A sound spectrograph is a laboratory instrument that displays a graphical representation of the strengths of the various component frequencies of a sound as time passes. The objectives of the study were to find the changes in spectrograph of audio after altering them to compare altering changes with spectrograph of original files and to check for similarity and difference in mp3 and wav. Five different alterations were carried out on each audio file to analyze the differences between the original and the altered file. For altering the audio file MP3 or WAV by cutcopy the file was opened in Audacity. A different audio was then pasted to the audio file. This new file was analyzed to view the differences. By adjusting the necessary parameters the noise was reduced. The differences between the new file and the original file were analyzed. By adjusting the parameters from the dialog box the necessary changes were made. The edited audio file was opened in the software named spek where after analyzing a graph is obtained of that particular file which is saved for further analysis. The original audio graph received was combined with the edited audio file graph to see the alterations.

  20. Trajectory Shape Analysis and Anomaly Detection Utilizing Information Theory Tools

    Directory of Open Access Journals (Sweden)

    Yuejun Guo

    2017-06-01

    Full Text Available In this paper, we propose to improve trajectory shape analysis by explicitly considering the speed attribute of trajectory data, and to successfully achieve anomaly detection. The shape of object motion trajectory is modeled using Kernel Density Estimation (KDE, making use of both the angle attribute of the trajectory and the speed of the moving object. An unsupervised clustering algorithm, based on the Information Bottleneck (IB method, is employed for trajectory learning to obtain an adaptive number of trajectory clusters through maximizing the Mutual Information (MI between the clustering result and a feature set of the trajectory data. Furthermore, we propose to effectively enhance the performance of IB by taking into account the clustering quality in each iteration of the clustering procedure. The trajectories are determined as either abnormal (infrequently observed or normal by a measure based on Shannon entropy. Extensive tests on real-world and synthetic data show that the proposed technique behaves very well and outperforms the state-of-the-art methods.

  1. Detection and analysis of ancient segmental duplications in mammalian genomes.

    Science.gov (United States)

    Pu, Lianrong; Lin, Yu; Pevzner, Pavel A

    2018-05-07

    Although segmental duplications (SDs) represent hotbeds for genomic rearrangements and emergence of new genes, there are still no easy-to-use tools for identifying SDs. Moreover, while most previous studies focused on recently emerged SDs, detection of ancient SDs remains an open problem. We developed an SDquest algorithm for SD finding and applied it to analyzing SDs in human, gorilla, and mouse genomes. Our results demonstrate that previous studies missed many SDs in these genomes and show that SDs account for at least 6.05% of the human genome (version hg19), a 17% increase as compared to the previous estimate. Moreover, SDquest classified 6.42% of the latest GRCh38 version of the human genome as SDs, a large increase as compared to previous studies. We thus propose to re-evaluate evolution of SDs based on their accurate representation across multiple genomes. Toward this goal, we analyzed the complex mosaic structure of SDs and decomposed mosaic SDs into elementary SDs, a prerequisite for follow-up evolutionary analysis. We also introduced the concept of the breakpoint graph of mosaic SDs that revealed SD hotspots and suggested that some SDs may have originated from circular extrachromosomal DNA (ecDNA), not unlike ecDNA that contributes to accelerated evolution in cancer. © 2018 Pu et al.; Published by Cold Spring Harbor Laboratory Press.

  2. Unsupervised EEG analysis for automated epileptic seizure detection

    Science.gov (United States)

    Birjandtalab, Javad; Pouyan, Maziyar Baran; Nourani, Mehrdad

    2016-07-01

    Epilepsy is a neurological disorder which can, if not controlled, potentially cause unexpected death. It is extremely crucial to have accurate automatic pattern recognition and data mining techniques to detect the onset of seizures and inform care-givers to help the patients. EEG signals are the preferred biosignals for diagnosis of epileptic patients. Most of the existing pattern recognition techniques used in EEG analysis leverage the notion of supervised machine learning algorithms. Since seizure data are heavily under-represented, such techniques are not always practical particularly when the labeled data is not sufficiently available or when disease progression is rapid and the corresponding EEG footprint pattern will not be robust. Furthermore, EEG pattern change is highly individual dependent and requires experienced specialists to annotate the seizure and non-seizure events. In this work, we present an unsupervised technique to discriminate seizures and non-seizures events. We employ power spectral density of EEG signals in different frequency bands that are informative features to accurately cluster seizure and non-seizure events. The experimental results tried so far indicate achieving more than 90% accuracy in clustering seizure and non-seizure events without having any prior knowledge on patient's history.

  3. Detection and phylogenetic analysis of bacteriophage WO in spiders (Araneae).

    Science.gov (United States)

    Yan, Qian; Qiao, Huping; Gao, Jin; Yun, Yueli; Liu, Fengxiang; Peng, Yu

    2015-11-01

    Phage WO is a bacteriophage found in Wolbachia. Herein, we represent the first phylogenetic study of WOs that infect spiders (Araneae). Seven species of spiders (Araneus alternidens, Nephila clavata, Hylyphantes graminicola, Prosoponoides sinensis, Pholcus crypticolens, Coleosoma octomaculatum, and Nurscia albofasciata) from six families were infected by Wolbachia and WO, followed by comprehensive sequence analysis. Interestingly, WO could be only detected Wolbachia-infected spiders. The relative infection rates of those seven species of spiders were 75, 100, 88.9, 100, 62.5, 72.7, and 100 %, respectively. Our results indicated that both Wolbachia and WO were found in three different body parts of N. clavata, and WO could be passed to the next generation of H. graminicola by vertical transmission. There were three different sequences for WO infected in A. alternidens and two different WO sequences from C. octomaculatum. Only one sequence of WO was found for the other five species of spiders. The discovered sequence of WO ranged from 239 to 311 bp. Phylogenetic tree was generated using maximum likelihood (ML) based on the orf7 gene sequences. According to the phylogenetic tree, WOs in N. clavata and H. graminicola were clustered in the same group. WOs from A. alternidens (WAlt1) and C. octomaculatum (WOct2) were closely related to another clade, whereas WO in P. sinensis was classified as a sole cluster.

  4. Behavior Analysis Usage with Behavior Tures Adoption for Malicious Code Detection on JAVASCRIPT Scenarios Example

    Directory of Open Access Journals (Sweden)

    Y. M. Tumanov

    2010-03-01

    Full Text Available The article offers the method of malicious JavaScript code detection, based on behavior analysis. Conceptions of program behavior, program state, an algorithm of malicious code detection are described.

  5. Application and Analysis of Wavelet Transform in Image Edge Detection

    Institute of Scientific and Technical Information of China (English)

    Jianfang gao[1

    2016-01-01

    For the image processing technology, technicians have been looking for a convenient and simple detection method for a long time, especially for the innovation research on image edge detection technology. Because there are a lot of original information at the edge during image processing, thus, we can get the real image data in terms of the data acquisition. The usage of edge is often in the case of some irregular geometric objects, and we determine the contour of the image by combining with signal transmitted data. At the present stage, there are different algorithms in image edge detection, however, different types of algorithms have divergent disadvantages so It is diffi cult to detect the image changes in a reasonable range. We try to use wavelet transformation in image edge detection, making full use of the wave with the high resolution characteristics, and combining multiple images, in order to improve the accuracy of image edge detection.

  6. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  7. Analysis of the Detectability of Sonar Under the Virtual Battlefield

    Directory of Open Access Journals (Sweden)

    Hou Chengyu

    2014-08-01

    Full Text Available Due to the high propagation speed and the low attenuation in the water, the sonar has played a crucial role in developing the ocean resources and the marine target detection. Therefore, simulation of the sonar detectability is indispensable to the virtual battlefield. This paper will combine the background noise model of the ocean, the reverberation model, the target strength model and the transmission loss to build the sonar performance model, and realize the calculation of the sonar detectability. Ultimately, the parameters’ effect in the sonar equation on the performance of the sonar detection is analyzed, and the validity of this model is verified by two serving sonars parameters.

  8. Ultrasensitive Detection of Infrared Photon Using Microcantilever: Theoretical Analysis

    International Nuclear Information System (INIS)

    Li-Xin, Cao; Feng-Xin, Zhang; Yin-Fang, Zhu; Jin-Ling, Yang

    2010-01-01

    We present a new method for detecting near-infrared, mid-infrared, and far-infrared photons with an ultrahigh sensitivity. The infrared photon detection was carried out by monitoring the displacement change of a vibrating microcantilever under light pressure using a laser Doppler vibrometer. Ultrathin silicon cantilevers with high sensitivity were produced using micro/nano-fabrication technology. The photon detection system was set up. The response of the microcantilever to the photon illumination is theoretically estimated, and a nanowatt resolution for the infrared photon detection is expected at room temperature with this method

  9. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  10. Image analysis for the detection of Barré

    Science.gov (United States)

    Barré is a major problem for the textile industry. Barré is detectable after fabric is dyed and the detection of barré can depend upon the color of the dyed fabric, lighting conditions, fabric pattern, and/or the color perception of the person viewing the fabric. The standard method for measuring ...

  11. Feynman-α correlation analysis by prompt-photon detection

    International Nuclear Information System (INIS)

    Hashimoto, Kengo; Yamada, Sumasu; Hasegawa, Yasuhiro; Horiguchi, Tetsuo

    1998-01-01

    Two-detector Feynman-α measurements were carried out using the UTR-KINKI reactor, a light-water-moderated and graphite-reflected reactor, by detecting high-energy, prompt gamma rays. For comparison, the conventional measurements by detecting neutrons were also performed. These measurements were carried out in the subcriticality range from 0 to $1.8. The gate-time dependence of the variance-and covariance-to-mean ratios measured by gamma-ray detection were nearly identical with those obtained using standard neutron-detection techniques. Consequently, the prompt-neutron decay constants inferred from the gamma-ray correlation data agreed with those from the neutron data. Furthermore, the correlated-to-uncorrelated amplitude ratios obtained by gamma-ray detection significantly depended on the low-energy discriminator level of the single-channel analyzer. The discriminator level was determined as optimum for obtaining a maximum value of the amplitude ratio. The maximum amplitude ratio was much larger than that obtained by neutron detection. The subcriticality dependence of the decay constant obtained by gamma-ray detection was consistent with that obtained by neutron detection and followed the linear relation based on the one-point kinetic model in the vicinity of delayed critical. These experimental results suggest that the gamma-ray correlation technique can be applied to measure reactor kinetic parameters more efficiently

  12. Probabilistic Anomaly Detection Based On System Calls Analysis

    Directory of Open Access Journals (Sweden)

    Przemysław Maciołek

    2007-01-01

    Full Text Available We present an application of probabilistic approach to the anomaly detection (PAD. Byanalyzing selected system calls (and their arguments, the chosen applications are monitoredin the Linux environment. This allows us to estimate “(abnormality” of their behavior (bycomparison to previously collected profiles. We’ve attached results of threat detection ina typical computer environment.

  13. Statistical analysis of DNT detection using chemically functionalized microcantilever arrays

    DEFF Research Database (Denmark)

    Bosco, Filippo; Bache, M.; Hwu, E.-T.

    2012-01-01

    The need for miniaturized and sensitive sensors for explosives detection is increasing in areas such as security and demining. Micrometer sized cantilevers are often used for label-free detection, and have previously been reported to be able to detect explosives. However, only a few measurements...... on the chemically treated surfaces results in significant bending of the cantilevers and in a decrease of their resonant frequencies. We present averaged measurements obtained from up to 72 cantilevers being simultaneously exposed to the same sample. Compared to integrated reference cantilevers with non...

  14. Fault Analysis and Detection in Microgrids with High PV Penetration

    Energy Technology Data Exchange (ETDEWEB)

    El Khatib, Mohamed [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hernandez Alvidrez, Javier [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ellis, Abraham [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    In this report we focus on analyzing current-controlled PV inverters behaviour under faults in order to develop fault detection schemes for microgrids with high PV penetration. Inverter model suitable for steady state fault studies is presented and the impact of PV inverters on two protection elements is analyzed. The studied protection elements are superimposed quantities based directional element and negative sequence directional element. Additionally, several non-overcurrent fault detection schemes are discussed in this report for microgrids with high PV penetration. A detailed time-domain simulation study is presented to assess the performance of the presented fault detection schemes under different microgrid modes of operation.

  15. Nuclear reaction analysis (NRA) for trace element detection

    Energy Technology Data Exchange (ETDEWEB)

    Doebeli, M. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Noll, K. [Bern Univ. (Switzerland)

    1997-09-01

    Ion beam induced nuclear reactions can be used to analyse trace element concentrations in materials. The method is especially suited for the detection of light contaminants in heavy matrices. (author) 3 figs., 2 refs.

  16. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  17. Receiver Operating Characteristic Analysis for Detecting Explosives-related Threats

    Energy Technology Data Exchange (ETDEWEB)

    Oxley, Mark E; Venzin, Alexander M

    2012-11-14

    The Department of Homeland Security (DHS) and the Transportation Security Administration (TSA) are interested in developing a standardized testing procedure for determining the performance of candidate detection systems. This document outlines a potential method for judging detection system performance as well as determining if combining the information from a legacy system with a new system can signicantly improve performance. In this document, performance corresponds to the Neyman-Pearson criterion applied to the Receiver Operating Characteristic (ROC) curves of the detection systems in question. A simulation was developed to investigate how the amount of data provided by the vendor in the form of the ROC curve eects the performance of the combined detection system. Furthermore, the simulation also takes into account the potential eects of correlation and how this information can also impact the performance of the combined system.

  18. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  19. Detection of mastitis pathogens by analysis of volatile bacterial metabolites

    NARCIS (Netherlands)

    Hettinga, K.A.; Valenberg, van H.J.F.; Lam, T.J.G.M.; Hooijdonk, van A.C.M.

    2008-01-01

    The ability to detect mastitis pathogens based on their volatile metabolites was studied. Milk samples from cows with clinical mastitis, caused by Staphylococcus aureus, coagulase-negative staphylococci, Streptococcus uberis, Streptococcus dysgalactiae, and Escherichia coli were collected. In

  20. Helper T lymphocyte precursor frequency analysis in alloreactivity detection

    International Nuclear Information System (INIS)

    Cukrova, V.; Dolezalova, L.; Loudova, M.; Vitek, A.

    1998-01-01

    The utility of IL-2 secreting helper T lymphocyte precursors (HTLp) frequency testing has been evaluated for detecting alloreactivity. The frequency of HTLp was approached by limiting dilution assay. High HTLp frequency was detected in 20 out of 30 HLA matched unrelated pairs (67%). The comparison of HTLp and CTLp (cytotoxic T lymphocyte precursors) frequencies in HLA matched unrelated pairs showed that the two examinations are not fully alternative in detecting alloreactivity. This could suggest the utility of combined testing of both HTLp and CTLp frequencies for alloreactivity assessment. In contrast, five positive HTLp values were only found among 28 HLA genotypic identical siblings (18%). Previous CTLp limiting dilution studies showed very low or undetectable CTLp frequency results in that group. For that, HTLp assay remains to be the only cellular in vitro technique detecting alloreactivity in these combinations. (authors)

  1. Sensitive Detection of Deliquescent Bacterial Capsules through Nanomechanical Analysis.

    Science.gov (United States)

    Nguyen, Song Ha; Webb, Hayden K

    2015-10-20

    Encapsulated bacteria usually exhibit strong resistance to a wide range of sterilization methods, and are often virulent. Early detection of encapsulation can be crucial in microbial pathology. This work demonstrates a fast and sensitive method for the detection of encapsulated bacterial cells. Nanoindentation force measurements were used to confirm the presence of deliquescent bacterial capsules surrounding bacterial cells. Force/distance approach curves contained characteristic linear-nonlinear-linear domains, indicating cocompression of the capsular layer and cell, indentation of the capsule, and compression of the cell alone. This is a sensitive method for the detection and verification of the encapsulation status of bacterial cells. Given that this method was successful in detecting the nanomechanical properties of two different layers of cell material, i.e. distinguishing between the capsule and the remainder of the cell, further development may potentially lead to the ability to analyze even thinner cellular layers, e.g. lipid bilayers.

  2. Detecting failure events in buildings: a numerical and experimental analysis

    OpenAIRE

    Heckman, V. M.; Kohler, M. D.; Heaton, T. H.

    2010-01-01

    A numerical method is used to investigate an approach for detecting the brittle fracture of welds associated with beam -column connections in instrumented buildings in real time through the use of time-reversed Green’s functions and wave propagation reciprocity. The approach makes use of a prerecorded catalog of Green’s functions for an instrumented building to detect failure events in the building during a later seismic event by screening continuous data for the presence of wavef...

  3. Detection and Analysis of the Quality of Ibuprofen Granules

    Science.gov (United States)

    Yu-bin, Ji; Xin, LI; Guo-song, Xin; Qin-bing, Xue

    2017-12-01

    The Ibuprofen Granules comprehensive quality testing to ensure that it is in accordance with the provisions of Chinese pharmacopoeia. With reference of Chinese pharmacopoeia, the Ibuprofen Granules is tested by UV, HPLC, in terms of grain size checking, volume deviation, weight loss on drying detection, dissolution rate detection, and quality evaluation. Results indicated that Ibuprofen Granules conform to the standards. The Ibuprofen Granules are qualified and should be permitted to be marketed.

  4. Comparative analysis of methods for detecting interacting loci.

    Science.gov (United States)

    Chen, Li; Yu, Guoqiang; Langefeld, Carl D; Miller, David J; Guy, Richard T; Raghuram, Jayaram; Yuan, Xiguo; Herrington, David M; Wang, Yue

    2011-07-05

    Interactions among genetic loci are believed to play an important role in disease risk. While many methods have been proposed for detecting such interactions, their relative performance remains largely unclear, mainly because different data sources, detection performance criteria, and experimental protocols were used in the papers introducing these methods and in subsequent studies. Moreover, there have been very few studies strictly focused on comparison of existing methods. Given the importance of detecting gene-gene and gene-environment interactions, a rigorous, comprehensive comparison of performance and limitations of available interaction detection methods is warranted. We report a comparison of eight representative methods, of which seven were specifically designed to detect interactions among single nucleotide polymorphisms (SNPs), with the last a popular main-effect testing method used as a baseline for performance evaluation. The selected methods, multifactor dimensionality reduction (MDR), full interaction model (FIM), information gain (IG), Bayesian epistasis association mapping (BEAM), SNP harvester (SH), maximum entropy conditional probability modeling (MECPM), logistic regression with an interaction term (LRIT), and logistic regression (LR) were compared on a large number of simulated data sets, each, consistent with complex disease models, embedding multiple sets of interacting SNPs, under different interaction models. The assessment criteria included several relevant detection power measures, family-wise type I error rate, and computational complexity. There are several important results from this study. First, while some SNPs in interactions with strong effects are successfully detected, most of the methods miss many interacting SNPs at an acceptable rate of false positives. In this study, the best-performing method was MECPM. Second, the statistical significance assessment criteria, used by some of the methods to control the type I error rate

  5. Comparative analysis of methods for detecting interacting loci

    Directory of Open Access Journals (Sweden)

    Yuan Xiguo

    2011-07-01

    Full Text Available Abstract Background Interactions among genetic loci are believed to play an important role in disease risk. While many methods have been proposed for detecting such interactions, their relative performance remains largely unclear, mainly because different data sources, detection performance criteria, and experimental protocols were used in the papers introducing these methods and in subsequent studies. Moreover, there have been very few studies strictly focused on comparison of existing methods. Given the importance of detecting gene-gene and gene-environment interactions, a rigorous, comprehensive comparison of performance and limitations of available interaction detection methods is warranted. Results We report a comparison of eight representative methods, of which seven were specifically designed to detect interactions among single nucleotide polymorphisms (SNPs, with the last a popular main-effect testing method used as a baseline for performance evaluation. The selected methods, multifactor dimensionality reduction (MDR, full interaction model (FIM, information gain (IG, Bayesian epistasis association mapping (BEAM, SNP harvester (SH, maximum entropy conditional probability modeling (MECPM, logistic regression with an interaction term (LRIT, and logistic regression (LR were compared on a large number of simulated data sets, each, consistent with complex disease models, embedding multiple sets of interacting SNPs, under different interaction models. The assessment criteria included several relevant detection power measures, family-wise type I error rate, and computational complexity. There are several important results from this study. First, while some SNPs in interactions with strong effects are successfully detected, most of the methods miss many interacting SNPs at an acceptable rate of false positives. In this study, the best-performing method was MECPM. Second, the statistical significance assessment criteria, used by some of the

  6. Fall detection in walking robots by multi-way principal component analysis

    NARCIS (Netherlands)

    Karssen, J.G.; Wisse, M.

    2008-01-01

    Large disturbances can cause a biped to fall. If an upcoming fall can be detected, damage can be minimized or the fall can be prevented. We introduce the multi-way principal component analysis (MPCA) method for the detection of upcoming falls. We study the detection capability of the MPCA method in

  7. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    Science.gov (United States)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  8. Real time risk analysis of kick detection: Testing and validation

    International Nuclear Information System (INIS)

    Islam, Rakibul; Khan, Faisal; Venkatesan, Ramchandran

    2017-01-01

    Oil and gas development is moving into harsh and remote locations where the highest level of safety is required. A blowout is one of the most feared accidents in oil and gas developments projects. The main objective of this paper is to test and validate the kick detection of blowout risk assessment model using uniquely developed experimental results. Kick detection is a major part of the blowout risk assessment model. The accuracy and timeliness of kick detection are dependent on the monitoring of multiple downhole parameters such as downhole pressure, fluid density, fluid conductivity and mass flow rate. In the present study these four parameters are considered in different logical combinations to assess the occurrence of kick and associated blowout risk. The assessed results are compared against the experimental observations. It is observed that simultaneous monitoring of mass flow rate combined with any one the three parameters provides most reliable detection of kick and potential blowout likelihood. The current work presents the framework for a dynamic risk assessment and management model. Upon success testing of this approach at the pilot and field levels, this approach could provide a paradigm shift in drilling safety. - Highlights: • A novel dynamic risk model of kick detection and blowout prediction. • Testing and Validation of the risk model. • Application of the dynamic risk model.

  9. Lane marking detection based on waveform analysis and CNN

    Science.gov (United States)

    Ye, Yang Yang; Chen, Hou Jin; Hao, Xiao Li

    2017-06-01

    Lane markings detection is a very important part of the ADAS to avoid traffic accidents. In order to obtain accurate lane markings, in this work, a novel and efficient algorithm is proposed, which analyses the waveform generated from the road image after inverse perspective mapping (IPM). The algorithm includes two main stages: the first stage uses an image preprocessing including a CNN to reduce the background and enhance the lane markings. The second stage obtains the waveform of the road image and analyzes the waveform to get lanes. The contribution of this work is that we introduce local and global features of the waveform to detect the lane markings. The results indicate the proposed method is robust in detecting and fitting the lane markings.

  10. Detection and size analysis of proteins with switchable DNA layers.

    Science.gov (United States)

    Rant, Ulrich; Pringsheim, Erika; Kaiser, Wolfgang; Arinaga, Kenji; Knezevic, Jelena; Tornow, Marc; Fujita, Shozo; Yokoyama, Naoki; Abstreiter, Gerhard

    2009-04-01

    We introduce a chip-compatible scheme for the label-free detection of proteins in real-time that is based on the electrically driven conformation switching of DNA oligonucleotides on metal surfaces. The switching behavior is a sensitive indicator for the specific recognition of IgG antibodies and antibody fragments, which can be detected in quantities of less than 10(-18) mol on the sensor surface. Moreover, we show how the dynamics of the induced molecular motion can be monitored by measuring the high-frequency switching response. When proteins bind to the layer, the increase in hydrodynamic drag slows the switching dynamics, which allows us to determine the size of the captured proteins. We demonstrate the identification of different antibody fragments by means of their kinetic fingerprint. The switchDNA method represents a generic approach to simultaneously detect and size target molecules using a single analytical platform.

  11. Water pollution analysis and detection. (Latest citations from the NTIS bibliographic database). Published Search

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-08-01

    The bibliography contains citations concerning water pollution analysis, detection, monitoring, and regulation. Citations review online systems, bioassay monitoring, laser-based detection, sensor and biosensor systems, metabolic analyzers, and microsystem techniques. References cover fiber-optic portable detection instruments and rapid detection of toxicants in drinking water. (Contains 50-250 citations and includes a subject term index and title list.) (Copyright NERAC, Inc. 1995)

  12. Can Technical Analysis Signals Detect Price Reactions Around Earnings Announcement?: Evidence from Indonesia

    OpenAIRE

    Dedhy Sulistiawan; Jogiyanto Hartono

    2014-01-01

    This study examines whether technical analysis signals can detect price reactions before and after earnings announcement dates in Indonesian stock market. Earnings announcements produce reactions, both before and after the announcements. Informed investors may use private information before earnings announcements (Christophe, Ferri and Angel, 2004; Porter, 1992). Using technical analysis signals, this study expects that retail investors (uninformed investors) can detect preannouncements react...

  13. Fusion of optical flow based motion pattern analysis and silhouette classification for person tracking and detection

    NARCIS (Netherlands)

    Tangelder, J.W.H.; Lebert, E.; Burghouts, G.J.; Zon, K. van; Den Uyl, M.J.

    2014-01-01

    This paper presents a novel approach to detect persons in video by combining optical flow based motion analysis and silhouette based recognition. A new fast optical flow computation method is described, and its application in a motion based analysis framework unifying human tracking and detection is

  14. Mediacampaign: A Multimodal Semantic Analysis System for Advertisement Campaign Detection

    NARCIS (Netherlands)

    Rehatschek, Herwig; Sorschag, Robert; Rettenbacher, Bernhard; Zeiner, Herwig; Nioche, Julien; de Jong, Franciska M.G.; Ordelman, Roeland J.F.; van Leeuwen, David A.

    MediaCampaign's scope is on discovering and inter-relating advertisements and campaigns, i.e. to relate advertisements semantically belonging together, across different countries and different media. The project’s main goal is to automate to a large degree the detection and tracking of advertisement

  15. Mediacampaign - A multimodal semantic analysis system for advertisement campaign detection

    NARCIS (Netherlands)

    Rehatschek, H.; Sorschag, R.; Rettenbacher, B.; Zeiner, H.; Nioche, J.; Jong, F. de; Ordelmann, R.; Leeuwen, D. van

    2008-01-01

    MediaCampaign's scope is on discovering and inter-relating advertisements and campaigns, i.e. to relate advertisements semantically belonging together, across different countries and different media. The project's main goal is to automate to a large degree the detection and tracking of advertisement

  16. Molecular analysis of cross-bacterial contamination detected in ...

    African Journals Online (AJOL)

    ... the isolate Delftia acidovorans BP(R2) and it is also coupled to protein with molecular weight 25-26 KDa. As well as, this bacterial contamination was the reason for the false positive results observed during the detection of HCV infections. Journal of Applied Sciences and Environmental Management Vol. 9(1) 2005: 5-10.

  17. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    elevation ranges from 0 m in the coastal areas of the. Persian Gulf to over ... been explained by Kang and Yusof (2012); Dhorde ...... J L 2004 Detection of statistically significant trends in ... Sun H, Chen Y, Li W, Li F, Chen Y, Hao X and Yang Y.

  18. Acoustic detection of intracranial aneurysms : A decision analysis

    NARCIS (Netherlands)

    vanBruggen, AC; Dippel, DWJ; Habbema, JDF; Mooij, JJA

    1996-01-01

    We present a further evaluation of an improved recording method for the acoustic detection of intracranial aneurysms (ADA). A sensor was applied to the patient's eyes. Two measures were derived to summarize the power spectral density functions of the sound frequencies that were obtained from each

  19. Analysis of Scattered Radiation Influence on Detectability in Diagnostic Radiology

    Energy Technology Data Exchange (ETDEWEB)

    Gurvich, V [ALVIM R and D Ltd., P.O.B. 801 Jerusalem 91007 (Israel); Manevich, I [Jerusalem College of Technology, 21 Havaad Haleumi St. P.O.B. 16031, Jerusalem 91160 (Israel)

    1994-12-31

    The calculation of holes detectability in tissue equivalent materials on a X-ray image is implemented. In the calculation various values of scatter accumulation factor are used. The obtained results confirmed by experimental data may be useful for choice of physics-technical conditions of X-ray examination. (authors). 7 refs, 1 fig, 1 tab.

  20. Detecting gallbladders in chicken livers using spectral analysis

    DEFF Research Database (Denmark)

    Jørgensen, Anders; Mølvig Jensen, Eigil; Moeslund, Thomas B.

    2015-01-01

    This paper presents a method for detecting gallbladders attached to chicken livers using spectral imaging. Gallbladders can contaminate good livers, making them unfit for human consumption. A data set consisting of chicken livers with and without gallbladders, has been captured using 33 wavelengths...

  1. cDNA cloning, structural analysis, SNP detection and tissue ...

    Indian Academy of Sciences (India)

    THOMAS NAICY

    detection and tissue expression profile of the IGF1 gene in Malabari and Attappady Black goats of India. J. Genet. ... Keywords. gene cloning; gene expression; goat; insulin-like growth factor 1; mRNA; single-nucleotide ..... cle tenderness (Koohmaraie et al. .... growth factor (IGF) system in the bovine oviduct at oestrus and.

  2. Binary pattern analysis for 3D facial action unit detection

    NARCIS (Netherlands)

    Sandbach, Georgia; Zafeiriou, Stefanos; Pantic, Maja

    2012-01-01

    In this paper we propose new binary pattern features for use in the problem of 3D facial action unit (AU) detection. Two representations of 3D facial geometries are employed, the depth map and the Azimuthal Projection Distance Image (APDI). To these the traditional Local Binary Pattern is applied,

  3. An analysis of network traffic classification for botnet detection

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2015-01-01

    of detecting botnet network traffic using three methods that target protocols widely considered as the main carriers of botnet Command and Control (C&C) and attack traffic, i.e. TCP, UDP and DNS. We propose three traffic classification methods based on capable Random Forests classifier. The proposed methods...

  4. Specializing network analysis to detect anomalous insider actions

    Science.gov (United States)

    Chen, You; Nyemba, Steve; Zhang, Wen; Malin, Bradley

    2012-01-01

    Collaborative information systems (CIS) enable users to coordinate efficiently over shared tasks in complex distributed environments. For flexibility, they provide users with broad access privileges, which, as a side-effect, leave such systems vulnerable to various attacks. Some of the more damaging malicious activities stem from internal misuse, where users are authorized to access system resources. A promising class of insider threat detection models for CIS focuses on mining access patterns from audit logs, however, current models are limited in that they assume organizations have significant resources to generate label cases for training classifiers or assume the user has committed a large number of actions that deviate from “normal” behavior. In lieu of the previous assumptions, we introduce an approach that detects when specific actions of an insider deviate from expectation in the context of collaborative behavior. Specifically, in this paper, we introduce a specialized network anomaly detection model, or SNAD, to detect such events. This approach assesses the extent to which a user influences the similarity of the group of users that access a particular record in the CIS. From a theoretical perspective, we show that the proposed model is appropriate for detecting insider actions in dynamic collaborative systems. From an empirical perspective, we perform an extensive evaluation of SNAD with the access logs of two distinct environments: the patient record access logs a large electronic health record system (6,015 users, 130,457 patients and 1,327,500 accesses) and the editing logs of Wikipedia (2,394,385 revisors, 55,200 articles and 6,482,780 revisions). We compare our model with several competing methods and demonstrate SNAD is significantly more effective: on average it achieves 20–30% greater area under an ROC curve. PMID:23399988

  5. Analysis of ESR measurement parameters for detecting irradiated spices

    International Nuclear Information System (INIS)

    Kameya, Hiromi; Hagiwara, Shoji; Todoriki, Setsuko

    2015-01-01

    The side signals from irradiated cellulose radical are used for detecting irradiated spices with the electron spin resonance (ESR). The side signals are two signals observed on both sides of a singlet signal (g≒2.00) from organic free radicals. Since the intensities of the side signals are weak, if the width of the singlet signal is large, these signals are covered and cannot be observed. In this study, we analyzed ESR measurement parameters of seven kinds spices (oregano, basil, parsley, coriander, cumin, white pepper, and black pepper) that would lead to narrow width of the singlet signal for detecting side signals. The results were as follows: 4 mW microwave power for basil, parsley, oregano, coriander, and cumin, and 8 mW for white pepper and black pepper, while modulation amplitude of 4 G, time constant of 20 ms were determined to be the optimal ESR measurement parameters. (author)

  6. On Textual Analysis and Machine Learning for Cyberstalking Detection.

    Science.gov (United States)

    Frommholz, Ingo; Al-Khateeb, Haider M; Potthast, Martin; Ghasem, Zinnar; Shukla, Mitul; Short, Emma

    2016-01-01

    Cyber security has become a major concern for users and businesses alike. Cyberstalking and harassment have been identified as a growing anti-social problem. Besides detecting cyberstalking and harassment, there is the need to gather digital evidence, often by the victim. To this end, we provide an overview of and discuss relevant technological means, in particular coming from text analytics as well as machine learning, that are capable to address the above challenges. We present a framework for the detection of text-based cyberstalking and the role and challenges of some core techniques such as author identification, text classification and personalisation. We then discuss PAN, a network and evaluation initiative that focusses on digital text forensics, in particular author identification.

  7. Software Analysis of Mining Images for Objects Detection

    Directory of Open Access Journals (Sweden)

    Jan Tomecek

    2013-11-01

    Full Text Available The contribution is dealing with the development of the new module of robust FOTOMNG system for editing images from a video or miningimage from measurements for subsequent improvement of detection of required objects in the 2D image. The generated module allows create a finalhigh-quality picture by combination of multiple images with the search objects. We can combine input data according to the parameters or basedon reference frames. Correction of detected 2D objects is also part of this module. The solution is implemented intoFOTOMNG system and finishedwork has been tested in appropriate frames, which were validated core functionality and usability. Tests confirmed the function of each part of themodule, its accuracy and implications of integration.

  8. Comparison of public peak detection algorithms for MALDI mass spectrometry data analysis.

    Science.gov (United States)

    Yang, Chao; He, Zengyou; Yu, Weichuan

    2009-01-06

    In mass spectrometry (MS) based proteomic data analysis, peak detection is an essential step for subsequent analysis. Recently, there has been significant progress in the development of various peak detection algorithms. However, neither a comprehensive survey nor an experimental comparison of these algorithms is yet available. The main objective of this paper is to provide such a survey and to compare the performance of single spectrum based peak detection methods. In general, we can decompose a peak detection procedure into three consequent parts: smoothing, baseline correction and peak finding. We first categorize existing peak detection algorithms according to the techniques used in different phases. Such a categorization reveals the differences and similarities among existing peak detection algorithms. Then, we choose five typical peak detection algorithms to conduct a comprehensive experimental study using both simulation data and real MALDI MS data. The results of comparison show that the continuous wavelet-based algorithm provides the best average performance.

  9. INCREMENTAL PRINCIPAL COMPONENT ANALYSIS BASED OUTLIER DETECTION METHODS FOR SPATIOTEMPORAL DATA STREAMS

    Directory of Open Access Journals (Sweden)

    A. Bhushan

    2015-07-01

    Full Text Available In this paper, we address outliers in spatiotemporal data streams obtained from sensors placed across geographically distributed locations. Outliers may appear in such sensor data due to various reasons such as instrumental error and environmental change. Real-time detection of these outliers is essential to prevent propagation of errors in subsequent analyses and results. Incremental Principal Component Analysis (IPCA is one possible approach for detecting outliers in such type of spatiotemporal data streams. IPCA has been widely used in many real-time applications such as credit card fraud detection, pattern recognition, and image analysis. However, the suitability of applying IPCA for outlier detection in spatiotemporal data streams is unknown and needs to be investigated. To fill this research gap, this paper contributes by presenting two new IPCA-based outlier detection methods and performing a comparative analysis with the existing IPCA-based outlier detection methods to assess their suitability for spatiotemporal sensor data streams.

  10. Network Intrusion Forensic Analysis Using Intrusion Detection System

    OpenAIRE

    Manish Kumar; Dr. M. Hanumanthappa; Dr. T.V. Suresh Kumar

    2011-01-01

    The need for computer intrusion forensics arises from the alarming increase in the number of computer crimes that are committed annually. After a computer system has been breached and an intrusion has been detected, there is a need for a computer forensics investigation to follow. Computer forensics is used to bring to justice, those responsible for conducting attacks on computer systems throughout the world. Because of this the law must be follow precisely when conducting a forensics investi...

  11. Studying Fake News via Network Analysis: Detection and Mitigation

    OpenAIRE

    Shu, Kai; Bernard, H. Russell; Liu, Huan

    2018-01-01

    Social media for news consumption is becoming increasingly popular due to its easy access, fast dissemination, and low cost. However, social media also enable the wide propagation of "fake news", i.e., news with intentionally false information. Fake news on social media poses significant negative societal effects, and also presents unique challenges. To tackle the challenges, many existing works exploit various features, from a network perspective, to detect and mitigate fake news. In essence...

  12. Detection and Monitoring of Neurotransmitters - a Spectroscopic Analysis

    Science.gov (United States)

    Manciu, Felicia; Lee, Kendall; Durrer, William; Bennet, Kevin

    2012-10-01

    In this work we demonstrate the capability of confocal Raman mapping spectroscopy for simultaneously and locally detecting important compounds in neuroscience such as dopamine, serotonin, and adenosine. The Raman results show shifting of the characteristic vibrations of the compounds, observations consistent with previous spectroscopic studies. Although some vibrations are common in these neurotransmitters, Raman mapping was achieved by detecting non-overlapping characteristic spectral signatures of the compounds, as follows: for dopamine the vibration attributed to C-O stretching, for serotonin the indole ring stretching vibration, and for adenosine the adenine ring vibrations. Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific micro-scale image regions. Such information is particularly important for complex, heterogeneous samples, where modification of the chemical or physical composition can influence the neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.

  13. Plagiarism and Source Deception Detection Based on Syntax Analysis

    Directory of Open Access Journals (Sweden)

    Eman Salih Al-Shamery

    2017-02-01

    Full Text Available In this research, the shingle algorithm with Jaccard method are employed as a new approach to detect deception in sources in addition to detect plagiarism . Source deception occurs as a result of taking a particular text from a source and relative it to another source, while plagiarism occurs in the documents as a result of taking part or all of the text belong to another research, this approach is based on Shingle algorithm with Jaccard coefficient , Shingling is an efficient way to compare the set of shingle in the files that contain text which are used as a feature to measure the syntactic similarity of the documents and it will work with Jaccard coefficient that measures similarity between sample sets . In this proposed system, text will be checked whether it contains syntax plagiarism or not and gives a percentage of similarity with other documents , As well as research sources will be checked to detect deception in source , by matching it with available sources from Turnitin report of the same research by using shingle algorithm with Jaccard coefficient. The motivations of this work is to discovery of literary thefts that occur on the researches , especially what students are doing in their researches , also discover the deception that occurs in the sources.

  14. Detecting Anomaly in Traffic Flow from Road Similarity Analysis

    KAUST Repository

    Liu, Xinran

    2016-06-02

    Taxies equipped with GPS devices are considered as 24-hour moving sensors widely distributed in urban road networks. Plenty of accurate and realtime trajectories of taxi are recorded by GPS devices and are commonly studied for understanding traffic dynamics. This paper focuses on anomaly detection in traffic volume, especially the non-recurrent traffic anomaly caused by unexpected or transient incidents, such as traffic accidents, celebrations and disasters. It is important to detect such sharp changes of traffic status for sensing abnormal events and planning their impact on the smooth volume of traffic. Unlike existing anomaly detection approaches that mainly monitor the derivation of current traffic status from history in the past, the proposed method in this paper evaluates the abnormal score of traffic on one road by comparing its current traffic volume with not only its historical data but also its neighbors. We define the neighbors as the roads that are close in sense of both geo-location and traffic patterns, which are extracted by matrix factorization. The evaluation results on trajectories data of 12,286 taxies over four weeks in Beijing show that our approach outperforms other baseline methods with higher precision and recall.

  15. Determination of detection limits for a VPD ICPMS method of analysis

    International Nuclear Information System (INIS)

    Badard, M.; Veillerot, M.

    2007-01-01

    This training course report presents the different methods of detection and quantifying of metallic impurities in semiconductors. One of the most precise technique is the collection of metal impurities by vapor phase decomposition (VPD) followed by their analysis by ICPMS (inductively coupled plasma mass spectrometry). The study shows the importance of detection limits in the domain of chemical analysis and the way to determine them for the ICPMS analysis. The results found on detection limits are excellent. Even if the detection limits reached with ICPMS performed after manual or automatic VPD are much higher than detection limits of ICPMS alone, this method remains one of the most sensible for ultra-traces analysis. (J.S.)

  16. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    Directory of Open Access Journals (Sweden)

    Kemal Akyol

    2016-01-01

    Full Text Available With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  17. Detection of the irradiated domestic potato by thermoluminescence analysis

    International Nuclear Information System (INIS)

    Nakauma, M.; Saitou, K.; Todoriki, S.

    2004-01-01

    The applicability of thermoluminescence (TL) method to the detection of Japanese domestic potatoes was examined. Potatoes of nine different origins were exposed to gamma rays at O to 150Gy and the minerals separated from the potatoes were subjected to the TL measurements. The fading effects of TL emission intensity were also examined during the storage of post-irradiation period. The irradiated samples gave large intensity of TL signals compared with those of non-irradiated ones, and there were proportionalities in the intensity and the dose. The quantum yield of the TL emission for each unit weight and shape of glow curves were varied in the samples from different producing regions hence the irradiated samples were difficult to be detected by using those parameters. TL ratios obtained from the irradiated samples in all regions were 0.15-1.00 and those obtained from non-irradiated samples were less than 0.15. Therefore, the irradiated and non-irradiated samples can be distinguished by normalization (TL ratio). The signal intensity decreased with the storage time, and the reduction of signal was more rapidly observed in the sample stored under light condition than that in darkness. However, the TL ratio of the samples irradiated at 150 Gy and stored for five months could be distinguished from those of non-irradiated samples. It was also possible to apply this method to the detection of the domestic potatoes which irradiated in a commercial facility and purchased at a local market. (Received Oct. 22, 2003; Accepted Mar. 18, 2004)

  18. Teardown analysis for detecting shelf-life degradation

    Science.gov (United States)

    Eckstein, A. S.

    1971-01-01

    Analysis is guideline in examining component materials, analytically determining physical properties and chemical compositions, and developing control data necessary for ascertaining effects of environments and their influence on deterioration and degradation mechanisms.

  19. Application of thermoluminescence analysis for detection of irradiated foodstuffs

    International Nuclear Information System (INIS)

    Malec-Czechowska, K.; Dancewicz, M.; Szot, Z.

    1996-01-01

    Investigations on the development conditions necessary to obtain reliable results of the detection of irradiated herbs, spices, mushrooms and strawberries by thermoluminescence (Tl) method in whole samples and/or in minerals isolated from them carried out. Tl intensities of whole samples were measured between 40 an 355 C. Threshold values for non-irradiated samples were obtained by multiplying Tl values of control samples by 3 (safety factor). Samples in which Tl intensities were higher than the threshold values were identified as irradiated. In 4 out of 10 kind of herbs and spices examined the results of Tl measurements led to false identification. (author). 14 refs, 1 fig., 5 tabs

  20. Expert knowledge and data analysis for detecting advanced persistent threats

    Directory of Open Access Journals (Sweden)

    Moya Juan Ramón

    2017-08-01

    Full Text Available Critical Infrastructures in public administration would be compromised by Advanced Persistent Threats (APT which today constitute one of the most sophisticated ways of stealing information. This paper presents an effective, learning based tool that uses inductive techniques to analyze the information provided by firewall log files in an IT infrastructure, and detect suspicious activity in order to mark it as a potential APT. The experiments have been accomplished mixing real and synthetic data traffic to represent different proportions of normal and anomalous activity.

  1. Applied network security monitoring collection, detection, and analysis

    CERN Document Server

    Sanders, Chris

    2013-01-01

    Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di

  2. Current methods of handling less-than-detectable measurements and detection limits in statistical analysis of environmental data

    International Nuclear Information System (INIS)

    Hertzler, C.L.; Atwood, C.L.; Harris, G.A.

    1989-09-01

    A search was made of statistical literature that might be applicable in environmental assessment contexts, when some of the measured quantities are reported as less than detectable (LTD). Over 60 documents were reviewed, and the findings are described in this report. The methodological areas considered are parameter estimation (point estimates and confidence intervals), tolerance intervals and prediction intervals, regression, trend analysis, comparisons of populations (including two-sample comparisons and analysis of variance), and goodness of fit tests. The conclusions are summarized at the end of the report. 68 refs., 1 tab

  3. Face Liveness Detection Based on Skin Blood Flow Analysis

    Directory of Open Access Journals (Sweden)

    Shun-Yi Wang

    2017-12-01

    Full Text Available Face recognition systems have been widely adopted for user authentication in security systems due to their simplicity and effectiveness. However, spoofing attacks, including printed photos, displayed photos, and replayed video attacks, are critical challenges to authentication, and these spoofing attacks allow malicious invaders to gain access to the system. This paper proposes two novel features for face liveness detection systems to protect against printed photo attacks and replayed attacks for biometric authentication systems. The first feature obtains the texture difference between red and green channels of face images inspired by the observation that skin blood flow in the face has properties that enable distinction between live and spoofing face images. The second feature estimates the color distribution in the local regions of face images, instead of whole images, because image quality might be more discriminative in small areas of face images. These two features are concatenated together, along with a multi-scale local binary pattern feature, and a support vector machine classifier is trained to discriminate between live and spoofing face images. The experimental results show that the performance of the proposed method for face spoof detection is promising when compared with that of previously published methods. Furthermore, the proposed system can be implemented in real time, which is valuable for mobile applications.

  4. Skin Lesion Analysis towards Melanoma Detection Using Deep Learning Network

    Directory of Open Access Journals (Sweden)

    Yuexiang Li

    2018-02-01

    Full Text Available Skin lesions are a severe disease globally. Early detection of melanoma in dermoscopy images significantly increases the survival rate. However, the accurate recognition of melanoma is extremely challenging due to the following reasons: low contrast between lesions and skin, visual similarity between melanoma and non-melanoma lesions, etc. Hence, reliable automatic detection of skin tumors is very useful to increase the accuracy and efficiency of pathologists. In this paper, we proposed two deep learning methods to address three main tasks emerging in the area of skin lesion image processing, i.e., lesion segmentation (task 1, lesion dermoscopic feature extraction (task 2 and lesion classification (task 3. A deep learning framework consisting of two fully convolutional residual networks (FCRN is proposed to simultaneously produce the segmentation result and the coarse classification result. A lesion index calculation unit (LICU is developed to refine the coarse classification results by calculating the distance heat-map. A straight-forward CNN is proposed for the dermoscopic feature extraction task. The proposed deep learning frameworks were evaluated on the ISIC 2017 dataset. Experimental results show the promising accuracies of our frameworks, i.e., 0.753 for task 1, 0.848 for task 2 and 0.912 for task 3 were achieved.

  5. Skin Lesion Analysis towards Melanoma Detection Using Deep Learning Network.

    Science.gov (United States)

    Li, Yuexiang; Shen, Linlin

    2018-02-11

    Skin lesions are a severe disease globally. Early detection of melanoma in dermoscopy images significantly increases the survival rate. However, the accurate recognition of melanoma is extremely challenging due to the following reasons: low contrast between lesions and skin, visual similarity between melanoma and non-melanoma lesions, etc. Hence, reliable automatic detection of skin tumors is very useful to increase the accuracy and efficiency of pathologists. In this paper, we proposed two deep learning methods to address three main tasks emerging in the area of skin lesion image processing, i.e., lesion segmentation (task 1), lesion dermoscopic feature extraction (task 2) and lesion classification (task 3). A deep learning framework consisting of two fully convolutional residual networks (FCRN) is proposed to simultaneously produce the segmentation result and the coarse classification result. A lesion index calculation unit (LICU) is developed to refine the coarse classification results by calculating the distance heat-map. A straight-forward CNN is proposed for the dermoscopic feature extraction task. The proposed deep learning frameworks were evaluated on the ISIC 2017 dataset. Experimental results show the promising accuracies of our frameworks, i.e., 0.753 for task 1, 0.848 for task 2 and 0.912 for task 3 were achieved.

  6. Skin Lesion Analysis towards Melanoma Detection Using Deep Learning Network

    Science.gov (United States)

    2018-01-01

    Skin lesions are a severe disease globally. Early detection of melanoma in dermoscopy images significantly increases the survival rate. However, the accurate recognition of melanoma is extremely challenging due to the following reasons: low contrast between lesions and skin, visual similarity between melanoma and non-melanoma lesions, etc. Hence, reliable automatic detection of skin tumors is very useful to increase the accuracy and efficiency of pathologists. In this paper, we proposed two deep learning methods to address three main tasks emerging in the area of skin lesion image processing, i.e., lesion segmentation (task 1), lesion dermoscopic feature extraction (task 2) and lesion classification (task 3). A deep learning framework consisting of two fully convolutional residual networks (FCRN) is proposed to simultaneously produce the segmentation result and the coarse classification result. A lesion index calculation unit (LICU) is developed to refine the coarse classification results by calculating the distance heat-map. A straight-forward CNN is proposed for the dermoscopic feature extraction task. The proposed deep learning frameworks were evaluated on the ISIC 2017 dataset. Experimental results show the promising accuracies of our frameworks, i.e., 0.753 for task 1, 0.848 for task 2 and 0.912 for task 3 were achieved. PMID:29439500

  7. Equivalent Dipole Vector Analysis for Detecting Pulmonary Hypertension

    Science.gov (United States)

    Harlander, Matevz; Salobir, Barbara; Toplisek, Janez; Schlegel, Todd T.; Starc, Vito

    2010-01-01

    Various 12-lead ECG criteria have been established to detect right ventricular hypertrophy as a marker of pulmonary hypertension (PH). While some criteria offer good specificity they lack sensitivity because of a low prevalence of positive findings in the PH population. We hypothesized that three-dimensional equivalent dipole (ED) model could serve as a better detection tool of PH. We enrolled: 1) 17 patients (12 female, 5 male, mean age 57 years, range 19-79 years) with echocardiographically detected PH (systolic pulmonary arterial pressure greater than 35 mmHg) and no significant left ventricular disease; and 2) 19 healthy controls (7 female, 12 male, mean age 44, range 31-53 years) with no known heart disease. In each subject we recorded a 5-minute high-resolution 12-lead conventional ECG and constructed principal signals using singular value decomposition. Assuming a standard thorax dimension of an adult person with homogenous and isotropic distribution of thorax conductance, we determined moving equivalent dipoles (ED), characterized by the 3D location in the thorax, dipolar strength and the spatial orientation, in time intervals of 5 ms. We used the sum of all ED vectors in the second half of the QRS complex to derive the amplitude of the right-sided ED vector (RV), if the orientation of ED was to the right side of the thorax, and in the first half the QRS to derive the amplitude of the left-sided vector (LV), if the orientation was leftward. Finally, the parameter RV/LV ratio was determined over an average of 256 complexes. The groups differed in age and gender to some extent. There was a non-significant trend toward higher RV in patients with PH (438 units 284) than in controls (280 plus or minus 140) (p = 0.066) but the overlap was such that RV alone was not a good predictor of PH. On the other hand, the RV/LV ratio was a better predictor of PH, with 11/17 (64.7%) of PH patients but only in 1/19 (5.3%) control subjects having RV/LV ratio greater than or

  8. Steam turbine coupling misalignment detection by vibrational analysis

    International Nuclear Information System (INIS)

    Behzad, M.; Asoyesh, M.

    2001-01-01

    Machinery troubleshooting and diagnostics via vibration analysis have historically been proven, and once again become enlightened topics with the recent popularity of predictive maintenance programs. Among several causes of vibration of turbomachinery, coupling misalignment plays an important role.The results of a theoretical analysis of coupling misalignment and its frequency spectrum characteristics, which can be used for predictive maintenance programs, are compared with other numerical investigations and practical results. The analytical method used in this research is very straightforward and does not need any computer programming

  9. Efficient hemodynamic event detection utilizing relational databases and wavelet analysis

    Science.gov (United States)

    Saeed, M.; Mark, R. G.

    2001-01-01

    Development of a temporal query framework for time-oriented medical databases has hitherto been a challenging problem. We describe a novel method for the detection of hemodynamic events in multiparameter trends utilizing wavelet coefficients in a MySQL relational database. Storage of the wavelet coefficients allowed for a compact representation of the trends, and provided robust descriptors for the dynamics of the parameter time series. A data model was developed to allow for simplified queries along several dimensions and time scales. Of particular importance, the data model and wavelet framework allowed for queries to be processed with minimal table-join operations. A web-based search engine was developed to allow for user-defined queries. Typical queries required between 0.01 and 0.02 seconds, with at least two orders of magnitude improvement in speed over conventional queries. This powerful and innovative structure will facilitate research on large-scale time-oriented medical databases.

  10. Facial image analysis to detect gestational alcohol exposure

    African Journals Online (AJOL)

    of structural anomalies and neurocognitive and behavioural disabilities, known as ... A simple, cost-effective method for large-scale FAS and FASD screening ... may be introduced to address the learning needs of affected children. ... early interventions for children with FAS. .... with FAS.15 Landmark-based shape analysis.

  11. Methods of Detecting Outliers in A Regression Analysis Model. | Ogu ...

    African Journals Online (AJOL)

    A Boilers data with dependent variable Y (man-Hour) and four independent variables X1 (Boiler Capacity), X2 (Design Pressure), X3 (Boiler Type), X4 (Drum Type) were used. The analysis of the Boilers data reviewed an unexpected group of Outliers. The results from the findings showed that an observation can be outlying ...

  12. Robust detection of discordant sites in regional frequency analysis

    NARCIS (Netherlands)

    Neykov, N.M.; Neytchev, P.N.; Van Gelder, P.H.A.J.M.; Todorov, V.K.

    2007-01-01

    The discordancy measure in terms of the sample L?moment ratios (L?CV, L?skewness, L?kurtosis) of the at?site data is widely recommended in the screening process of atypical sites in the regional frequency analysis (RFA). The sample mean and the covariance matrix of the L?moments ratios, on which the

  13. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  14. Knowledge-Base Semantic Gap Analysis for the Vulnerability Detection

    Science.gov (United States)

    Wu, Raymond; Seki, Keisuke; Sakamoto, Ryusuke; Hisada, Masayuki

    Web security became an alert in internet computing. To cope with ever-rising security complexity, semantic analysis is proposed to fill-in the gap that the current approaches fail to commit. Conventional methods limit their focus to the physical source codes instead of the abstraction of semantics. It bypasses new types of vulnerability and causes tremendous business loss.

  15. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun Saptohartyadi

    2014-01-01

    Condition monitoring of wind turbines is a field of continu- ous research and development as new turbine configurations enter into the market and new failure modes appear. Systems utilising well established techniques from the energy and in- dustry sector, such as vibration analysis...

  16. Mutation Analysis in Classical Phenylketonuria Patients Followed by Detecting Haplotypes Linked to Some PAH Mutations.

    Science.gov (United States)

    Dehghanian, Fatemeh; Silawi, Mohammad; Tabei, Seyed M B

    2017-02-01

    Deficiency of phenylalanine hydroxylase (PAH) enzyme and elevation of phenylalanine in body fluids cause phenylketonuria (PKU). The gold standard for confirming PKU and PAH deficiency is detecting causal mutations by direct sequencing of the coding exons and splicing involved sequences of the PAH gene. Furthermore, haplotype analysis could be considered as an auxiliary approach for detecting PKU causative mutations before direct sequencing of the PAH gene by making comparisons between prior detected mutation linked-haplotypes and new PKU case haplotypes with undetermined mutations. In this study, 13 unrelated classical PKU patients took part in the study detecting causative mutations. Mutations were identified by polymerase chain reaction (PCR) and direct sequencing in all patients. After that, haplotype analysis was performed by studying VNTR and PAHSTR markers (linked genetic markers of the PAH gene) through application of PCR and capillary electrophoresis (CE). Mutation analysis was performed successfully and the detected mutations were as follows: c.782G>A, c.754C>T, c.842C>G, c.113-115delTCT, c.688G>A, and c.696A>G. Additionally, PAHSTR/VNTR haplotypes were detected to discover haplotypes linked to each mutation. Mutation detection is the best approach for confirming PAH enzyme deficiency in PKU patients. Due to the relatively large size of the PAH gene and high cost of the direct sequencing in developing countries, haplotype analysis could be used before DNA sequencing and mutation detection for a faster and cheaper way via identifying probable mutated exons.

  17. An RFI Detection Algorithm for Microwave Radiometers Using Sparse Component Analysis

    Science.gov (United States)

    Mohammed-Tano, Priscilla N.; Korde-Patel, Asmita; Gholian, Armen; Piepmeier, Jeffrey R.; Schoenwald, Adam; Bradley, Damon

    2017-01-01

    Radio Frequency Interference (RFI) is a threat to passive microwave measurements and if undetected, can corrupt science retrievals. The sparse component analysis (SCA) for blind source separation has been investigated to detect RFI in microwave radiometer data. Various techniques using SCA have been simulated to determine detection performance with continuous wave (CW) RFI.

  18. A light detection cell to be used in a micro analysis system for ammonia

    NARCIS (Netherlands)

    Tiggelaar, Roald M.; Veenstra, T.T.; Sanders, Remco G.P.; Gardeniers, Johannes G.E.; Elwenspoek, Michael Curt; van den Berg, Albert

    2002-01-01

    This paper describes the design, realization and characterization of a micromachined light detection cell. This light detection cell is designed to meet the specifications needed for a micro total analysis system in which ammonia is converted to indophenol blue. The concentration of indophenol blue

  19. A parylene-based dual channel microelectrophoresis system for rapid mutation detection via heteroduplex analysis

    NARCIS (Netherlands)

    Sukas, S.; Erson, Ayse Elif; Sert, Cuneyt; Kulah, Haluk

    2008-01-01

    A new dual channel micro-electrophoresis system for rapid mutation detection based on heteroduplex analysis was designed and implemented. Mutation detection was successfully achieved in a total separation length of 250 μm in less than 3 min for a 590 bp DNA sample harboring a 3 bp mutation causing

  20. AN IMAGE-ANALYSIS TECHNIQUE FOR DETECTION OF RADIATION-INDUCED DNA FRAGMENTATION AFTER CHEF ELECTROPHORESIS

    NARCIS (Netherlands)

    ROSEMANN, M; KANON, B; KONINGS, AWT; KAMPINGA, HH

    CHEF-electrophoresis was used as a technique to detect radiation-induced DNA breakage with special emphasis to biological relevant X-ray doses (0-10 Gy). Fluorescence detection of DNA-fragments using a sensitive image analysis system was directly compared with conventional scintillation counting of

  1. Detection of COL III in Parchment by Amino Acid Analysis

    DEFF Research Database (Denmark)

    Vestergaard Poulsen Sommer, Dorte; Larsen, René

    2016-01-01

    Cultural heritage parchments made from the reticular dermis of animals have been subject to studies of deterioration and conservation by amino acid analysis. The reticular dermis contains a varying mixture of collagen I and III (COL I and III). When dealing with the results of the amino acid...... analyses, till now the COL III content has not been taken into account. Based on the available amino acid sequences we present a method for determining the amount of COL III in the reticular dermis of new and historical parchments calculated from the ratio of Ile/Val. We find COL III contents between 7...... and 32 % in new parchments and between 0.2 and 40 % in the historical parchments. This is consistent with results in the literature. The varying content of COL III has a significant influence on the uncertainty of the amino acid analysis. Although we have not found a simple correlation between the COL...

  2. Nuisance Source Population Modeling for Radiation Detection System Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sokkappa, P; Lange, D; Nelson, K; Wheeler, R

    2009-10-05

    A major challenge facing the prospective deployment of radiation detection systems for homeland security applications is the discrimination of radiological or nuclear 'threat sources' from radioactive, but benign, 'nuisance sources'. Common examples of such nuisance sources include naturally occurring radioactive material (NORM), medical patients who have received radioactive drugs for either diagnostics or treatment, and industrial sources. A sensitive detector that cannot distinguish between 'threat' and 'benign' classes will generate false positives which, if sufficiently frequent, will preclude it from being operationally deployed. In this report, we describe a first-principles physics-based modeling approach that is used to approximate the physical properties and corresponding gamma ray spectral signatures of real nuisance sources. Specific models are proposed for the three nuisance source classes - NORM, medical and industrial. The models can be validated against measured data - that is, energy spectra generated with the model can be compared to actual nuisance source data. We show by example how this is done for NORM and medical sources, using data sets obtained from spectroscopic detector deployments for cargo container screening and urban area traffic screening, respectively. In addition to capturing the range of radioactive signatures of individual nuisance sources, a nuisance source population model must generate sources with a frequency of occurrence consistent with that found in actual movement of goods and people. Measured radiation detection data can indicate these frequencies, but, at present, such data are available only for a very limited set of locations and time periods. In this report, we make more general estimates of frequencies for NORM and medical sources using a range of data sources such as shipping manifests and medical treatment statistics. We also identify potential data sources for industrial

  3. Bioinformatics analysis and detection of gelatinase encoded gene in Lysinibacillussphaericus

    Science.gov (United States)

    Repin, Rul Aisyah Mat; Mutalib, Sahilah Abdul; Shahimi, Safiyyah; Khalid, Rozida Mohd.; Ayob, Mohd. Khan; Bakar, Mohd. Faizal Abu; Isa, Mohd Noor Mat

    2016-11-01

    In this study, we performed bioinformatics analysis toward genome sequence of Lysinibacillussphaericus (L. sphaericus) to determine gene encoded for gelatinase. L. sphaericus was isolated from soil and gelatinase species-specific bacterium to porcine and bovine gelatin. This bacterium offers the possibility of enzymes production which is specific to both species of meat, respectively. The main focus of this research is to identify the gelatinase encoded gene within the bacteria of L. Sphaericus using bioinformatics analysis of partially sequence genome. From the research study, three candidate gene were identified which was, gelatinase candidate gene 1 (P1), NODE_71_length_93919_cov_158.931839_21 which containing 1563 base pair (bp) in size with 520 amino acids sequence; Secondly, gelatinase candidate gene 2 (P2), NODE_23_length_52851_cov_190.061386_17 which containing 1776 bp in size with 591 amino acids sequence; and Thirdly, gelatinase candidate gene 3 (P3), NODE_106_length_32943_cov_169.147919_8 containing 1701 bp in size with 566 amino acids sequence. Three pairs of oligonucleotide primers were designed and namely as, F1, R1, F2, R2, F3 and R3 were targeted short sequences of cDNA by PCR. The amplicons were reliably results in 1563 bp in size for candidate gene P1 and 1701 bp in size for candidate gene P3. Therefore, the results of bioinformatics analysis of L. Sphaericus resulting in gene encoded gelatinase were identified.

  4. Familial cases of Norrie disease detected by copy number analysis.

    Science.gov (United States)

    Arai, Eisuke; Fujimaki, Takuro; Yanagawa, Ai; Fujiki, Keiko; Yokoyama, Toshiyuki; Okumura, Akihisa; Shimizu, Toshiaki; Murakami, Akira

    2014-09-01

    Norrie disease (ND, MIM#310600) is an X-linked disorder characterized by severe vitreoretinal dysplasia at birth. We report the results of causative NDP gene analysis in three male siblings with Norrie disease and describe the associated phenotypes. Three brothers with suspected Norrie disease and their mother presented for clinical examination. After obtaining informed consent, DNA was extracted from the peripheral blood of the proband, one of his brothers and his unaffected mother. Exons 1-3 of the NDP gene were amplified by polymerase chain reaction (PCR), and direct sequencing was performed. Multiplex ligation-dependent probe amplification (MLPA) was also performed to search for copy number variants in the NDP gene. The clinical findings of the three brothers included no light perception, corneal opacity, shallow anterior chamber, leukocoria, total retinal detachment and mental retardation. Exon 2 of the NDP gene was not amplified in the proband and one brother, even when the PCR primers for exon 2 were changed, whereas the other two exons showed no mutations by direct sequencing. MLPA analysis showed deletion of exon 2 of the NDP gene in the proband and one brother, while there was only one copy of exon 2 in the mother. Norrie disease was diagnosed in three patients from a Japanese family by clinical examination and was confirmed by genetic analysis. To localize the defect, confirmation of copy number variation by the MLPA method was useful in the present study.

  5. 4. International symposium on analysis and detection of explosives, Jerusalem (Israel)

    International Nuclear Information System (INIS)

    1992-09-01

    From all the presentations given at the 4. International Symposium on Analysis and Detection of Explosives (September 7-10, 1992, Jerusalem, Israel), three were considered in INIS scope and separately indexed

  6. Research on data auto-analysis algorithms in the explosive detection system

    International Nuclear Information System (INIS)

    Wang Haidong; Li Yuanjing; Yang Yigang; Li Tiezhu; Chen Boxian; Cheng Jianping

    2006-01-01

    This paper mainly describe some auto-analysis algorithms in explosive detection system with TNA method. These include the auto-calibration algorithm when disturbed by other factors, MCA auto-calibration algorithm with calibrated spectrum, the auto-fitting and integral of hydrogen and nitrogen elements data. With these numerical algorithms, the authors can automatically and precisely analysis the gamma-spectra and ultimately achieve the explosive auto-detection. (authors)

  7. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to be in statistical control. Significant deviations between analytical results from different laboratories reveal the presence of systematic errors, and agreement between different laboratories indicate the absence of systematic errors. This statistical approach, referred to as the analysis of precision, was applied...

  8. Fiber optic system design for vehicle detection and analysis

    Science.gov (United States)

    Nedoma, Jan; Zboril, Ondrej; Fajkus, Marcel; Zavodny, Petr; Kepak, Stanislav; Bednarek, Lukas; Martinek, Radek; Vasinek, Vladimir

    2016-04-01

    Fiber optic interferometers belong to a group of highly sensitive and precise devices enabling to measure small changes in the deformation shapes, changes in pressure, temperature, vibration and so on. The basis of their activity is to evaluate the number of fringes over time, not changes in the intensity of the optical signal. The methodology described in the article is based on using the interferometer to monitor traffic density. The base of the solution is a Mach-Zehnder interferometer operating with single-mode G.652 optical fiber at the wavelength of 1550 nm excited by a DFB laser. The power distribution of the laser light into the individual arms of the interferometer is in the ratio 1:1. Realized measuring scheme was terminated by an optical receiver including InGaAs PIN photodiode. Registered signal from the photodetector was through 8 Hz high pass filter fed to the measuring card that captures the analog input voltage using an application written in LabView development environment. The interferometer was stored in a waterproof box and placed at the side of the road. Here panned individual transit of cars in his environs. Vertically across the road was placed in contact removable belt simulating a retarder, which was used when passing cars to create sufficient vibration response detecting interferometer. The results demonstrated that the individual vehicles passing around boxing showed characteristic amplitude spectra, which was unique for each object, and had sufficient value signal to noise ratio (SNR). The signal was processed by applications developed for the amplitude-frequency spectrum. Evaluated was the maximum amplitude of the signal and compared to the noise. The results were verified by repeated transit of the different types of cars.

  9. Technoeconomic analysis of renewable hydrogen production, storage, and detection systems

    Energy Technology Data Exchange (ETDEWEB)

    Mann, M.K.; Spath, P.L.; Kadam, K. [National Renewable Energy Lab., Golden, CO (United States)

    1996-10-01

    Technical and economic feasibility studies of different degrees of completeness and detail have been performed on several projects being funded by the Department of Energy`s Hydrogen Program. Work this year focused on projects at the National Renewable Energy Laboratory, although analyses of projects at other institutions are underway or planned. Highly detailed analyses were completed on a fiber optic hydrogen leak detector and a process to produce hydrogen from biomass via pyrolysis followed by steam reforming of the pyrolysis oil. Less detailed economic assessments of solar and biologically-based hydrogen production processes have been performed and focused on the steps that need to be taken to improve the competitive position of these technologies. Sensitivity analyses were conducted on all analyses to reveal the degree to which the cost results are affected by market changes and technological advances. For hydrogen storage by carbon nanotubes, a survey of the competing storage technologies was made in order to set a baseline for cost goals. A determination of the likelihood of commercialization was made for nearly all systems examined. Hydrogen from biomass via pyrolysis and steam reforming was found to have significant economic potential if a coproduct option could be co-commercialized. Photoelectrochemical hydrogen production may have economic potential, but only if low-cost cells can be modified to split water and to avoid surface oxidation. The use of bacteria to convert the carbon monoxide in biomass syngas to hydrogen was found to be slightly more expensive than the high end of currently commercial hydrogen, although there are significant opportunities to reduce costs. Finally, the cost of installing a fiber-optic chemochromic hydrogen detection system in passenger vehicles was found to be very low and competitive with alternative sensor systems.

  10. Cluster analysis for DNA methylation profiles having a detection threshold

    Directory of Open Access Journals (Sweden)

    Siegmund Kimberly D

    2006-07-01

    Full Text Available Abstract Background DNA methylation, a molecular feature used to investigate tumor heterogeneity, can be measured on many genomic regions using the MethyLight technology. Due to the combination of the underlying biology of DNA methylation and the MethyLight technology, the measurements, while being generated on a continuous scale, have a large number of 0 values. This suggests that conventional clustering methodology may not perform well on this data. Results We compare performance of existing methodology (such as k-means with two novel methods that explicitly allow for the preponderance of values at 0. We also consider how the ability to successfully cluster such data depends upon the number of informative genes for which methylation is measured and the correlation structure of the methylation values for those genes. We show that when data is collected for a sufficient number of genes, our models do improve clustering performance compared to methods, such as k-means, that do not explicitly respect the supposed biological realities of the situation. Conclusion The performance of analysis methods depends upon how well the assumptions of those methods reflect the properties of the data being analyzed. Differing technologies will lead to data with differing properties, and should therefore be analyzed differently. Consequently, it is prudent to give thought to what the properties of the data are likely to be, and which analysis method might therefore be likely to best capture those properties.

  11. Shaft cracks detection on operating centrifugal pumps by vibration analysis

    International Nuclear Information System (INIS)

    Serra, Reynaldo Cavalcanti.

    1995-01-01

    This study gives an account of the vibratory behaviour of one centrifugal pump representative of those employed in nuclear reactors whereby its shaft contained a fatigue crack with critical orientation. Two cracks depth were included in the study, aside from the uncracked shaft. Four other machined discontinuities with varying depths were also included to allow a direct comparison. The data acquisition was carried out with a system using eight accelerometers and a tape recorder. The signals were then processed and interpreted with a dynamic signal analysis work station. The data analysis based in the time domain were unsuccessful as a result of the signal complexity. The fundamental frequency and its harmonics were defined from the frequency spectra. The corresponding amplitudes were recorded and tabulated for future reference. A method was proposed to identify the evolution of the discontinuities based on the departures from a reference state and procedure is suggested to substitute the standards and practices presently in use which are unreliable. (author). 46 refs., 48 figs., 24 tabs

  12. Genome-Wide Detection and Analysis of Multifunctional Genes

    Science.gov (United States)

    Pritykin, Yuri; Ghersi, Dario; Singh, Mona

    2015-01-01

    Many genes can play a role in multiple biological processes or molecular functions. Identifying multifunctional genes at the genome-wide level and studying their properties can shed light upon the complexity of molecular events that underpin cellular functioning, thereby leading to a better understanding of the functional landscape of the cell. However, to date, genome-wide analysis of multifunctional genes (and the proteins they encode) has been limited. Here we introduce a computational approach that uses known functional annotations to extract genes playing a role in at least two distinct biological processes. We leverage functional genomics data sets for three organisms—H. sapiens, D. melanogaster, and S. cerevisiae—and show that, as compared to other annotated genes, genes involved in multiple biological processes possess distinct physicochemical properties, are more broadly expressed, tend to be more central in protein interaction networks, tend to be more evolutionarily conserved, and are more likely to be essential. We also find that multifunctional genes are significantly more likely to be involved in human disorders. These same features also hold when multifunctionality is defined with respect to molecular functions instead of biological processes. Our analysis uncovers key features about multifunctional genes, and is a step towards a better genome-wide understanding of gene multifunctionality. PMID:26436655

  13. Detection of non-stationary leak signals at NPP primary circuit by cross-correlation analysis

    International Nuclear Information System (INIS)

    Shimanskij, S.B.

    2007-01-01

    A leak-detection system employing high-temperature microphones has been developed for the RBMK and ATR (Japan) reactors. Further improvement of the system focused on using cross-correlation analysis of the spectral components of the signal to detect a small leak at an early stage of development. Since envelope processes are less affected by distortions than are wave processes, they give a higher-degree of correlation and can be used to detect leaks with lower signal-noise ratios. Many simulation tests performed at nuclear power plants have shown that the proposed methods can be used to detect and find the location of a small leak [ru

  14. Detecting Network Communities: An Application to Phylogenetic Analysis

    Science.gov (United States)

    Andrade, Roberto F. S.; Rocha-Neto, Ivan C.; Santos, Leonardo B. L.; de Santana, Charles N.; Diniz, Marcelo V. C.; Lobão, Thierry Petit; Goés-Neto, Aristóteles; Pinho, Suani T. R.; El-Hani, Charbel N.

    2011-01-01

    This paper proposes a new method to identify communities in generally weighted complex networks and apply it to phylogenetic analysis. In this case, weights correspond to the similarity indexes among protein sequences, which can be used for network construction so that the network structure can be analyzed to recover phylogenetically useful information from its properties. The analyses discussed here are mainly based on the modular character of protein similarity networks, explored through the Newman-Girvan algorithm, with the help of the neighborhood matrix . The most relevant networks are found when the network topology changes abruptly revealing distinct modules related to the sets of organisms to which the proteins belong. Sound biological information can be retrieved by the computational routines used in the network approach, without using biological assumptions other than those incorporated by BLAST. Usually, all the main bacterial phyla and, in some cases, also some bacterial classes corresponded totally (100%) or to a great extent (>70%) to the modules. We checked for internal consistency in the obtained results, and we scored close to 84% of matches for community pertinence when comparisons between the results were performed. To illustrate how to use the network-based method, we employed data for enzymes involved in the chitin metabolic pathway that are present in more than 100 organisms from an original data set containing 1,695 organisms, downloaded from GenBank on May 19, 2007. A preliminary comparison between the outcomes of the network-based method and the results of methods based on Bayesian, distance, likelihood, and parsimony criteria suggests that the former is as reliable as these commonly used methods. We conclude that the network-based method can be used as a powerful tool for retrieving modularity information from weighted networks, which is useful for phylogenetic analysis. PMID:21573202

  15. Detection of dopamine in dopaminergic cell using nanoparticles-based barcode DNA analysis.

    Science.gov (United States)

    An, Jeung Hee; Kim, Tae-Hyung; Oh, Byung-Keun; Choi, Jeong Woo

    2012-01-01

    Nanotechnology-based bio-barcode-amplification analysis may be an innovative approach to dopamine detection. In this study, we evaluated the efficacy of this bio-barcode DNA method in detecting dopamine from dopaminergic cells. Herein, a combination DNA barcode and bead-based immunoassay for neurotransmitter detection with PCR-like sensitivity is described. This method relies on magnetic nanoparticles with antibodies and nanoparticles that are encoded with DNA, and antibodies that can sandwich the target protein captured by the nanoparticle-bound antibodies. The aggregate sandwich structures are magnetically separated from solution, and treated in order to remove the conjugated barcode DNA. The DNA barcodes were then identified via PCR analysis. The dopamine concentration in dopaminergic cells can be readily and rapidly detected via the bio-barcode assay method. The bio-barcode assay method is, therefore, a rapid and high-throughput screening tool for the detection of neurotransmitters such as dopamine.

  16. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis.

    Science.gov (United States)

    Motooka, Masanobu; Uno, Shigeyasu

    2018-02-02

    Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  17. Isotope Enrichment Detection by Laser Ablation - Laser Absorption Spectrometry: Automated Environmental Sampling and Laser-Based Analysis for HEU Detection

    International Nuclear Information System (INIS)

    Anheier, Norman C.; Bushaw, Bruce A.

    2010-01-01

    The global expansion of nuclear power, and consequently the uranium enrichment industry, requires the development of new safeguards technology to mitigate proliferation risks. Current enrichment monitoring instruments exist that provide only yes/no detection of highly enriched uranium (HEU) production. More accurate accountancy measurements are typically restricted to gamma-ray and weight measurements taken in cylinder storage yards. Analysis of environmental and cylinder content samples have much higher effectiveness, but this approach requires onsite sampling, shipping, and time-consuming laboratory analysis and reporting. Given that large modern gaseous centrifuge enrichment plants (GCEPs) can quickly produce a significant quantity (SQ ) of HEU, these limitations in verification suggest the need for more timely detection of potential facility misuse. The Pacific Northwest National Laboratory (PNNL) is developing an unattended safeguards instrument concept, combining continuous aerosol particulate collection with uranium isotope assay, to provide timely analysis of enrichment levels within low enriched uranium facilities. This approach is based on laser vaporization of aerosol particulate samples, followed by wavelength tuned laser diode spectroscopy to characterize the uranium isotopic ratio through subtle differences in atomic absorption wavelengths. Environmental sampling (ES) media from an integrated aerosol collector is introduced into a small, reduced pressure chamber, where a focused pulsed laser vaporizes material from a 10 to 20-(micro)m diameter spot of the surface of the sampling media. The plume of ejected material begins as high-temperature plasma that yields ions and atoms, as well as molecules and molecular ions. We concentrate on the plume of atomic vapor that remains after the plasma has expanded and then cooled by the surrounding cover gas. Tunable diode lasers are directed through this plume and each isotope is detected by monitoring absorbance

  18. Potku – New analysis software for heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  19. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  20. Cardiac arrhythmia detection using combination of heart rate variability analyses and PUCK analysis.

    Science.gov (United States)

    Mahananto, Faizal; Igasaki, Tomohiko; Murayama, Nobuki

    2013-01-01

    This paper presents cardiac arrhythmia detection using the combination of a heart rate variability (HRV) analysis and a "potential of unbalanced complex kinetics" (PUCK) analysis. Detection performance was improved by adding features extracted from the PUCK analysis. Initially, R-R interval data were extracted from the original electrocardiogram (ECG) recordings and were cut into small segments and marked as either normal or arrhythmia. HRV analyses then were conducted using the segmented R-R interval data, including a time-domain analysis, frequency-domain analysis, and nonlinear analysis. In addition to the HRV analysis, PUCK analysis, which has been implemented successfully in a foreign exchange market series to characterize change, was employed. A decision-tree algorithm was applied to all of the obtained features for classification. The proposed method was tested using the MIT-BIH arrhythmia database and had an overall classification accuracy of 91.73%. After combining features obtained from the PUCK analysis, the overall accuracy increased to 92.91%. Therefore, we suggest that the use of a PUCK analysis in conjunction with HRV analysis might improve performance accuracy for the detection of cardiac arrhythmia.

  1. A method based on temporal concept analysis for detecting and profiling human trafficking suspects

    NARCIS (Netherlands)

    Poelmans, J.; Elzinga, P.; Viaene, S.; Dedene, G.; Hamza, M.H.

    2010-01-01

    Human trafficking and forced prostitution are a serious problem for the Amsterdam-Amstelland police (the Netherlands). In this paper, we present a method based on Temporal Concept Analysis for detecting and profiling human trafficking suspects. Using traditional Formal Concept Analysis, we first

  2. Sensitive and reliable detection of genomic imbalances in human neuroblastomas using comparative genomic hybridisation analysis

    NARCIS (Netherlands)

    van Gele, M.; van Roy, N.; Jauch, A.; Laureys, G.; Benoit, Y.; Schelfhout, V.; de Potter, C. R.; Brock, P.; Uyttebroeck, A.; Sciot, R.; Schuuring, E.; Versteeg, R.; Speleman, F.

    1997-01-01

    Deletions of the short arm of chromosome 1, extra copies of chromosome 17q and MYCN amplification are the most frequently encountered genetic changes in neuroblastomas. Standard techniques for detection of one or more of these genetic changes are karyotyping, FISH analysis and LOH analysis by

  3. Signal Detection Analysis of Factors Associated with Diabetes among Semirural Mexican American Adults

    Science.gov (United States)

    Hanni, K. D.; Ahn, D. A.; Winkleby, M. A.

    2013-01-01

    Signal detection analysis was used to evaluate a combination of sociodemographic, acculturation, mental health, health care, and chronic disease risk factors potentially associated with diabetes in a sample of 4,505 semirural Mexican American adults. Overall, 8.9% of adults had been diagnosed with diabetes. The analysis resulted in 12 mutually…

  4. Detection method of nonlinearity errors by statistical signal analysis in heterodyne Michelson interferometer.

    Science.gov (United States)

    Hu, Juju; Hu, Haijiang; Ji, Yinghua

    2010-03-15

    Periodic nonlinearity that ranges from tens of nanometers to a few nanometers in heterodyne interferometer limits its use in high accuracy measurement. A novel method is studied to detect the nonlinearity errors based on the electrical subdivision and the analysis method of statistical signal in heterodyne Michelson interferometer. Under the movement of micropositioning platform with the uniform velocity, the method can detect the nonlinearity errors by using the regression analysis and Jackknife estimation. Based on the analysis of the simulations, the method can estimate the influence of nonlinearity errors and other noises for the dimensions measurement in heterodyne Michelson interferometer.

  5. Wavelength modulation spectroscopy--digital detection of gas absorption harmonics based on Fourier analysis.

    Science.gov (United States)

    Mei, Liang; Svanberg, Sune

    2015-03-20

    This work presents a detailed study of the theoretical aspects of the Fourier analysis method, which has been utilized for gas absorption harmonic detection in wavelength modulation spectroscopy (WMS). The lock-in detection of the harmonic signal is accomplished by studying the phase term of the inverse Fourier transform of the Fourier spectrum that corresponds to the harmonic signal. The mathematics and the corresponding simulation results are given for each procedure when applying the Fourier analysis method. The present work provides a detailed view of the WMS technique when applying the Fourier analysis method.

  6. Quantile regression for the statistical analysis of immunological data with many non-detects.

    Science.gov (United States)

    Eilers, Paul H C; Röder, Esther; Savelkoul, Huub F J; van Wijk, Roy Gerth

    2012-07-07

    Immunological parameters are hard to measure. A well-known problem is the occurrence of values below the detection limit, the non-detects. Non-detects are a nuisance, because classical statistical analyses, like ANOVA and regression, cannot be applied. The more advanced statistical techniques currently available for the analysis of datasets with non-detects can only be used if a small percentage of the data are non-detects. Quantile regression, a generalization of percentiles to regression models, models the median or higher percentiles and tolerates very high numbers of non-detects. We present a non-technical introduction and illustrate it with an implementation to real data from a clinical trial. We show that by using quantile regression, groups can be compared and that meaningful linear trends can be computed, even if more than half of the data consists of non-detects. Quantile regression is a valuable addition to the statistical methods that can be used for the analysis of immunological datasets with non-detects.

  7. Study of relationship between MUF correlation and detection sensitivity of statistical analysis

    International Nuclear Information System (INIS)

    Tamura, Toshiaki; Ihara, Hitoshi; Yamamoto, Yoichi; Ikawa, Koji

    1989-11-01

    Various kinds of statistical analysis are proposed to NRTA (Near Real Time Materials Accountancy) which was devised to satisfy the timeliness goal of one of the detection goals of IAEA. It will be presumed that different statistical analysis results will occur between the case of considered rigorous error propagation (with MUF correlation) and the case of simplified error propagation (without MUF correlation). Therefore, measurement simulation and decision analysis were done using flow simulation of 800 MTHM/Y model reprocessing plant, and relationship between MUF correlation and detection sensitivity and false alarm of statistical analysis was studied. Specific character of material accountancy for 800 MTHM/Y model reprocessing plant was grasped by this simulation. It also became clear that MUF correlation decreases not only false alarm but also detection probability for protracted loss in case of CUMUF test and Page's test applied to NRTA. (author)

  8. QUALITATIVE ANALYSIS METHOD OF DETECTION OF WAX CONTENT IN GORENGAN USING SMARTPHONE

    Directory of Open Access Journals (Sweden)

    Yulia Yulia

    2018-05-01

    Full Text Available Wax is one of the compounds that can be misused to be added to Gorengan, Indonesian fritter, to keep them crispy. Gorengan containing wax is difficult to identify visually, so a quick and easy method of detecting wax content is required. The purpose of this research is to develop and evaluate the analytical performance of detecting wax content in gorengan using smartphone. Gorengan sample was dissolved with hexane and then added reagent that will give discoloration followed by analysis using smartphone. Some analysis performance parameters were evaluated in terms of linearity and detection limit, qualitative analysis capability, precision, and selectivity test. The developed method was also applied in some gorengan samples. The result shows that the detection of wax content in gorengan can be conducted by using reagent consisting of NaOH, Schift, and curcumin (1 : 2 : 2. Performance analysis shows that the linearity measurement at concentration between 10% and 25% has correlation coefficient (r of 0.9537 with detection limit at concentration of 2% and precision (%RSD less than 3%. The developed method can be applied for the detection of wax content in gorengan in the market.

  9. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  10. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  11. Improved Principal Component Analysis for Anomaly Detection: Application to an Emergency Department

    KAUST Repository

    Harrou, Fouzi; Kadri, Farid; Chaabane, Sondé s; Tahon, Christian; Sun, Ying

    2015-01-01

    Monitoring of production systems, such as those in hospitals, is primordial for ensuring the best management and maintenance desired product quality. Detection of emergent abnormalities allows preemptive actions that can prevent more serious consequences. Principal component analysis (PCA)-based anomaly-detection approach has been used successfully for monitoring systems with highly correlated variables. However, conventional PCA-based detection indices, such as the Hotelling’s T2T2 and the Q statistics, are ill suited to detect small abnormalities because they use only information from the most recent observations. Other multivariate statistical metrics, such as the multivariate cumulative sum (MCUSUM) control scheme, are more suitable for detection small anomalies. In this paper, a generic anomaly detection scheme based on PCA is proposed to monitor demands to an emergency department. In such a framework, the MCUSUM control chart is applied to the uncorrelated residuals obtained from the PCA model. The proposed PCA-based MCUSUM anomaly detection strategy is successfully applied to the practical data collected from the database of the pediatric emergency department in the Lille Regional Hospital Centre, France. The detection results evidence that the proposed method is more effective than the conventional PCA-based anomaly-detection methods.

  12. Improved Principal Component Analysis for Anomaly Detection: Application to an Emergency Department

    KAUST Repository

    Harrou, Fouzi

    2015-07-03

    Monitoring of production systems, such as those in hospitals, is primordial for ensuring the best management and maintenance desired product quality. Detection of emergent abnormalities allows preemptive actions that can prevent more serious consequences. Principal component analysis (PCA)-based anomaly-detection approach has been used successfully for monitoring systems with highly correlated variables. However, conventional PCA-based detection indices, such as the Hotelling’s T2T2 and the Q statistics, are ill suited to detect small abnormalities because they use only information from the most recent observations. Other multivariate statistical metrics, such as the multivariate cumulative sum (MCUSUM) control scheme, are more suitable for detection small anomalies. In this paper, a generic anomaly detection scheme based on PCA is proposed to monitor demands to an emergency department. In such a framework, the MCUSUM control chart is applied to the uncorrelated residuals obtained from the PCA model. The proposed PCA-based MCUSUM anomaly detection strategy is successfully applied to the practical data collected from the database of the pediatric emergency department in the Lille Regional Hospital Centre, France. The detection results evidence that the proposed method is more effective than the conventional PCA-based anomaly-detection methods.

  13. THE ANALYSIS OF DETECTIVE GENRE IN MEDIA STUDIES IN THE STUDENT AUDIENCE

    Directory of Open Access Journals (Sweden)

    Alexander Fedorov

    2011-11-01

    Full Text Available Development of skills for the critical analysis of media texts - an important task of media education. However, media literacy practice shows that students have the problems with the discussion / analysis of entertainment genres in the early stages of media studies, for example, the difficulties in the process of understanding and interpreting the author's conception, plot and genre features. This article substantiates the methodological approaches to the analysis skills of detective/thriller genre in media studies in the student audience.

  14. Analysis of arecoline in Semen Arecae decoction pieces by microchip capillary electrophoresis with contactless conductivity detection

    Directory of Open Access Journals (Sweden)

    Zi-You Cai

    2012-10-01

    Full Text Available A new method for the determination of arecoline in Semen Arecae decoction pieces by microchip capillary electrophoresis with contactless conductivity detection (MCE-CCD was proposed. The effects of various electrophoretic operating parameters on the analysis of arecoline were studied. Under the optimal conditions, arecoline was rapidly separated and detected in 1 min with good linearity over the concentration range of 20–1500 μM (r2=0.9991 and the detection limit of 5 μM (S/N=3. The method was used for the analysis of arecoline satisfactorily with a recovery of 96.8–104%. Keywords: Microchip capillary electrophoresis, Contactless conductivity detection, Arecoline, Semen Arecae

  15. Using discriminant analysis to detect intrusions in external communication for self-driving vehicles

    Directory of Open Access Journals (Sweden)

    Khattab M.Ali Alheeti

    2017-08-01

    Full Text Available Security systems are a necessity for the deployment of smart vehicles in our society. Security in vehicular ad hoc networks is crucial to the reliable exchange of information and control data. In this paper, we propose an intelligent Intrusion Detection System (IDS to protect the external communication of self-driving and semi self-driving vehicles. This technology has the ability to detect Denial of Service (DoS and black hole attacks on vehicular ad hoc networks (VANETs. The advantage of the proposed IDS over existing security systems is that it detects attacks before they causes significant damage. The intrusion prediction technique is based on Linear Discriminant Analysis (LDA and Quadratic Discriminant Analysis (QDA which are used to predict attacks based on observed vehicle behavior. We perform simulations using Network Simulator 2 to demonstrate that the IDS achieves a low rate of false alarms and high accuracy in detection.

  16. Identifying Time Measurement Tampering in the Traversal Time and Hop Count Analysis (TTHCA Wormhole Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Jonny Karlsson

    2013-05-01

    Full Text Available Traversal time and hop count analysis (TTHCA is a recent wormhole detection algorithm for mobile ad hoc networks (MANET which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  17. Marginal space learning for medical image analysis efficient detection and segmentation of anatomical structures

    CERN Document Server

    Zheng, Yefeng

    2014-01-01

    Presents an award winning image analysis technology (Thomas Edison Patent Award, MICCAI Young Investigator Award) that achieves object detection and segmentation with state-of-the-art accuracy and efficiency Flexible, machine learning-based framework, applicable across multiple anatomical structures and imaging modalities Thirty five clinical applications on detecting and segmenting anatomical structures such as heart chambers and valves, blood vessels, liver, kidney, prostate, lymph nodes, and sub-cortical brain structures, in CT, MRI, X-Ray and Ultrasound.

  18. Detection of cardiac wall motion defects with combined amplitude/phase analysis

    International Nuclear Information System (INIS)

    Bacharach, S.L.; Green, M.V.; Bonow, R.O.; Pace, L.; Brunetti, A.; Larson, S.M.

    1985-01-01

    Fourier phase images have been used with some success to detect and quantify left ventricular (LV) wall motion defects. In abnormal regions of the LV, wall motion asynchronies often cause the time activity curve (TAC) to be shifted in phase. Such regional shifts are detected by analysis of the distribution function of phase values over the LV. However, not all wall motion defects result in detectable regional phase abnormalities. Such abnormalities may cause a reduction in the magnitude of contraction (and hence TAC amplitude) without any appreciable change in TAC shape(and hence phase). In an attempt to improve the sensitivity of the Fourier phase method for the detection of wall motion defects the authors analyzed the distribution function of Fourier amplitude as well as phase. 26 individuals with normal cardiac function and no history of cardiac disease served as controls. The goal was to detect and quantify wall motion as compared to the consensus of 3 independent observers viewing the scintigraphic cines. 26 subjects with coronary artery disease and mild wall motion defects (22 with normal EF) were studied ate rest. They found that analysis of the skew of thew amplitude distribution function improved the sensitivity for the detection of wall motion abnormalities at rest in the group from 65% to 85% (17/26 detected by phase alone, 22/26 by combined phase and amplitude analysis) while retaining a 0 false positive rate in the normal group. The authors conclude that analysis of Fourier amplitude distribution functions can significantly increase the sensitivity of phase imaging for detection of wall motion abnormalities

  19. Determination of detection limits for a VPD ICPMS method of analysis; Determination des limites de detection d'une methode d'analyse VPD ICPMS

    Energy Technology Data Exchange (ETDEWEB)

    Badard, M.; Veillerot, M

    2007-07-01

    This training course report presents the different methods of detection and quantifying of metallic impurities in semiconductors. One of the most precise technique is the collection of metal impurities by vapor phase decomposition (VPD) followed by their analysis by ICPMS (inductively coupled plasma mass spectrometry). The study shows the importance of detection limits in the domain of chemical analysis and the way to determine them for the ICPMS analysis. The results found on detection limits are excellent. Even if the detection limits reached with ICPMS performed after manual or automatic VPD are much higher than detection limits of ICPMS alone, this method remains one of the most sensible for ultra-traces analysis. (J.S.)

  20. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  1. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    Poline, J.B.; Mazoyer, B.M.

    1994-01-01

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  2. Using Order Tracking Analysis Method to Detect the Angle Faults of Blades on Wind Turbine

    DEFF Research Database (Denmark)

    Li, Pengfei; Hu, Weihao; Liu, Juncheng

    2016-01-01

    The angle faults of blades on wind turbines are usually included in the set angle fault and the pitch angle fault. They are occupied with a high proportion in all wind turbine faults. Compare with the traditional fault detection methods, using order tracking analysis method to detect angle faults....... By analyzing and reconstructing the fault signals, it is easy to detect the fault characteristic frequency and see the characteristic frequencies of angle faults depend on the shaft rotating frequency, which is known as the 1P frequency and 3P frequency distinctly....

  3. Change detection for synthetic aperture radar images based on pattern and intensity distinctiveness analysis

    Science.gov (United States)

    Wang, Xiao; Gao, Feng; Dong, Junyu; Qi, Qiang

    2018-04-01

    Synthetic aperture radar (SAR) image is independent on atmospheric conditions, and it is the ideal image source for change detection. Existing methods directly analysis all the regions in the speckle noise contaminated difference image. The performance of these methods is easily affected by small noisy regions. In this paper, we proposed a novel change detection framework for saliency-guided change detection based on pattern and intensity distinctiveness analysis. The saliency analysis step can remove small noisy regions, and therefore makes the proposed method more robust to the speckle noise. In the proposed method, the log-ratio operator is first utilized to obtain a difference image (DI). Then, the saliency detection method based on pattern and intensity distinctiveness analysis is utilized to obtain the changed region candidates. Finally, principal component analysis and k-means clustering are employed to analysis pixels in the changed region candidates. Thus, the final change map can be obtained by classifying these pixels into changed or unchanged class. The experiment results on two real SAR images datasets have demonstrated the effectiveness of the proposed method.

  4. SWAN - Detection of explosives by means of fast neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gierlik, M., E-mail: m.gierlik@ncbj.gov.pl; Borsuk, S.; Guzik, Z.; Iwanowska, J.; Kaźmierczak, Ł.; Korolczuk, S.; Kozłowski, T.; Krakowski, T.; Marcinkowski, R.; Swiderski, L.; Szeptycka, M.; Szewiński, J.; Urban, A.

    2016-10-21

    In this work we report on SWAN, the experimental, portable device for explosives detection. The device was created as part of the EU Structural Funds Project “Accelerators & Detectors” (POIG.01.01.02-14-012/08-00), with the goal to increase beneficiary's expertise and competencies in the field of neutron activation analysis. Previous experiences and budged limitations lead toward a less advanced design based on fast neutron interactions and unsophisticated data analysis with the emphasis on the latest gamma detection and spectrometry solutions. The final device has been designed as a portable, fast neutron activation analyzer, with the software optimized for detection of carbon, nitrogen and oxygen. SWAN's performance in the role of explosives detector is elaborated in this paper. We demonstrate that the unique features offered by neutron activation analysis might not be impressive enough when confronted with practical demands and expectations of a generic homeland security customer.

  5. Analysis Spectrum of ECG Signal and QRS Detection during Running on Treadmill

    Science.gov (United States)

    Agung Suhendra, M.; Ilham R., M.; Simbolon, Artha I.; Faizal A., M.; Munandar, A.

    2018-03-01

    The heart is an important organ in our metabolism in which it controls circulatory and oxygen. The heart exercise is needed one of them using the treadmill to prevent health. To analysis, it using electrocardiograph (ECG) to investigating and diagnosing anomalies of the heart. In this paper, we would like to analysis ECG signals during running on the treadmill with kinds of speeds. There are two analysis ECG signals i.e. QRS detection and power spectrum density (PSD). The result of PSD showed that subject 3 has highly for all subject and the result of QRS detection using pan Tomkins algorithm that a percentage of failed detection is an approaching to 0 % for all subject.

  6. SWAN - Detection of explosives by means of fast neutron activation analysis

    International Nuclear Information System (INIS)

    Gierlik, M.; Borsuk, S.; Guzik, Z.; Iwanowska, J.; Kaźmierczak, Ł.; Korolczuk, S.; Kozłowski, T.; Krakowski, T.; Marcinkowski, R.; Swiderski, L.; Szeptycka, M.; Szewiński, J.; Urban, A.

    2016-01-01

    In this work we report on SWAN, the experimental, portable device for explosives detection. The device was created as part of the EU Structural Funds Project “Accelerators & Detectors” (POIG.01.01.02-14-012/08-00), with the goal to increase beneficiary's expertise and competencies in the field of neutron activation analysis. Previous experiences and budged limitations lead toward a less advanced design based on fast neutron interactions and unsophisticated data analysis with the emphasis on the latest gamma detection and spectrometry solutions. The final device has been designed as a portable, fast neutron activation analyzer, with the software optimized for detection of carbon, nitrogen and oxygen. SWAN's performance in the role of explosives detector is elaborated in this paper. We demonstrate that the unique features offered by neutron activation analysis might not be impressive enough when confronted with practical demands and expectations of a generic homeland security customer.

  7. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Science.gov (United States)

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  8. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    Directory of Open Access Journals (Sweden)

    Shameng Wen

    Full Text Available Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  9. Costs of a community-based glaucoma detection programme: analysis of the Philadelphia Glaucoma Detection and Treatment Project.

    Science.gov (United States)

    Pizzi, Laura T; Waisbourd, Michael; Hark, Lisa; Sembhi, Harjeet; Lee, Paul; Crews, John E; Saaddine, Jinan B; Steele, Deon; Katz, L Jay

    2018-02-01

    Glaucoma is the foremost cause of irreversible blindness, and more than 50% of cases remain undiagnosed. Our objective was to report the costs of a glaucoma detection programme operationalised through Philadelphia community centres. The analysis was performed using a healthcare system perspective in 2013 US dollars. Costs of examination and educational workshops were captured. Measures were total programme costs, cost/case of glaucoma detected and cost/case of any ocular disease detected (including glaucoma). Diagnoses are reported at the individual level (therefore representing a diagnosis made in one or both eyes). Staff time was captured during site visits to 15 of 43 sites and included time to deliver examinations and workshops, supervision, training and travel. Staff time was converted to costs by applying wage and fringe benefit costs from the US Bureau of Labor Statistics. Non-staff costs (equipment and mileage) were collected using study logs. Participants with previously diagnosed glaucoma were excluded. 1649 participants were examined. Mean total per-participant examination time was 56 min (SD 4). Mean total examination cost/participant was $139. The cost/case of glaucoma newly identified (open-angle glaucoma, angle-closure glaucoma, glaucoma suspect, or primary angle closure) was $420 and cost/case for any ocular disease identified was $273. Glaucoma examinations delivered through this programme provided significant health benefit to hard-to-reach communities. On a per-person basis, examinations were fairly low cost, though opportunities exist to improve efficiency. Findings serve as an important benchmark for planning future community-based glaucoma examination programmes. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  10. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  11. Volatile Organic Compound (VOC) Analysis For Disease Detection: Proof Of Principle For Field Studies Detecting Paratuberculosis And Brucellosis

    Science.gov (United States)

    Knobloch, Henri; Köhler, Heike; Nicola, Commander; Reinhold, Petra; Turner, Claire; Chambers, Mark

    2009-05-01

    A proof of concept investigation was performed to demonstrate that two independent infectious diseases of cattle result in different patterns of volatile organic compounds (VOC) in the headspace of serum samples detectable using an electronic nose (e-nose). A total of 117 sera from cattle naturally infected with Mycobacterium avium subsp. paratuberculosis (paraTB, n = 43) or Brucella sp. (n = 26) and sera from corresponding control animals (n = 48) were randomly and analysed blind to infection status using a ST214 e-nose (Scensive Ltd, Leeds, UK). Samples were collected under non-standardised conditions on different farms from the UK (brucellosis) and Germany (paraTB). The e-nose could differentiate the sera from brucellosis infected, paraTB infected and healthy animals at the population level, but the technology used was not suitable for determination of the disease status of individual animals. Nevertheless, the data indicate that there are differences in the sensor responses depending on the disease status, and therefore, it shows the potential of VOC analysis from serum headspace samples for disease detection.

  12. Conflict Detection Performance Analysis for Function Allocation Using Time-Shifted Recorded Traffic Data

    Science.gov (United States)

    Guerreiro, Nelson M.; Butler, Ricky W.; Maddalon, Jeffrey M.; Hagen, George E.; Lewis, Timothy A.

    2015-01-01

    The performance of the conflict detection function in a separation assurance system is dependent on the content and quality of the data available to perform that function. Specifically, data quality and data content available to the conflict detection function have a direct impact on the accuracy of the prediction of an aircraft's future state or trajectory, which, in turn, impacts the ability to successfully anticipate potential losses of separation (detect future conflicts). Consequently, other separation assurance functions that rely on the conflict detection function - namely, conflict resolution - are prone to negative performance impacts. The many possible allocations and implementations of the conflict detection function between centralized and distributed systems drive the need to understand the key relationships that impact conflict detection performance, with respect to differences in data available. This paper presents the preliminary results of an analysis technique developed to investigate the impacts of data quality and data content on conflict detection performance. Flight track data recorded from a day of the National Airspace System is time-shifted to create conflicts not present in the un-shifted data. A methodology is used to smooth and filter the recorded data to eliminate sensor fusion noise, data drop-outs and other anomalies in the data. The metrics used to characterize conflict detection performance are presented and a set of preliminary results is discussed.

  13. Leak detection and localization in a pipeline system by application of statistical analysis techniques

    International Nuclear Information System (INIS)

    Fukuda, Toshio; Mitsuoka, Toyokazu.

    1985-01-01

    The detection of leak in piping system is an important diagnostic technique for facilities to prevent accidents and to take maintenance measures, since the occurrence of leak lowers productivity and causes environmental destruction. As the first step, it is necessary to detect the occurrence of leak without delay, and as the second step, if the place of leak occurrence in piping system can be presumed, accident countermeasures become easy. The detection of leak by pressure is usually used for detecting large leak. But the method depending on pressure is simple and advantageous, therefore the extension of the detecting technique by pressure gradient method to the detection of smaller scale leak using statistical analysis techniques was examined for a pipeline in steady operation in this study. Since the flow in a pipe irregularly varies during pumping, statistical means is required for the detection of small leak by pressure. The index for detecting leak proposed in this paper is the difference of the pressure gradient at the both ends of a pipeline. The experimental results on water and air in nylon tubes are reported. (Kako, I.)

  14. The Application of Helicopter Rotor Defect Detection Using Wavelet Analysis and Neural Network Technique

    Directory of Open Access Journals (Sweden)

    Jin-Li Sun

    2014-06-01

    Full Text Available When detect the helicopter rotor beam with ultrasonic testing, it is difficult to realize the noise removing and quantitative testing. This paper used the wavelet analysis technique to remove the noise among the ultrasonic detection signal and highlight the signal feature of defect, then drew the curve of defect size and signal amplitude. Based on the relationship of defect size and signal amplitude, a BP neural network was built up and the corresponding estimated value of the simulate defect was obtained by repeating training. It was confirmed that the wavelet analysis and neural network technique met the requirements of practical testing.

  15. Theoretical detection limit of PIXE analysis using 20 MeV proton beams

    Science.gov (United States)

    Ishii, Keizo; Hitomi, Keitaro

    2018-02-01

    Particle-induced X-ray emission (PIXE) analysis is usually performed using proton beams with energies in the range 2∼3 MeV because at these energies, the detection limit is low. The detection limit of PIXE analysis depends on the X-ray production cross-section, the continuous background of the PIXE spectrum and the experimental parameters such as the beam currents and the solid angle and detector efficiency of X-ray detector. Though the continuous background increases as the projectile energy increases, the cross-section of the X-ray increases as well. Therefore, the detection limit of high energy proton PIXE is not expected to increase significantly. We calculated the cross sections of continuous X-rays produced in several bremsstrahlung processes and estimated the detection limit of a 20 MeV proton PIXE analysis by modelling the Compton tail of the γ-rays produced in the nuclear reactions, and the escape effect on the secondary electron bremsstrahlung. We found that the Compton tail does not affect the detection limit when a thin X-ray detector is used, but the secondary electron bremsstrahlung escape effect does have an impact. We also confirmed that the detection limit of the PIXE analysis, when used with 4 μm polyethylene backing film and an integrated beam current of 1 μC, is 0.4∼2.0 ppm for proton energies in the range 10∼30 MeV and elements with Z = 16-90. This result demonstrates the usefulness of several 10 MeV cyclotrons for performing PIXE analysis. Cyclotrons with these properties are currently installed in positron emission tomography (PET) centers.

  16. A Mass Spectrometric Analysis Method Based on PPCA and SVM for Early Detection of Ovarian Cancer.

    Science.gov (United States)

    Wu, Jiang; Ji, Yanju; Zhao, Ling; Ji, Mengying; Ye, Zhuang; Li, Suyi

    2016-01-01

    Background. Surfaced-enhanced laser desorption-ionization-time of flight mass spectrometry (SELDI-TOF-MS) technology plays an important role in the early diagnosis of ovarian cancer. However, the raw MS data is highly dimensional and redundant. Therefore, it is necessary to study rapid and accurate detection methods from the massive MS data. Methods. The clinical data set used in the experiments for early cancer detection consisted of 216 SELDI-TOF-MS samples. An MS analysis method based on probabilistic principal components analysis (PPCA) and support vector machine (SVM) was proposed and applied to the ovarian cancer early classification in the data set. Additionally, by the same data set, we also established a traditional PCA-SVM model. Finally we compared the two models in detection accuracy, specificity, and sensitivity. Results. Using independent training and testing experiments 10 times to evaluate the ovarian cancer detection models, the average prediction accuracy, sensitivity, and specificity of the PCA-SVM model were 83.34%, 82.70%, and 83.88%, respectively. In contrast, those of the PPCA-SVM model were 90.80%, 92.98%, and 88.97%, respectively. Conclusions. The PPCA-SVM model had better detection performance. And the model combined with the SELDI-TOF-MS technology had a prospect in early clinical detection and diagnosis of ovarian cancer.

  17. Detection of fast oscillating magnetic fields using dynamic multiple TR imaging and Fourier analysis.

    Directory of Open Access Journals (Sweden)

    Ki Hwan Kim

    Full Text Available Neuronal oscillations produce oscillating magnetic fields. There have been trials to detect neuronal oscillations using MRI, but the detectability in in vivo is still in debate. Major obstacles to detecting neuronal oscillations are (i weak amplitudes, (ii fast oscillations, which are faster than MRI temporal resolution, and (iii random frequencies and on/off intervals. In this study, we proposed a new approach for direct detection of weak and fast oscillating magnetic fields. The approach consists of (i dynamic acquisitions using multiple times to repeats (TRs and (ii an expanded frequency spectral analysis. Gradient echo echo-planar imaging was used to test the feasibility of the proposed approach with a phantom generating oscillating magnetic fields with various frequencies and amplitudes and random on/off intervals. The results showed that the proposed approach could precisely detect the weak and fast oscillating magnetic fields with random frequencies and on/off intervals. Complex and phase spectra showed reliable signals, while no meaningful signals were observed in magnitude spectra. A two-TR approach provided an absolute frequency spectrum above Nyquist sampling frequency pixel by pixel with no a priori target frequency information. The proposed dynamic multiple-TR imaging and Fourier analysis are promising for direct detection of neuronal oscillations and potentially applicable to any pulse sequences.

  18. Power spectrum weighted edge analysis for straight edge detection in images

    Science.gov (United States)

    Karvir, Hrishikesh V.; Skipper, Julie A.

    2007-04-01

    Most man-made objects provide characteristic straight line edges and, therefore, edge extraction is a commonly used target detection tool. However, noisy images often yield broken edges that lead to missed detections, and extraneous edges that may contribute to false target detections. We present a sliding-block approach for target detection using weighted power spectral analysis. In general, straight line edges appearing at a given frequency are represented as a peak in the Fourier domain at a radius corresponding to that frequency, and a direction corresponding to the orientation of the edges in the spatial domain. Knowing the edge width and spacing between the edges, a band-pass filter is designed to extract the Fourier peaks corresponding to the target edges and suppress image noise. These peaks are then detected by amplitude thresholding. The frequency band width and the subsequent spatial filter mask size are variable parameters to facilitate detection of target objects of different sizes under known imaging geometries. Many military objects, such as trucks, tanks and missile launchers, produce definite signatures with parallel lines and the algorithm proves to be ideal for detecting such objects. Moreover, shadow-casting objects generally provide sharp edges and are readily detected. The block operation procedure offers advantages of significant reduction in noise influence, improved edge detection, faster processing speed and versatility to detect diverse objects of different sizes in the image. With Scud missile launcher replicas as target objects, the method has been successfully tested on terrain board test images under different backgrounds, illumination and imaging geometries with cameras of differing spatial resolution and bit-depth.

  19. High-energy elastic recoil detection heavy ions for light element analysis

    International Nuclear Information System (INIS)

    Goppelt-Langer, P.; Yamamoto, S.; Takeshita, H.; Aoki, Y.; Naramoto, H.

    1994-01-01

    The detection of light and medium heavy elements in not homogeneous solids is a severe problem in ion beam analysis. Heavy elements can be detected by the well established Rutherford backscattering technique (RBS). In a homogeneous host material most impurities can be easily analyzed by secondary ion mass spectroscopy (SIMS). Some isotopes ( 3 He, 6 Li, 10 B) can be measured by nuclear reaction analysis (NRA) using thermal neutrons inducing (n, p) or (n, α) reactions. Others can be detected by energetic ion beams by nuclear reactions (e.g. 15 N( 1 H, αγ) 12 C for analysis of hydrogen). A high content of H, D or T can be also determined by elastic recoil detection using an energetic He beam. The latter technique has been developed to a universal method for detection of light and heavy elements in any target, using a high energetic heavy ion beam and a detector system, which is able to identify the recoils and delivers energy and position of the particles. (author)

  20. Combined DECS Analysis and Next-Generation Sequencing Enable Efficient Detection of Novel Plant RNA Viruses

    Directory of Open Access Journals (Sweden)

    Hironobu Yanagisawa

    2016-03-01

    Full Text Available The presence of high molecular weight double-stranded RNA (dsRNA within plant cells is an indicator of infection with RNA viruses as these possess genomic or replicative dsRNA. DECS (dsRNA isolation, exhaustive amplification, cloning, and sequencing analysis has been shown to be capable of detecting unknown viruses. We postulated that a combination of DECS analysis and next-generation sequencing (NGS would improve detection efficiency and usability of the technique. Here, we describe a model case in which we efficiently detected the presumed genome sequence of Blueberry shoestring virus (BSSV, a member of the genus Sobemovirus, which has not so far been reported. dsRNAs were isolated from BSSV-infected blueberry plants using the dsRNA-binding protein, reverse-transcribed, amplified, and sequenced using NGS. A contig of 4,020 nucleotides (nt that shared similarities with sequences from other Sobemovirus species was obtained as a candidate of the BSSV genomic sequence. Reverse transcription (RT-PCR primer sets based on sequences from this contig enabled the detection of BSSV in all BSSV-infected plants tested but not in healthy controls. A recombinant protein encoded by the putative coat protein gene was bound by the BSSV-antibody, indicating that the candidate sequence was that of BSSV itself. Our results suggest that a combination of DECS analysis and NGS, designated here as “DECS-C,” is a powerful method for detecting novel plant viruses.

  1. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Science.gov (United States)

    Lee, Jack; Zee, Benny Chung Ying; Li, Qing

    2013-01-01

    Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA), high order spectrum analysis (HOS), fractal analysis (FA), and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC) are obtained. They are 96.3%, 99.1% and 98.5% (99.3%), respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  2. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Jack Lee

    Full Text Available Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA, high order spectrum analysis (HOS, fractal analysis (FA, and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC are obtained. They are 96.3%, 99.1% and 98.5% (99.3%, respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  3. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    Directory of Open Access Journals (Sweden)

    Göran Pulkkis

    2011-11-01

    Full Text Available As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA, which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  4. Detecting Distributed Network Traffic Anomaly with Network-Wide Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Yang Dan

    2008-12-01

    Full Text Available Distributed network traffic anomaly refers to a traffic abnormal behavior involving many links of a network and caused by the same source (e.g., DDoS attack, worm propagation. The anomaly transiting in a single link might be unnoticeable and hard to detect, while the anomalous aggregation from many links can be prevailing, and does more harm to the networks. Aiming at the similar features of distributed traffic anomaly on many links, this paper proposes a network-wide detection method by performing anomalous correlation analysis of traffic signals' instantaneous parameters. In our method, traffic signals' instantaneous parameters are firstly computed, and their network-wide anomalous space is then extracted via traffic prediction. Finally, an anomaly is detected by a global correlation coefficient of anomalous space. Our evaluation using Abilene traffic traces demonstrates the excellent performance of this approach for distributed traffic anomaly detection.

  5. Efficiency of Airborne Sample Analysis Platform (ASAP Bioaerosol Sampler for Pathogen Detection

    Directory of Open Access Journals (Sweden)

    Anurag eSharma

    2015-05-01

    Full Text Available The threat of bioterrorism and pandemics has highlighted the urgency for rapid and reliable bioaerosol detection in different environments. Safeguarding against such threats requires continuous sampling of the ambient air for pathogen detection. In this study we investigated the efficacy of the Airborne Sample Analysis Platform (ASAP 2800 bioaerosol sampler to collect representative samples of air and identify specific viruses suspended as bioaerosols. To test this concept, we aerosolized an innocuous replication-defective bovine adenovirus serotype 3 (BAdV3 in a controlled laboratory environment. The ASAP efficiently trapped the surrogate virus at 5×10E3 plaque-forming units (p.f.u. [2×10E5 genome copy equivalent] concentrations or more resulting in the successful detection of the virus using quantitative PCR. These results support the further development of ASAP for bioaerosol pathogen detection.

  6. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection.

    Science.gov (United States)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H; Nørregaard, Rikke; Møller-Jensen, Jakob; Nejsum, Lene N

    2017-10-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial colonies in infected host cells (Detection and quantification of intracellular bacterial colonies by automated, high-throughput microscopy, Ernstsen et al., 2017 [1]). The infected cells were imaged with a 10× objective and number of intracellular bacterial colonies, their size distribution and the number of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method.

  7. Comparative Analysis of Automatic Exudate Detection between Machine Learning and Traditional Approaches

    Science.gov (United States)

    Sopharak, Akara; Uyyanonvara, Bunyarit; Barman, Sarah; Williamson, Thomas

    To prevent blindness from diabetic retinopathy, periodic screening and early diagnosis are neccessary. Due to lack of expert ophthalmologists in rural area, automated early exudate (one of visible sign of diabetic retinopathy) detection could help to reduce the number of blindness in diabetic patients. Traditional automatic exudate detection methods are based on specific parameter configuration, while the machine learning approaches which seems more flexible may be computationally high cost. A comparative analysis of traditional and machine learning of exudates detection, namely, mathematical morphology, fuzzy c-means clustering, naive Bayesian classifier, Support Vector Machine and Nearest Neighbor classifier are presented. Detected exudates are validated with expert ophthalmologists' hand-drawn ground-truths. The sensitivity, specificity, precision, accuracy and time complexity of each method are also compared.

  8. Results of Automated Retinal Image Analysis for Detection of Diabetic Retinopathy from the Nakuru Study, Kenya

    DEFF Research Database (Denmark)

    Juul Bøgelund Hansen, Morten; Abramoff, M. D.; Folk, J. C.

    2015-01-01

    Objective Digital retinal imaging is an established method of screening for diabetic retinopathy (DR). It has been established that currently about 1% of the world's blind or visually impaired is due to DR. However, the increasing prevalence of diabetes mellitus and DR is creating an increased...... workload on those with expertise in grading retinal images. Safe and reliable automated analysis of retinal images may support screening services worldwide. This study aimed to compare the Iowa Detection Program (IDP) ability to detect diabetic eye diseases (DED) to human grading carried out at Moorfields...... predictive value of IDP versus the human grader as reference standard. Results Altogether 3,460 participants were included. 113 had DED, giving a prevalence of 3.3%(95% CI, 2.7-3.9%). Sensitivity of the IDP to detect DED as by the human grading was 91.0%(95% CI, 88.0-93.4%). The IDP ability to detect DED...

  9. Hidden corrosion detection in aircraft aluminum structures using laser ultrasonics and wavelet transform signal analysis.

    Science.gov (United States)

    Silva, M Z; Gouyon, R; Lepoutre, F

    2003-06-01

    Preliminary results of hidden corrosion detection in aircraft aluminum structures using a noncontact laser based ultrasonic technique are presented. A short laser pulse focused to a line spot is used as a broadband source of ultrasonic guided waves in an aluminum 2024 sample cut from an aircraft structure and prepared with artificially corroded circular areas on its back surface. The out of plane surface displacements produced by the propagating ultrasonic waves were detected with a heterodyne Mach-Zehnder interferometer. Time-frequency analysis of the signals using a continuous wavelet transform allowed the identification of the generated Lamb modes by comparison with the calculated dispersion curves. The presence of back surface corrosion was detected by noting the loss of the S(1) mode near its cutoff frequency. This method is applicable to fast scanning inspection techniques and it is particularly suited for early corrosion detection.

  10. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis.

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim' based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks.

  11. Shilling Attacks Detection in Recommender Systems Based on Target Item Analysis

    Science.gov (United States)

    Zhou, Wei; Wen, Junhao; Koh, Yun Sing; Xiong, Qingyu; Gao, Min; Dobbie, Gillian; Alam, Shafiq

    2015-01-01

    Recommender systems are highly vulnerable to shilling attacks, both by individuals and groups. Attackers who introduce biased ratings in order to affect recommendations, have been shown to negatively affect collaborative filtering (CF) algorithms. Previous research focuses only on the differences between genuine profiles and attack profiles, ignoring the group characteristics in attack profiles. In this paper, we study the use of statistical metrics to detect rating patterns of attackers and group characteristics in attack profiles. Another question is that most existing detecting methods are model specific. Two metrics, Rating Deviation from Mean Agreement (RDMA) and Degree of Similarity with Top Neighbors (DegSim), are used for analyzing rating patterns between malicious profiles and genuine profiles in attack models. Building upon this, we also propose and evaluate a detection structure called RD-TIA for detecting shilling attacks in recommender systems using a statistical approach. In order to detect more complicated attack models, we propose a novel metric called DegSim’ based on DegSim. The experimental results show that our detection model based on target item analysis is an effective approach for detecting shilling attacks. PMID:26222882

  12. Fault Detection Algorithm based on Null-Space Analysis for On-Line Structural Health Monitoring

    OpenAIRE

    Yan, Ai-Min; Golinval, Jean-Claude; Marin, Frédéric

    2005-01-01

    Early diagnosis of structural damages or machinery malfunctions allows to reduce the maintenance cost of systems and to increase their reliability and safety. This paper addresses the damage detection problem by statistical analysis on output-only measurements of structures. The developed method is based on subspace analysis of the Hankel matrices constructed by vibration measurement data. The column active subspace of the Hankel matrix defined by the first principal components is orthonormal...

  13. Transient pattern analysis for fault detection and diagnosis of HVAC systems

    International Nuclear Information System (INIS)

    Cho, Sung-Hwan; Yang, Hoon-Cheol; Zaheer-uddin, M.; Ahn, Byung-Cheon

    2005-01-01

    Modern building HVAC systems are complex and consist of a large number of interconnected sub-systems and components. In the event of a fault, it becomes very difficult for the operator to locate and isolate the faulty component in such large systems using conventional fault detection methods. In this study, transient pattern analysis is explored as a tool for fault detection and diagnosis of an HVAC system. Several tests involving different fault replications were conducted in an environmental chamber test facility. The results show that the evolution of fault residuals forms clear and distinct patterns that can be used to isolate faults. It was found that the time needed to reach steady state for a typical building HVAC system is at least 50-60 min. This means incorrect diagnosis of faults can happen during online monitoring if the transient pattern responses are not considered in the fault detection and diagnosis analysis

  14. Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution

    Science.gov (United States)

    Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin

    2018-06-01

    Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6  ±  36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.

  15. Analysis of Endocrine Disrupting Pesticides by Capillary GC with Mass Spectrometric Detection

    Directory of Open Access Journals (Sweden)

    Svetlana Hrouzková

    2012-09-01

    Full Text Available Endocrine disrupting chemicals, among them many pesticides, alter the normal functioning of the endocrine system of both wildlife and humans at very low concentration levels. Therefore, the importance of method development for their analysis in food and the environment is increasing. This also covers contributions in the field of ultra-trace analysis of multicomponent mixtures of organic pollutants in complex matrices. With this fact conventional capillary gas chromatography (CGC and fast CGC with mass spectrometric detection (MS has acquired a real importance in the analysis of endocrine disrupting pesticide (EDP residues. This paper provides an overview of GC methods, including sample preparation steps, for analysis of EDPs in a variety of matrices at ultra-trace concentration levels. Emphasis is put on separation method, mode of MS detection and ionization and obtained limits of detection and quantification. Analysis time is one of the most important aspects that should be considered in the choice of analytical methods for routine analysis. Therefore, the benefits of developed fast GC methods are important.

  16. Context-based object-of-interest detection for a generic traffic surveillance analysis system

    NARCIS (Netherlands)

    Bao, X.; Javanbakhti, S.; Zinger, S.; Wijnhoven, R.G.J.; With, de P.H.N.

    2014-01-01

    We present a new traffic surveillance video analysis system, focusing on building a framework with robust and generic techniques, based on both scene understanding and moving object-of-interest detection. Since traffic surveillance is widely applied, we want to design a single system that can be

  17. Denial-of-service attack detection based on multivariate correlation analysis

    NARCIS (Netherlands)

    Tan, Zhiyuan; Jamdagni, Aruna; He, Xiangjian; Nanda, Priyadarsi; Liu, Ren Ping; Lu, Bao-Liang; Zhang, Liqing; Kwok, James

    2011-01-01

    The reliability and availability of network services are being threatened by the growing number of Denial-of-Service (DoS) attacks. Effective mechanisms for DoS attack detection are demanded. Therefore, we propose a multivariate correlation analysis approach to investigate and extract second-order

  18. Comprehension-Driven Program Analysis (CPA) for Malware Detection in Android Phones

    Science.gov (United States)

    2015-07-01

    Android source . 3.1.2.2 Analyzers An analyzer conforms to specifications defined by the Security Toolbox. Specifically an analyzer encapsulates a...COMPREHENSION-DRIVEN PROGRAM ANALYSIS (CPA) FOR MALWARE DETECTION IN ANDROID PHONES IOWA STATE UNIVERSITY JULY 2015 FINAL...average 1 hour per response, including the time for reviewing instructions, searching existing data sources , gathering and maintaining the data needed

  19. [SWOT analysis of laboratory certification and accreditation on detection of parasitic diseases].

    Science.gov (United States)

    Xiong, Yan-hong; Zheng, Bin

    2014-04-01

    This study analyzes the strength, weakness, opportunity and threat (SWOT) of laboratory certification and accreditation on detection of parasitic diseases by SWOT analysis comprehensively, and it puts forward some development strategies specifically, in order to provide some indicative references for the further development.

  20. Retrospective detection of exposure to organophosphorus anti-cholinesterases: Mass spectrometric analysis of phosphylated human butyrylcholinesterase

    NARCIS (Netherlands)

    Fidder, A.; Hulst, A.G.; Noort, D.; Ruiter, R. de; Schans, M.J. van der; Benschop, H.P.; Langenberg, J.P.

    2002-01-01

    In this paper a novel and general procedure is presented for detection of organophosphate-inhibited human butyrylcholinesterase (HuBuChE), which is based on electrospray tandem mass spectrometric analysis of phosphylated nonapeptides obtained after pepsin digestion of the enzyme. The utility of this

  1. EVALUATION OF A COMPUTER-AIDED SEMEN ANALYSIS SYSTEM WITH SPERM TAIL DETECTION

    NARCIS (Netherlands)

    WIJCHMAN, JG; DEWOLF, BTHM; JAGER, S

    The aim of this study was to evaluate the Stroemberg-Mika cell motion analyser (SM-CMA) which uses tail detection in order to discriminate between immotile spermatozoa and other particles. Analysis of the spermatozoa by the SM-CMA can easily be checked on a video monitor. The semen samples were from

  2. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis.

    Science.gov (United States)

    Huan, L N; Tejani, A M; Egan, G

    2014-10-01

    An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed. © 2014 John Wiley & Sons Ltd.

  3. CEST ANALYSIS: AUTOMATED CHANGE DETECTION FROM VERY-HIGH-RESOLUTION REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    M. Ehlers

    2012-08-01

    Full Text Available A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST of the change algorithms is applied to calculate the probability of change for a particular location. CEST

  4. Cost-Effectiveness Analysis of Three Leprosy Case Detection Methods in Northern Nigeria

    Science.gov (United States)

    Ezenduka, Charles; Post, Erik; John, Steven; Suraj, Abdulkarim; Namadi, Abdulahi; Onwujekwe, Obinna

    2012-01-01

    Background Despite several leprosy control measures in Nigeria, child proportion and disability grade 2 cases remain high while new cases have not significantly reduced, suggesting continuous spread of the disease. Hence, there is the need to review detection methods to enhance identification of early cases for effective control and prevention of permanent disability. This study evaluated the cost-effectiveness of three leprosy case detection methods in Northern Nigeria to identify the most cost-effective approach for detection of leprosy. Methods A cross-sectional study was carried out to evaluate the additional benefits of using several case detection methods in addition to routine practice in two north-eastern states of Nigeria. Primary and secondary data were collected from routine practice records and the Nigerian Tuberculosis and Leprosy Control Programme of 2009. The methods evaluated were Rapid Village Survey (RVS), Household Contact Examination (HCE) and Traditional Healers incentive method (TH). Effectiveness was measured as number of new leprosy cases detected and cost-effectiveness was expressed as cost per case detected. Costs were measured from both providers' and patients' perspectives. Additional costs and effects of each method were estimated by comparing each method against routine practise and expressed as incremental cost-effectiveness ratio (ICER). All costs were converted to the U.S. dollar at the 2010 exchange rate. Univariate sensitivity analysis was used to evaluate uncertainties around the ICER. Results The ICER for HCE was $142 per additional case detected at all contact levels and it was the most cost-effective method. At ICER of $194 per additional case detected, THs method detected more cases at a lower cost than the RVS, which was not cost-effective at $313 per additional case detected. Sensitivity analysis showed that varying the proportion of shared costs and subsistent wage for valuing unpaid time did not significantly change the

  5. Automatic Pedestrian Crossing Detection and Impairment Analysis Based on Mobile Mapping System

    Science.gov (United States)

    Liu, X.; Zhang, Y.; Li, Q.

    2017-09-01

    Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians' lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  6. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  7. AUTOMATIC PEDESTRIAN CROSSING DETECTION AND IMPAIRMENT ANALYSIS BASED ON MOBILE MAPPING SYSTEM

    Directory of Open Access Journals (Sweden)

    X. Liu

    2017-09-01

    Full Text Available Pedestrian crossing, as an important part of transportation infrastructures, serves to secure pedestrians’ lives and possessions and keep traffic flow in order. As a prominent feature in the street scene, detection of pedestrian crossing contributes to 3D road marking reconstruction and diminishing the adverse impact of outliers in 3D street scene reconstruction. Since pedestrian crossing is subject to wearing and tearing from heavy traffic flow, it is of great imperative to monitor its status quo. On this account, an approach of automatic pedestrian crossing detection using images from vehicle-based Mobile Mapping System is put forward and its defilement and impairment are analyzed in this paper. Firstly, pedestrian crossing classifier is trained with low recall rate. Then initial detections are refined by utilizing projection filtering, contour information analysis, and monocular vision. Finally, a pedestrian crossing detection and analysis system with high recall rate, precision and robustness will be achieved. This system works for pedestrian crossing detection under different situations and light conditions. It can recognize defiled and impaired crossings automatically in the meanwhile, which facilitates monitoring and maintenance of traffic facilities, so as to reduce potential traffic safety problems and secure lives and property.

  8. Bilinear Time-frequency Analysis for Lamb Wave Signal Detected by Electromagnetic Acoustic Transducer

    Science.gov (United States)

    Sun, Wenxiu; Liu, Guoqiang; Xia, Hui; Xia, Zhengwu

    2018-03-01

    Accurate acquisition of the detection signal travel time plays a very important role in cross-hole tomography. The experimental platform of aluminum plate under the perpendicular magnetic field is established and the bilinear time-frequency analysis methods, Wigner-Ville Distribution (WVD) and the pseudo-Wigner-Ville distribution (PWVD), are applied to analyse the Lamb wave signals detected by electromagnetic acoustic transducer (EMAT). By extracting the same frequency component of the time-frequency spectrum as the excitation frequency, the travel time information can be obtained. In comparison with traditional linear time-frequency analysis method such as short-time Fourier transform (STFT), the bilinear time-frequency analysis method PWVD is more appropriate in extracting travel time and recognizing patterns of Lamb wave.

  9. VIBRATIONS DETECTION IN INDUSTRIAL PUMPS BASED ON SPECTRAL ANALYSIS TO INCREASE THEIR EFFICIENCY

    Directory of Open Access Journals (Sweden)

    Belhadef RACHID

    2016-01-01

    Full Text Available Spectral analysis is the key tool for the study of vibration signals in rotating machinery. In this work, the vibration analy-sis applied for conditional preventive maintenance of such machines is proposed, as part of resolved problems related to vibration detection on the organs of these machines. The vibration signal of a centrifugal pump was treated to mount the benefits of the approach proposed. The obtained results present the signal estimation of a pump vibration using Fourier transform technique compared by the spectral analysis methods based on Prony approach.

  10. Cardiac monitoring for detection of atrial fibrillation after TIA: A systematic review and meta-analysis.

    Science.gov (United States)

    Korompoki, Eleni; Del Giudice, Angela; Hillmann, Steffi; Malzahn, Uwe; Gladstone, David J; Heuschmann, Peter; Veltkamp, Roland

    2017-01-01

    Background and purpose The detection rate of atrial fibrillation has not been studied specifically in transient ischemic attack (TIA) patients although extrapolation from ischemic stroke may be inadequate. We conducted a systematic review and meta-analysis to determine the rate of newly diagnosed atrial fibrillation using different methods of ECG monitoring in TIA. Methods A comprehensive literature search was performed following a pre-specified protocol the PRISMA statement. Prospective observational studies and randomized controlled trials were considered that included TIA patients who underwent cardiac monitoring for >12 h. Primary outcome was frequency of detection of atrial fibrillation ≥30 s. Analyses of subgroups and of duration and type of monitoring were performed. Results Seventeen studies enrolling 1163 patients were included. The pooled atrial fibrillation detection rate for all methods was 4% (95% CI: 2-7%). Yield of monitoring was higher in selected (higher age, more extensive testing for arrhythmias before enrolment, or presumed cardioembolic/cryptogenic cause) than in unselected cohorts (7% vs 3%). Pooled mean atrial fibrillation detection rates rose with duration of monitoring: 4% (24 h), 5% (24 h to 7 days) and 6% (>7 days), respectively. Yield of non-invasive was significantly lower than that of invasive monitoring (4% vs. 11%). Significant heterogeneity was observed among studies (I 2 =60.61%). Conclusion This first meta-analysis of atrial fibrillation detection in TIA patients finds a lower atrial fibrillation detection rate in TIA than reported for IS and TIA cohorts in previous meta-analyses. Prospective studies are needed to determine actual prevalence of atrial fibrillation and optimal diagnostic procedure for atrial fibrillation detection in TIA.

  11. Time from first detectable PSA following radical prostatectomy to biochemical recurrence: A competing risk analysis

    Science.gov (United States)

    de Boo, Leonora; Pintilie, Melania; Yip, Paul; Baniel, Jack; Fleshner, Neil; Margel, David

    2015-01-01

    Introduction: In this study, we estimated the time from first detectable prostate-specific antigen (PSA) following radical prostatectomy (RP) to commonly used definitions of biochemical recurrence (BCR). We also identified the predictors of time to BCR. Methods: We identified subjects who underwent a RP and had an undetectable PSA after surgery followed by at least 1 detectable PSA between 2000 and 2011. The primary outcome was time to BCR (PSA ≥0.2 and successive PSA ≥0.2) and prediction of the rate of PSA rise. Outcomes were calculated using a competing risk analysis, with univariable and multivariable Fine and Grey models. We employed a mixed effect model to test clinical predictors that are associated with the rate of PSA rise. Results: The cohort included 376 patients. The median follow-up from surgery was 60.5 months (interquartile range [IQR] 40.8–91.5) and from detectable PSA was 18 months (IQR 11–32). Only 45.74% (n = 172) had PSA values ≥0.2 ng/mL, while 15.16% (n = 57) reached the PSA level of ≥0.4 ng/mL and rising. On multivariable analysis, the values of the first detectable PSA and pathologic Gleason grade 8 or higher were consistently independent predictors of time to BCR. In the mixed effect model rate, the PSA rise was associated with time from surgery to first detectable PSA, Gleason score, and prostate volume. The main limitation of this study is the large proportion of patients that received treatment without reaching BCR. It is plausible that shorter estimated median times would occur at a centre that does not use salvage therapy at such an early state. Conclusion: The time from first detectable PSA to BCR may be lengthy. Our analyses of the predictors of the rate of PSA rise can help determine a personalized approach for patients with a detectable PSA after surgery. PMID:25624961

  12. Mental-disorder detection using chaos and nonlinear dynamical analysis of photoplethysmographic signals

    International Nuclear Information System (INIS)

    Pham, Tuan D.; Thang, Truong Cong; Oyama-Higa, Mayumi; Sugiyama, Masahide

    2013-01-01

    Highlights: • Chaos and nonlinear dynamical analysis are applied for mental-disorder detection. • Experimental results show significant detection improvement with feature synergy. • Proposed approach is effective for analysis of photoplethysmographic signals. • Proposed approach is promising for developing automated mental-health systems. -- Abstract: Mental disorder can be defined as a psychological disturbance of thought or emotion. In particular, depression is a mental disease which can ultimately lead to death from suicide. If depression is identified, it can be treated with medication and psychotherapy. However, the diagnosis of depression is difficult and there are currently no any quick and reliable medical tests to detect if someone is depressed. This is because the exact cause of depression is still unknown given the belief that depression results in chemical brain changes, genetic disorder, stress, or the combination of these problems. Photoplethysmography has recently been realized as a non-invasive optical technique that can give new insights into the physiology and pathophysiology of the central and peripheral nervous systems. We present in this paper an automated mental-disorder detection approach in a general sense based on a novel synergy of chaos and nonlinear dynamical methods for the analysis of photoplethysmographic finger pulse waves of mental and control subjects. Such an approach can be applied for automated detection of depression as a special case. Because of the computational effectiveness of the studied methods and low cost of generation of the physiological signals, the proposed automated detection of mental illness is feasible for real-life applications including self-assessment, self-monitoring, and computerized health care

  13. Improved target detection and bearing estimation utilizing fast orthogonal search for real-time spectral analysis

    International Nuclear Information System (INIS)

    Osman, Abdalla; El-Sheimy, Naser; Nourledin, Aboelamgd; Theriault, Jim; Campbell, Scott

    2009-01-01

    The problem of target detection and tracking in the ocean environment has attracted considerable attention due to its importance in military and civilian applications. Sonobuoys are one of the capable passive sonar systems used in underwater target detection. Target detection and bearing estimation are mainly obtained through spectral analysis of received signals. The frequency resolution introduced by current techniques is limited which affects the accuracy of target detection and bearing estimation at a relatively low signal-to-noise ratio (SNR). This research investigates the development of a bearing estimation method using fast orthogonal search (FOS) for enhanced spectral estimation. FOS is employed in this research in order to improve both target detection and bearing estimation in the case of low SNR inputs. The proposed methods were tested using simulated data developed for two different scenarios under different underwater environmental conditions. The results show that the proposed method is capable of enhancing the accuracy for target detection as well as bearing estimation especially in cases of a very low SNR

  14. Bladed wheels damage detection through Non-Harmonic Fourier Analysis improved algorithm

    Science.gov (United States)

    Neri, P.

    2017-05-01

    Recent papers introduced the Non-Harmonic Fourier Analysis for bladed wheels damage detection. This technique showed its potential in estimating the frequency of sinusoidal signals even when the acquisition time is short with respect to the vibration period, provided that some hypothesis are fulfilled. Anyway, previously proposed algorithms showed severe limitations in cracks detection at their early stage. The present paper proposes an improved algorithm which allows to detect a blade vibration frequency shift due to a crack whose size is really small compared to the blade width. Such a technique could be implemented for condition-based maintenance, allowing to use non-contact methods for vibration measurements. A stator-fixed laser sensor could monitor all the blades as they pass in front of the spot, giving precious information about the wheel health. This configuration determines an acquisition time for each blade which become shorter as the machine rotational speed increases. In this situation, traditional Discrete Fourier Transform analysis results in poor frequency resolution, being not suitable for small frequency shift detection. Non-Harmonic Fourier Analysis instead showed high reliability in vibration frequency estimation even with data samples collected in a short time range. A description of the improved algorithm is provided in the paper, along with a comparison with the previous one. Finally, a validation of the method is presented, based on finite element simulations results.

  15. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study.

    Science.gov (United States)

    Proescholdt, Martin A; Faltermeier, Rupert; Bele, Sylvia; Brawanski, Alexander

    2017-01-01

    Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca), correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp). In this study we compared the results of the sca with the pressure reactivity index (PRx), an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc). The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  16. Detection of Impaired Cerebral Autoregulation Using Selected Correlation Analysis: A Validation Study

    Directory of Open Access Journals (Sweden)

    Martin A. Proescholdt

    2017-01-01

    Full Text Available Multimodal brain monitoring has been utilized to optimize treatment of patients with critical neurological diseases. However, the amount of data requires an integrative tool set to unmask pathological events in a timely fashion. Recently we have introduced a mathematical model allowing the simulation of pathophysiological conditions such as reduced intracranial compliance and impaired autoregulation. Utilizing a mathematical tool set called selected correlation analysis (sca, correlation patterns, which indicate impaired autoregulation, can be detected in patient data sets (scp. In this study we compared the results of the sca with the pressure reactivity index (PRx, an established marker for impaired autoregulation. Mean PRx values were significantly higher in time segments identified as scp compared to segments showing no selected correlations (nsc. The sca based approach predicted cerebral autoregulation failure with a sensitivity of 78.8% and a specificity of 62.6%. Autoregulation failure, as detected by the results of both analysis methods, was significantly correlated with poor outcome. Sca of brain monitoring data detects impaired autoregulation with high sensitivity and sufficient specificity. Since the sca approach allows the simultaneous detection of both major pathological conditions, disturbed autoregulation and reduced compliance, it may become a useful analysis tool for brain multimodal monitoring data.

  17. A Novel Method for Surface Defect Detection of Photovoltaic Module Based on Independent Component Analysis

    Directory of Open Access Journals (Sweden)

    Xuewu Zhang

    2013-01-01

    Full Text Available This paper proposed a new method for surface defect detection of photovoltaic module based on independent component analysis (ICA reconstruction algorithm. Firstly, a faultless image is used as the training image. The demixing matrix and corresponding ICs are obtained by applying the ICA in the training image. Then we reorder the ICs according to the range values and reform the de-mixing matrix. Then the reformed de-mixing matrix is used to reconstruct the defect image. The resulting image can remove the background structures and enhance the local anomalies. Experimental results have shown that the proposed method can effectively detect the presence of defects in periodically patterned surfaces.

  18. Detection of gamma-irradiated peanuts by ESR spectroscopy and GC analysis of hydrocarbons

    Energy Technology Data Exchange (ETDEWEB)

    Wei Mingli; An Li [Institute of Agro-food Science and Technology, Chinese Academy of Agricultural Sciences, 100193 Beijing (China); Yi Mingha, E-mail: wangyilwm@163.co [Institute of Agro-food Science and Technology, Chinese Academy of Agricultural Sciences, 100193 Beijing (China); Feng Wang [Institute of Agro-food Science and Technology, Chinese Academy of Agricultural Sciences, 100193 Beijing (China); Yan Lizhang [Division of Metrology in Ionizing Radiation and Medicine, National Institute of Metrology, 100013 Beijing (China)

    2011-03-15

    Peanuts were analyzed by electron spin resonance (ESR) spectroscopy and gas chromatography (GC) before and after gamma irradiation. Using European protocols, the validity and effectiveness of these two techniques were compared with regard to sample preparation, sample and solvent consumption and dose-response curves after irradiation. The results showed the possibility of using ESR and GC for distinguishing between irradiated and unirradiated peanuts. A radiation dose of 0.1 kGy could be detected by ESR but not by GC. The results also indicated that GC is an effective method for qualitative analysis of irradiated peanut, while ESR is suitable for the rapid detection of irradiated peanuts.

  19. Flow cytometric analysis of RNA synthesis by detection of bromouridine incorporation

    DEFF Research Database (Denmark)

    Larsen, J K; Jensen, Peter Østrup; Larsen, J

    2001-01-01

    RNA synthesis has traditionally been investigated by a laborious and time-consuming radiographic method involving incorporation of tritiated uridine. Now a faster non-radioactive alternative has emerged, based on immunocytochemical detection. This method utilizes the brominated RNA precursor...... bromouridine, which is taken into a cell, phosphorylated, and incorporated into nascent RNA. The BrU-substituted RNA is detected by permeabilizing the cells and staining with certain anti-BrdU antibodies. This dynamic approach yields information complementing that provided by cellular RNA content analysis...

  20. An open source cryostage and software analysis method for detection of antifreeze activity

    DEFF Research Database (Denmark)

    Lørup Buch, Johannes; Ramløv, H

    2016-01-01

    AFP could reliably be told apart from controls after only two minutes of recrystallisation. The goal of providing a fast, cheap and easy method for detecting antifreeze proteins in solution was met, and further development of the system can be followed at https://github.com/pechano/cryostage.......The aim of this study is to provide the reader with a simple setup that can detect antifreeze proteins (AFP) by inhibition of ice recrystallisation in very small sample sizes. This includes an open source cryostage, a method for preparing and loading samples as well as a software analysis method...

  1. Detection of negative ions in glow discharge mass spectrometry for analysis of solid specimens

    DEFF Research Database (Denmark)

    Canulescu, Stela; Molchan, Igor S.; Tauziede, C.

    2010-01-01

    A new method is presented for elemental and molecular analysis of halogen-containing samples by glow discharge time-of-flight mass spectrometry, consisting of detection of negative ions from a pulsed RF glow discharge in argon. Analyte signals are mainly extracted from the afterglow regime...... be used to study the distribution of a tantalum fluoride layer within the anodized tantala layer. Further, comparison is made with data obtained using glow-discharge optical emission spectroscopy, where elemental fluorine can only be detected using a neon plasma. The ionization mechanisms responsible...... for the formation of negative ions in glow discharge time-of-flight mass spectrometry are briefly discussed....

  2. Limitations to depth resolution in high-energy, heavy-ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Elliman, R.G.; Palmer, G.R.; Ophel, T.R.; Timmers, H.

    1998-01-01

    The depth resolution of heavy-ion elastic recoil detection analysis was examined for Al and Co thin films ranging in thickness from 100 to 400 nm. Measurements were performed with 154 MeV Au ions as the incident beam, and recoils were detected using a gas ionisation detector. Energy spectra were extracted for the Al and Co recoils and the depth resolution determined as a function of film thickness from the width of the high- and low- energy edges. These results were compared with theoretical estimates calculated using the computer program DEPTH. (authors)

  3. Analysis of ecstasy tablets using capillary electrophoresis with capacitively coupled contactless conductivity detection.

    Science.gov (United States)

    Porto, Suely K S S; Nogueira, Thiago; Blanes, Lucas; Doble, Philip; Sabino, Bruno D; do Lago, Claudimir L; Angnes, Lúcio

    2014-11-01

    A method for the identification of 3,4-methylenedioxymethamphetamine (MDMA) and meta-chlorophenylpiperazine (mCPP) was developed employing capillary electrophoresis (CE) with capacitively coupled contactless conductivity detection (C(4) D). Sample extraction, separation, and detection of "Ecstasy" tablets were performed in fenproporex, caffeine, lidocaine, and cocaine. Separation was performed in <90 sec. The advantages of using C(4) D instead of traditional CE-UV methods for in-field analysis are also discussed. © 2014 American Academy of Forensic Sciences.

  4. Reduction of Compton background from hydrogen in prompt gamma-ray analysis by multiple photon detection

    International Nuclear Information System (INIS)

    Toh, Y.; Oshima, M.; Kimura, A.; Koizumi, M.; Furutaka, K.; Hatsukawa, Y.

    2008-01-01

    Low-energy photons produced by the Compton scattering from hydrogen increase the background in the lower-energy region of the gamma-ray spectrum. This results in an increase in the detection limit for trace elements. In multiple photon detection prompt gamma-ray analysis (MPGA), only those elements that simultaneously emit two or more prompt gamma-rays, which have cascade relation and are emitted within a short interval, can be measured. Therefore, the influence of hydrogen can be reduced. In this study, standard polymer and food samples are measured. The hydrogen background is reduced in MPGA. (author)

  5. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography

    International Nuclear Information System (INIS)

    Longuetaud, F.

    2005-10-01

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  6. [Detection of UGT1A1*28 Polymorphism Using Fragment Analysis].

    Science.gov (United States)

    Huang, Ying; Su, Jian; Huang, Xiaosui; Lu, Danxia; Xie, Zhi; Yang, Suqing; Guo, Weibang; Lv, Zhiyi; Wu, Hongsui; Zhang, Xuchao

    2017-12-20

    Uridine-diphosphoglucuronosyl transferase 1A1 (UGT1A1), UGT1A1*28 polymorphism can reduce UGT1A1 enzymatic activity, which may lead to severe toxicities in patients who receive irinotecan. This study tries to build a fragment analysis method to detect UGT1A1*28 polymorphism. A total of 286 blood specimens from the lung cancer patients who were hospitalized in Guangdong General Hospital between April 2014 to May 2015 were detected UGT1A1*28 polymorphism by fragment analysis method. Comparing with Sanger sequencing, precision and accuracy of the fragment analysis method were 100%. Of the 286 patients, 236 (82.5% harbored TA6/6 genotype, 48 (16.8%) TA 6/7 genotype and 2 (0.7%) TA7/7 genotype. Our data suggest hat the fragment analysis method is robust for detecting UGT1A1*28 polymorphism in clinical practice. It's simple, time-saving, and easy-to-carry.

  7. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  8. Rapid detection of Listeria monocytogenes in milk using confocal micro-Raman spectroscopy and chemometric analysis.

    Science.gov (United States)

    Wang, Junping; Xie, Xinfang; Feng, Jinsong; Chen, Jessica C; Du, Xin-jun; Luo, Jiangzhao; Lu, Xiaonan; Wang, Shuo

    2015-07-02

    Listeria monocytogenes is a facultatively anaerobic, Gram-positive, rod-shape foodborne bacterium causing invasive infection, listeriosis, in susceptible populations. Rapid and high-throughput detection of this pathogen in dairy products is critical as milk and other dairy products have been implicated as food vehicles in several outbreaks. Here we evaluated confocal micro-Raman spectroscopy (785 nm laser) coupled with chemometric analysis to distinguish six closely related Listeria species, including L. monocytogenes, in both liquid media and milk. Raman spectra of different Listeria species and other bacteria (i.e., Staphylococcus aureus, Salmonella enterica and Escherichia coli) were collected to create two independent databases for detection in media and milk, respectively. Unsupervised chemometric models including principal component analysis and hierarchical cluster analysis were applied to differentiate L. monocytogenes from Listeria and other bacteria. To further evaluate the performance and reliability of unsupervised chemometric analyses, supervised chemometrics were performed, including two discriminant analyses (DA) and soft independent modeling of class analogies (SIMCA). By analyzing Raman spectra via two DA-based chemometric models, average identification accuracies of 97.78% and 98.33% for L. monocytogenes in media, and 95.28% and 96.11% in milk were obtained, respectively. SIMCA analysis also resulted in satisfied average classification accuracies (over 93% in both media and milk). This Raman spectroscopic-based detection of L. monocytogenes in media and milk can be finished within a few hours and requires no extensive sample preparation. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  10. Image edge detection based tool condition monitoring with morphological component analysis.

    Science.gov (United States)

    Yu, Xiaolong; Lin, Xin; Dai, Yiquan; Zhu, Kunpeng

    2017-07-01

    The measurement and monitoring of tool condition are keys to the product precision in the automated manufacturing. To meet the need, this study proposes a novel tool wear monitoring approach based on the monitored image edge detection. Image edge detection has been a fundamental tool to obtain features of images. This approach extracts the tool edge with morphological component analysis. Through the decomposition of original tool wear image, the approach reduces the influence of texture and noise for edge measurement. Based on the target image sparse representation and edge detection, the approach could accurately extract the tool wear edge with continuous and complete contour, and is convenient in charactering tool conditions. Compared to the celebrated algorithms developed in the literature, this approach improves the integrity and connectivity of edges, and the results have shown that it achieves better geometry accuracy and lower error rate in the estimation of tool conditions. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Detection of masses in mammograms by analysis of gradient vector convergence using sector filter

    International Nuclear Information System (INIS)

    Fakhari, Y.; Karimian, A.; Mohammadbeigi, M.

    2012-01-01

    Although mammography is the main diagnostic method for breast cancer, but the interpretation of mammograms is a difficult task and depends on the experience and skill of the radiologists. Computer Aided Detection (CADe) systems have been proposed to help radiologist in interpretation of mammograms. In this paper a novel filter called Sector filter is proposed to detect masses. This filter works based on the analysis of convergence of gradient vectors toward the center of filter. Using this filter, rounded convex regions, which are more likely to be pertained to a mass, could be detected in a gray scale image. After applying this filter on the images with two scales and their linear combination suspicious points were selected by a specific process. After implementation of the proposed method, promising results were achieved. The performance of the proposed method in this research was competitive or in some cases even better than that of other suggested methods in the literature. (authors)

  12. Molecular analysis of the Duchenne muscular dystrophy gene in Spanish individuals: Deletion detection and familial diagnosis

    Energy Technology Data Exchange (ETDEWEB)

    Patino, A.; Garcia-Delgado, M.; Narbona, J. [Univ. of Navarra, Pamplona (Spain)

    1995-11-06

    Deletion studies were performed in 26 Duchenne muscular dystrophy (DMD) patients through amplification of nine different exons by multiplex polymerase chain reaction (PCR). DNA from paraffin-embedded muscle biopsies was analyzed in 12 of the 26 patients studied. Optimization of this technique is of great utility because it enables analysis of material stored in pathology archives. PCR deletion detection, useful in DMD-affected boys, is problematic in determining the carrier state in female relatives. For this reason, to perform familial linkage diagnosis, we made use of a dinucleotide repeat polymorphism (STRP, or short tandem repeat polymorphism) located in intron 49 of the gene. We designed a new pair of primers that enabled the detection of 22 different alleles in relatives in the 14 DMD families studied. The use of this marker allowed familial diagnosis in 11 of the 14 DMD families and detection of de novo deletions in 3 of the probands. 8 refs., 5 figs., 2 tabs.

  13. Non-invasive method for screening and early detection of breast tumors using thermal field analysis

    Directory of Open Access Journals (Sweden)

    O. Drosu

    2009-10-01

    Full Text Available The paper refers to general presentation of international and European evaluation regarding breast cancer incidence and mortality as well as recommendations for prevention, screening, detection and treatment.The past years international research development in biomedical engineering has put a particular emphasis on the thermography use in breast pathology diagnosis and its main advantages, such as: an early diagnose of the breast cancer, in that stage when the mammography or ultrasounds can not easily detect the changes of the tissue; a totally non-invasive interaction with human body; very low costs and possibilities for the women to do a self thermographic test.We also present some important results of our research within the field of breast tumor detection using the numerical analysis of the thermal inverse problem.

  14. Fault Detection of Reciprocating Compressors using a Model from Principles Component Analysis of Vibrations

    International Nuclear Information System (INIS)

    Ahmed, M; Gu, F; Ball, A D

    2012-01-01

    Traditional vibration monitoring techniques have found it difficult to determine a set of effective diagnostic features due to the high complexity of the vibration signals originating from the many different impact sources and wide ranges of practical operating conditions. In this paper Principal Component Analysis (PCA) is used for selecting vibration feature and detecting different faults in a reciprocating compressor. Vibration datasets were collected from the compressor under baseline condition and five common faults: valve leakage, inter-cooler leakage, suction valve leakage, loose drive belt combined with intercooler leakage and belt loose drive belt combined with suction valve leakage. A model using five PCs has been developed using the baseline data sets and the presence of faults can be detected by comparing the T 2 and Q values from the features of fault vibration signals with corresponding thresholds developed from baseline data. However, the Q -statistic procedure produces a better detection as it can separate the five faults completely.

  15. Comparison of virtual cystoscopy and ultrasonography for bladder cancer detection: A meta-analysis

    International Nuclear Information System (INIS)

    Qu Xinhua; Huang Xiaolu; Wu Lianming; Huang Gang; Ping Xiong; Yan Weili

    2011-01-01

    Background and purpose: Bladder cancer is the most commonly diagnosed malignancy in patients presenting with haematuria. Early detection is crucial for improving patient prognosis. We therefore performed a meta-analysis to evaluate and compare the detection validity (sensitivity and specificity) of virtual cystoscopy (VC) and ultrasonography (US). Methods: We searched MEDLINE, EMBASE, PubMed and the Cochrane Library for studies evaluating diagnosis validity of VC and US between January 1966 and December 2009. Meta-analysis methods were used to pool sensitivity and specificity and to construct a summary receiver-operating characteristic (SROC) curve. Results: A total of 26 studies that included 3084 patients who fulfilled all of the inclusion criteria were considered for inclusion in the analysis. The pooled sensitivity for bladder cancer detection using CT virtual cystoscopy (CTVC), MR virtual cystoscopy (MRVC) and US was 0.939 (95% CI, 0.919-0.956), 0.908 (95% CI, 0.827-0.959) and 0.779 (95% CI, 0.744-0.812), respectively. The pooled specificity for bladder cancer detection using CTVC, MRVC and US was 0.981 (95% CI, 0.973-0.988), 0.948 (95% CI, 0.884-0.983) and 0.962 (95% CI, 0.953-0.969), respectively. The pooled diagnostic odd ratio (DOR) estimate for CTVC (604.22) were significantly higher than for MRVC (144.35, P < 0.001) and US (72.472, P < 0.001). Conclusion: Our results showed that both CTVC and MRVC are better imaging methods for diagnosing bladder cancer than US. CTVC has higher diagnostic value (sensitivity, specificity and DOR) for the detection of bladder cancer than either MRCT or US.

  16. Comparison of virtual cystoscopy and ultrasonography for bladder cancer detection: A meta-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Qu Xinhua; Huang Xiaolu [Department of Nuclear Medicine, Renji Hospital, Shanghai Jiaotong University School of Medicine, Shanghai 200127 (China); Department of Ultrasonic, Ninth People' s Hospital, Shanghai Jiaotong University School of Medicine, Shanghai 200011 (China); Wu Lianming [Department of Radiology, Renji Hospital, Shanghai Jiaotong University School of Medicine, Shanghai 200127 (China); Huang Gang [Department of Nuclear Medicine, Renji Hospital, Shanghai Jiaotong University School of Medicine, Shanghai 200127 (China); Ping Xiong, E-mail: pxiong6@126.com [Department of Ultrasonic, Ninth People' s Hospital, Shanghai Jiaotong University School of Medicine, Shanghai 200011 (China); Yan Weili, E-mail: wl_yan67@126.com [Department of Nuclear Medicine, Renji Hospital, Shanghai Jiaotong University School of Medicine, Shanghai 200127 (China)

    2011-11-15

    Background and purpose: Bladder cancer is the most commonly diagnosed malignancy in patients presenting with haematuria. Early detection is crucial for improving patient prognosis. We therefore performed a meta-analysis to evaluate and compare the detection validity (sensitivity and specificity) of virtual cystoscopy (VC) and ultrasonography (US). Methods: We searched MEDLINE, EMBASE, PubMed and the Cochrane Library for studies evaluating diagnosis validity of VC and US between January 1966 and December 2009. Meta-analysis methods were used to pool sensitivity and specificity and to construct a summary receiver-operating characteristic (SROC) curve. Results: A total of 26 studies that included 3084 patients who fulfilled all of the inclusion criteria were considered for inclusion in the analysis. The pooled sensitivity for bladder cancer detection using CT virtual cystoscopy (CTVC), MR virtual cystoscopy (MRVC) and US was 0.939 (95% CI, 0.919-0.956), 0.908 (95% CI, 0.827-0.959) and 0.779 (95% CI, 0.744-0.812), respectively. The pooled specificity for bladder cancer detection using CTVC, MRVC and US was 0.981 (95% CI, 0.973-0.988), 0.948 (95% CI, 0.884-0.983) and 0.962 (95% CI, 0.953-0.969), respectively. The pooled diagnostic odd ratio (DOR) estimate for CTVC (604.22) were significantly higher than for MRVC (144.35, P < 0.001) and US (72.472, P < 0.001). Conclusion: Our results showed that both CTVC and MRVC are better imaging methods for diagnosing bladder cancer than US. CTVC has higher diagnostic value (sensitivity, specificity and DOR) for the detection of bladder cancer than either MRCT or US.

  17. Detecting, reporting, and analysis of priority diseases for routine public health surveillance in Liberia.

    Science.gov (United States)

    Frimpong, Joseph Asamoah; Park, Meeyoung Mattie; Amo-Addae, Maame Pokuah; Adewuyi, Peter Adebayo; Nagbe, Thomas Knue

    2017-01-01

    An essential component of a public health surveillance system is its ability to detect priority diseases which fall within the mandate of public health officials at all levels. Early detection, reporting and response to public health events help to reduce the burden of mortality and morbidity on communities. Analysis of reliable surveillance data provides relevant information which can enable implementation of timely and appropriate public health interventions. To ensure that a resilient system is in place, the World Health Organization (WHO) has provided guidelines for detection, reporting and response to public health events in the Integrated Disease Surveillance and Response (IDSR) strategy. This case study provides training on detection, reporting and analysis of priority diseases for routine public health surveillance in Liberia and highlights potential errors and challenges which can hinder effective surveillance. Table-top exercises and group discussion lead participants through a simulated verification and analyses of summary case reports in the role of the District Surveillance Officer. This case study is intended for public health training in a classroom setting and can be accomplished within 2 hours 30 minutes. The target audience include residents in Frontline Epidemiology Training Programs (FETP-Frontline), Field Epidemiology and Laboratory Training Programs (FELTPs), and others who are interested in this topic.

  18. Noninvasive detection of inhomogeneities in turbid media with time-resolved log-slope analysis

    International Nuclear Information System (INIS)

    Wan, S.K.; Guo Zhixiong; Kumar, Sunil; Aber, Janice; Garetz, B.A.

    2004-01-01

    Detecting foreign objects embedded in turbid media using noninvasive optical tomography techniques is of great importance in many practical applications, such as in biomedical imaging and diagnosis, safety inspection on aircrafts and submarines, and LIDAR techniques. In this paper we develop a novel optical tomography approach based on slope analysis of time-resolved back-scattered signals collected at the medium boundaries where the light source is an ultrafast, short-pulse laser. As the optical field induced by the laser-pulse propagates, the detected temporal signals are influenced by the optical properties of the medium traversed. The detected temporal signatures therefore contain information that can indicate the presence of an inhomogeneity as well as its size and location relative to the laser source and detection systems. The log-slope analysis of the time-resolved back-scattered intensity is shown to be an effective method for extracting the information contained in the signal. The technique is validated by experimental results and by Monte Carlo simulations

  19. Detection of Low Molecular Weight Adulterants in Beverages by Direct Analysis in Real Time Mass Spectrometry.

    Science.gov (United States)

    Sisco, Edward; Dake, Jeffrey

    2016-04-14

    Direct Analysis in Real Time Mass Spectrometry (DART-MS) has been used to detect the presence of non-narcotic adulterants in beverages. The non-narcotic adulterants that were examined in this work incorporated a number low molecular weight alcohols, acetone, ammonium hydroxide, and sodium hypochlorite. Analysis of the adulterants was completed by pipetting 1 µL deposits onto glass microcapillaries along with an appropriate dopant species followed by introduction into the DART gas stream. It was found that detection of these compounds in the complex matrices of common beverages (soda, energy drinks, etc.) was simplified through the use of a dopant species to allow for adduct formation with the desired compound(s) of interest. Other parameters that were investigated included DART gas stream temperature, in source collision induced dissociation, ion polarity, and DART needle voltage. Sensitivities of the technique were found to range from 0.001 % volume fraction to 0.1 % volume fraction, comparable to traditional analyses completed using headspace gas chromatography mass spectrometry (HS-GC/MS). Once a method was established using aqueous solutions, , fifteen beverages were spiked with each of the nine adulterants, to simulate real world detection, and in nearly all cases the adulterant could be detected either in pure form, or complexed with the added dopant species. This technique provides a rapid way to directly analyze beverages believed to be contaminated with non-narcotic adulterants at sensitivities similar to or exceeding those of traditional confirmatory analyses.

  20. CUSUM-Logistic Regression analysis for the rapid detection of errors in clinical laboratory test results.

    Science.gov (United States)

    Sampson, Maureen L; Gounden, Verena; van Deventer, Hendrik E; Remaley, Alan T

    2016-02-01

    The main drawback of the periodic analysis of quality control (QC) material is that test performance is not monitored in time periods between QC analyses, potentially leading to the reporting of faulty test results. The objective of this study was to develop a patient based QC procedure for the more timely detection of test errors. Results from a Chem-14 panel measured on the Beckman LX20 analyzer were used to develop the model. Each test result was predicted from the other 13 members of the panel by multiple regression, which resulted in correlation coefficients between the predicted and measured result of >0.7 for 8 of the 14 tests. A logistic regression model, which utilized the measured test result, the predicted test result, the day of the week and time of day, was then developed for predicting test errors. The output of the logistic regression was tallied by a daily CUSUM approach and used to predict test errors, with a fixed specificity of 90%. The mean average run length (ARL) before error detection by CUSUM-Logistic Regression (CSLR) was 20 with a mean sensitivity of 97%, which was considerably shorter than the mean ARL of 53 (sensitivity 87.5%) for a simple prediction model that only used the measured result for error detection. A CUSUM-Logistic Regression analysis of patient laboratory data can be an effective approach for the rapid and sensitive detection of clinical laboratory errors. Published by Elsevier Inc.

  1. NINJA data analysis with a detection pipeline based on the Hilbert-Huang transform

    International Nuclear Information System (INIS)

    Stroeer, Alexander; Camp, Jordan

    2009-01-01

    The NINJA data analysis challenge allowed the study of the sensitivity of data analysis pipelines to binary black hole numerical relativity waveforms in simulated Gaussian noise at the design level of the LIGO observatory and the VIRGO observatory. We analyzed NINJA data with a pipeline based on the Hilbert-Huang transform, utilizing a detection stage and a characterization stage: detection is performed by triggering on excess instantaneous power, characterization is performed by displaying the kernel density enhanced (KD) time-frequency trace of the signal. Using the simulated data based on the two LIGO detectors, we were able to detect 77 signals out of 126 above signal-to-noise ratio, SNR 5 in coincidence, with 43 missed events characterized by SNR < 10. Characterization of the detected signals revealed the merger part of the waveform in high time and frequency resolution, free from time-frequency uncertainty. We estimated the timelag of the signals between the detectors based on the optimal overlap of the individual KD time-frequency maps, yielding estimates accurate within a fraction of a millisecond for half of the events. A coherent addition of the data sets according to the estimated timelag eventually was used in a final characterization of the event.

  2. Detector location selection based on VIP analysis in near-infrared detection of dural hematoma

    Directory of Open Access Journals (Sweden)

    Qiuming Sun

    2018-03-01

    Full Text Available Detection of dural hematoma based on multi-channel near-infrared differential absorbance has the advantages of rapid and non-invasive detection. The location and number of detectors around the light source are critical for reducing the pathological characteristics of the prediction model on dural hematoma degree. Therefore, rational selection of detector numbers and their distances from the light source is very important. In this paper, a detector position screening method based on Variable Importance in the Projection (VIP analysis is proposed. A preliminary modeling based on Partial Least Squares method (PLS for the prediction of dural position μa was established using light absorbance information from 30 detectors located 2.0–5.0 cm from the light source with a 0.1 cm interval. The mean relative error (MRE of the dural position μa prediction model was 4.08%. After VIP analysis, the number of detectors was reduced from 30 to 4 and the MRE of the dural position μa prediction was reduced from 4.08% to 2.06% after the reduction in detector numbers. The prediction model after VIP detector screening still showed good prediction of the epidural position μa. This study provided a new approach and important reference on the selection of detector location in near-infrared dural hematoma detection. Keywords: Detector location screening, Epidural hematoma detection, Variable importance in the projection

  3. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  4. Comparative analysis of peak-detection techniques for comprehensive two-dimensional chromatography.

    Science.gov (United States)

    Latha, Indu; Reichenbach, Stephen E; Tao, Qingping

    2011-09-23

    Comprehensive two-dimensional gas chromatography (GC×GC) is a powerful technology for separating complex samples. The typical goal of GC×GC peak detection is to aggregate data points of analyte peaks based on their retention times and intensities. Two techniques commonly used for two-dimensional peak detection are the two-step algorithm and the watershed algorithm. A recent study [4] compared the performance of the two-step and watershed algorithms for GC×GC data with retention-time shifts in the second-column separations. In that analysis, the peak retention-time shifts were corrected while applying the two-step algorithm but the watershed algorithm was applied without shift correction. The results indicated that the watershed algorithm has a higher probability of erroneously splitting a single two-dimensional peak than the two-step approach. This paper reconsiders the analysis by comparing peak-detection performance for resolved peaks after correcting retention-time shifts for both the two-step and watershed algorithms. Simulations with wide-ranging conditions indicate that when shift correction is employed with both algorithms, the watershed algorithm detects resolved peaks with greater accuracy than the two-step method. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. Detection of Yersinia enterocolitica in milk powders by cross-priming amplification combined with immunoblotting analysis.

    Science.gov (United States)

    Zhang, Hongwei; Feng, Shaolong; Zhao, Yulong; Wang, Shuo; Lu, Xiaonan

    2015-12-02

    Yersinia enterocolitica (Y. enterocolitica) is frequently isolated from a wide variety of foods and can cause human yersiniosis. Biochemical and culture-based assays are common detection methods, but require a long incubation time and easily misidentify Y. enterocolitica as other non-pathogenic Yersinia species. Alternatively, cross-priming amplification (CPA) under isothermal conditions combined with immunoblotting analysis enables a more sensitive detection in a relatively short time period. A set of specific displacement primers, cross primers and testing primers was designed on the basis of six specific sequences in Y. enterocolitica 16S-23S rDNA internal transcribed spacer. Under isothermal condition, amplification and hybridization were conducted simultaneously at 63°C for 60 min. The specificity of CPA was tested for 96 different bacterial strains and 165 commercial milk powder samples. Two red lines were developed on BioHelix Express strip for all of the Y. enterocolitica strains, and one red line was shown for non-Y. enterocolitica strains. The limit of detection of CPA was 10(0)fg for genomic DNA (1000 times more sensitive than PCR assay), 10(1) CFU/ml for pure bacterial culture, and 10(0) CFU per 100 g milk powder with pre-enrichment at 37°C for 24 h. CPA combined with immunoblotting analysis can achieve highly specific and sensitive detection of Y. enterocolitica in milk powder in 90 min after pre-enrichment. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. SDS-PAGE in conjunction with match lane statistical analysis for the detection of meat adulteration

    International Nuclear Information System (INIS)

    Hegazy, R.A.; Nassef, A.E.

    2003-01-01

    Sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE) of seven meat types and two component mixtures of them were made. Banding patterns of resulting denstograms in conjunction with cluster analysi and match lane statistical analysis were used for the detection of meat adulteration. The use of beef as a reference meat have resulted in a clear distinction from goat, pork, chicken, turkey, camel meats and their mixture and camel meat. The use of pork meat as a reference was more assurate because of the low degrees of matching with all meats and their mixtures and consequently high abilities of differentiations. The purpose of identification. the purpose of identification of meat species arises from the desire of human, in general, to confirm what he eat ? for moslems the establisment that meat is free from pork type is most important. Another economic purpose is the detection of adulteration of valuable meat by less valuable types. Several attempts in different laboratories were done to serve this object but most of analytical techniques. Barbieri and formi (1999) were able to detect 5% of meat type in mixtures by isolelectric focusing and 1% of meat type by PCR technique in beef, pork, chicken and turkey meats. By crossover immunoelectrophoresis technique, zanon and vianello (1998) were also to detect a limit of 5% of specific meat in mixuters of beef, pork, mutton/lamb, horse and chicken meats

  7. Controversies in using urine samples for prostate cancer detection: PSA and PCA3 expression analysis

    Directory of Open Access Journals (Sweden)

    S. Fontenete

    2011-12-01

    Full Text Available PURPOSE: Prostate cancer (PCa is one of the most commonly diagnosed malignancies in the world. Although PSA utilization as a serum marker has improved prostate cancer detection it still presents some limitations, mainly regarding its specificity. The expression of this marker, along with the detection of PCA3 mRNA in urine samples, has been suggested as a new approach for PCa detection. The goal of this work was to evaluate the efficacy of the urinary detection of PCA3 mRNA and PSA mRNA without performing the somewhat embarrassing prostate massage. It was also intended to optimize and implement a methodological protocol for this kind of sampling. MATERIALS AND METHODS: Urine samples from 57 patients with suspected prostate disease were collected, without undergoing prostate massage. Increased serum PSA levels were confirmed by medical records review. RNA was extracted by different methods and a preamplification step was included in order to improve gene detection by Real-Time PCR. RESULTS: An increase in RNA concentration with the use of TriPure Isolation Reagent. Despite this optimization, only 15.8% of the cases showed expression of PSA mRNA and only 3.8% of prostate cancer patients presented detectable levels of PCA3 mRNA. The use of a preamplification step revealed no improvement in the results obtained. CONCLUSION: This work confirms that prostate massage is important before urine collection for gene expression analysis. Since PSA and PCA3 are prostate specific, it is necessary to promote the passage of cells from prostate to urinary tract, in order to detect these genetic markers in urine samples.

  8. Automated analysis of retinal images for detection of referable diabetic retinopathy.

    Science.gov (United States)

    Abràmoff, Michael D; Folk, James C; Han, Dennis P; Walker, Jonathan D; Williams, David F; Russell, Stephen R; Massin, Pascale; Cochener, Beatrice; Gain, Philippe; Tang, Li; Lamard, Mathieu; Moga, Daniela C; Quellec, Gwénolé; Niemeijer, Meindert

    2013-03-01

    The diagnostic accuracy of computer detection programs has been reported to be comparable to that of specialists and expert readers, but no computer detection programs have been validated in an independent cohort using an internationally recognized diabetic retinopathy (DR) standard. To determine the sensitivity and specificity of the Iowa Detection Program (IDP) to detect referable diabetic retinopathy (RDR). In primary care DR clinics in France, from January 1, 2005, through December 31, 2010, patients were photographed consecutively, and retinal color images were graded for retinopathy severity according to the International Clinical Diabetic Retinopathy scale and macular edema by 3 masked independent retinal specialists and regraded with adjudication until consensus. The IDP analyzed the same images at a predetermined and fixed set point. We defined RDR as more than mild nonproliferative retinopathy and/or macular edema. A total of 874 people with diabetes at risk for DR. Sensitivity and specificity of the IDP to detect RDR, area under the receiver operating characteristic curve, sensitivity and specificity of the retinal specialists' readings, and mean interobserver difference (κ). The RDR prevalence was 21.7% (95% CI, 19.0%-24.5%). The IDP sensitivity was 96.8% (95% CI, 94.4%-99.3%) and specificity was 59.4% (95% CI, 55.7%-63.0%), corresponding to 6 of 874 false-negative results (none met treatment criteria). The area under the receiver operating characteristic curve was 0.937 (95% CI, 0.916-0.959). Before adjudication and consensus, the sensitivity/specificity of the retinal specialists were 0.80/0.98, 0.71/1.00, and 0.91/0.95, and the mean intergrader κ was 0.822. The IDP has high sensitivity and specificity to detect RDR. Computer analysis of retinal photographs for DR and automated detection of RDR can be implemented safely into the DR screening pipeline, potentially improving access to screening and health care productivity and reducing visual loss

  9. An assessment of independent component analysis for detection of military targets from hyperspectral images

    Science.gov (United States)

    Tiwari, K. C.; Arora, M. K.; Singh, D.

    2011-10-01

    Hyperspectral data acquired over hundreds of narrow contiguous wavelength bands are extremely suitable for target detection due to their high spectral resolution. Though spectral response of every material is expected to be unique, but in practice, it exhibits variations, which is known as spectral variability. Most target detection algorithms depend on spectral modelling using a priori available target spectra In practice, target spectra is, however, seldom available a priori. Independent component analysis (ICA) is a new evolving technique that aims at finding out components which are statistically independent or as independent as possible. The technique therefore has the potential of being used for target detection applications. A assessment of target detection from hyperspectral images using ICA and other algorithms based on spectral modelling may be of immense interest, since ICA does not require a priori target information. The aim of this paper is, thus, to assess the potential of ICA based algorithm vis a vis other prevailing algorithms for military target detection. Four spectral matching algorithms namely Orthogonal Subspace Projection (OSP), Constrained Energy Minimisation (CEM), Spectral Angle Mapper (SAM) and Spectral Correlation Mapper (SCM), and four anomaly detection algorithms namely OSP anomaly detector (OSPAD), Reed-Xiaoli anomaly detector (RXD), Uniform Target Detector (UTD) and a combination of Reed-Xiaoli anomaly detector and Uniform Target Detector (RXD-UTD) were considered. The experiments were conducted using a set of synthetic and AVIRIS hyperspectral images containing aircrafts as military targets. A comparison of true positive and false positive rates of target detections obtained from ICA and other algorithms plotted on a receiver operating curves (ROC) space indicates the superior performance of the ICA over other algorithms.

  10. Evaluation of different analysis and identification methods for Salmonella detection in surface drinking water sources

    International Nuclear Information System (INIS)

    Hsu, Bing-Mu; Huang, Kuan-Hao; Huang, Shih-Wei; Tseng, Kuo-Chih; Su, Ming-Jen; Lin, Wei-Chen; Ji, Dar-Der; Shih, Feng-Cheng; Chen, Jyh-Larng; Kao, Po-Min

    2011-01-01

    The standard method for detecting Salmonella generally analyzes food or fecal samples. Salmonella often occur in relatively low concentrations in environmental waters. Therefore, some form of concentration and proliferation may be needed. This study compares three Salmonella analysis methods and develops a new Salmonella detection procedure for use in environmental water samples. The new procedure for Salmonella detection include water concentration, nutrient broth enrichment, selection of Salmonella containing broth by PCR, isolation of Salmonella strains by selective culture plates, detection of possible Salmonella isolate by PCR, and biochemical testing. Serological assay and pulsed-field gel electrophoresis (PFGE) can be used to identify Salmonella serotype and genotype, respectively. This study analyzed 116 raw water samples taken from 18 water plants and belonging to 5 watersheds. Of these 116, 10 water samples (8.6%) taken from 7 water plants and belonging to 4 watersheds were positive for a Salmonella-specific polymerase chain reaction targeting the invA gene. Guided by serological assay results, this study identified 7 cultured Salmonella isolates as Salmonella enterica serovar: Alnaby, Enteritidis, Houten, Montevideo, Newport, Paratyphi B var. Java, and Victoria. These seven Salmonella serovars were identified in clinical cases for the same geographical areas, but only one of them was 100% homologous with clinical cases in the PFGE pattern. - Research highlights: → A new Salmonella detecting procedure for environmental water is developed. → Salmonella isolates are identified by serological assay and PFGE. → A total of seven Salmonella serovars is isolated from environmental water.

  11. Online Detection of Peroxidase Using 3D Printing, Active Magnetic Mixing, and Spectra Analysis

    Directory of Open Access Journals (Sweden)

    Shanshan Bai

    2017-01-01

    Full Text Available A new method for online detection of peroxidase (POD using 3D printing, active magnetic mixing, fluidic control, and optical detection was developed and demonstrated in this study. The proposed POD detection system consisted of a 3D printing and active magnetic mixing based fluidic chip for online catalytic reaction, an optical detector with a fluidic flow cell for quantitative determination of the final catalysate, and a single-chip microcontroller based controller for automatic control of two rotating magnetic fields and four precise peristaltic pumps. Horseradish peroxidase (HRP was used as research model and a linear relationship between the absorbance at the characteristic wavelength of 450 nm and the concentration of HRP of 1/4–1/128 μg mL−1 was obtained as A  =  0.257ln⁡(C + 1.425 (R2  = 0.976. For the HRP spiked pork tests, the recoveries of HRP ranged from 93.5% to 110.4%, indicating that this proposed system was capable of detecting HRP in real samples. It has the potential to be extended for online detection of the activity of other enzymes and integration with ELISA method for biological and chemical analysis.

  12. Dynamical scene analysis with a moving camera: mobile targets detection system

    International Nuclear Information System (INIS)

    Hennebert, Christine

    1996-01-01

    This thesis work deals with the detection of moving objects in monocular image sequences acquired with a mobile camera. We propose a method able to detect small moving objects in visible or infrared images of real outdoor scenes. In order to detect objects of very low apparent motion, we consider an analysis on a large temporal interval. We have chosen to compensate for the dominant motion due to the camera displacement for several consecutive images in order to form a sub-sequence of images for which the camera seems virtually static. We have also developed a new approach allowing to extract the different layers of a real scene in order to deal with cases where the 2D motion due to the camera displacement cannot be globally compensated for. To this end, we use a hierarchical model with two levels: the local merging step and the global merging one. Then, an appropriate temporal filtering is applied to registered image sub-sequence to enhance signals corresponding to moving objects. The detection issue is stated as a labeling problem within a statistical regularization based on Markov Random Fields. Our method has been validated on numerous real image sequences depicting complex outdoor scenes. Finally, the feasibility of an integrated circuit for mobile object detection has been proved. This circuit could lead to an ASIC creation. (author) [fr

  13. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar

    Directory of Open Access Journals (Sweden)

    Raja Syamsul Azmir Raja Abdullah

    2016-09-01

    Full Text Available The passive bistatic radar (PBR system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR. The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system’s capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications.

  14. Trace elements detection in whole food samples by Neutron Activation Analysis, k0-method

    International Nuclear Information System (INIS)

    Sathler, Márcia Maia; Menezes, Maria Ângela de Barros Correia; Salles, Paula Maria Borges de

    2017-01-01

    Inorganic elements, from natural and anthropogenic sources are present in foods in different concentrations. With the increase in anthropogenic activities, there was also a considerable increase in the emission of these elements in the environment, leading to the need of monitoring the elemental composition of foods available for consumption. Numerous techniques have been used to detect inorganic elements in biological and environmental matrices, always aiming at reaching lower detection limits in order to evaluate the trace element content in the sample. Neutron activation analysis (INAA), applying the k 0 -method, produces accurate and precise results without the need of chemical preparation of the samples – that could cause their contamination. This study evaluated the presence of inorganic elements in whole foods samples, mainly elements on trace levels. For this purpose, seven samples of different types of whole foods were irradiated in the TRIGA MARK I IPR-R1 research reactor - located at CDTN/CNEN, in Belo Horizonte, MG. It was possible to detect twenty two elements above the limit of detection in, at least, one of the samples analyzed. This study reaffirms the INAA, k 0 - method, as a safe and efficient technique for detecting trace elements in food samples. (author)

  15. Evaluation of different analysis and identification methods for Salmonella detection in surface drinking water sources

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, Bing-Mu, E-mail: bmhsu@ccu.edu.tw [Department of Earth and Environmental Sciences, National Chung Cheng University, Chiayi, Taiwan, ROC (China); Huang, Kuan-Hao; Huang, Shih-Wei [Department of Earth and Environmental Sciences, National Chung Cheng University, Chiayi, Taiwan, ROC (China); Tseng, Kuo-Chih [Department of Internal Medicine, Buddhist Dalin Tzu Chi General Hospital, Chiayi, Taiwan, ROC (China); Su, Ming-Jen [Department of Clinical Pathology, Buddhist Dalin Tzu Chi General Hospital, Chiayi, Taiwan, ROC (China); Lin, Wei-Chen; Ji, Dar-Der [Research and Diagnostic Center, Centers for Disease Control, Taipei, Taiwan, ROC (China); Shih, Feng-Cheng; Chen, Jyh-Larng [Department of Environmental Engineering and Health, Yuanpei University of Science and Technology, HsinChu, Taiwan, ROC (China); Kao, Po-Min [Department of Earth and Environmental Sciences, National Chung Cheng University, Chiayi, Taiwan, ROC (China)

    2011-09-15

    The standard method for detecting Salmonella generally analyzes food or fecal samples. Salmonella often occur in relatively low concentrations in environmental waters. Therefore, some form of concentration and proliferation may be needed. This study compares three Salmonella analysis methods and develops a new Salmonella detection procedure for use in environmental water samples. The new procedure for Salmonella detection include water concentration, nutrient broth enrichment, selection of Salmonella containing broth by PCR, isolation of Salmonella strains by selective culture plates, detection of possible Salmonella isolate by PCR, and biochemical testing. Serological assay and pulsed-field gel electrophoresis (PFGE) can be used to identify Salmonella serotype and genotype, respectively. This study analyzed 116 raw water samples taken from 18 water plants and belonging to 5 watersheds. Of these 116, 10 water samples (8.6%) taken from 7 water plants and belonging to 4 watersheds were positive for a Salmonella-specific polymerase chain reaction targeting the invA gene. Guided by serological assay results, this study identified 7 cultured Salmonella isolates as Salmonella enterica serovar: Alnaby, Enteritidis, Houten, Montevideo, Newport, Paratyphi B var. Java, and Victoria. These seven Salmonella serovars were identified in clinical cases for the same geographical areas, but only one of them was 100% homologous with clinical cases in the PFGE pattern. - Research highlights: {yields} A new Salmonella detecting procedure for environmental water is developed. {yields} Salmonella isolates are identified by serological assay and PFGE. {yields} A total of seven Salmonella serovars is isolated from environmental water.

  16. Temperature noise analysis and sodium boiling detection in the fuel failure mockup

    International Nuclear Information System (INIS)

    Sides, W.H. Jr.; Fry, D.N.; Leavell, W.H.; Mathis, M.V.; Saxe, R.F.

    1976-01-01

    Sodium temperature noise was measured at the exit of simulated, fast-reactor fuel subassemblies in the Fuel Failure Mockup (FFM) to determine the feasibility of using temperature noise monitors to detect flow blockages in fast reactors. Also, acoustic noise was measured to determine whether sodium boiling in the FFM could be detected acoustically and whether noncondensable gas entrained in the sodium coolant would affect the sensitivity of the acoustic noise detection system. Information from these studies would be applied to the design of safety systems for operating liquid-metal fast breeder reactors (LMFBRs). It was determined that the statistical properties of temperature noise are dependent on the shape of temperature profiles across the subassemblies, and that a blockage upstream of a thermocouple that increases the gradient of the profile near the blockage will also increase the temperature noise at the thermocouple. Amplitude probability analysis of temperature noise shows a skewed amplitude density function about the mean temperature that varies with the location of the thermocouple with respect to the blockage location. It was concluded that sodium boiling in the FFM could be detected acoustically. However, entrained noncondensable gas in the sodium coolant at void fractions greater than 0.4 percent attenuated the acoustic signals sufficiently that boiling was not detected. At a void fraction of 0.1 percent, boiling was indicated only by the two acoustic detectors closest to the boiling site

  17. Analysis of the Chirplet Transform-Based Algorithm for Radar Detection of Accelerated Targets

    Science.gov (United States)

    Galushko, V. G.; Vavriv, D. M.

    2017-06-01

    Purpose: Efficiency analysis of an optimal algorithm of chirp signal processing based on the chirplet transform as applied to detection of radar targets in uniformly accelerated motion. Design/methodology/approach: Standard methods of the optimal filtration theory are used to investigate the ambiguity function of chirp signals. Findings: An analytical expression has been derived for the ambiguity function of chirp signals that is analyzed with respect to detection of radar targets moving at a constant acceleration. Sidelobe level and characteristic width of the ambiguity function with respect to the coordinates frequency and rate of its change have been estimated. The gain in the signal-to-noise ratio has been assessed that is provided by the algorithm under consideration as compared with application of the standard Fourier transform to detection of chirp signals against a “white” noise background. It is shown that already with a comparatively small (processing channels (elementary filters with respect to the frequency change rate) the gain in the signal-tonoise ratio exceeds 10 dB. A block diagram of implementation of the algorithm under consideration is suggested on the basis of a multichannel weighted Fourier transform. Recommendations as for selection of the detection algorithm parameters have been developed. Conclusions: The obtained results testify to efficiency of application of the algorithm under consideration to detection of radar targets moving at a constant acceleration. Nevertheless, it seems expedient to perform computer simulations of its operability with account for the noise impact along with trial measurements in real conditions.

  18. Sensitivity Analysis Based SVM Application on Automatic Incident Detection of Rural Road in China

    Directory of Open Access Journals (Sweden)

    Xingliang Liu

    2018-01-01

    Full Text Available Traditional automatic incident detection methods such as artificial neural networks, backpropagation neural network, and Markov chains are not suitable for addressing the incident detection problem of rural roads in China which have a relatively high accident rate and a low reaction speed caused by the character of small traffic volume. This study applies the support vector machine (SVM and parameter sensitivity analysis methods to build an accident detection algorithm in a rural road condition, based on real-time data collected in a field experiment. The sensitivity of four parameters (speed, front distance, vehicle group time interval, and free driving ratio is analyzed, and the data sets of two parameters with a significant sensitivity are chosen to form the traffic state feature vector. The SVM and k-fold cross validation (K-CV methods are used to build the accident detection algorithm, which shows an excellent performance in detection accuracy (98.15% of the training data set and 87.5% of the testing data set. Therefore, the problem of low incident reaction speed of rural roads in China could be solved to some extent.

  19. Trace elements detection in whole food samples by Neutron Activation Analysis, k{sub 0}-method

    Energy Technology Data Exchange (ETDEWEB)

    Sathler, Márcia Maia; Menezes, Maria Ângela de Barros Correia, E-mail: maia.sathler@gmail.com, E-mail: menezes@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Salles, Paula Maria Borges de, E-mail: pauladesalles@yahoo.com.br [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    Inorganic elements, from natural and anthropogenic sources are present in foods in different concentrations. With the increase in anthropogenic activities, there was also a considerable increase in the emission of these elements in the environment, leading to the need of monitoring the elemental composition of foods available for consumption. Numerous techniques have been used to detect inorganic elements in biological and environmental matrices, always aiming at reaching lower detection limits in order to evaluate the trace element content in the sample. Neutron activation analysis (INAA), applying the k{sub 0}-method, produces accurate and precise results without the need of chemical preparation of the samples – that could cause their contamination. This study evaluated the presence of inorganic elements in whole foods samples, mainly elements on trace levels. For this purpose, seven samples of different types of whole foods were irradiated in the TRIGA MARK I IPR-R1 research reactor - located at CDTN/CNEN, in Belo Horizonte, MG. It was possible to detect twenty two elements above the limit of detection in, at least, one of the samples analyzed. This study reaffirms the INAA, k{sub 0} - method, as a safe and efficient technique for detecting trace elements in food samples. (author)

  20. Integrated Kidney Exosome Analysis for the Detection of Kidney Transplant Rejection.

    Science.gov (United States)

    Park, Jongmin; Lin, Hsing-Ying; Assaker, Jean Pierre; Jeong, Sangmoo; Huang, Chen-Han; Kurdi, A; Lee, Kyungheon; Fraser, Kyle; Min, Changwook; Eskandari, Siawosh; Routray, Sujit; Tannous, Bakhos; Abdi, Reza; Riella, Leonardo; Chandraker, Anil; Castro, Cesar M; Weissleder, Ralph; Lee, Hakho; Azzi, Jamil R

    2017-11-28

    Kidney transplant patients require life-long surveillance to detect allograft rejection. Repeated biopsy, albeit the clinical gold standard, is an invasive procedure with the risk of complications and comparatively high cost. Conversely, serum creatinine or urinary proteins are noninvasive alternatives but are late markers with low specificity. We report a urine-based platform to detect kidney transplant rejection. Termed iKEA (integrated kidney exosome analysis), the approach detects extracellular vesicles (EVs) released by immune cells into urine; we reasoned that T cells, attacking kidney allografts, would shed EVs, which in turn can be used as a surrogate marker for inflammation. We optimized iKEA to detect T-cell-derived EVs and implemented a portable sensing system. When applied to clinical urine samples, iKEA revealed high level of CD3-positive EVs in kidney rejection patients and achieved high detection accuracy (91.1%). Fast, noninvasive, and cost-effective, iKEA could offer new opportunities in managing transplant recipients, perhaps even in a home setting.

  1. Analysis on Target Detection and Classification in LTE Based Passive Forward Scattering Radar.

    Science.gov (United States)

    Raja Abdullah, Raja Syamsul Azmir; Abdul Aziz, Noor Hafizah; Abdul Rashid, Nur Emileen; Ahmad Salah, Asem; Hashim, Fazirulhisyam

    2016-09-29

    The passive bistatic radar (PBR) system can utilize the illuminator of opportunity to enhance radar capability. By utilizing the forward scattering technique and procedure into the specific mode of PBR can provide an improvement in target detection and classification. The system is known as passive Forward Scattering Radar (FSR). The passive FSR system can exploit the peculiar advantage of the enhancement in forward scatter radar cross section (FSRCS) for target detection. Thus, the aim of this paper is to show the feasibility of passive FSR for moving target detection and classification by experimental analysis and results. The signal source is coming from the latest technology of 4G Long-Term Evolution (LTE) base station. A detailed explanation on the passive FSR receiver circuit, the detection scheme and the classification algorithm are given. In addition, the proposed passive FSR circuit employs the self-mixing technique at the receiver; hence the synchronization signal from the transmitter is not required. The experimental results confirm the passive FSR system's capability for ground target detection and classification. Furthermore, this paper illustrates the first classification result in the passive FSR system. The great potential in the passive FSR system provides a new research area in passive radar that can be used for diverse remote monitoring applications.

  2. FTIR gas analysis with improved sensitivity and selectivity for CWA and TIC detection

    Science.gov (United States)

    Phillips, Charles M.; Tan, Huwei

    2010-04-01

    This presentation describes the use of an FTIR (Fourier Transform Infrared)-based spectrometer designed to continuously monitor ambient air for the presence of chemical warfare agents (CWAs) and toxic industrial chemicals (TICs). The necessity of a reliable system capable of quickly and accurately detecting very low levels of CWAs and TICs while simultaneously retaining a negligible false alarm rate will be explored. Technological advancements in FTIR sensing have reduced noise while increasing selectivity and speed of detection. These novel analyzer design characteristics are discussed in detail and descriptions are provided which show how optical throughput, gas cell form factor, and detector response are optimized. The hardware and algorithms described here will explain why this FTIR system is very effective for the simultaneous detection and speciation of a wide variety of toxic compounds at ppb concentrations. Analytical test data will be reviewed demonstrating the system's sensitivity to and selectivity for specific CWAs and TICs; this will include recent data acquired as part of the DHS ARFCAM (Autonomous Rapid Facility Chemical Agent Monitor) project. These results include analyses of the data from live agent testing for the determination of CWA detection limits, immunity to interferences, detection times, residual noise analysis and false alarm rates. Sensing systems such as this are critical for effective chemical hazard identification which is directly relevant to the CBRNE community.

  3. Time series analysis of infrared satellite data for detecting thermal anomalies: a hybrid approach

    Science.gov (United States)

    Koeppen, W. C.; Pilger, E.; Wright, R.

    2011-07-01

    We developed and tested an automated algorithm that analyzes thermal infrared satellite time series data to detect and quantify the excess energy radiated from thermal anomalies such as active volcanoes. Our algorithm enhances the previously developed MODVOLC approach, a simple point operation, by adding a more complex time series component based on the methods of the Robust Satellite Techniques (RST) algorithm. Using test sites at Anatahan and Kīlauea volcanoes, the hybrid time series approach detected ~15% more thermal anomalies than MODVOLC with very few, if any, known false detections. We also tested gas flares in the Cantarell oil field in the Gulf of Mexico as an end-member scenario representing very persistent thermal anomalies. At Cantarell, the hybrid algorithm showed only a slight improvement, but it did identify flares that were undetected by MODVOLC. We estimate that at least 80 MODIS images for each calendar month are required to create good reference images necessary for the time series analysis of the hybrid algorithm. The improved performance of the new algorithm over MODVOLC will result in the detection of low temperature thermal anomalies that will be useful in improving our ability to document Earth's volcanic eruptions, as well as detecting low temperature thermal precursors to larger eruptions.

  4. A Comparative Analysis for Selection of Appropriate Mother Wavelet for Detection of Stationary Disturbances

    Science.gov (United States)

    Kamble, Saurabh Prakash; Thawkar, Shashank; Gaikwad, Vinayak G.; Kothari, D. P.

    2017-12-01

    Detection of disturbances is the first step of mitigation. Power electronics plays a crucial role in modern power system which makes system operation efficient but it also bring stationary disturbances in the power system and added impurities to the supply. It happens because of the non-linear loads used in modern day power system which inject disturbances like harmonic disturbances, flickers, sag etc. in power grid. These impurities can damage equipments so it is necessary to mitigate these impurities present in the supply very quickly. So, digital signal processing techniques are incorporated for detection purpose. Signal processing techniques like fast Fourier transform, short-time Fourier transform, Wavelet transform etc. are widely used for the detection of disturbances. Among all, wavelet transform is widely used because of its better detection capabilities. But, which mother wavelet has to use for detection is still a mystery. Depending upon the periodicity, the disturbances are classified as stationary and non-stationary disturbances. This paper presents the importance of selection of mother wavelet for analyzing stationary disturbances using discrete wavelet transform. Signals with stationary disturbances of various frequencies are generated using MATLAB. The analysis of these signals is done using various mother wavelets like Daubechies and bi-orthogonal wavelets and the measured root mean square value of stationary disturbance is obtained. The measured value obtained by discrete wavelet transform is compared with the exact RMS value of the frequency component and the percentage differences are presented which helps to select optimum mother wavelet.

  5. Feasibility analysis of EDXRF method to detect heavy metal pollution in ecological environment

    Science.gov (United States)

    Hao, Zhixu; Qin, Xulei

    2018-02-01

    The change of heavy metal content in water environment, soil and plant can reflect the change of heavy metal pollution in ecological environment, and it is important to monitor the trend of heavy metal pollution in eco-environment by using water environment, soil and heavy metal content in plant. However, the content of heavy metals in nature is very low, the background elements of water environment, soil and plant samples are complex, and there are many interfering factors in the EDXRF system that will affect the spectral analysis results and reduce the detection accuracy. Through the contrastive analysis of several heavy metal elements detection methods, it is concluded that the EDXRF method is superior to other chemical methods in testing accuracy and method feasibility when the heavy metal pollution in soil is tested in ecological environment.

  6. Theoretical analysis about early detection of hepatocellular carcinoma by medical imaging procedure

    Energy Technology Data Exchange (ETDEWEB)

    Odano, Ikuo; Hinata, Hiroshi; Hara, Keiji; Sakai, Kunio [Niigata Univ. (Japan). School of Medicine

    1983-04-01

    It is well-known that patients with chronic hepatitis and liver cirrhosis are frequently accompanied by hepatocellular carcinoma (hepatoma). They are called high risk group for hepatoma. In order to detect a small hepatoma, it is reasonable for us to perform screening examinations on these high risk group patients. Optimal screening interval, however, has not been established. In this report, a theoretical analysis was made to estimate optimal screening interval by imaging procedure such as ultrasonography, x-ray computed tomography and scintigraphy. By the analysis of eight cases, mean doubling time of hepatoma was estimated about four months (73 - 143 days). If we want to detect a hepatoma not greater than 3.0cm in diameter, medical screening procedure combining ultrasonography and scintigraphy should be performed once per about nine months.

  7. Novelty detection methods for online health monitoring and post data analysis of turbopumps

    International Nuclear Information System (INIS)

    Lei Hu; Niaoqing, Hu; Xinpeng, Zhang; Fengshou, Gu; Ming, Gao

    2013-01-01

    As novelty detection works when only normal data are available, it is of considerable promise for health monitoring in cases lacking fault samples and prior knowledge. We present two novelty detection methods for health monitoring of turbopumps in large-scale liquid propellant rocket engines. The first method is the adaptive Gaussian threshold model. This method is designed to monitor the vibration of the turbopumps online because it has minimal computational complexity and is easy for implementation in real time. The second method is the one-class support vector machine (OCSVM) which is developed for post analysis of historical vibration signals. Via post analysis the method not only confirms the online monitoring results but also provides diagnostic results so that faults from sensors are separated from those actually from the turbopumps. Both of these two methods are validated to be efficient for health monitoring of the turbopumps.

  8. Detecting chaos in particle accelerators through the frequency map analysis method.

    Science.gov (United States)

    Papaphilippou, Yannis

    2014-06-01

    The motion of beams in particle accelerators is dominated by a plethora of non-linear effects, which can enhance chaotic motion and limit their performance. The application of advanced non-linear dynamics methods for detecting and correcting these effects and thereby increasing the region of beam stability plays an essential role during the accelerator design phase but also their operation. After describing the nature of non-linear effects and their impact on performance parameters of different particle accelerator categories, the theory of non-linear particle motion is outlined. The recent developments on the methods employed for the analysis of chaotic beam motion are detailed. In particular, the ability of the frequency map analysis method to detect chaotic motion and guide the correction of non-linear effects is demonstrated in particle tracking simulations but also experimental data.

  9. Heteroduplex analysis of the dystrophin gene: Application to point mutation and carrier detection

    Energy Technology Data Exchange (ETDEWEB)

    Prior, T.W.; Papp, A.C.; Snyder, P.J.; Sedra, M.S.; Western, L.M.; Bartolo, C.; Mendell, J.R. [Ohio State Univ., Columbus, OH (United States); Moxley, R.T. [Univ. of Rochester Medical Center, NY (United States)

    1994-03-01

    Approximately one-third of Duchenne muscular dystrophy patients have undefined mutations in the dystrophin gene. For carrier and prenatal studies in families without detectable mutations, the indirect restriction fragment length polymorphism linkage approach is used. Using a multiplex amplification and heteroduplex analysis of dystrophin exons, the authors identified nonsense mutations in two DMD patients. Although the nonsense mutations are predicted to severely truncate the dystrophin protein, both patients presented with mild clinical courses of the disease. As a result of identifying the mutation in the affected boys, direct carrier studies by heteroduplex analysis were extended to other relatives. The authors conclude that the technique is not only ideal for mutation detection but is also useful for diagnostic testing. 29 refs., 4 figs.

  10. Detecting long-range correlation with detrended fluctuation analysis: Application to BWR stability

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa-Paredes, Gilberto [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico, DF 09340 (Mexico)]. E-mail: gepe@xanum.uam.mx; Alvarez-Ramirez, Jose [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico, DF 09340 (Mexico); Vazquez, Alejandro [Departamento de Ingenieria de Procesos e Hidraulica, Universidad Autonoma Metropolitana-Iztapalapa, Apartado Postal 55-534, Mexico, DF 09340 (Mexico)

    2006-11-15

    The aim of this paper is to explore the application of detrended fluctuation analysis (DFA) to study boiling water reactor stability. DFA is a scaling method commonly used for detecting long-range correlations in non-stationary time series. This method is based on the random walk theory and was applied to neutronic power signal of Forsmark stability benchmark. Our results shows that the scaling properties breakdown during unstable oscillations.

  11. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  12. State of the art in hair analysis for detection of drug and alcohol abuse.

    Science.gov (United States)

    Pragst, Fritz; Balikova, Marie A

    2006-08-01

    Hair differs from other materials used for toxicological analysis because of its unique ability to serve as a long-term storage of foreign substances with respect to the temporal appearance in blood. Over the last 20 years, hair testing has gained increasing attention and recognition for the retrospective investigation of chronic drug abuse as well as intentional or unintentional poisoning. In this paper, we review the physiological basics of hair growth, mechanisms of substance incorporation, analytical methods, result interpretation and practical applications of hair analysis for drugs and other organic substances. Improved chromatographic-mass spectrometric techniques with increased selectivity and sensitivity and new methods of sample preparation have improved detection limits from the ng/mg range to below pg/mg. These technical advances have substantially enhanced the ability to detect numerous drugs and other poisons in hair. For example, it was possible to detect previous administration of a single very low dose in drug-facilitated crimes. In addition to its potential application in large scale workplace drug testing and driving ability examination, hair analysis is also used for detection of gestational drug exposure, cases of criminal liability of drug addicts, diagnosis of chronic intoxication and in postmortem toxicology. Hair has only limited relevance in therapy compliance control. Fatty acid ethyl esters and ethyl glucuronide in hair have proven to be suitable markers for alcohol abuse. Hair analysis for drugs is, however, not a simple routine procedure and needs substantial guidelines throughout the testing process, i.e., from sample collection to results interpretation.

  13. Defect Detection in Alphonso using Statistical Method and Principal Component Analysis: A Non-Destructive Approach

    OpenAIRE

    Sandeep S. Musale; Pradeep M. Patil

    2014-01-01

    Natural image analysis uses textural property of the surface. Texture is defined as a spatial arrangement of local intensity attributes that are correlated within areas of visual scene corresponding to surface regions. Texture exhibits some sort of periodicity of the basic pattern of Spongy Tissue in alphonso mango. This leads to use textural property to identify different patterns of Spongy Tissue in alphonso for detection of defects in alphonso mango. Visual assessment of texture made by hu...

  14. Detecting long-range correlation with detrended fluctuation analysis: Application to BWR stability

    International Nuclear Information System (INIS)

    Espinosa-Paredes, Gilberto; Alvarez-Ramirez, Jose; Vazquez, Alejandro

    2006-01-01

    The aim of this paper is to explore the application of detrended fluctuation analysis (DFA) to study boiling water reactor stability. DFA is a scaling method commonly used for detecting long-range correlations in non-stationary time series. This method is based on the random walk theory and was applied to neutronic power signal of Forsmark stability benchmark. Our results shows that the scaling properties breakdown during unstable oscillations

  15. Network analysis to detect common strategies in Italian foreign direct investment

    Science.gov (United States)

    De Masi, G.; Giovannetti, G.; Ricchiuti, G.

    2013-03-01

    In this paper we reconstruct and discuss the network of Italian firms investing abroad, exploiting information from complex network analysis. This method, detecting the key nodes of the system (both in terms of firms and countries of destination), allows us to single out the linkages among firms without ex-ante priors. Moreover, through the examination of affiliates’ economic activity, it allows us to highlight different internationalization strategies of “leaders” in different manufacturing sectors.

  16. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  17. Caenorhabditis elegans Egg-Laying Detection and Behavior Study Using Image Analysis

    Directory of Open Access Journals (Sweden)

    Palm Megan

    2005-01-01

    Full Text Available Egg laying is an important phase of the life cycle of the nematode Caenorhabditis elegans (C. elegans. Previous studies examined egg-laying events manually. This paper presents a method for automatic detection of egg-laying onset using deformable template matching and other morphological image analysis techniques. Some behavioral changes surrounding egg-laying events are also studied. The results demonstrate that the computer vision tools and the algorithm developed here can be effectively used to study C. elegans egg-laying behaviors. The algorithm developed is an essential part of a machine-vision system for C. elegans tracking and behavioral analysis.

  18. Detection of colorectal cancer (CRC) by urinary volatile organic compound analysis.

    Science.gov (United States)

    Arasaradnam, Ramesh P; McFarlane, Michael J; Ryan-Fisher, Courtenay; Westenbrink, Erik; Hodges, Phoebe; Hodges, Paula; Thomas, Matthew G; Chambers, Samantha; O'Connell, Nicola; Bailey, Catherine; Harmston, Christopher; Nwokolo, Chuka U; Bardhan, Karna D; Covington, James A

    2014-01-01

    Colorectal cancer (CRC) is a leading cause of cancer related death in Europe and the USA. There is no universally accepted effective non-invasive screening test for CRC. Guaiac based faecal occult blood (gFOB) testing has largely been superseded by Faecal Immunochemical testing (FIT), but sensitivity still remains poor. The uptake of population based FOBt testing in the UK is also low at around 50%. The detection of volatile organic compounds (VOCs) signature(s) for many cancer subtypes is receiving increasing interest using a variety of gas phase analytical instruments. One such example is FAIMS (Field Asymmetric Ion Mobility Spectrometer). FAIMS is able to identify Inflammatory Bowel disease (IBD) patients by analysing shifts in VOCs patterns in both urine and faeces. This study extends this concept to determine whether CRC patients can be identified through non-invasive analysis of urine, using FAIMS. 133 patients were recruited; 83 CRC patients and 50 healthy controls. Urine was collected at the time of CRC diagnosis and headspace analysis undertaken using a FAIMS instrument (Owlstone, Lonestar, UK). Data was processed using Fisher Discriminant Analysis (FDA) after feature extraction from the raw data. FAIMS analyses demonstrated that the VOC profiles of CRC patients were tightly clustered and could be distinguished from healthy controls. Sensitivity and specificity for CRC detection with FAIMS were 88% and 60% respectively. This study suggests that VOC signatures emanating from urine can be detected in patients with CRC using ion mobility spectroscopy technology (FAIMS) with potential as a novel screening tool.

  19. Spike detection, characterization, and discrimination using feature analysis software written in LabVIEW.

    Science.gov (United States)

    Stewart, C M; Newlands, S D; Perachio, A A

    2004-12-01

    Rapid and accurate discrimination of single units from extracellular recordings is a fundamental process for the analysis and interpretation of electrophysiological recordings. We present an algorithm that performs detection, characterization, discrimination, and analysis of action potentials from extracellular recording sessions. The program was entirely written in LabVIEW (National Instruments), and requires no external hardware devices or a priori information about action potential shapes. Waveform events are detected by scanning the digital record for voltages that exceed a user-adjustable trigger. Detected events are characterized to determine nine different time and voltage levels for each event. Various algebraic combinations of these waveform features are used as axis choices for 2-D Cartesian plots of events. The user selects axis choices that generate distinct clusters. Multiple clusters may be defined as action potentials by manually generating boundaries of arbitrary shape. Events defined as action potentials are validated by visual inspection of overlain waveforms. Stimulus-response relationships may be identified by selecting any recorded channel for comparison to continuous and average cycle histograms of binned unit data. The algorithm includes novel aspects of feature analysis and acquisition, including higher acquisition rates for electrophysiological data compared to other channels. The program confirms that electrophysiological data may be discriminated with high-speed and efficiency using algebraic combinations of waveform features derived from high-speed digital records.

  20. Detection of toxin genes and RAPD analysis of bacillus cereus isolates from different soil types

    Directory of Open Access Journals (Sweden)

    Savic Dejana

    2015-01-01

    Full Text Available The aim of this study was to detect genes for enterotoxins (hbla, entFM and bceT and for emetic toxin (cer, to determine antibiotic resistance, and to estimate intraspecies diversity in B. cereus isolates by RAPD analysis. B. cereus was identified in 12 out of 117 indigenous Bacillus spp. using the classical microbiological methods and PCR. All isolates were resistant to penicillin and ampicillin, two to tetracyclin and four to trimethoprim-sulphamethoxazole. Also, all isolates produced inducible penicillinases and β-lactamase. Toxin genes were detected with PCR. EntFM and cer genes were present in all isolates, hbla in all, but two, and bceT in none. RAPD analysis was performed with four different primers, two of them designed for this study. The intraspecies diversity revealed 10 different patterns at the 90% similarity level. Two separate clusters were formed regardless of a soil type or utilization. The detection of genes encoding toxins in all B. cereus isolates indicated these bacteria as potentially pathogenic and seriously for human health. Regardless of a soil type or utilization, the RAPD analysis showed high intraspecies heterogeneity in B. cereus isolates. To the best of our knowledge, this is the first study to analyse the presence of entero- and emetic toxin genes and genetic heterogeneity in B. cereus isolates from different soil types and different soil utilization in Serbia. [Projekat Ministarstva nauke Republike Srbije, br. TR37006

  1. Screening Performance Characteristic of Ultrasonography and Radiography in Detection of Pleural Effusion; a Meta-Analysis.

    Science.gov (United States)

    Yousefifard, Mahmoud; Baikpour, Masoud; Ghelichkhani, Parisa; Asady, Hadi; Shahsavari Nia, Kavous; Moghadas Jafari, Ali; Hosseini, Mostafa; Safari, Saeed

    2016-01-01

    The role of ultrasonography in detection of pleural effusion has long been a subject of interest but controversial results have been reported. Accordingly, this study aims to conduct a systematic review of the available literature on diagnostic value of ultrasonography and radiography in detection of pleural effusion through a meta-analytic approach. An extended search was done in databases of Medline, EMBASE, ISI Web of Knowledge, Scopus, Cochrane Library, and ProQuest. Two reviewers independently extracted the data and assessed the quality of the articles. Meta-analysis was performed using a mixed-effects binary regression model. Finally, subgroup analysis was carried out in order to find the sources of heterogeneity between the included studies. 12 studies were included in this meta-analysis (1554 subjects, 58.6% male). Pooled sensitivity of ultrasonography in detection of pleural effusion was 0.94 (95% CI: 0.88-0.97; I2= 84.23, pultrasonography was found to be higher when the procedure was carried out by an intensivist or a radiologist using 5-10 MHz transducers. Chest ultrasonography, as a screening tool, has a higher diagnostic accuracy in identification of plural effusion compared to radiography. The sensitivity of this imaging modality was found to be higher when performed by a radiologist or an intensivist and using 5-10MHz probes.

  2. Possible Detection of Perchlorates by the Sample Analysis at Mars (SAM) Instrument: Comparison with Previous Missions

    Science.gov (United States)

    Navarro-Gonzalex, Rafael; Sutter, Brad; Archer, Doug; Ming, Doug; Eigenbrode, Jennifer; Franz, Heather; Glavin, Daniel; McAdam, Amy; Stern, Jennifer; McKay, Christopher; hide

    2013-01-01

    The first chemical analysis of soluble salts in the soil was carried out by the Phoenix Lander in the Martian Arctic [1]. Surprisingly, chlorine was present as magnesium or calcium perchlorate at 0.4 to 0.6 percent. Additional support for the identification of perchlorate came from the evolved gas analysis which detected the release of molecular oxygen at 350-550C [1]. When Mars-like soils from the Atacama Desert were spiked with magnesium perchlorate (1 percent) and heated using the Viking GC-MS protocol, nearly all the organics were combusted but a small amount was chlorinated, forming chloromethane and dichloromethane [2]. These chlorohydrocarbons were detected by the Viking GC-MS experiments when the Martian soil was analyzed but they were considered to be terrestrial contaminants [3]. Reinterpretation of the Viking results suggests Analysis at Mars (SAM) instrument on board the Mars Science Laboratory (MSL) ran four samples from an aeolian bedform named Rocknest. The samples analyzed were portioned from the fifth scoop at this location. The samples were heated to 835C at 35C/min with a He flow. The SAM QMS detected a major oxygen release (300-500C) [5], coupled with the release of chlorinated hydrocarbons (chloromethane, dichloromethane, trichloromethane, and chloromethylpropene) detected both by SAM QMS and GC-MS derived from known Earth organic contaminants in the instrument [6]. Calcium perchlorate appears to be the best candidate for evolved O2 in the Rocknest samples at this time but other Cl species (e.g., chlorates) are possible and must be evaluated. The potential detection of perchlorates in Rocknest material adds weight to the argument that both Viking Landers measured signatures of perchlorates. Even if the source of the organic carbon detected is still unknown, the chlorine source was likely Martian. Two mechanisms have been hypothesized for the formation of soil perchlorate: (1) Atmospheric oxidation of chlorine; and (2) UV photooxidation of

  3. Image analysis for the detection and quantification of concrete bugholes in a tunnel lining

    Directory of Open Access Journals (Sweden)

    Isamu Yoshitake

    2018-06-01

    Full Text Available A measurement and quantification system for concrete bugholes (surface air voids on sidewalls was developed to quantify the surface quality of tunnel-lining concrete. The developed system uses and evaluates red/green/blue values of color images taken by a commercial digital still camera. A comparative test shows that the developed system has higher accuracy than image analyses using thresholding and can estimate bugholes with accuracy almost equal to that of a detailed visual inspection. The results confirm that even small bugholes (<1 mm can be detected in color image analysis, whereas such bugholes are hardly detected in the detailed visual survey. In addition, color image analysis improves the calculations of the area of multiple bugholes distributed randomly over a concrete surface. Fundamental tests employing image analysis demonstrate that the prevalence of bugholes increases with an increase in the negative angle of the concrete form and a decrease in concrete workability. The system is applicable to the quantitative evaluation of a concrete surface having visible and invisible bugholes. Results indicate that the developed color image analysis can contribute to the reasonable and appropriate evaluation of bugholes and replace a detailed survey that requires much human resource and has a long inspection time. Keywords: Bughole, Image analysis, Surface quality, Tunnel lining concrete, Laboratory test, Inspection

  4. Sequential capillary electrophoresis analysis using optically gated sample injection and UV/vis detection.

    Science.gov (United States)

    Liu, Xiaoxia; Tian, Miaomiao; Camara, Mohamed Amara; Guo, Liping; Yang, Li

    2015-10-01

    We present sequential CE analysis of amino acids and L-asparaginase-catalyzed enzyme reaction, by combing the on-line derivatization, optically gated (OG) injection and commercial-available UV-Vis detection. Various experimental conditions for sequential OG-UV/vis CE analysis were investigated and optimized by analyzing a standard mixture of amino acids. High reproducibility of the sequential CE analysis was demonstrated with RSD values (n = 20) of 2.23, 2.57, and 0.70% for peak heights, peak areas, and migration times, respectively, and the LOD of 5.0 μM (for asparagine) and 2.0 μM (for aspartic acid) were obtained. With the application of the OG-UV/vis CE analysis, sequential online CE enzyme assay of L-asparaginase-catalyzed enzyme reaction was carried out by automatically and continuously monitoring the substrate consumption and the product formation every 12 s from the beginning to the end of the reaction. The Michaelis constants for the reaction were obtained and were found to be in good agreement with the results of traditional off-line enzyme assays. The study demonstrated the feasibility and reliability of integrating the OG injection with UV/vis detection for sequential online CE analysis, which could be of potential value for online monitoring various chemical reaction and bioprocesses. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. [Comparison of Discriminant Analysis and Decision Trees for the Detection of Subclinical Keratoconus].

    Science.gov (United States)

    Kleinhans, Sonja; Herrmann, Eva; Kohnen, Thomas; Bühren, Jens

    2017-08-15

    Background Iatrogenic keratectasia is one of the most dreaded complications of refractive surgery. In most cases, keratectasia develops after refractive surgery of eyes suffering from subclinical stages of keratoconus with few or no signs. Unfortunately, there has been no reliable procedure for the early detection of keratoconus. In this study, we used binary decision trees (recursive partitioning) to assess their suitability for discrimination between normal eyes and eyes with subclinical keratoconus. Patients and Methods The method of decision tree analysis was compared with discriminant analysis which has shown good results in previous studies. Input data were 32 eyes of 32 patients with newly diagnosed keratoconus in the contralateral eye and preoperative data of 10 eyes of 5 patients with keratectasia after laser in-situ keratomileusis (LASIK). The control group was made up of 245 normal eyes after LASIK and 12-month follow-up without any signs of iatrogenic keratectasia. Results Decision trees gave better accuracy and specificity than did discriminant analysis. The sensitivity of decision trees was lower than the sensitivity of discriminant analysis. Conclusion On the basis of the patient population of this study, decision trees did not prove to be superior to linear discriminant analysis for the detection of subclinical keratoconus. Georg Thieme Verlag KG Stuttgart · New York.

  6. Detection of somatic mutations by high-resolution DNA melting (HRM) analysis in multiple cancers.

    Science.gov (United States)

    Gonzalez-Bosquet, Jesus; Calcei, Jacob; Wei, Jun S; Garcia-Closas, Montserrat; Sherman, Mark E; Hewitt, Stephen; Vockley, Joseph; Lissowska, Jolanta; Yang, Hannah P; Khan, Javed; Chanock, Stephen

    2011-01-17

    Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM) curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each). HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples.

  7. Detection of somatic mutations by high-resolution DNA melting (HRM analysis in multiple cancers.

    Directory of Open Access Journals (Sweden)

    Jesus Gonzalez-Bosquet

    Full Text Available Identification of somatic mutations in cancer is a major goal for understanding and monitoring the events related to cancer initiation and progression. High resolution melting (HRM curve analysis represents a fast, post-PCR high-throughput method for scanning somatic sequence alterations in target genes. The aim of this study was to assess the sensitivity and specificity of HRM analysis for tumor mutation screening in a range of tumor samples, which included 216 frozen pediatric small rounded blue-cell tumors as well as 180 paraffin-embedded tumors from breast, endometrial and ovarian cancers (60 of each. HRM analysis was performed in exons of the following candidate genes known to harbor established commonly observed mutations: PIK3CA, ERBB2, KRAS, TP53, EGFR, BRAF, GATA3, and FGFR3. Bi-directional sequencing analysis was used to determine the accuracy of the HRM analysis. For the 39 mutations observed in frozen samples, the sensitivity and specificity of HRM analysis were 97% and 87%, respectively. There were 67 mutation/variants in the paraffin-embedded samples, and the sensitivity and specificity for the HRM analysis were 88% and 80%, respectively. Paraffin-embedded samples require higher quantity of purified DNA for high performance. In summary, HRM analysis is a promising moderate-throughput screening test for mutations among known candidate genomic regions. Although the overall accuracy appears to be better in frozen specimens, somatic alterations were detected in DNA extracted from paraffin-embedded samples.

  8. Multivariate image analysis of laser-induced photothermal imaging used for detection of caries tooth

    Science.gov (United States)

    El-Sherif, Ashraf F.; Abdel Aziz, Wessam M.; El-Sharkawy, Yasser H.

    2010-08-01

    Time-resolved photothermal imaging has been investigated to characterize tooth for the purpose of discriminating between normal and caries areas of the hard tissue using thermal camera. Ultrasonic thermoelastic waves were generated in hard tissue by the absorption of fiber-coupled Q-switched Nd:YAG laser pulses operating at 1064 nm in conjunction with a laser-induced photothermal technique used to detect the thermal radiation waves for diagnosis of human tooth. The concepts behind the use of photo-thermal techniques for off-line detection of caries tooth features were presented by our group in earlier work. This paper illustrates the application of multivariate image analysis (MIA) techniques to detect the presence of caries tooth. MIA is used to rapidly detect the presence and quantity of common caries tooth features as they scanned by the high resolution color (RGB) thermal cameras. Multivariate principal component analysis is used to decompose the acquired three-channel tooth images into a two dimensional principal components (PC) space. Masking score point clusters in the score space and highlighting corresponding pixels in the image space of the two dominant PCs enables isolation of caries defect pixels based on contrast and color information. The technique provides a qualitative result that can be used for early stage caries tooth detection. The proposed technique can potentially be used on-line or real-time resolved to prescreen the existence of caries through vision based systems like real-time thermal camera. Experimental results on the large number of extracted teeth as well as one of the thermal image panoramas of the human teeth voltanteer are investigated and presented.

  9. Dissociation between recognition and detection advantage for facial expressions: a meta-analysis.

    Science.gov (United States)

    Nummenmaa, Lauri; Calvo, Manuel G

    2015-04-01

    Happy facial expressions are recognized faster and more accurately than other expressions in categorization tasks, whereas detection in visual search tasks is widely believed to be faster for angry than happy faces. We used meta-analytic techniques for resolving this categorization versus detection advantage discrepancy for positive versus negative facial expressions. Effect sizes were computed on the basis of the r statistic for a total of 34 recognition studies with 3,561 participants and 37 visual search studies with 2,455 participants, yielding a total of 41 effect sizes for recognition accuracy, 25 for recognition speed, and 125 for visual search speed. Random effects meta-analysis was conducted to estimate effect sizes at population level. For recognition tasks, an advantage in recognition accuracy and speed for happy expressions was found for all stimulus types. In contrast, for visual search tasks, moderator analysis revealed that a happy face detection advantage was restricted to photographic faces, whereas a clear angry face advantage was found for schematic and "smiley" faces. Robust detection advantage for nonhappy faces was observed even when stimulus emotionality was distorted by inversion or rearrangement of the facial features, suggesting that visual features primarily drive the search. We conclude that the recognition advantage for happy faces is a genuine phenomenon related to processing of facial expression category and affective valence. In contrast, detection advantages toward either happy (photographic stimuli) or nonhappy (schematic) faces is contingent on visual stimulus features rather than facial expression, and may not involve categorical or affective processing. (c) 2015 APA, all rights reserved).

  10. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available A Similarity Ratio Analysis (SRA method is proposed for early-stage Fault Detection (FD in plasma etching processes using real-time Optical Emission Spectrometer (OES data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A, takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  11. Towards quantitative SERS detection of hydrogen cyanide at ppb level for human breath analysis

    Directory of Open Access Journals (Sweden)

    Rikke Kragh Lauridsen

    2015-09-01

    Full Text Available Lung infections with Pseudomonas aeruginosa (PA is the most common cause of morbidity and mortality in cystic fibrosis (CF patients. Due to its ready adaptation to the dehydrated mucosa of CF airways, PA infections tend to become chronic, eventually killing the patient. Hydrogen cyanide (HCN at ppb level has been reported to be a PA biomarker. For early PA detection in CF children not yet chronically lung infected a non-invasive Surface-Enhanced Raman Spectroscopy (SERS-based breath nanosensor is being developed. The triple bond between C and N in cyanide, with its characteristic band at ∼2133 cm−1, is an excellent case for the SERS-based detection due to the infrequent occurrence of triple bonds in nature. For demonstration of direct HCN detection in the gas phase, a gold-coated silicon nanopillar substrate was exposed to 5 ppm HCN in N2. Results showed that HCN adsorbed on the SERS substrate can be consistently detected under different experimental conditions and up to 9 days after exposure. For detection of lower cyanide concentrations serial dilution experiments using potassium cyanide (KCN demonstrated cyanide quantification down to 1 μM in solution (corresponding to 18 ppb. Lower KCN concentrations of 10 and 100 nM (corresponding to 0.18 and 1.8 ppb produced SERS intensities that were relatively similar to the reference signal. Since HCN concentration in the breath of PA colonized CF children is reported to be ∼13.5 ppb, the detection of cyanide is within the required range. Keywords: Surface-Enhanced Raman Spectroscopy, Hydrogen cyanide, Pseudomonas aeruginosa, Cystic fibrosis, Breath analysis

  12. Streak detection and analysis pipeline for space-debris optical images

    Science.gov (United States)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  13. EEG analysis of seizure patterns using visibility graphs for detection of generalized seizures.

    Science.gov (United States)

    Wang, Lei; Long, Xi; Arends, Johan B A M; Aarts, Ronald M

    2017-10-01

    The traditional EEG features in the time and frequency domain show limited seizure detection performance in the epileptic population with intellectual disability (ID). In addition, the influence of EEG seizure patterns on detection performance was less studied. A single-channel EEG signal can be mapped into visibility graphs (VGS), including basic visibility graph (VG), horizontal VG (HVG), and difference VG (DVG). These graphs were used to characterize different EEG seizure patterns. To demonstrate its effectiveness in identifying EEG seizure patterns and detecting generalized seizures, EEG recordings of 615h on one EEG channel from 29 epileptic patients with ID were analyzed. A novel feature set with discriminative power for seizure detection was obtained by using the VGS method. The degree distributions (DDs) of DVG can clearly distinguish EEG of each seizure pattern. The degree entropy and power-law degree power in DVG were proposed here for the first time, and they show significant difference between seizure and non-seizure EEG. The connecting structure measured by HVG can better distinguish seizure EEG from background than those by VG and DVG. A traditional EEG feature set based on frequency analysis was used here as a benchmark feature set. With a support vector machine (SVM) classifier, the seizure detection performance of the benchmark feature set (sensitivity of 24%, FD t /h of 1.8s) can be improved by combining our proposed VGS features extracted from one EEG channel (sensitivity of 38%, FD t /h of 1.4s). The proposed VGS-based features can help improve seizure detection for ID patients. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Soil Carbon Variability and Change Detection in the Forest Inventory Analysis Database of the United States

    Science.gov (United States)

    Wu, A. M.; Nater, E. A.; Dalzell, B. J.; Perry, C. H.

    2014-12-01

    The USDA Forest Service's Forest Inventory Analysis (FIA) program is a national effort assessing current forest resources to ensure sustainable management practices, to assist planning activities, and to report critical status and trends. For example, estimates of carbon stocks and stock change in FIA are reported as the official United States submission to the United Nations Framework Convention on Climate Change. While the main effort in FIA has been focused on aboveground biomass, soil is a critical component of this system. FIA sampled forest soils in the early 2000s and has remeasurement now underway. However, soil sampling is repeated on a 10-year interval (or longer), and it is uncertain what magnitude of changes in soil organic carbon (SOC) may be detectable with the current sampling protocol. We aim to identify the sensitivity and variability of SOC in the FIA database, and to determine the amount of SOC change that can be detected with the current sampling scheme. For this analysis, we attempt to answer the following questions: 1) What is the sensitivity (power) of SOC data in the current FIA database? 2) How does the minimum detectable change in forest SOC respond to changes in sampling intervals and/or sample point density? Soil samples in the FIA database represent 0-10 cm and 10-20 cm depth increments with a 10-year sampling interval. We are investigating the variability of SOC and its change over time for composite soil data in each FIA region (Pacific Northwest, Interior West, Northern, and Southern). To guide future sampling efforts, we are employing statistical power analysis to examine the minimum detectable change in SOC storage. We are also investigating the sensitivity of SOC storage changes under various scenarios of sample size and/or sample frequency. This research will inform the design of future FIA soil sampling schemes and improve the information available to international policy makers, university and industry partners, and the public.

  15. Testing continuous earthquake detection and location in Alentejo (South Portugal) by waveform coherency analysis

    Science.gov (United States)

    Matos, Catarina; Grigoli, Francesco; Cesca, Simone; Custódio, Susana

    2015-04-01

    In the last decade a permanent seismic network of 30 broadband stations, complemented by dense temporary deployments, covered Portugal. This extraordinary network coverage enables now the computation of a high-resolution image of the seismicity of Portugal, which in turn will shed light on the seismotectonics of Portugal. The large data volumes available cannot be analyzed by traditional time-consuming manual location procedures. In this presentation we show first results on the automatic detection and location of earthquakes occurred in a selected region in the south of Portugal Our main goal is to implement an automatic earthquake detection and location routine in order to have a tool to quickly process large data sets, while at the same time detecting low magnitude earthquakes (i.e., lowering the detection threshold). We present a modified version of the automatic seismic event location by waveform coherency analysis developed by Grigoli et al. (2013, 2014), designed to perform earthquake detections and locations in continuous data. The event detection is performed by continuously computing the short-term-average/long-term-average of two different characteristic functions (CFs). For the P phases we used a CF based on the vertical energy trace, while for S phases we used a CF based on the maximum eigenvalue of the instantaneous covariance matrix (Vidale 1991). Seismic event detection and location is obtained by performing waveform coherence analysis scanning different hypocentral coordinates. We apply this technique to earthquakes in the Alentejo region (South Portugal), taking advantage from a small aperture seismic network installed in the south of Portugal for two years (2010 - 2011) during the DOCTAR experiment. In addition to the good network coverage, the Alentejo region was chosen for its simple tectonic setting and also because the relationship between seismicity, tectonics and local lithospheric structure is intriguing and still poorly understood. Inside

  16. Hyperspectral wide gap second derivative analysis for in vivo detection of cervical intraepithelial neoplasia

    Science.gov (United States)

    Zheng, Wenli; Wang, Chaojian; Chang, Shufang; Zhang, Shiwu; Xu, Ronald X.

    2015-12-01

    Hyperspectral reflectance imaging technique has been used for in vivo detection of cervical intraepithelial neoplasia. However, the clinical outcome of this technique is suboptimal owing to multiple limitations such as nonuniform illumination, high-cost and bulky setup, and time-consuming data acquisition and processing. To overcome these limitations, we acquired the hyperspectral data cube in a wavelength ranging from 600 to 800 nm and processed it by a wide gap second derivative analysis method. This method effectively reduced the image artifacts caused by nonuniform illumination and background absorption. Furthermore, with second derivative analysis, only three specific wavelengths (620, 696, and 772 nm) are needed for tissue classification with optimal separability. Clinical feasibility of the proposed image analysis and classification method was tested in a clinical trial where cervical hyperspectral images from three patients were used for classification analysis. Our proposed method successfully classified the cervix tissue into three categories of normal, inflammation and high-grade lesion. These classification results were coincident with those by an experienced gynecology oncologist after applying acetic acid. Our preliminary clinical study has demonstrated the technical feasibility for in vivo and noninvasive detection of cervical neoplasia without acetic acid. Further clinical research is needed in order to establish a large-scale diagnostic database and optimize the tissue classification technique.

  17. Phase analysis in gated blood pool tomography. Detection of accessory conduction pathway

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, Kenichi; Bunko, Hisashi; Tada, Akira; Taki, Junichi; Nanbu, Ichiro (Kanazawa Univ. (Japan). School of Medicine)

    1984-02-01

    Phase analysis of gated blood pool study has been applied to detect the site of accessory conduction pathway (ACP) in the Wolff-Parkinson-White (WPW) syndrome; however, there was a limitation to detect the precise location of ACP by phase analysis alone. In this study, we applied phase analysis to gated blood pool tomography using seven pin hole tomography (7PT) and gated emission computed tomography (GECT) in 21 patients with WPW syndrome and 3 normal subjects. In 17 patients, the sites of ACPs were confirmed by epicardial mapping and the result of the surgical division of ACP. In 7PT, the site of ACP grossly agreed to the abnormal initial phase in phase image in 5 out of 6 patients with left cardiac type. In GECT, phase images were generated in short axial, vertical and horizontal long axial sections. In 8 out of 9 patients, the site of ACP was correctly identified by phase images, and in a patient who had two ACPs, initial phase corresponded to one of the two locations. Phase analysis of gated blood pool tomography has advantages for avoiding overlap of blood pools and for estimating three-dimensional propagation of the contraction, and can be a good adjunctive method in patients with WPW syndrome.

  18. Detection of Early Faults in Rotating Machinery Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Meng Hee Lim

    2013-01-01

    Full Text Available This paper explores the application of wavelet analysis for the detection of early changes in rotor dynamics caused by common machinery faults, namely, rotor unbalance and minor blade rubbing conditions. In this paper, the time synchronised wavelet analysis method was formulated and its effectiveness to detect machinery faults at the early stage was evaluated based on signal simulation and experimental study. The proposed method provides a more standardised approach to visualise the current state of rotor dynamics of a rotating machinery by taking into account the effects of time shift, wavelet edge distortion, and system noise suppression. The experimental results showed that this method is able to reveal subtle changes of the vibration signal characteristics in both the frequency content distribution and the amplitude distortion caused by minor rotor unbalance and blade rubbing conditions. Besides, this method also appeared to be an effective tool to diagnose and to discriminate the different types of machinery faults based on the unique pattern of the wavelet contours. This study shows that the proposed wavelet analysis method is promising to reveal machinery faults at early stage as compared to vibration spectrum analysis.

  19. Automatic facial pore analysis system using multi-scale pore detection.

    Science.gov (United States)

    Sun, J Y; Kim, S W; Lee, S H; Choi, J E; Ko, S J

    2017-08-01

    As facial pore widening and its treatments have become common concerns in the beauty care field, the necessity for an objective pore-analyzing system has been increased. Conventional apparatuses lack in usability requiring strong light sources and a cumbersome photographing process, and they often yield unsatisfactory analysis results. This study was conducted to develop an image processing technique for automatic facial pore analysis. The proposed method detects facial pores using multi-scale detection and optimal scale selection scheme and then extracts pore-related features such as total area, average size, depth, and the number of pores. Facial photographs of 50 subjects were graded by two expert dermatologists, and correlation analyses between the features and clinical grading were conducted. We also compared our analysis result with those of conventional pore-analyzing devices. The number of large pores and the average pore size were highly correlated with the severity of pore enlargement. In comparison with the conventional devices, the proposed analysis system achieved better performance showing stronger correlation with the clinical grading. The proposed system is highly accurate and reliable for measuring the severity of skin pore enlargement. It can be suitably used for objective assessment of the pore tightening treatments. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Censoring approach to the detection limits in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Pajek, M.; Kubala-Kukus, A.

    2004-01-01

    We demonstrate that the effect of detection limits in the X-ray fluorescence analysis (XRF), which limits the determination of very low concentrations of trace elements and results in appearance of the so-called 'nondetects', can be accounted for using the statistical concept of censoring. More precisely, the results of such measurements can be viewed as the left random censored data, which can further be analyzed using the Kaplan-Meier method correcting the data for the presence of nondetects. Using this approach, the results of measured, detection limit censored concentrations can be interpreted in a nonparametric manner including the correction for the nondetects, i.e. the measurements in which the concentrations were found to be below the actual detection limits. Moreover, using the Monte Carlo simulation technique we show that by using the Kaplan-Meier approach the corrected mean concentrations for a population of the samples can be estimated within a few percent uncertainties with respect of the simulated, uncensored data. This practically means that the final uncertainties of estimated mean values are limited in fact by the number of studied samples and not by the correction procedure itself. The discussed random-left censoring approach was applied to analyze the XRF detection-limit-censored concentration measurements of trace elements in biomedical samples

  1. Accuracy analysis of the CTBTO nuclear test detection scale and Improvement

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Young Kwang [Seoul National Univ., Seoul (Korea, Republic of)

    2014-10-15

    CTBTO (Comprehensive nuclear Test Ban Treaty Organization) is charge of nuclear test monitoring for nuclear non-proliferation. CTBTO has 170 seismic stations in operation in 76 countries in order to detect the artificial earthquake that was caused by an underground nuclear test. Korea use formula that is based on the equations that are used by the IMS (International Monitoring System) of CTBTO for analysis of explosive scale, and reflect the nature of the terrain, such as rock. But the expression for calculating the exact scale explosive is still un-established state. And generally CTBTO doesn't care about artificial explosive that is being received low-yield in accordance with the criteria of nuclear detection. But, at the time that North Korea conduct a nuclear test, it should not be overlooked that the scale of the earthquake detection criteria below. Because DPRK is trying to conceal their nuclear development capability, there are possibility of low-yield nuclear test or possibility of install a buffer to hide actual explosive scale. These radionuclide observations were consistent with a DPRK low-yield nuclear test on May 2010, even though no seismic signals from such a test have been detected. But there were a few times of low-yield (magnitude 1.39-1.93) occurred around DPRK nuclear test site at that time.

  2. Rapid detection and identification of four major Schistosoma species by high-resolution melt (HRM) analysis.

    Science.gov (United States)

    Li, Juan; Zhao, Guang-Hui; Lin, RuiQing; Blair, David; Sugiyama, Hiromu; Zhu, Xing-Quan

    2015-11-01

    Schistosomiasis, caused by blood flukes belonging to several species of the genus Schistosoma, is a serious and widespread parasitic disease. Accurate and rapid differentiation of these etiological agents of animal and human schistosomiasis to species level can be difficult. We report a real-time PCR assay coupled with a high-resolution melt (HRM) assay targeting a portion of the nuclear 18S rDNA to detect, identify, and distinguish between four major blood fluke species (Schistosoma japonicum, Schistosoma mansoni, Schistosoma haematobium, and Schistosoma mekongi). Using this system, the Schistosoma spp. was accurately identified and could also be distinguished from all other trematode species with which they were compared. As little as 10(-5) ng genomic DNA from a Schistosoma sp. could be detected. This process is inexpensive, easy, and can be completed within 3 h. Examination of 21 representative Schistosoma samples from 15 geographical localities in seven endemic countries validated the value of the HRM detection assay and proved its reliability. The melting curves were characterized by peaks of 83.65 °C for S. japonicum and S. mekongi, 85.65 °C for S. mansoni, and 85.85 °C for S. haematobium. The present study developed a real-time PCR coupled with HRM analysis assay for detection and differential identification of S. mansoni, S. haematobium, S. japonicum, and S. mekongi. This method is rapid, sensitive, and inexpensive. It has important implications for epidemiological studies of Schistosoma.

  3. Power analysis of QTL detection in half-sib families using selective DNA pooling

    Directory of Open Access Journals (Sweden)

    López Teresa

    2001-05-01

    Full Text Available Abstract Individual loci of economic importance (QTL can be detected by comparing the inheritance of a trait and the inheritance of loci with alleles readily identifiable by laboratory methods (genetic markers. Data on allele segregation at the individual level are costly and alternatives have been proposed that make use of allele frequencies among progeny, rather than individual genotypes. Among the factors that may affect the power of the set up, the most important are those intrinsic to the QTL: the additive effect of the QTL, and its dominance, and distance between markers and QTL. Other factors are relative to the choice of animals and markers, such as the frequency of the QTL and marker alleles among dams and sires. Data collection may affect the detection power through the size of half-sib families, selection rate within families, and the technical error incurred when estimating genetic frequencies. We present results for a sensitivity analysis for QTL detection using pools of DNA from selected half-sibs. Simulations showed that conclusive detection may be achieved with families of at least 500 half-sibs if sires are chosen on the criteria that most of their marker alleles are either both missing, or one is fixed, among dams.

  4. The detection of great crested newts year round via environmental DNA analysis.

    Science.gov (United States)

    Rees, Helen C; Baker, Claire A; Gardner, David S; Maddison, Ben C; Gough, Kevin C

    2017-07-26

    Analysis of environmental DNA (eDNA) is a method that has been used for the detection of various species within water bodies. The great crested newt (Triturus cristatus) has a short eDNA survey season (mid-April to June). Here we investigate whether this season could be extended into other months using the current methodology as stipulated by Natural England. Here we present data to show that in monthly water samples taken from two ponds (March 2014-February 2015) we were able to detect great crested newt DNA in all months in at least one of the ponds. Similar levels of great crested newt eDNA (i.e. highly positive identification) were detected through the months of March-August, suggesting it may be possible to extend the current survey window. In order to determine how applicable these observations are for ponds throughout the rest of the UK, further work in multiple other ponds over multiple seasons is suggested. Nevertheless, the current work clearly demonstrates, in two ponds, the efficacy and reproducibility of eDNA detection for determining the presence of great crested newts.

  5. Experimental study on the crack detection with optimized spatial wavelet analysis and windowing

    Science.gov (United States)

    Ghanbari Mardasi, Amir; Wu, Nan; Wu, Christine

    2018-05-01

    In this paper, a high sensitive crack detection is experimentally realized and presented on a beam under certain deflection by optimizing spatial wavelet analysis. Due to the crack existence in the beam structure, a perturbation/slop singularity is induced in the deflection profile. Spatial wavelet transformation works as a magnifier to amplify the small perturbation signal at the crack location to detect and localize the damage. The profile of a deflected aluminum cantilever beam is obtained for both intact and cracked beams by a high resolution laser profile sensor. Gabor wavelet transformation is applied on the subtraction of intact and cracked data sets. To improve detection sensitivity, scale factor in spatial wavelet transformation and the transformation repeat times are optimized. Furthermore, to detect the possible crack close to the measurement boundaries, wavelet transformation edge effect, which induces large values of wavelet coefficient around the measurement boundaries, is efficiently reduced by introducing different windowing functions. The result shows that a small crack with depth of less than 10% of the beam height can be localized with a clear perturbation. Moreover, the perturbation caused by a crack at 0.85 mm away from one end of the measurement range, which is covered by wavelet transform edge effect, emerges by applying proper window functions.

  6. Analysis of image heterogeneity using 2D Minkowski functionals detects tumor responses to treatment.

    Science.gov (United States)

    Larkin, Timothy J; Canuto, Holly C; Kettunen, Mikko I; Booth, Thomas C; Hu, De-En; Krishnan, Anant S; Bohndiek, Sarah E; Neves, André A; McLachlan, Charles; Hobson, Michael P; Brindle, Kevin M

    2014-01-01

    The acquisition of ever increasing volumes of high resolution magnetic resonance imaging (MRI) data has created an urgent need to develop automated and objective image analysis algorithms that can assist in determining tumor margins, diagnosing tumor stage, and detecting treatment response. We have shown previously that Minkowski functionals, which are precise morphological and structural descriptors of image heterogeneity, can be used to enhance the detection, in T1 -weighted images, of a targeted Gd(3+) -chelate-based contrast agent for detecting tumor cell death. We have used Minkowski functionals here to characterize heterogeneity in T2 -weighted images acquired before and after drug treatment, and obtained without contrast agent administration. We show that Minkowski functionals can be used to characterize the changes in image heterogeneity that accompany treatment of tumors with a vascular disrupting agent, combretastatin A4-phosphate, and with a cytotoxic drug, etoposide. Parameterizing changes in the heterogeneity of T2 -weighted images can be used to detect early responses of tumors to drug treatment, even when there is no change in tumor size. The approach provides a quantitative and therefore objective assessment of treatment response that could be used with other types of MR image and also with other imaging modalities. Copyright © 2013 Wiley Periodicals, Inc.

  7. Wind turbine blade shear web disbond detection using rotor blade operational sensing and data analysis.

    Science.gov (United States)

    Myrent, Noah; Adams, Douglas E; Griffith, D Todd

    2015-02-28

    A wind turbine blade's structural dynamic response is simulated and analysed with the goal of characterizing the presence and severity of a shear web disbond. Computer models of a 5 MW offshore utility-scale wind turbine were created to develop effective algorithms for detecting such damage. Through data analysis and with the use of blade measurements, a shear web disbond was quantified according to its length. An aerodynamic sensitivity study was conducted to ensure robustness of the detection algorithms. In all analyses, the blade's flap-wise acceleration and root-pitching moment were the clearest indicators of the presence and severity of a shear web disbond. A combination of blade and non-blade measurements was formulated into a final algorithm for the detection and quantification of the disbond. The probability of detection was 100% for the optimized wind speed ranges in laminar, 30% horizontal shear and 60% horizontal shear conditions. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  8. Damage detection using piezoelectric transducers and the Lamb wave approach: I. System analysis

    International Nuclear Information System (INIS)

    Wang, X; Lu, Y; Tang, J

    2008-01-01

    Structural damage detection using piezoelectric transducers and the Lamb wave approach has been under intensive investigations. A commonly pursued topic is the selection of system parameters such that the detection performance can be optimized. Previous studies have indicated that the excitation center frequency plays a critical role, and suggested use of the 'sweet spot' frequency to maximize the peak wave amplitude ratio between the S 0 and the A 0 modes. In this paper, the analytical formulation of Lamb wave propagation on a narrow-strip beam excited and sensed by piezoelectric transducers is outlined first. Then, the antisymmetric and symmetric contents of the wave propagation response are analyzed in detail with respect to system parameters. In particular, the parametric influence on the 'sweet spot' frequency is investigated systematically. The complicated interaction of the wave components with respect to damage is illustrated through case studies. The analytical study is supported by numerical analysis using the finite element method and by experimental investigation. This research provides the mechanistic basis for robust damage detection using data processing and statistical analysis tools which is the focus of the second paper of this two-paper series

  9. Spectral methods for the detection of network community structure: a comparative analysis

    International Nuclear Information System (INIS)

    Shen, Hua-Wei; Cheng, Xue-Qi

    2010-01-01

    Spectral analysis has been successfully applied to the detection of community structure of networks, respectively being based on the adjacency matrix, the standard Laplacian matrix, the normalized Laplacian matrix, the modularity matrix, the correlation matrix and several other variants of these matrices. However, the comparison between these spectral methods is less reported. More importantly, it is still unclear which matrix is more appropriate for the detection of community structure. This paper answers the question by evaluating the effectiveness of these five matrices against benchmark networks with heterogeneous distributions of node degree and community size. Test results demonstrate that the normalized Laplacian matrix and the correlation matrix significantly outperform the other three matrices at identifying the community structure of networks. This indicates that it is crucial to take into account the heterogeneous distribution of node degree when using spectral analysis for the detection of community structure. In addition, to our surprise, the modularity matrix exhibits very similar performance to the adjacency matrix, which indicates that the modularity matrix does not gain benefits from using the configuration model as a reference network with the consideration of the node degree heterogeneity

  10. Damage detection of engine bladed-disks using multivariate statistical analysis

    Science.gov (United States)

    Fang, X.; Tang, J.

    2006-03-01

    The timely detection of damage in aero-engine bladed-disks is an extremely important and challenging research topic. Bladed-disks have high modal density and, particularly, their vibration responses are subject to significant uncertainties due to manufacturing tolerance (blade-to-blade difference or mistuning), operating condition change and sensor noise. In this study, we present a new methodology for the on-line damage detection of engine bladed-disks using their vibratory responses during spin-up or spin-down operations which can be measured by blade-tip-timing sensing technique. We apply a principle component analysis (PCA)-based approach for data compression, feature extraction, and denoising. The non-model based damage detection is achieved by analyzing the change between response features of the healthy structure and of the damaged one. We facilitate such comparison by incorporating the Hotelling's statistic T2 analysis, which yields damage declaration with a given confidence level. The effectiveness of the method is demonstrated by case studies.

  11. High resolution melting analysis: a rapid and accurate method to detect CALR mutations.

    Directory of Open Access Journals (Sweden)

    Cristina Bilbao-Sieyro

    Full Text Available The recent discovery of CALR mutations in essential thrombocythemia (ET and primary myelofibrosis (PMF patients without JAK2/MPL mutations has emerged as a relevant finding for the molecular diagnosis of these myeloproliferative neoplasms (MPN. We tested the feasibility of high-resolution melting (HRM as a screening method for rapid detection of CALR mutations.CALR was studied in wild-type JAK2/MPL patients including 34 ET, 21 persistent thrombocytosis suggestive of MPN and 98 suspected secondary thrombocytosis. CALR mutation analysis was performed through HRM and Sanger sequencing. We compared clinical features of CALR-mutated versus 45 JAK2/MPL-mutated subjects in ET.Nineteen samples showed distinct HRM patterns from wild-type. Of them, 18 were mutations and one a polymorphism as confirmed by direct sequencing. CALR mutations were present in 44% of ET (15/34, 14% of persistent thrombocytosis suggestive of MPN (3/21 and none of the secondary thrombocytosis (0/98. Of the 18 mutants, 9 were 52 bp deletions, 8 were 5 bp insertions and other was a complex mutation with insertion/deletion. No mutations were found after sequencing analysis of 45 samples displaying wild-type HRM curves. HRM technique was reproducible, no false positive or negative were detected and the limit of detection was of 3%.This study establishes a sensitive, reliable and rapid HRM method to screen for the presence of CALR mutations.

  12. Numerical analysis of the resonance mechanism of the lumped parameter system model for acoustic mine detection

    International Nuclear Information System (INIS)

    Wang Chi; Zhou Yu-Qiu; Shen Gao-Wei; Wu Wen-Wen; Ding Wei

    2013-01-01

    The method of numerical analysis is employed to study the resonance mechanism of the lumped parameter system model for acoustic mine detection. Based on the basic principle of the acoustic resonance technique for mine detection and the characteristics of low-frequency acoustics, the ''soil-mine'' system could be equivalent to a damping ''mass-spring'' resonance model with a lumped parameter analysis method. The dynamic simulation software, Adams, is adopted to analyze the lumped parameter system model numerically. The simulated resonance frequency and anti-resonance frequency are 151 Hz and 512 Hz respectively, basically in agreement with the published resonance frequency of 155 Hz and anti-resonance frequency of 513 Hz, which were measured in the experiment. Therefore, the technique of numerical simulation is validated to have the potential for analyzing the acoustic mine detection model quantitatively. The influences of the soil and mine parameters on the resonance characteristics of the soil—mine system could be investigated by changing the parameter setup in a flexible manner. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  13. Myeloperoxidase mRNA detection for lineage determination of leukemic blasts: retrospective analysis.

    Science.gov (United States)

    Crisan, D; Anstett, M J

    1995-07-01

    Myeloperoxidase (MPO) mRNA is an early myeloid marker; its detection in the morphologically and immunophenotypically primitive blasts of acute undifferentiated leukemia (AUL) establishes myeloid lineage and allows reclassification as acute myelogenous leukemia with minimal differentiation (AML-MO). We have previously reported a procedure for MPO mRNA detection by RT-PCR (reverse transcription-polymerase chain reaction) and an adaptation for use of routine hematology smears. This variant procedure allows retrospective analysis of mRNA and is used in the present study to evaluate the lineage of leukemic blasts in seven cases with morphology and cytochemistry consistent with AUL. All hematology smears used in this study were air-dried, unstained or Wright-stained and stored at room temperature for periods varying between 3 days and 2 years. MPO mRNA was detected in six cases, establishing the myeloid lineage of the blasts and the diagnosis of AML-MO. In the remaining case, the blasts were MPO mRNA negative, confirming the diagnosis of AUL. The RT-PCR procedure for retrospective mRNA analysis is useful in the clinical setting, due to its high specificity and sensitivity, speed (less than 24 h), safety (no radioactivity) and convenient use of routine hematology smears; it is particularly attractive in clinical situations when fresh or frozen specimens are no longer available at the time when the need for molecular diagnostics becomes apparent.

  14. An Experimental Study of Cavitation Detection in a Centrifugal Pump Using Envelope Analysis

    Science.gov (United States)

    Tan, Chek Zin; Leong, M. Salman

    Cavitation represents one of the most common faults in pumps and could potentially lead to a series of failure in mechanical seal, impeller, bearing, shaft, motor, etc. In this work, an experimental rig was setup to investigate cavitation detection using vibration envelope analysis method, and measured parameters included sound, pressure and flow rate for feasibility of cavitation detection. The experiment testing included 3 operating points of the centrifugal pump (B.E.P, 90% of B.E.P and 80% of B.E.P). Suction pressure of the centrifugal pump was decreased gradually until the inception point of cavitation. Vibration measurements were undertaken at various locations including casing, bearing, suction and discharge flange of the centrifugal pump. Comparisons of envelope spectrums under cavitating and non-cavitating conditions were presented. Envelope analysis was proven useful in detecting cavitation over the 3 testing conditions. During the normal operating condition, vibration peak synchronous to rotational speed was more pronounced. It was however during cavitation condition, the half order sub-harmonic vibration component was clearly evident in the envelope spectrums undertaken at all measurement locations except at the pump bearing. The possible explanation of the strong sub-harmonic (½ of BPF) during cavitation existence in the centrifugal pump was due to insufficient time for the bubbles to collapse completely before the end of the single cycle.

  15. Anomalous Traffic Detection and Self-Similarity Analysis in the Environment of ATMSim

    Directory of Open Access Journals (Sweden)

    Hae-Duck J. Jeong

    2017-12-01

    Full Text Available Internet utilisation has steadily increased, predominantly due to the rapid recent development of information and communication networks and the widespread distribution of smartphones. As a result of this increase in Internet consumption, various types of services, including web services, social networking services (SNS, Internet banking, and remote processing systems have been created. These services have significantly enhanced global quality of life. However, as a negative side-effect of this rapid development, serious information security problems have also surfaced, which has led to serious to Internet privacy invasions and network attacks. In an attempt to contribute to the process of addressing these problems, this paper proposes a process to detect anomalous traffic using self-similarity analysis in the Anomaly Teletraffic detection Measurement analysis Simulator (ATMSim environment as a research method. Simulations were performed to measure normal and anomalous traffic. First, normal traffic for each attack, including the Address Resolution Protocol (ARP and distributed denial-of-service (DDoS was measured for 48 h over 10 iterations. Hadoop was used to facilitate processing of the large amount of collected data, after which MapReduce was utilised after storing the data in the Hadoop Distributed File System (HDFS. A new platform on Hadoop, the detection system ATMSim, was used to identify anomalous traffic after which a comparative analysis of the normal and anomalous traffic was performed through a self-similarity analysis. There were four categories of collected traffic that were divided according to the attack methods used: normal local area network (LAN traffic, DDoS attack, and ARP spoofing, as well as DDoS and ARP attack. ATMSim, the anomaly traffic detection system, was used to determine if real attacks could be identified effectively. To achieve this, the ATMSim was used in simulations for each scenario to test its ability to

  16. Improvement on Exoplanet Detection Methods and Analysis via Gaussian Process Fitting Techniques

    Science.gov (United States)

    Van Ross, Bryce; Teske, Johanna

    2018-01-01

    Planetary signals in radial velocity (RV) data are often accompanied by signals coming solely from stellar photo- or chromospheric variation. Such variation can reduce the precision of planet detection and mass measurements, and cause misidentification of planetary signals. Recently, several authors have demonstrated the utility of Gaussian Process (GP) regression for disentangling planetary signals in RV observations (Aigrain et al. 2012; Angus et al. 2017; Czekala et al. 2017; Faria et al. 2016; Gregory 2015; Haywood et al. 2014; Rajpaul et al. 2015; Foreman-Mackey et al. 2017). GP models the covariance of multivariate data to make predictions about likely underlying trends in the data, which can be applied to regions where there are no existing observations. The potency of GP has been used to infer stellar rotation periods; to model and disentangle time series spectra; and to determine physical aspects, populations, and detection of exoplanets, among other astrophysical applications. Here, we implement similar analysis techniques to times series of the Ca-2 H and K activity indicator measured simultaneously with RVs in a small sample of stars from the large Keck/HIRES RV planet search program. Our goal is to characterize the pattern(s) of non-planetary variation to be able to know what is/ is not a planetary signal. We investigated ten different GP kernels and their respective hyperparameters to determine the optimal combination (e.g., the lowest Bayesian Information Criterion value) in each stellar data set. To assess the hyperparameters’ error, we sampled their posterior distributions using Markov chain Monte Carlo (MCMC) analysis on the optimized kernels. Our results demonstrate how GP analysis of stellar activity indicators alone can contribute to exoplanet detection in RV data, and highlight the challenges in applying GP analysis to relatively small, irregularly sampled time series.

  17. Sensitivity of the Positive and Negative Syndrome Scale (PANSS) in Detecting Treatment Effects via Network Analysis.

    Science.gov (United States)

    Esfahlani, Farnaz Zamani; Sayama, Hiroki; Visser, Katherine Frost; Strauss, Gregory P

    2017-12-01

    Objective: The Positive and Negative Syndrome Scale is a primary outcome measure in clinical trials examining the efficacy of antipsychotic medications. Although the Positive and Negative Syndrome Scale has demonstrated sensitivity as a measure of treatment change in studies using traditional univariate statistical approaches, its sensitivity to detecting network-level changes in dynamic relationships among symptoms has yet to be demonstrated using more sophisticated multivariate analyses. In the current study, we examined the sensitivity of the Positive and Negative Syndrome Scale to detecting antipsychotic treatment effects as revealed through network analysis. Design: Participants included 1,049 individuals diagnosed with psychotic disorders from the Phase I portion of the Clinical Antipsychotic Trials of Intervention Effectiveness (CATIE) study. Of these participants, 733 were clinically determined to be treatment-responsive and 316 were found to be treatment-resistant. Item level data from the Positive and Negative Syndrome Scale were submitted to network analysis, and macroscopic, mesoscopic, and microscopic network properties were evaluated for the treatment-responsive and treatment-resistant groups at baseline and post-phase I antipsychotic treatment. Results: Network analysis indicated that treatment-responsive patients had more densely connected symptom networks after antipsychotic treatment than did treatment-responsive patients at baseline, and that symptom centralities increased following treatment. In contrast, symptom networks of treatment-resistant patients behaved more randomly before and after treatment. Conclusions: These results suggest that the Positive and Negative Syndrome Scale is sensitive to detecting treatment effects as revealed through network analysis. Its findings also provide compelling new evidence that strongly interconnected symptom networks confer an overall greater probability of treatment responsiveness in patients with

  18. Toward the detection of gravitational waves under non-Gaussian noises II. Independent component analysis.

    Science.gov (United States)

    Morisaki, Soichiro; Yokoyama, Jun'ichi; Eda, Kazunari; Itoh, Yousuke

    2016-01-01

    We introduce a new analysis method to deal with stationary non-Gaussian noises in gravitational wave detectors in terms of the independent component analysis. First, we consider the simplest case where the detector outputs are linear combinations of the inputs, consisting of signals and various noises, and show that this method may be helpful to increase the signal-to-noise ratio. Next, we take into account the time delay between the inputs and the outputs. Finally, we extend our method to nonlinearly correlated noises and show that our method can identify the coupling coefficients and remove non-Gaussian noises. Although we focus on gravitational wave data analysis, our methods are applicable to the detection of any signals under non-Gaussian noises.

  19. Filterless preconcentration, flow injection analysis and detection by inductively-coupled plasma mass spectrometry

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    The influence of interferences in the analysis of elements by inductively-coupled-plasma mass-spectrometry (ICP-MS) may be significantly diminished by utilising a protocol of flow-injection analysis (FIA). The method is based on filterless preconcentration of metallic elements at the walls...... of a knotted reactor that was made of nylon tubings. In the load mode, the preconcentration was accomplished by precipitation of metallic species in alkaline-buffered carriers onto the inner walls of the hydrofilic tube. After a preconcen-tration period of 40-120 seconds using sample volumes of 4-10 m...... of 10-30 were obtained in the analysis of aluminium, of chromium and of iron, which resulted in detection limits (3) down to 20 g/L at a sampling frequency of 50 per hour. The preconcentration protocol improves the selectivity thus allowing direct determination of the elements in saline media. Anionic...

  20. Analysis of rhythmic variance - ANORVA. A new simple method for detecting rhythms in biological time series

    Directory of Open Access Journals (Sweden)

    Peter Celec

    2004-01-01

    Full Text Available Cyclic variations of variables are ubiquitous in biomedical science. A number of methods for detecting rhythms have been developed, but they are often difficult to interpret. A simple procedure for detecting cyclic variations in biological time series and quantification of their probability is presented here. Analysis of rhythmic variance (ANORVA is based on the premise that the variance in groups of data from rhythmic variables is low when a time distance of one period exists between the data entries. A detailed stepwise calculation is presented including data entry and preparation, variance calculating, and difference testing. An example for the application of the procedure is provided, and a real dataset of the number of papers published per day in January 2003 using selected keywords is compared to randomized datasets. Randomized datasets show no cyclic variations. The number of papers published daily, however, shows a clear and significant (p<0.03 circaseptan (period of 7 days rhythm, probably of social origin

  1. Bienzymatic Biosensor for Rapid Detection of Aspartame by Flow Injection Analysis

    Directory of Open Access Journals (Sweden)

    Maria-Cristina Radulescu

    2014-01-01

    Full Text Available A rapid, simple and stable biosensor for aspartame detection was developed. Alcohol oxidase (AOX, carboxyl esterase (CaE and bovine serum albumin (BSA were immobilised with glutaraldehyde (GA onto screen-printed electrodes modified with cobalt-phthalocyanine (CoPC. The biosensor response was fast. The sample throughput using a flow injection analysis (FIA system was 40 h−1 with an RSD of 2.7%. The detection limits for both batch and FIA measurements were 0.1 µM for methanol and 0.2 µM for aspartame, respectively. The enzymatic biosensor was successfully applied for aspartame determination in different sample matrices/commercial products (liquid and solid samples without any pre-treatment step prior to measurement.

  2. Laboratory Detection and Analysis of Organic Compounds in Rocks Using HPLC and XRD Methods

    Science.gov (United States)

    Dragoi, D.; Kanik, I.; Bar-Cohen, Y.; Sherrit, S.; Tsapin, A.; Kulleck, J.

    2004-01-01

    In this work we describe an analytical method for determining the presence of organic compounds in rocks, limestone, and other composite materials. Our preliminary laboratory experiments on different rocks/limestone show that the organic component in mineralogical matrices is a minor phase on order of hundreds of ppm and can be better detected using high precision liquid chromatography (HPLC). The matrix, which is the major phase, plays an important role in embedding and protecting the organic molecules from the harsh Martian environment. Some rocks bear significant amounts of amino acids therefore, it is possible to identify these phases using powder x-ray diffraction (XRD) by crystallizing the organic. The method of detection/analysis of organics, in particular amino acids, that have been associated with life will be shown in the next section.

  3. Spatial statistical analysis of organs for intelligent CAD and its application to disease detection

    International Nuclear Information System (INIS)

    Takizawa, Hotaka

    2009-01-01

    The present article reports our research that was performed in a research project supported by a Grantin-Aid for Scientific Research on Priority Area from the Ministry of Education, Culture Sports, Science and Technology, JAPAN, from 2003 to 2006. Our method developed in the research acquired the trend of variation of spatial relations between true diseases, false positives and image features through statistical analysis of a set of medical images and improved the accuracy of disease detection by predicting their occurrence positions in an image based on the trend. This article describes the formulation of the method in general form and shows the results obtained by applying the method to chest X-ray CT images for detection of pulmonary nodules. (author)

  4. GPR Detection of Buried Symmetrically Shaped Mine-like Objects using Selective Independent Component Analysis

    DEFF Research Database (Denmark)

    Karlsen, Brian; Sørensen, Helge Bjarup Dissing; Larsen, Jan

    2003-01-01

    from small-scale anti-personal (AP) mines to large-scale anti-tank (AT) mines were designed. Large-scale SF-GPR measurements on this series of mine-like objects buried in soil were performed. The SF-GPR data was acquired using a wideband monostatic bow-tie antenna operating in the frequency range 750......This paper addresses the detection of mine-like objects in stepped-frequency ground penetrating radar (SF-GPR) data as a function of object size, object content, and burial depth. The detection approach is based on a Selective Independent Component Analysis (SICA). SICA provides an automatic...... ranking of components, which enables the suppression of clutter, hence extraction of components carrying mine information. The goal of the investigation is to evaluate various time and frequency domain ICA approaches based on SICA. Performance comparison is based on a series of mine-like objects ranging...

  5. Detection of Moving Targets Based on Doppler Spectrum Analysis Technique for Passive Coherent Radar

    Directory of Open Access Journals (Sweden)

    Zhao Yao-dong

    2013-06-01

    Full Text Available A novel method of moving targets detection taking Doppler spectrum analysis technique for Passive Coherent Radar (PCR is provided. After dividing the receiving signals into segments as pulse series, it utilizes the technique of pulse compress and Doppler processing to detect and locate the targets. Based on the algorithm for Pulse-Doppler (PD radar, the equipollence between continuous and pulsed wave in match filtering is proved and details of this method are introduced. To compare it with the traditional method of Cross-Ambiguity Function (CAF calculation, the relationship and mathematical modes of them are analyzed, with some suggestions on parameters choosing. With little influence to the gain of targets, the method can greatly promote the processing efficiency. The validity of the proposed method is demonstrated by offline processing real collected data sets and simulation results.

  6. Bienzymatic biosensor for rapid detection of aspartame by flow injection analysis.

    Science.gov (United States)

    Radulescu, Maria-Cristina; Bucur, Bogdan; Bucur, Madalina-Petruta; Radu, Gabriel Lucian

    2014-01-09

    A rapid, simple and stable biosensor for aspartame detection was developed. Alcohol oxidase (AOX), carboxyl esterase (CaE) and bovine serum albumin (BSA) were immobilised with glutaraldehyde (GA) onto screen-printed electrodes modified with cobalt-phthalocyanine (CoPC). The biosensor response was fast. The sample throughput using a flow injection analysis (FIA) system was 40 h⁻¹ with an RSD of 2.7%. The detection limits for both batch and FIA measurements were 0.1 µM for methanol and 0.2 µM for aspartame, respectively. The enzymatic biosensor was successfully applied for aspartame determination in different sample matrices/commercial products (liquid and solid samples) without any pre-treatment step prior to measurement.

  7. Detection of generator bearing inner race creep by means of vibration and temperature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Dragiev, Ivaylo G.; Hilmisson, Reynir

    2015-01-01

    Vibration and temperature analysis are the two dominating condition monitoring techniques applied to fault detection of bearing failures in wind turbine generators. Relative movement between the bearing inner ring and generator axle is one of the most severe failure modes in terms of secondary...... damages and development. Detection of bearing creep can be achieved reliably based on continuous trending of the amplitude of vibration running speed harmonic and temperature absolute values. In order to decrease the number of condition indicators which need to be assessed, it is proposed to exploit...... a weighted average descriptor calculated based on the 3rd up to 6th harmonic orders. Two cases of different bearing creep severity are presented, showing the consistency of the combined vibration and temperature data utilization. In general, vibration monitoring reveals early signs of abnormality several...

  8. Prospects of Frequency-Time Correlation Analysis for Detecting Pipeline Leaks by Acoustic Emission Method

    International Nuclear Information System (INIS)

    Faerman, V A; Cheremnov, A G; Avramchuk, V V; Luneva, E E

    2014-01-01

    In the current work the relevance of nondestructive test method development applied for pipeline leak detection is considered. It was shown that acoustic emission testing is currently one of the most widely spread leak detection methods. The main disadvantage of this method is that it cannot be applied in monitoring long pipeline sections, which in its turn complicates and slows down the inspection of the line pipe sections of main pipelines. The prospects of developing alternative techniques and methods based on the use of the spectral analysis of signals were considered and their possible application in leak detection on the basis of the correlation method was outlined. As an alternative, the time-frequency correlation function calculation is proposed. This function represents the correlation between the spectral components of the analyzed signals. In this work, the technique of time-frequency correlation function calculation is described. The experimental data that demonstrate obvious advantage of the time-frequency correlation function compared to the simple correlation function are presented. The application of the time-frequency correlation function is more effective in suppressing the noise components in the frequency range of the useful signal, which makes maximum of the function more pronounced. The main drawback of application of the time- frequency correlation function analysis in solving leak detection problems is a great number of calculations that may result in a further increase in pipeline time inspection. However, this drawback can be partially reduced by the development and implementation of efficient algorithms (including parallel) of computing the fast Fourier transform using computer central processing unit and graphic processing unit

  9. Clinical evaluation of a novel population-based regression analysis for detecting glaucomatous visual field progression.

    Science.gov (United States)

    Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C

    2011-04-01

    The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF

  10. Fast and sensitive detection of foodborne pathogen using electrochemical impedance analysis, urease catalysis and microfluidics.

    Science.gov (United States)

    Chen, Qi; Wang, Dan; Cai, Gaozhe; Xiong, Yonghua; Li, Yuntao; Wang, Maohua; Huo, Huiling; Lin, Jianhan

    2016-12-15

    Early screening of pathogenic bacteria is a key to prevent and control of foodborne diseases. In this study, we developed a fast and sensitive bacteria detection method integrating electrochemical impedance analysis, urease catalysis with microfluidics and using Listeria as model. The Listeria cells, the anti-Listeria monoclonal antibodies modified magnetic nanoparticles (MNPs), and the anti-Listeria polyclonal antibodies and urease modified gold nanoparticles (AuNPs) were incubated in a fluidic separation chip with active mixing to form the MNP-Listeria-AuNP-urease sandwich complexes. The complexes were captured in the separation chip by applying a high gradient magnetic field, and the urea was injected to resuspend the complexes and hydrolyzed under the catalysis of the urease on the complexes into ammonium ions and carbonate ions, which were transported into a microfluidic detection chip with an interdigitated microelectrode for impedance measurement to determine the amount of the Listeria cells. The capture efficiency of the Listeria cells in the separation chip was ∼93% with a shorter time of 30min due to the faster immuno-reaction using the active magnetic mixing. The changes on both impedance magnitude and phase angle were demonstrated to be able to detect the Listeria cells as low as 1.6×10(2)CFU/mL. The detection time was reduced from original ∼2h to current ∼1h. The recoveries of the spiked lettuce samples ranged from 82.1% to 89.6%, indicating the applicability of this proposed biosensor. This microfluidic impedance biosensor has shown the potential for online, automatic and sensitive bacteria separation and detection. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Change detection of medical images using dictionary learning techniques and principal component analysis.

    Science.gov (United States)

    Nika, Varvara; Babyn, Paul; Zhu, Hongmei

    2014-07-01

    Automatic change detection methods for identifying the changes of serial MR images taken at different times are of great interest to radiologists. The majority of existing change detection methods in medical imaging, and those of brain images in particular, include many preprocessing steps and rely mostly on statistical analysis of magnetic resonance imaging (MRI) scans. Although most methods utilize registration software, tissue classification remains a difficult and overwhelming task. Recently, dictionary learning techniques are being used in many areas of image processing, such as image surveillance, face recognition, remote sensing, and medical imaging. We present an improved version of the EigenBlockCD algorithm, named the EigenBlockCD-2. The EigenBlockCD-2 algorithm performs an initial global registration and identifies the changes between serial MR images of the brain. Blocks of pixels from a baseline scan are used to train local dictionaries to detect changes in the follow-up scan. We use PCA to reduce the dimensionality of the local dictionaries and the redundancy of data. Choosing the appropriate distance measure significantly affects the performance of our algorithm. We examine the differences between [Formula: see text] and [Formula: see text] norms as two possible similarity measures in the improved EigenBlockCD-2 algorithm. We show the advantages of the [Formula: see text] norm over the [Formula: see text] norm both theoretically and numerically. We also demonstrate the performance of the new EigenBlockCD-2 algorithm for detecting changes of MR images and compare our results with those provided in the recent literature. Experimental results with both simulated and real MRI scans show that our improved EigenBlockCD-2 algorithm outperforms the previous methods. It detects clinical changes while ignoring the changes due to the patient's position and other acquisition artifacts.

  12. Regression analysis for LED color detection of visual-MIMO system

    Science.gov (United States)

    Banik, Partha Pratim; Saha, Rappy; Kim, Ki-Doo

    2018-04-01

    Color detection from a light emitting diode (LED) array using a smartphone camera is very difficult in a visual multiple-input multiple-output (visual-MIMO) system. In this paper, we propose a method to determine the LED color using a smartphone camera by applying regression analysis. We employ a multivariate regression model to identify the LED color. After taking a picture of an LED array, we select the LED array region, and detect the LED using an image processing algorithm. We then apply the k-means clustering algorithm to determine the number of potential colors for feature extraction of each LED. Finally, we apply the multivariate regression model to predict the color of the transmitted LEDs. In this paper, we show our results for three types of environmental light condition: room environmental light, low environmental light (560 lux), and strong environmental light (2450 lux). We compare the results of our proposed algorithm from the analysis of training and test R-Square (%) values, percentage of closeness of transmitted and predicted colors, and we also mention about the number of distorted test data points from the analysis of distortion bar graph in CIE1931 color space.

  13. Detection of irradiated mushrooms by GC/MS analysis of lipid-derived hydrocarbons

    International Nuclear Information System (INIS)

    Delincee, H.; Koller, W.D.

    1993-01-01

    A number of methods has been developed for the detection of irradiated foods in recent years, and in the case of mushrooms several methods have been proposed, of which the thermoluminescence (TL) measurements seem to be the most valuable. However, in several cases mineral contamination of fresh mushrooms is so extremely low that not enough minerals can be isolated for TL analysis. In that case an alternative method is needed to detect the radiation treatment of mushrooms. Several methods including TTC (2,3,5-triphenyl-tetrazolium-chloride) staining, kinetin treatment, dropping out of spores and mirco-gel electrophoresis of spores, were tested, but the most promising method was the GC/MS analysis of radiation-induced lipid-derived hydrocarbons in spite of the low fat content - around 0.2-0.3% - of mushrooms. Successful results were achieved by GC/MS analysis of the radiolytic hydrocarbons. Although mushrooms have a low fat content, by extracting a large quantity, in this case 500 g of mushrooms, about 1.2-1.5 g of fat could be obtained. The main fatty acids of mushroom fat and some of their expected cleavage products on irradiation - the c n-1 hydrocarbon which has one C atom less than the parent fatty acid and the C n-2:1 hydrocarbon, which has two C atoms less and an additional double bond in position 1 - are given. (orig./Vhe)

  14. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    Science.gov (United States)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  15. Spectral analysis to detection of short circuit fault of solar photovoltaic modules in strings

    International Nuclear Information System (INIS)

    Sevilla-Camacho, P.Y.; Robles-Ocampo, J.B.; Zuñiga-Reyes, Marco A.

    2017-01-01

    This research work presents a method to detect the number of short circuit faulted solar photovoltaic modules in strings of a photovoltaic system by taking into account speed, safety, and non-use of sensors and specialized and expensive equipment. The method consists on apply the spectral analysis and statistical techniques to the alternating current output voltage of a string and detect the number of failed modules through the changes in the amplitude of the component frequency of 12 kHz. For that, the analyzed string is disconnected of the array; and a small pulsed voltage signal of frequency of 12 kHz introduces him under dark condition and controlled temperature. Previous to the analysis, the signal is analogic filtered in order to reduce the direct current signal component. The spectral analysis technique used is the Fast Fourier Transform. The obtained experimental results were validated through simulation of the alternating current equivalent circuit of a solar cell. In all experimental and simulated test, the method allowed to identify correctly the number of photovoltaic modules with short circuit in the analyzed string. (author)

  16. Scientific literature addressing detection of monosialoganglioside: A 10-year bibliometric analysis.

    Science.gov (United States)

    Xu, Yanli; Li, Miaojing; Liu, Zhijun; Xi, Aiping; Zhao, Chaoxian; Zhang, Jianzhong

    2012-04-05

    The study was undertaken to explore a bibliometric approach to quantitatively assess the research on detection of monosialoganglioside from 2002 to 2011. A bibliometric analysis based on the publications on Web of Science was performed using key words such as "monosialoganglioside", "colloidal gold", "high performance liquid chromatography" and "detection". (1) Research articles on the detection of monosialoganglioside; (2) researches on human and animal fundamentals, clinical trials and case reports; (3) article types: article, review, proceedings paper, note, letter, editorial material, discussion, book chapter; (4) Publication year: 2002-2011. (1) unrelated articles; (2) type of articles: correction; (3) articles from following databases: all databases related to social science and arts & humanities in Web of Science were excluded. (1) distribution of subject areas; (2) number of publications annually; (3) document type and language of publications; (4) distribution of institutions; (5) distribution of output in journals; (6) the number of countries in which the article is published; (7) top cited paper. Overall population stands at 1 880 research articles addressing detection of monosialoganglioside in Web of Science during the study period. Articles (1 599) were the most frequently used document type comprising 85.05%, followed by meeting abstracts, reviews and proceedings papers. The distribution of subject categories showed that monosialoganglioside research covered both clinical and basic science research. The USA, Japan, and Italy were the three most productive countries, and the publication numbers in the USA were highest with 559 papers. The University of Milan, Nagoya University, and Kinki University are the most productive institutions regarding detection of monosialoganglioside. In 559 articles published by Americans, Medical College of Georgia ranked the first with 30 articles, followed by University of Medicine and Dentistry of New Jersey (28

  17. Hair analysis for the detection of drug use-is there potential for evasion?

    Science.gov (United States)

    Marrinan, Shanna; Roman-Urrestarazu, Andres; Naughton, Declan; Levari, Emerlinda; Collins, John; Chilcott, Robert; Bersani, Giuseppe; Corazza, Ornella

    2017-05-01

    Hair analysis for illicit substances is widely used to detect chronic drug consumption or abstention from drugs. Testees are increasingly seeking ways to avoid detection by using a variety of untested adulterant products (e.g., shampoos, cleansers) widely sold online. This study aims to investigate adulteration of hair samples and to assess effectiveness of such methods. The literature on hair test evasion was searched for on PubMed or MEDLINE, Psycinfo, and Google Scholar. Given the sparse nature of peer-reviewed data on this subject, results were integrated with a qualitative assessment of online sources, including user-orientated information or commercial websites, drug fora and "chat rooms". Over four million web sources were identified in a Google search by using "beat hair drug test" and the first 86 were monitored on regular basis and considered for further analysis. Attempts to influence hair test results are widespread. Various "shampoos," and "cleansers" among other products, were found for sale, which claim to remove analytes. Often advertised with aggressive marketing strategies, which include discounts, testimonials, and unsupported claims of efficacy. However, these products may pose serious health hazards and are also potentially toxic. In addition, many anecdotal reports suggest that Novel Psychoactive Substances are also consumed as an evasion technique, as these are not easily detectable via standard drug test. Recent changes on Novel Psychoactive Substances legislations such as New Psychoactive Bill in the UK might further challenge the testing process. Further research is needed by way of chemical analysis and trial of the adulterant products sold online and their effects as well as the development of more sophisticated hair testing techniques. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Screening Performance Characteristic of Ultrasonography and Radiography in Detection of Pleural Effusion; a Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Mahmoud Yousefifard

    2016-01-01

    Full Text Available Introduction: The role of ultrasonography in detection of pleural effusion has long been a subject of interest but controversial results have been reported. Accordingly, this study aims to conduct a systematic review of the available literature on diagnostic value of ultrasonography and radiography in detection of pleural effusion through a meta-analytic approach. Methods: An extended search was done in databases of Medline, EMBASE, ISI Web of Knowledge, Scopus, Cochrane Library, and ProQuest. Two reviewers independently extracted the data and assessed the quality of the articles. Meta-analysis was performed using a mixed-effects binary regression model. Finally, subgroup analysis was carried out in order to find the sources of heterogeneity between the included studies. Results: 12 studies were included in this meta-analysis (1554 subjects, 58.6% male. Pooled sensitivity of ultrasonography in detection of pleural effusion was 0.94 (95% CI: 0.88-0.97; I2= 84.23, p<0.001 and its pooled specificity was calculated to be 0.98 (95% CI: 0.92-1.0; I2= 88.65, p<0.001, while sensitivity and specificity of chest radiography were 0.51 (95% CI: 0.33-0.68; I2= 91.76, p<0.001 and 0.91 (95% CI: 0.68-0.98; I2= 92.86, p<0.001, respectively. Sensitivity of ultrasonography was found to be higher when the procedure was carried out by an intensivist or a radiologist using 5-10 MHz transducers. Conclusion: Chest ultrasonography, as a screening tool, has a higher diagnostic accuracy in identification of plural effusion compared to radiography. The sensitivity of this imaging modality was found to be higher when performed by a radiologist or an intensivist and using 5-10MHz probes.

  19. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  20. Detection of colorectal cancer (CRC by urinary volatile organic compound analysis.

    Directory of Open Access Journals (Sweden)

    Ramesh P Arasaradnam

    Full Text Available Colorectal cancer (CRC is a leading cause of cancer related death in Europe and the USA. There is no universally accepted effective non-invasive screening test for CRC. Guaiac based faecal occult blood (gFOB testing has largely been superseded by Faecal Immunochemical testing (FIT, but sensitivity still remains poor. The uptake of population based FOBt testing in the UK is also low at around 50%. The detection of volatile organic compounds (VOCs signature(s for many cancer subtypes is receiving increasing interest using a variety of gas phase analytical instruments. One such example is FAIMS (Field Asymmetric Ion Mobility Spectrometer. FAIMS is able to identify Inflammatory Bowel disease (IBD patients by analysing shifts in VOCs patterns in both urine and faeces. This study extends this concept to determine whether CRC patients can be identified through non-invasive analysis of urine, using FAIMS. 133 patients were recruited; 83 CRC patients and 50 healthy controls. Urine was collected at the time of CRC diagnosis and headspace analysis undertaken using a FAIMS instrument (Owlstone, Lonestar, UK. Data was processed using Fisher Discriminant Analysis (FDA after feature extraction from the raw data. FAIMS analyses demonstrated that the VOC profiles of CRC patients were tightly clustered and could be distinguished from healthy controls. Sensitivity and specificity for CRC detection with FAIMS were 88% and 60% respectively. This study suggests that VOC signatures emanating from urine can be detected in patients with CRC using ion mobility spectroscopy technology (FAIMS with potential as a novel screening tool.

  1. Detection and genetic analysis of human sapoviruses in river water in Japan.

    Science.gov (United States)

    Kitajima, Masaaki; Oka, Tomoichiro; Haramoto, Eiji; Katayama, Hiroyuki; Takeda, Naokazu; Katayama, Kazuhiko; Ohgaki, Shinichiro

    2010-04-01

    We investigated the prevalence of sapoviruses (SaVs) in the Tamagawa River in Japan from April 2003 to March 2004 and performed genetic analysis of the SaV genes identified in river water. A total of 60 river water samples were collected from five sites along the river, and 500 ml was concentrated using the cation-coated filter method. By use of a real-time reverse transcription (RT)-PCR assay, 12 (20%) of the 60 samples were positive for SaV. SaV sequences were obtained from 15 (25%) samples, and a total of 30 SaV strains were identified using six RT-PCR assays followed by cloning and sequence analysis. A newly developed nested RT-PCR assay utilizing a broadly reactive forward primer showed the highest detection efficiency and amplified more diverse SaV genomes in the samples. SaV sequences were frequently detected from November to March, whereas none were obtained in April, July, September, or October. No SaV sequences were detected in the upstream portion of the river, whereas the midstream portion showed high positive rates. Based on phylogenetic analysis, SaV strains identified in the river water samples were classified into nine genotypes, namely, GI/1, GI/2, GI/3, GI/5, GI/untyped, GII/1, GII/2, GII/3, and GV/1. To our knowledge, this is the first study describing seasonal and spatial distributions and genetic diversity of SaVs in river water. A combination of real-time RT-PCR assay and newly developed nested RT-PCR assay is useful for identifying and characterizing SaV strains in a water environment.

  2. A novel approach for the detection and genetic analysis of live melanoma circulating tumor cells.

    Directory of Open Access Journals (Sweden)

    Melody J Xu

    Full Text Available Circulating tumor cell (CTC detection and genetic analysis may complement currently available disease assessments in patients with melanoma to improve risk stratification and monitoring. We therefore sought to establish the feasibility of a telomerase-based assay for detecting and isolating live melanoma CTCs.The telomerase-based CTC assay utilizes an adenoviral vector that, in the presence of elevated human telomerase activity, drives the amplification of green fluorescent protein. Tumor cells are then identified via an image processing system. The protocol was tested on melanoma cells in culture or spiked into control blood, and on samples from patients with metastatic melanoma. Genetic analysis of the isolated melanoma CTCs was then performed for BRAF mutation status.The adenoviral vector was effective for all melanoma cell lines tested with sensitivity of 88.7% (95%CI 85.6-90.4% and specificity of 99.9% (95%CI 99.8-99.9%. In a pilot trial of patients with metastatic disease, CTCs were identified in 9 of 10 patients, with a mean of 6.0 CTCs/mL. At a cutoff of 1.1 CTCs/mL, the telomerase-based assay exhibits test performance of 90.0% sensitivity and 91.7% specificity. BRAF mutation analysis of melanoma cells isolated from culture or spiked control blood, or from pilot patient samples was found to match the known BRAF mutation status of the cell lines and primary tumors.To our knowledge, this is the first report of a telomerase-based assay effective for detecting and isolating live melanoma CTCs. These promising findings support further studies, including towards integrating into the management of patients with melanoma receiving multimodality therapy.

  3. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    Directory of Open Access Journals (Sweden)

    Peeyush Sahay

    2009-10-01

    Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

  4. Performance of computer-aided diagnosis for detection of lacunar infarcts on brain MR images: ROC analysis of radiologists' detection

    International Nuclear Information System (INIS)

    Uchiyama, Y.; Yokoyama, R.; Hara, T.; Fujita, H.; Asano, T.; Kato, H.; Hoshi, H.; Yamakawa, H.; Iwama, T.; Ando, H.; Yamakawa, H.

    2007-01-01

    The detection and management of asymptomatic lacunar infarcts on magnetic resonance (MR) images are important tasks for radiologists to ensure the prevention of sever cerebral infarctions. However, accurate identification of lacunar infarcts is a difficult. Therefore, we developed a computer-aided diagnosis (CAD) scheme for detection of lacunar infarcts. The purpose of this study was to evaluate radiologists' performance in detection of lacunar infarcts without and with use of CAD scheme. 30 T1- and 30 T2- weighted images obtained from 30 patients were used for an observer study, which were consisted of 15 cases with a single lacunar infarct and 15 cases without any lacunar infarct. Six radiologists participated in the observer study. They interpreted lacunar infarcts first without and then with use of the scheme. For all six observers, average area under the receiver operating characteristic curve value was increased from 0.920 to 0.965 when they used the computer output. This CAD scheme might have the potential to improve the accuracy of radiologists' performance in the detection of lacunar infarcts on MR images. (orig.)

  5. Contrast-enhanced Ultrasound for Detection of Crohn's Disease Activity: Systematic Review and Meta-analysis.

    Science.gov (United States)

    Serafin, Zbigniew; Białecki, Marcin; Białecka, Agnieszka; Sconfienza, Luca Maria; Kłopocka, Maria

    2016-03-01

    Reports on imaging of active Crohn's disease (aCD) using contrast-enhanced ultrasound (CEUS) are encouraging. However, the statistical power of most published papers is limited due to the small size of the patient groups included. This study was performed to verify the diagnostic value of CEUS in detecting aCD. A systematic literature search was performed by two independent reviewers for articles on the test characteristics of CEUS for the identification of aCD. The quality of the analysed studies was evaluated using a quality assessment tool for diagnostic accuracy studies (QUADAS-2). Pooling was performed using a diagnostic random-effect model and bivariate analysis. Eight articles were included in the final analysis, with a total of 332 patients. There was no significant publication bias. Significant heterogeneity was found regarding CEUS methodology and sonographic definitions of aCD. In a bivariate analysis, pooled sensitivity was 0.94 (95% CI 0.87-0.97) and pooled specificity was 0.79 (95% CI 0.67-0.88). Spearman correlation statistics presented no significant diagnostic threshold effect (r = 0.12, p > 0.9). Subgroup analysis showed that relative intestine wall enhancement had the highest diagnostic value (area under the curve 94%), while the presence of enhancement and analysis of the slope were less useful (area under the curve 91 and 90%, respectively). CEUS presents good sensitivity and moderate specificity in the detection of the aCD. Large-scale randomized trials with quantitative evaluation of CEUS images are necessary to promote this technique in clinical practice. Copyright © 2015 European Crohn’s and Colitis Organisation (ECCO). Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  6. Capillary electrophoresis fragment analysis and clone sequencing in detection of dynamic mutations of spinocerebellar ataxia

    Directory of Open Access Journals (Sweden)

    Yuan-yuan CHEN

    2018-04-01

    Full Text Available Objective To estimate the accuracy and stability of capillary electrophoresis fragment analysis and clone sequencing in detecting dynamic mutations of spinocerebellar ataxia (SCA. Methods Capillary electrophoresis fragment analysis and clone sequencing were used in detecting trinucleotide repeated sequence of 14 SCA patients (3 cases of SCA2, 2 cases of SCA7, 7 cases of SCA8 and 2 cases of SCA17. Results Capillary electrophoresis fragment analysis of 3 SCA2 cases showed the expanded cytosine-adenine-guanine (CAG repeats were 31, 30 and 32, and the copy numbers of 3 clone sequencing for 3 colonies in each case were 37/40/40, 37/38/39 and 38/39/40 respectively. Capillary electrophoresis fragment analysis of 2 SCA7 cases showed the expanded CAG repeats were 57 and 34, and the copy numbers of repeats were 69, 74, 75 in 3 colonies of one case, and was 45 in the other case. For the 7 SCA8 cases with the expanded cytosine-thymine-adenine (CTA/cytosine-thymine-guanine (CTG repeats of 99, 111, 104, 92, 89, 104 and 75, the results of clone sequencing were 97, 116, 104, 90, 90, 102 and 76 respectively. For 2 SCA17 cases with the short/expanded CAG repeats of 37/50 and 36/45, the results of clone sequencing were 51/50/52 and 45/44 for 3 and 2 colonies. Conclusions Although the higher mobility of polymerase chain reaction (PCR products containing dynamic mutation in the capillary electrophoresis fragment analysis might cause the deviation for analysis of copy numbers, the deviation was predictable and the results were repeatable. The clone sequencing results showed obvious instability, especially for SCA2 and SCA7 genes, which might owing to their simple CAG repeats. Consequently, clone sequencing is not suited for detection of dynamic mutation, not to mention the quantitative criteria of dynamic mutation sequencing. DOI: 10.3969/j.issn.1672-6731.2018.03.008

  7. Detecting Lactococcus lactis Prophages by Mitomycin C-Mediated Induction Coupled to Flow Cytometry Analysis

    Directory of Open Access Journals (Sweden)

    Joana Oliveira

    2017-07-01

    Full Text Available Most analyzed Lactococcus lactis strains are predicted to harbor one or more prophage genomes within their chromosome; however, the true extent of the inducibility and functionality of such prophages cannot easily be deduced from sequence analysis alone. Chemical treatment of lysogenic strains with Mitomycin C is known to cause induction of temperate phages, though it is not always easy to clearly identify a lysogenic strain or to measure the number of released phage particles. Here, we report the application of flow cytometry as a reliable tool for the detection and enumeration of released lactococcal prophages using the green dye SYTO-9.

  8. Elemental analysis technique based on detecting gamma-rays from interactions of neutrons with medium

    International Nuclear Information System (INIS)

    Pospisil, S.; Janout, Z.; Vobecky, M.

    1979-01-01

    The methods are discussed of carbon content determination in large amounts of material by detecting 4438 keV gamma radiation accompanying inelastic scattering of neutrons from a radionuclide neutron source. Presented are the methodological analysis of the problem, the results of test measurements, and methodological recommendations for the practical application of the method. Test measurements were conducted on fly ash, limestone and brown coal in amounts of approximately 5 kg for each material sample, using an Am-Be neutron source. The determined sensitivity thresholds corresponded to the carbon concentration of 5 to 10% w.w. (S.P.)

  9. Combining Trust and Behavioral Analysis to Detect Security Threats in Open Environments

    Science.gov (United States)

    2010-11-01

    behavioral feature values. This would provide a baseline notional object trust and is formally defined as follows: TO(1)[0, 1] = ∑ 0,n:νbt wtP (S) (8...TO(2)[0, 1] = ∑ wtP (S) · identity(O,P ) (9) 28- 12 RTO-MP-IST-091 Combining Trust and Behavioral Analysis to Detect Security Threats in Open...respectively. The wtP weight function determines the significance of a particular behavioral feature in the final trust calculation. Note that the weight

  10. Resonance detection of EEG signals using two-layer wavelet analysis

    International Nuclear Information System (INIS)

    Abdallah, H. M; Odeh, F.S.

    2000-01-01

    This paper presents the hybrid quadrature mirror filter (HQMF) algorithm applied to the electroencephalogram (EEG) signal during mental activity. The information contents of this signal, i.e., its medical diagnosis, lie in its power spectral density (PSD). The HQMF algorithm is a modified technique that is based on the shape and the details of the signal. If applied efficiently, the HQMF algorithm will produce much better results than conventional wavelet methods in detecting (diagnosing) the information of the EEG signal from its PSD. This technique is applicable not only to EEG signals, but is highly recommended to compression analysis and de noising techniques. (authors). 16 refs., 9 figs

  11. Design, Simulation and Analysis of Cantilever Sensor for in-Vitro LDL Detection

    Directory of Open Access Journals (Sweden)

    Dr. S. Hosimin Thilagar

    2011-07-01

    Full Text Available This work is focused on the design, simulation and analysis of microcantilever integrated with piezoresistors in Wheatstone bridge arrangement to detect low density lipoprotein (LDL in blood, which is responsible for cholesterol accumulation in arteries. This paper uses Finite Element Method (FEM to obtain the performance of piezoresistive microcantilever sensor to measure surface stress corresponding to the adsorption of LDL molecules. The FEM results are compared with the analytical solutions. The results suggest that the designed sensor can effectively sense LDL molecules as in-Vitro with few micro-litre of blood sample.

  12. Palladium configuration dependence of hydrogen detection sensitivity based on graphene FET for breath analysis

    Science.gov (United States)

    Sakamoto, Yuri; Uemura, Kohei; Ikuta, Takashi; Maehashi, Kenzo

    2018-04-01

    We have succeeded in fabricating a hydrogen gas sensor based on palladium-modified graphene field-effect transistors (FETs). The negative-voltage shift in the transfer characteristics was observed with exposure to hydrogen gas, which was explained by the change in work function. The hydrogen concentration dependence of the voltage shift was investigated using graphene FETs with palladium deposited by three different evaporation processes. The results indicate that the hydrogen detection sensitivity of the palladium-modified graphene FETs is strongly dependent on the palladium configuration. Therefore, the palladium-modified graphene FET is a candidate for breath analysis.

  13. Early detection of abnormality by a logistic analysis of atomic bomb survivor's serially performed examinations

    International Nuclear Information System (INIS)

    Mori, Hiroyuki; Nakamura, Tsuyoshi; Toyoda, Shigeki; Morikawa, Akira.

    1986-01-01

    It is important to establish a method of using periodic medical examination results for disease screening. A method is described of logistically analyzing results from four or more periodic examinations for the early detection of abnormality. This method was applied to 308 patients who died of gastric cancer and 3,002 normal controls. The logistic analysis showed that the ability to differentiate normal from abnormal findings was better when using the results of the periodic examinations than when using the data of the one particular examination. (Namekawa, K.)

  14. Detection of icing on wind turbine blades by means of vibration and power curve analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Kleani, Karolina; Mijatovic, Nenad

    2016-01-01

    accelerometers and power performance analysis. Features extracted from these two techniques serve as inputs in a decision-making scheme, allowing early activation of de-icing systems or shut down of the wind turbine. An additional parameter is the month of operation, assuring consistent outcomes in both winter......Ice accretion on wind turbines' blades is one of the main challenges of systems installed in cold climate locations, resulting in power performance deterioration and excessive nacelle oscillation. In this work, consistent detection of icing events is achieved utilizing indications from the nacelle...

  15. A neurophysiological method of rapid detection and analysis of marine algal toxins

    DEFF Research Database (Denmark)

    Kerr, DS; Bødtkjer, Donna Briggs; Saba, HI

    1999-01-01

    a robust, reversible increase in amplitude mic spikes, and the appearance of multiple spikes (i.e., epileptiform activity) within minutes of toxin wash-in. Other notable features of the domoic acid signature included a significant decrease in amplitude of the field EPSPs, and a complete absence of effect...... responsive fashion at toxin concentrations of 25-200 nM, and tests of naturally contaminated shellfish confirmed the utility of this assay as a screening method for PSP. Our findings suggest that the in vitro hippocampal slice preparation has potential in the detection and analysis of three marine algal...

  16. Privacy-Preserved Behavior Analysis and Fall Detection by an Infrared Ceiling Sensor Network

    Directory of Open Access Journals (Sweden)

    Mineichi Kudo

    2012-12-01

    Full Text Available An infrared ceiling sensor network system is reported in this study to realize behavior analysis and fall detection of a single person in the home environment. The sensors output multiple binary sequences from which we know the existence/non-existence of persons under the sensors. The short duration averages of the binary responses are shown to be able to be regarded as pixel values of a top-view camera, but more advantageous in the sense of preserving privacy. Using the “pixel values” as features, support vector machine classifiers succeeded in recognizing eight activities (walking, reading, etc. performed by five subjects at an average recognition rate of 80.65%. In addition, we proposed a martingale framework for detecting falls in this system. The experimental results showed that we attained the best performance of 95.14% (F1 value, the FAR of 7.5% and the FRR of 2.0%. This accuracy is not sufficient in general but surprisingly high with such low-level information. In summary, it is shown that this system has the potential to be used in the home environment to provide personalized services and to detect abnormalities of elders who live alone.

  17. Capillary electrophoresis method with UV-detection for analysis of free amino acids concentrations in food.

    Science.gov (United States)

    Omar, Mei Musa Ali; Elbashir, Abdalla Ahmed; Schmitz, Oliver J

    2017-01-01

    Simple and inexpensive capillary electrophoresis with UV-detection method (CE-UV) was optimized and validated for determination of six amino acids namely (alanine, asparagine, glutamine, proline, serine and valine) for Sudanese food. Amino acids in the samples were derivatized with 4-chloro-7-nitro-2,1,3-benzoxadiazole (NBD-Cl) prior to CE-UV analysis. Labeling reaction conditions (100mM borate buffer at pH 8.5, labeling reaction time 60min, temperature 70°C and NBD-Cl concentration 40mM) were systematically investigated. The optimal conditions for the separation were 100mM borate buffer at pH 9.7 and detected at 475nm. The method was validated in terms of linearity, limit of detection (LOD), limit of quantification (LOQ), precision (repeatability) (RSD%) and accuracy (recovery). Good linearity was achieved for all amino acids (r(2)>0.9981) in the concentration range of 2.5-40mg/L. The LODs in the range of 0.32-0.56mg/L were obtained. Recoveries of amino acids ranging from 85% to 108%, (n=3) were obtained. The validated method was successfully applied for the determination of amino acids for Sudanese food samples. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Detection of heart disease by open access echocardiography: a retrospective analysis of general practice referrals.

    Science.gov (United States)

    Chambers, John; Kabir, Saleha; Cajeat, Eric

    2014-02-01

    Heart disease is difficult to detect clinically and it has been suggested that echocardiography should be available to all patients with possible cardiac symptoms or signs. To analyse the results of 2 years of open access echocardiography for the frequency of structural heart disease according to request. Retrospective database analysis in a teaching hospital open access echocardiography service. Reports of all open access transthoracic echocardiograms between January 2011 and December 2012 were categorised as normal, having minor abnormalities, or significant abnormalities according to the indication. There were 2343 open access echocardiograms performed and there were significant abnormalities in 29%, predominantly valve disease (n = 304, 13%), LV systolic dysfunction (n = 179, 8%), aortic dilatation (n = 80, 3%), or pulmonary hypertension (n = 91, 4%). If echocardiography had been targeted at a high-risk group, 267 with valve disease would have been detected (compared to 127 with murmur alone) and 139 with LV systolic dysfunction (compared to 91 with suspected heart failure alone). Most GP practices requested fewer than 10 studies, but 6 practices requested over 70 studies. Open access echocardiograms are often abnormal but structural disease may not be suspected from the clinical request. Uptake by individual practices is patchy. A targeted expansion of echocardiography in patients with a high likelihood of disease is therefore likely to increase the detection of clinically important pathology.

  19. Change Analysis and Decision Tree Based Detection Model for Residential Objects across Multiple Scales

    Directory of Open Access Journals (Sweden)

    CHEN Liyan

    2018-03-01

    Full Text Available Change analysis and detection plays important role in the updating of multi-scale databases.When overlap an updated larger-scale dataset and a to-be-updated smaller-scale dataset,people usually focus on temporal changes caused by the evolution of spatial entities.Little attention is paid to the representation changes influenced by map generalization.Using polygonal building data as an example,this study examines the changes from different perspectives,such as the reasons for their occurrence,their performance format.Based on this knowledge,we employ decision tree in field of machine learning to establish a change detection model.The aim of the proposed model is to distinguish temporal changes that need to be applied as updates to the smaller-scale dataset from representation changes.The proposed method is validated through tests using real-world building data from Guangzhou city.The experimental results show the overall precision of change detection is more than 90%,which indicates our method is effective to identify changed objects.

  20. Drug sales data analysis for outbreak detection of infectious diseases: a systematic literature review.

    Science.gov (United States)

    Pivette, Mathilde; Mueller, Judith E; Crépey, Pascal; Bar-Hen, Avner

    2014-11-18

    This systematic literature review aimed to summarize evidence for the added value of drug sales data analysis for the surveillance of infectious diseases. A search for relevant publications was conducted in Pubmed, Embase, Scopus, Cochrane Library, African Index Medicus and Lilacs databases. Retrieved studies were evaluated in terms of objectives, diseases studied, data sources, methodologies and performance for real-time surveillance. Most studies compared drug sales data to reference surveillance data using correlation measurements or indicators of outbreak detection performance (sensitivity, specificity, timeliness of the detection). We screened 3266 articles and included 27 in the review. Most studies focused on acute respiratory and gastroenteritis infections. Nineteen studies retrospectively compared drug sales data to reference clinical data, and significant correlations were observed in 17 of them. Four studies found that over-the-counter drug sales preceded clinical data in terms of incidence increase. Five studies developed and evaluated statistical algorithms for selecting drug groups to monitor specific diseases. Another three studies developed models to predict incidence increase from drug sales. Drug sales data analyses appear to be a useful tool for surveillance of gastrointestinal and respiratory disease, and OTC drugs have the potential for early outbreak detection. Their utility remains to be investigated for other diseases, in particular those poorly surveyed.

  1. Detection of heart disease by open access echocardiography: a retrospective analysis of general practice referrals

    Science.gov (United States)

    Chambers, John; Kabir, Saleha; Cajeat, Eric

    2014-01-01

    Background Heart disease is difficult to detect clinically and it has been suggested that echocardiography should be available to all patients with possible cardiac symptoms or signs. Aim To analyse the results of 2 years of open access echocardiography for the frequency of structural heart disease according to request. Design and setting Retrospective database analysis in a teaching hospital open access echocardiography service. Method Reports of all open access transthoracic echocardiograms between January 2011 and December 2012 were categorised as normal, having minor abnormalities, or significant abnormalities according to the indication. Results There were 2343 open access echocardiograms performed and there were significant abnormalities in 29%, predominantly valve disease (n = 304, 13%), LV systolic dysfunction (n = 179, 8%), aortic dilatation (n = 80, 3%), or pulmonary hypertension (n = 91, 4%). If echocardiography had been targeted at a high-risk group, 267 with valve disease would have been detected (compared to 127 with murmur alone) and 139 with LV systolic dysfunction (compared to 91 with suspected heart failure alone). Most GP practices requested fewer than 10 studies, but 6 practices requested over 70 studies. Conclusion Open access echocardiograms are often abnormal but structural disease may not be suspected from the clinical request. Uptake by individual practices is patchy. A targeted expansion of echocardiography in patients with a high likelihood of disease is therefore likely to increase the detection of clinically important pathology. PMID:24567615

  2. Comparative Analysis of ACAS-Xu and DAIDALUS Detect-and-Avoid Systems

    Science.gov (United States)

    Davies, Jason T.; Wu, Minghong G.

    2018-01-01

    The Detect and Avoid (DAA) capability of a recent version (Run 3) of the Airborne Collision Avoidance System-Xu (ACAS-Xu) is measured against that of the Detect and AvoID Alerting Logic for Unmanned Systems (DAIDALUS), a reference algorithm for the Phase 1 Minimum Operational Performance Standards (MOPS) for DAA. This comparative analysis of the two systems' alerting and horizontal guidance outcomes is conducted through the lens of the Detect and Avoid mission using flight data of scripted encounters from a recent flight test. Results indicate comparable timelines and outcomes between ACAS-Xu's Remain Well Clear alert and guidance and DAIDALUS's corrective alert and guidance, although ACAS-Xu's guidance appears to be more conservative. ACAS-Xu's Collision Avoidance alert and guidance occurs later than DAIDALUS's warning alert and guidance, and overlaps with DAIDALUS's timeline of maneuver to remain Well Clear. Interesting discrepancies between ACAS-Xu's directive guidance and DAIDALUS's "Regain Well Clear" guidance occur in some scenarios.

  3. Hippocampus shape analysis for temporal lobe epilepsy detection in magnetic resonance imaging

    Science.gov (United States)

    Kohan, Zohreh; Azmi, Reza

    2016-03-01

    There are evidences in the literature that Temporal Lobe Epilepsy (TLE) causes some lateralized atrophy and deformation on hippocampus and other substructures of the brain. Magnetic Resonance Imaging (MRI), due to high-contrast soft tissue imaging, is one of the most popular imaging modalities being used in TLE diagnosis and treatment procedures. Using an algorithm to help clinicians for better and more effective shape deformations analysis could improve the diagnosis and treatment of the disease. In this project our purpose is to design, implement and test a classification algorithm for MRIs based on hippocampal asymmetry detection using shape and size-based features. Our method consisted of two main parts; (1) shape feature extraction, and (2) image classification. We tested 11 different shape and size features and selected four of them that detect the asymmetry in hippocampus significantly in a randomly selected subset of the dataset. Then, we employed a support vector machine (SVM) classifier to classify the remaining images of the dataset to normal and epileptic images using our selected features. The dataset contains 25 patient images in which 12 cases were used as a training set and the rest 13 cases for testing the performance of classifier. We measured accuracy, specificity and sensitivity of, respectively, 76%, 100%, and 70% for our algorithm. The preliminary results show that using shape and size features for detecting hippocampal asymmetry could be helpful in TLE diagnosis in MRI.

  4. 3D depth image analysis for indoor fall detection of elderly people

    Directory of Open Access Journals (Sweden)

    Lei Yang

    2016-02-01

    Full Text Available This paper presents a new fall detection method of elderly people in a room environment based on shape analysis of 3D depth images captured by a Kinect sensor. Depth images are pre-processed by a median filter both for background and target. The silhouette of moving individual in depth images is achieved by a subtraction method for background frames. The depth images are converted to disparity map, which is obtained by the horizontal and vertical projection histogram statistics. The initial floor plane information is obtained by V disparity map, and the floor plane equation is estimated by the least square method. Shape information of human subject in depth images is analyzed by a set of moment functions. Coefficients of ellipses are calculated to determine the direction of individual. The centroids of the human body are calculated and the angle between the human body and the floor plane is calculated. When both the distance from the centroids of the human body to the floor plane and the angle between the human body and the floor plane are lower than some thresholds, fall incident will be detected. Experiments with different falling direction are performed. Experimental results show that the proposed method can detect fall incidents effectively.

  5. Maximum likelihood fitting of FROC curves under an initial-detection-and-candidate-analysis model

    International Nuclear Information System (INIS)

    Edwards, Darrin C.; Kupinski, Matthew A.; Metz, Charles E.; Nishikawa, Robert M.

    2002-01-01

    We have developed a model for FROC curve fitting that relates the observer's FROC performance not to the ROC performance that would be obtained if the observer's responses were scored on a per image basis, but rather to a hypothesized ROC performance that the observer would obtain in the task of classifying a set of 'candidate detections' as positive or negative. We adopt the assumptions of the Bunch FROC model, namely that the observer's detections are all mutually independent, as well as assumptions qualitatively similar to, but different in nature from, those made by Chakraborty in his AFROC scoring methodology. Under the assumptions of our model, we show that the observer's FROC performance is a linearly scaled version of the candidate analysis ROC curve, where the scaling factors are just given by the FROC operating point coordinates for detecting initial candidates. Further, we show that the likelihood function of the model parameters given observational data takes on a simple form, and we develop a maximum likelihood method for fitting a FROC curve to this data. FROC and AFROC curves are produced for computer vision observer datasets and compared with the results of the AFROC scoring method. Although developed primarily with computer vision schemes in mind, we hope that the methodology presented here will prove worthy of further study in other applications as well

  6. Detection and phylogenetic analysis of a new adenoviral polymerase gene in reptiles in Korea.

    Science.gov (United States)

    Bak, Eun-Jung; Jho, Yeonsook; Woo, Gye-Hyeong

    2018-06-01

    Over a period of 7 years (2004-2011), samples from 34 diseased reptiles provided by local governments, zoos, and pet shops were tested for viral infection. Animals were diagnosed based on clinical signs, including loss of appetite, diarrhea, rhinorrhea, and unexpected sudden death. Most of the exotic animals had gastrointestinal problems, such as mucosal redness and ulcers, while the native animals had no clinical symptoms. Viral sequences were found in seven animals. Retroviral genes were amplified from samples from five Burmese pythons (Python molurus bivittatus), an adenovirus was detected in a panther chameleon (Furcifer pardalis), and an adenovirus and a paramyxovirus were detected in a tropical girdled lizard (Cordylus tropidosternum). Phylogenetic analysis of retroviruses and paramyxoviruses showed the highest sequence identity to both a Python molurus endogenous retrovirus and a Python curtus endogenous retrovirus and to a lizard isolate, respectively. Partial sequencing of an adenoviral DNA polymerase gene from the lizard isolate suggested that the corresponding virus was a novel isolate different from the reference strain (accession no. AY576677.1). The virus was not isolated but was detected, using molecular genetic techniques, in a lizard raised in a pet shop. This animal was also coinfected with a paramyxovirus.

  7. Microplate-reader method for the rapid analysis of copper in natural waters with chemiluminescence detection

    Directory of Open Access Journals (Sweden)

    Axel eDurand

    2013-01-01

    Full Text Available We have developed a method for the determination of copper in natural waters at nanomolar levels. The use of a microplate-reader minimises sample processing time (~ 25 sec per sample, reagent consumption (~ 120 μL per sample and sample volume (~ 700 μL. Copper is detected by chemiluminescence. This technique is based on the formation of a complex between copper and 1,10-phenanthroline and the subsequent emission of light during the oxidation of the complex by hydrogen peroxide. Samples are acidified to pH 1.7 and then introduced directly into a 24-well plate. Reagents are added during data acquisition via two reagent injectors. When trace metal clean protocols are employed, the reproducibility is generally less then 7% on blanks and the detection limit is 0.7 nM for seawater and 0.4 nM for freshwater. More than 100 samples per hour can be analyzed with this technique, which is simple, robust, and amenable to at-sea analysis. Seawater samples from Storm Bay in Tasmania illustrate the utility of the method for environmental science. Indeed other trace metals for which optical detection methods exist (e.g. chemiluminescence, fluorescence and absorbance could be adapted to the microplate-reader.

  8. Light emitting diode, photodiode-based fluorescence detection system for DNA analysis with microchip electrophoresis.

    Science.gov (United States)

    Hall, Gordon H; Glerum, D Moira; Backhouse, Christopher J

    2016-02-01

    Electrophoretic separation of fluorescently end-labeled DNA after a PCR serves as a gold standard in genetic diagnostics. Because of their size and cost, instruments for this type of analysis have had limited market uptake, particularly for point-of-care applications. This might be changed through a higher level of system integration and lower instrument costs that can be realized through the use of LEDs for excitation and photodiodes for detection--if they provide sufficient sensitivity. Here, we demonstrate an optimized microchip electrophoresis instrument using polymeric fluidic chips with fluorescence detection of end-labeled DNA with a LOD of 0.15 nM of Alexa Fluor 532. This represents orders of magnitude improvement over previously reported instruments of this type. We demonstrate the system with an electrophoretic separation of two PCR products and their respective primers. We believe that this is the first LED-induced fluorescence microchip electrophoresis system with photodiode-based detection that could be used for standard applications of PCR and electrophoresis. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  10. Study on Analysis and Pattern Recognition of the Manifestation of the Pulse Detection of Cerebrovascular Disease

    Energy Technology Data Exchange (ETDEWEB)

    Jing, J; Wang, Y C; Hong, W X; Zhang, W P [Department of Biomedical Engineering, University of Yanshan, Qinhuangdao, Hebei Province, 066004 (China)

    2006-10-15

    Cerebrovascular Disease (CVD) is also called stroke in Traditional Chinese Medicine (TCM). CVD is a kind of frequent diseases with high incidence, high death rate, high deformity rate and high relapse rate. The pathogenesis of CVD has relation to many factors. In modern medicine, we can make use of various instruments to check many biochemical parameters. However, at present, the early detection of CVD can mostly be done artificially by specialists. In TCM the salted expert can detect the state of a CVD patient by felling his (or her) pulse. It is significant to apply the modern information and engineering techniques to the early discovery of CVD. It is also a challenge to do this in fact. In this paper, the authors presented a detection method of CVD basing on analysis and pattern recognition of Manifestation of the Pulse of TCM using wavelet technology and Neural Networks. Pulse signals from normal health persons and CVD patients were studied comparatively. This research method is flexible to deal with other physiological signals.

  11. Analysis of human reticulocyte genes reveals altered erythropoiesis: potential use to detect recombinant human erythropoietin doping.

    Science.gov (United States)

    Varlet-Marie, Emmanuelle; Audran, Michel; Lejeune, Mireille; Bonafoux, Béatrice; Sicart, Marie-Therese; Marti, Jacques; Piquemal, David; Commes, Thérèse

    2004-08-01

    Enhancement of oxygen delivery to tissues is associated with improved sporting performance. One way of enhancing oxygen delivery is to take recombinant human erythropoietin (rHuEpo), which is an unethical and potentially dangerous practice. However, detection of the use of rHuEpo remains difficult in situations such as: i) several days after the end of treatment ii) when a treatment with low doses is conducted iii) if the rHuEpo effect is increased by other substances. In an attempt to detect rHuEpo abuse, we selected erythroid gene markers from a SAGE library and analyzed the effects of rHuEpo administration on expression of the HBB, FTL and OAZ genes. Ten athletes were assigned to the rHuEpo or placebo group. The rHuEpo group received subcutaneous injections of rHuEpo (50 UI/kg three times a week, 4 weeks; 20 UI/kg three times a week, 2 weeks). HBB, FTL and OAZ gene profiles were monitored by real time-polymerase chain reaction (PCR) quantification during and for 3 weeks after drug administration. The global analysis of these targeted genes detected in whole blood samples showed a characteristic profile of subjects misusing rHuEpo with a increase above the threshold levels. The individual analysis of OAZ mRNA seemed indicative of rHuEpo treatment. The performance-enhancing effect of rHuEpo treatment is greater than the duration of hematologic changes associated with rHuEpo misuse. Although direct electrophoretic methods to detect rHuEpo have been developed, recombinant isoforms of rHuEpo are not detectable some days after the last subcutaneous injection. To overcome these limitations indirect OFF models have been developed. Our data suggest that, in the near future, it will be possible to consolidate results achievable with the OFF models by analyzing selected erythroid gene markers as a supplement to indirect methods.

  12. Nuclear Smuggling Detection and Deterrence FY 2016 Data Analysis Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Enders, Alexander L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Harris, Tyrone C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Pope, Thomas C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Patterson, Jeremy B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-01-01

    The National Nuclear Security Administration’s Office of Nuclear Smuggling Detection and Deterrence (NSDD) has facilitated the installation of more than 3,500 radiation portal monitors (RPMs) at 606 sites in 56 countries worldwide. This collection of RPMs represents the world’s largest network of radiation detectors and provides one element in the defense-in-depth approach that supports the Global Nuclear Detection Architecture. These systems support NSDD’s mission to build partner country capability to deter, detect, and interdict the illicit transport of radiological and fissile material through strategic points of entry and exit at seaports, airports, and border crossings. NSDD works collaboratively with partner countries and international organizations to optimize the operation of these RPMs. The large amount of data provided by NSDD partner countries highlights the close cooperation and partnerships NSDD has built with 56 countries around the world. Thirty-seven of these countries shared at least some RPM-related data with NSDD in fiscal year 2016. This significant level of data sharing is a key element that distinguishes the NSDD office as unique among nuclear nonproliferation programs and initiatives: NSDD can provide specific, objective, data-driven decisions and support for sustaining the radiation detection systems it helped deploy. This data analysis report summarizes and aggregates the RPM data provided to the NSDD office for analysis and review in fiscal year 2016. The data can be used to describe RPM performance and characterize the wide diversity of NSDD deployment sites. For example, NSDD deploys detector systems across sites with natural background radiation levels that can vary by a factor of approximately six from site to site. Some lanes have few occupancies, whereas others have approximately 8,000 occupancies per day and the different types of cargo that travel through a site can result in site-wide alarm rates that range from near 0% at

  13. Comparative analysis on the selection of number of clusters in community detection

    Science.gov (United States)

    Kawamoto, Tatsuro; Kabashima, Yoshiyuki

    2018-02-01

    We conduct a comparative analysis on various estimates of the number of clusters in community detection. An exhaustive comparison requires testing of all possible combinations of frameworks, algorithms, and assessment criteria. In this paper we focus on the framework based on a stochastic block model, and investigate the performance of greedy algorithms, statistical inference, and spectral methods. For the assessment criteria, we consider modularity, map equation, Bethe free energy, prediction errors, and isolated eigenvalues. From the analysis, the tendency of overfit and underfit that the assessment criteria and algorithms have becomes apparent. In addition, we propose that the alluvial diagram is a suitable tool to visualize statistical inference results and can be useful to determine the number of clusters.

  14. Hyperspectral Imaging and SPA-LDA Quantitative Analysis for Detection of Colon Cancer Tissue

    Science.gov (United States)

    Yuan, X.; Zhang, D.; Wang, Ch.; Dai, B.; Zhao, M.; Li, B.

    2018-05-01

    Hyperspectral imaging (HSI) has been demonstrated to provide a rapid, precise, and noninvasive method for cancer detection. However, because HSI contains many data, quantitative analysis is often necessary to distill information useful for distinguishing cancerous from normal tissue. To demonstrate that HSI with our proposed algorithm can make this distinction, we built a Vis-NIR HSI setup and made many spectral images of colon tissues, and then used a successive projection algorithm (SPA) to analyze the hyperspectral image data of the tissues. This was used to build an identification model based on linear discrimination analysis (LDA) using the relative reflectance values of the effective wavelengths. Other tissues were used as a prediction set to verify the reliability of the identification model. The results suggest that Vis-NIR hyperspectral images, together with the spectroscopic classification method, provide a new approach for reliable and safe diagnosis of colon cancer and could lead to advances in cancer diagnosis generally.

  15. Applications of capillary electrophoresis with chemiluminescence detection in clinical, environmental and food analysis. A review

    International Nuclear Information System (INIS)

    Lara, Francisco J.; Airado-Rodríguez, Diego; Moreno-González, David; Huertas-Pérez, José F.; García-Campaña, Ana M.

    2016-01-01

    This paper reviews the latest developments and analytical applications of chemiluminescence detection coupled to capillary electrophoresis (CE-CL). Different sections considering the most common CL systems have been included, such as the tris(2,2′-bipyridine)ruthenium(II) system, the luminol and acridinium derivative reactions, the peroxyoxalate CL or direct oxidations. Improvements in instrumental designs, new strategies for improving both resolution and sensitivity, and applications in different fields such as clinical, pharmaceutical, environmental and food analysis have been included. This review covers the literature from 2010 to 2015. - Highlights: • An up-to-date critical review about the evolution of CE-CL is presented. • Tris(2,2′-bipyridine)ruthenium(II) and luminol as the most used CL systems. • Instrumental designs and strategies for improving resolution and sensitivity. • Applications in clinical, pharmaceutical, environmental and food analysis.

  16. Colour and shape analysis techniques for weed detection in cereal fields

    DEFF Research Database (Denmark)

    Pérez, A.J; López, F; Benlloch, J.V.

    2000-01-01

    . The proposed methods use colour information to discriminate between vegetation and background, whilst shape analysis techniques are applied to distinguish between crop and weeds. The determination of crop row position helps to reduce the number of objects to which shape analysis techniques are applied....... The performance of algorithms was assessed by comparing the results with a human classification, providing an acceptable success rate. The study has shown that despite the difficulties in accurately determining the number of seedlings (as in visual surveys), it is feasible to use image processing techniques......Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions...

  17. Complementary scattered and recoiled ion data from TOF-E heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Johnston, P.N.; El Bouanani, M.; Stannard, W.B.; Bubb, I.F.; Cohen, D.D.; Dytlewski, N.; Siegele, R.

    1998-01-01

    The advantage of Time of Flight and Energy (ToF-E) Heavy Ion Elastic Recoil Detection Analysis (HIERDA) over Rutherford Backscattering (RBS) analysis is its mass and energy dispersive capabilities. The mass resolution of ToF-E HIERDA deteriorates for very heavy elements. The limitation is related to the poor energy resolution of Si detectors for heavy ions. While the energy spectra from ToF-E HIERDA data are normally used to extract depth profiles, this work discusses the benefits of using the time spectra of both the recoiled and the scattered ions for depth profiling. The simulation of the complementary scattered and recoiled ion time spectra improves depth profiling and reduced current limitations when dealing with very heavy ions, such as Pt, Bi, Ta. (authors)

  18. DETECTION OF MALNUTRITION IN PATIENTS UNDERGOING MAINTENANCE HAEMODIALYSIS: A QUANTITATIVE DATA ANALYSIS ON 12 PARAMETERS.

    Science.gov (United States)

    Nafzger, Sonja; Fleury, Lea-Angelica; Uehlinger, Dominik E; Plüss, Petra; Scura, Ninetta; Kurmann, Silvia

    2015-09-01

    Protein-energy-malnutrition (PEM) is common in people with end stage kidney disease (ESKD) undergoing maintenance haemodialysis (MHD) and correlates strongly with mortality. To this day, there is no gold standard for detecting PEM in patients on MHD. The aim of this study was to evaluate if Nutritional Risk Screening 2002 (NRS-2002), handgrip strength measurement, mid-upper arm muscle area (MUAMA), triceps skin fold measurement (TSF), serum albumin, normalised protein catabolic rate (nPCR), Kt/V and eKt/V, dry body weight, body mass index (BMI), age and time since start on MHD are relevant for assessing PEM in patients on MHD. The predictive value of the selected parameters on mortality and mortality or weight loss of more than 5% was assessed. Quantitative data analysis of the 12 parameters in the same patients on MHD in autumn 2009 (n = 64) and spring 2011 (n = 40) with paired statistical analysis and multivariate logistic regression analysis was performed. Paired data analysis showed significant reduction of dry body weight, BMI and nPCR. Kt/Vtot did not change, eKt/v and hand grip strength measurements were significantly higher in spring 2011. No changes were detected in TSF, serum albumin, NRS-2002 and MUAMA. Serum albumin was shown to be the only predictor of death and of the combined endpoint "death or weight loss of more than 5%". We now screen patients biannually for serum albumin, nPCR, Kt/V, handgrip measurement of the shunt-free arm, dry body weight, age and time since initiation of MHD. © 2015 European Dialysis and Transplant Nurses Association/European Renal Care Association.

  19. Analysis of Oriented Texture - with application to the Detection of Architectural Distortion in Mammograms with Application to the Detection of Architectural Distortion in Mammograms

    CERN Document Server

    Ayres, Fabio; Desautels, JE Leo

    2011-01-01

    The presence of oriented features in images often conveys important information about the scene or the objects contained; the analysis of oriented patterns is an important task in the general framework of image understanding. As in many other applications of computer vision, the general framework for the understanding of oriented features in images can be divided into low- and high-level analysis. In the context of the study of oriented features, low-level analysis includes the detection of oriented features in images; a measure of the local magnitude and orientation of oriented features over

  20. Detection of Anti-Hepatitis B Virus Drug Resistance Mutations Based on Multicolor Melting Curve Analysis.

    Science.gov (United States)

    Mou, Yi; Athar, Muhammad Ammar; Wu, Yuzhen; Xu, Ye; Wu, Jianhua; Xu, Zhenxing; Hayder, Zulfiqar; Khan, Saeed; Idrees, Muhammad; Nasir, Muhammad Israr; Liao, Yiqun; Li, Qingge

    2016-11-01

    Detection of anti-hepatitis B virus (HBV) drug resistance mutations is critical for therapeutic decisions for chronic hepatitis B virus infection. We describe a real-time PCR-based assay using multicolor melting curve analysis (MMCA) that could accurately detect 24 HBV nucleotide mutations at 10 amino acid positions in the reverse transcriptase region of the HBV polymerase gene. The two-reaction assay had a limit of detection of 5 copies per reaction and could detect a minor mutant population (5% of the total population) with the reverse transcriptase M204V amino acid mutation in the presence of the major wild-type population when the overall concentration was 10 4 copies/μl. The assay could be finished within 3 h, and the cost of materials for each sample was less than $10. Clinical validation studies using three groups of samples from both nucleos(t)ide analog-treated and -untreated patients showed that the results for 99.3% (840/846) of the samples and 99.9% (8,454/8,460) of the amino acids were concordant with those of Sanger sequencing of the PCR amplicon from the HBV reverse transcriptase region (PCR Sanger sequencing). HBV DNA in six samples with mixed infections consisting of minor mutant subpopulations was undetected by the PCR Sanger sequencing method but was detected by MMCA, and the results were confirmed by coamplification at a lower denaturation temperature-PCR Sanger sequencing. Among the treated patients, 48.6% (103/212) harbored viruses that displayed lamivudine monoresistance, adefovir monoresistance, entecavir resistance, or lamivudine and adefovir resistance. Among the untreated patients, the Chinese group had more mutation-containing samples than did the Pakistani group (3.3% versus 0.56%). Because of its accuracy, rapidness, wide-range coverage, and cost-effectiveness, the real-time PCR assay could be a robust tool for the detection if anti-HBV drug resistance mutations in resource-limited countries. Copyright © 2016, American Society for

  1. Transmission and selection of macrolide resistant Mycoplasma genitalium infections detected by rapid high resolution melt analysis.

    Directory of Open Access Journals (Sweden)

    Jimmy Twin

    Full Text Available BACKGROUND: Mycoplasma genitalium (MG causes urethritis, cervicitis and pelvic inflammatory disease. The MG treatment failure rate using 1 g azithromycin at an Australian Sexual Health clinic in 2007-9 was 31% (95%CI 23-40%. We developed a rapid high resolution melt analysis (HRMA assay targeting resistance mutations in the MG 23S rRNA gene, and validated it against DNA sequencing by examining pre- and post-treatment archived samples from MG-infected patients. METHODOLOGY/PRINCIPAL FINDINGS: Available MG-positive pre-treatment (n = 82 and post-treatment samples from individuals with clinical treatment failure (n = 20 were screened for 23S rRNA gene mutations. Sixteen (20% pre-treatment samples possessed resistance mutations (A2058G, A2059G, A2059C, which were significantly more common in patients with symptomatic azithromycin-treatment failure (12/26; 44% than in those clinically cured (4/56; 7%, p<0.001. All 20 patients experiencing azithromycin-failure had detectable mutations in their post-treatment samples. In 9 of these cases, the same mutational types were present in both pre- and post-treatment samples indicating transmitted resistance, whilst in 11 of these cases (55%, mutations were absent in pre-treatment samples indicating likely selection of resistant isolates have occurred. HRMA was able to detect all mutational changes determined in this study by DNA sequencing. An additional HRMA assay incorporating an unlabelled probe was also developed to detect type 4 single-nucleotide polymorphisms found in other populations, with a slightly lower sensitivity of 90%. CONCLUSIONS/SIGNIFICANCE: Treatment failure is associated with the detection of macrolide resistance mutations, which appear to be almost equally due to selection of resistant isolates following exposure to 1 g azithromycin and pre-existing transmitted resistance. The application of a rapid molecular assay to detect resistance at the time of initial detection of infection allows

  2. Performance Analysis of Hierarchical Group Key Management Integrated with Adaptive Intrusion Detection in Mobile ad hoc Networks

    Science.gov (United States)

    2016-04-05

    applications in wireless networks such as military battlefields, emergency response, mobile commerce , online gaming, and collaborative work are based on the...www.elsevier.com/locate/peva Performance analysis of hierarchical group key management integrated with adaptive intrusion detection in mobile ad hoc...Accepted 19 September 2010 Available online 26 September 2010 Keywords: Mobile ad hoc networks Intrusion detection Group communication systems Group

  3. Diagnostic accuracy of transesophageal echocardiogram for the detection of patent foramen ovale: a meta-analysis.

    Science.gov (United States)

    Mojadidi, Mohammad Khalid; Bogush, Nikolay; Caceres, Jose Diego; Msaouel, Pavlos; Tobis, Jonathan M

    2014-07-01

    Patent foramen ovale (PFO) is a remnant of the fetal circulation present in 20% of the population. Right-to-left shunting (RLS) through a PFO has been linked to the pathophysiology of stroke, migraine with aura, and hypoxemia. While different imaging modalities including transcranial Doppler, intra-cardiac echo, and transthoracic echo (TTE) have often been used to detect RLS, transesophageal echo (TEE) bubble study remains the gold standard for diagnosing PFO. The aim of this study was to determine the relative accuracy of TEE in the detection of PFO. A systematic review of Medline, using a standard approach for meta-analysis, was performed for all prospective studies assessing accuracy of TEE in the detection of PFO using confirmation by autopsy, cardiac surgery, and/or catheterization as the reference. Search results revealed 3105 studies; 4 met inclusion criteria. A total of 164 patients were included. TEE had a weighted sensitivity of 89.2% (95% CI: 81.1-94.7%) and specificity of 91.4% (95% CI: 82.3-96.8%) to detect PFO. The overall positive likelihood ratio (LR+) was 5.93 (95% CI: 1.30-27.09) and the overall negative likelihood ratio (LR-) was 0.22 (95% CI: 0.08-0.56). While TEE bubble study is considered to be the gold standard modality for diagnosing PFO, some PFOs may still be missed or misdiagnosed. It is important to understand the limitations of TEE and perhaps use other highly sensitive screening tests, such as transcranial doppler (TCD), in conjunction with TEE before scheduling a patient for transcatheter PFO closure. © 2013, Wiley Periodicals, Inc.

  4. Results of Automated Retinal Image Analysis for Detection of Diabetic Retinopathy from the Nakuru Study, Kenya.

    Science.gov (United States)

    Hansen, Morten B; Abràmoff, Michael D; Folk, James C; Mathenge, Wanjiku; Bastawrous, Andrew; Peto, Tunde

    2015-01-01

    Digital retinal imaging is an established method of screening for diabetic retinopathy (DR). It has been established that currently about 1% of the world's blind or visually impaired is due to DR. However, the increasing prevalence of diabetes mellitus and DR is creating an increased workload on those with expertise in grading retinal images. Safe and reliable automated analysis of retinal images may support screening services worldwide. This study aimed to compare the Iowa Detection Program (IDP) ability to detect diabetic eye diseases (DED) to human grading carried out at Moorfields Reading Centre on the population of Nakuru Study from Kenya. Retinal images were taken from participants of the Nakuru Eye Disease Study in Kenya in 2007/08 (n = 4,381 participants [NW6 Topcon Digital Retinal Camera]). First, human grading was performed for the presence or absence of DR, and for those with DR this was sub-divided in to referable or non-referable DR. The automated IDP software was deployed to identify those with DR and also to categorize the severity of DR. The primary outcomes were sensitivity, specificity, and positive and negative predictive value of IDP versus the human grader as reference standard. Altogether 3,460 participants were included. 113 had DED, giving a prevalence of 3.3% (95% CI, 2.7-3.9%). Sensitivity of the IDP to detect DED as by the human grading was 91.0% (95% CI, 88.0-93.4%). The IDP ability to detect DED gave an AUC of 0.878 (95% CI 0.850-0.905). It showed a negative predictive value of 98%. The IDP missed no vision threatening retinopathy in any patients and none of the false negative cases met criteria for treatment. In this epidemiological sample, the IDP's grading was comparable to that of human graders'. It therefore might be feasible to consider inclusion into usual epidemiological grading.

  5. Detecting altered connectivity patterns in HIV associated neurocognitive impairment using mutual connectivity analysis

    Science.gov (United States)

    Abidin, Anas Zainul; D'Souza, Adora M.; Nagarajan, Mahesh B.; Wismüller, Axel

    2016-03-01

    The use of functional Magnetic Resonance Imaging (fMRI) has provided interesting insights into our understanding of the brain. In clinical setups these scans have been used to detect and study changes in the brain network properties in various neurological disorders. A large percentage of subjects infected with HIV present cognitive deficits, which are known as HIV associated neurocognitive disorder (HAND). In this study we propose to use our novel technique named Mutual Connectivity Analysis (MCA) to detect differences in brain networks in subjects with and without HIV infection. Resting state functional MRI scans acquired from 10 subjects (5 HIV+ and 5 HIV-) were subject to standard preprocessing routines. Subsequently, the average time-series for each brain region of the Automated Anatomic Labeling (AAL) atlas are extracted and used with the MCA framework to obtain a graph characterizing the interactions between them. The network graphs obtained for different subjects are then compared using Network-Based Statistics (NBS), which is an approach to detect differences between graphs edges while controlling for the family-wise error rate when mass univariate testing is performed. Applying this approach on the graphs obtained yields a single network encompassing 42 nodes and 65 edges, which is significantly different between the two subject groups. Specifically connections to the regions in and around the basal ganglia are significantly decreased. Also some nodes corresponding to the posterior cingulate cortex are affected. These results are inline with our current understanding of pathophysiological mechanisms of HIV associated neurocognitive disease (HAND) and other HIV based fMRI connectivity studies. Hence, we illustrate the applicability of our novel approach with network-based statistics in a clinical case-control study to detect differences connectivity patterns.

  6. Automated multi-radionuclide separation and analysis with combined detection capability

    Science.gov (United States)

    Plionis, Alexander Asterios

    The radiological dispersal device (RDD) is a weapon of great concern to those agencies responsible for protecting the public from the modern age of terrorism. In order to effectively respond to an RDD event, these agencies need to possess the capability to rapidly identify the radiological agents involved in the incident and assess the uptake of each individual victim. Since medical treatment for internal radiation poisoning is radionuclide-specific, it is critical to identify and quantify the radiological uptake of each individual victim. This dissertation describes the development of automated analytical components that could be used to determine and quantify multiple radionuclides in human urine bioassays. This is accomplished through the use of extraction chromatography that is plumbed in-line with one of a variety of detection instruments. Flow scintillation analysis is used for 90Sr and 210Po determination, flow gamma analysis is used assess 60 Co and 137Cs, and inductively coupled plasma mass spectrometry is used to determine actinides. Detection limits for these analytes were determined for the appropriate technique and related to their implications for health physics.

  7. Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis

    Science.gov (United States)

    Che, E.; Olsen, M. J.

    2017-09-01

    Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.

  8. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  9. Real-time movement detection and analysis for video surveillance applications

    Science.gov (United States)

    Hueber, Nicolas; Hennequin, Christophe; Raymond, Pierre; Moeglin, Jean-Pierre

    2014-06-01

    Pedestrian movement along critical infrastructures like pipes, railways or highways, is of major interest in surveillance applications as well as its behavior in urban environment. The goal is to anticipate illicit or dangerous human activities. For this purpose, we propose an all-in-one small autonomous system which delivers high level statistics and reports alerts in specific cases. This situational awareness project leads us to manage efficiently the scene by performing movement analysis. A dynamic background extraction algorithm is developed to reach the degree of robustness against natural and urban environment perturbations and also to match the embedded implementation constraints. When changes are detected in the scene, specific patterns are applied to detect and highlight relevant movements. Depending on the applications, specific descriptors can be extracted and fused in order to reach a high level of interpretation. In this paper, our approach is applied to two operational use cases: pedestrian urban statistics and railway surveillance. In the first case, a grid of prototypes is deployed over a city centre to collect pedestrian movement statistics up to a macroscopic level of analysis. The results demonstrate the relevance of the delivered information; in particular, the flow density map highlights pedestrian preferential paths along the streets. In the second case, one prototype is set next to high speed train tracks to secure the area. The results exhibit a low false alarm rate and assess our approach of a large sensor network for delivering a precise operational picture without overwhelming a supervisor.

  10. Fault detection of flywheel system based on clustering and principal component analysis

    Directory of Open Access Journals (Sweden)

    Wang Rixin

    2015-12-01

    Full Text Available Considering the nonlinear, multifunctional properties of double-flywheel with closed-loop control, a two-step method including clustering and principal component analysis is proposed to detect the two faults in the multifunctional flywheels. At the first step of the proposed algorithm, clustering is taken as feature recognition to check the instructions of “integrated power and attitude control” system, such as attitude control, energy storage or energy discharge. These commands will ask the flywheel system to work in different operation modes. Therefore, the relationship of parameters in different operations can define the cluster structure of training data. Ordering points to identify the clustering structure (OPTICS can automatically identify these clusters by the reachability-plot. K-means algorithm can divide the training data into the corresponding operations according to the reachability-plot. Finally, the last step of proposed model is used to define the relationship of parameters in each operation through the principal component analysis (PCA method. Compared with the PCA model, the proposed approach is capable of identifying the new clusters and learning the new behavior of incoming data. The simulation results show that it can effectively detect the faults in the multifunctional flywheels system.

  11. STRUCTURE LINE DETECTION FROM LIDAR POINT CLOUDS USING TOPOLOGICAL ELEVATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    C. Y. Lo

    2012-07-01

    Full Text Available Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.

  12. Detecting and correcting for publication bias in meta-analysis - A truncated normal distribution approach.

    Science.gov (United States)

    Zhu, Qiaohao; Carriere, K C

    2016-01-01

    Publication bias can significantly limit the validity of meta-analysis when trying to draw conclusion about a research question from independent studies. Most research on detection and correction for publication bias in meta-analysis focus mainly on funnel plot-based methodologies or selection models. In this paper, we formulate publication bias as a truncated distribution problem, and propose new parametric solutions. We develop methodologies of estimating the underlying overall effect size and the severity of publication bias. We distinguish the two major situations, in which publication bias may be induced by: (1) small effect size or (2) large p-value. We consider both fixed and random effects models, and derive estimators for the overall mean and the truncation proportion. These estimators will be obtained using maximum likelihood estimation and method of moments under fixed- and random-effects models, respectively. We carried out extensive simulation studies to evaluate the performance of our methodology, and to compare with the non-parametric Trim and Fill method based on funnel plot. We find that our methods based on truncated normal distribution perform consistently well, both in detecting and correcting publication bias under various situations.

  13. WAVELET ANALYSIS AND NEURAL NETWORK CLASSIFIERS TO DETECT MID-SAGITTAL SECTIONS FOR NUCHAL TRANSLUCENCY MEASUREMENT

    Directory of Open Access Journals (Sweden)

    Giuseppa Sciortino

    2016-04-01

    Full Text Available We propose a methodology to support the physician in the automatic identification of mid-sagittal sections of the fetus in ultrasound videos acquired during the first trimester of pregnancy. A good mid-sagittal section is a key requirement to make the correct measurement of nuchal translucency which is one of the main marker for screening of chromosomal defects such as trisomy 13, 18 and 21. NT measurement is beyond the scope of this article. The proposed methodology is mainly based on wavelet analysis and neural network classifiers to detect the jawbone and on radial symmetry analysis to detect the choroid plexus. Those steps allow to identify the frames which represent correct mid-sagittal sections to be processed. The performance of the proposed methodology was analyzed on 3000 random frames uniformly extracted from 10 real clinical ultrasound videos. With respect to a ground-truth provided by an expert physician, we obtained a true positive, a true negative and a balanced accuracy equal to 87.26%, 94.98% and 91.12% respectively.

  14. Optimization of heteroduplex analysis for the detection of BRCA mutations and SNPs

    Directory of Open Access Journals (Sweden)

    Lucian Negura

    2011-02-01

    Full Text Available BRCA1 and BRCA2 are tumour suppressor genes whose mutant phenotypes predispose to breast and ovarian cancer. Screening for mutations in these genes is now standard practice for hereditary breast and ovarian cancer (HBOC cases in Europe, and permits medical follow-up and genetic counselling adapted to the needs of individuals in such families. Currently, most laboratories performing diagnostic analysis of the BRCA genes use PCR of exons and intron-exon boundaries coupled to a pre-screening step to identify anomalous amplicons. The techniques employed for the detection of mutations and SNPs have evolved over time and vary in sensitivity, specificity and cost-effectiveness. As a variant for pre-screening techniques, we chose the recently developed Surveyor® heteroduplex cleavage method as a sensitive and specific technique to reveal anomalous amplicons of the BRCA genes, using only basic laboratory equipment and agarose gel electrophoresis. Here we present the detection of either mutations or SNPs within the BRCA1 exon 7, using heteroduplex analysis (HA by mismatch-specific endonuclease, confirmed by dideoxy sequencing.

  15. Application of wavelet analysis to detect dysfunction in cerebral blood flow autoregulation during experimental hyperhomocysteinaemia.

    Science.gov (United States)

    Aleksandrin, Valery V; Ivanov, Alexander V; Virus, Edward D; Bulgakova, Polina O; Kubatiev, Aslan A

    2018-04-03

    The purpose of the present study was to investigate the use of laser Doppler flowmetry (LDF) signals coupled with spectral wavelet analysis to detect endothelial link dysfunction in the autoregulation of cerebral blood flow in the setting of hyperhomocysteinaemia (HHcy). Fifty-one rats were assigned to three groups (intact, control, and HHcy) according to the results of biochemical assays of homocysteine level in blood plasma. LDF signals on the rat brain were recorded by LAKK-02 device to measure the microcirculatory blood flow. The laser operating wavelength and output power density were1064 nm and 0.051 W/mm 2 , respectively. A Morlet mother wavelet transform was applied to the measured 8-min LDF signals, and periodic oscillations with five frequency intervals were identified (0.01-0.04 Hz, 0.04-0.15 Hz, 0.15-0.4 Hz, 0.4-2 Hz, and 2-5 Hz) corresponding to endothelial, neurogenic, myogenic, respiratory, and cardiac origins, respectively. In initial state, the amplitude of the oscillations decreased by 38% (P wavelet analysis may be successfully applied to detect the dysfunction of the endothelial link in cerebral vessel tone and to reveal the pathological shift of lower limit of autoregulation.

  16. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  17. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  18. Flow-cytometric identification of vinegars using a multi-parameter analysis optical detection module

    Science.gov (United States)

    Verschooten, T.; Ottevaere, H.; Vervaeke, M.; Van Erps, J.; Callewaert, M.; De Malsche, W.; Thienpont, H.

    2015-09-01

    We show a proof-of-concept demonstration of a multi-parameter analysis low-cost optical detection system for the flowcytometric identification of vinegars. This multi-parameter analysis system can simultaneously measure laser induced fluorescence, absorption and scattering excited by two time-multiplexed lasers of different wavelengths. To our knowledge no other polymer optofluidic chip based system offers more simultaneous measurements. The design of the optofluidic channels is aimed at countering the effects that viscous fingering, air bubbles, and emulsion samples can have on the correct operation of such a detection system. Unpredictable variations in viscosity and refractive index of the channel content can be turned into a source of information. The sample is excited by two laser diodes that are driven by custom made low-cost laser drivers. The optofluidic chip is built to be robust and easy to handle and is reproducible using hot embossing. We show a custom optomechanical holder for the optofluidic chip that ensures correct alignment and automatic connection to the external fluidic system. We show an experiment in which 92 samples of vinegar are measured. We are able to identify 9 different kinds of vinegar with an accuracy of 94%. Thus we show an alternative approach to the classic optical spectroscopy solution at a lowered. Furthermore, we have shown the possibility of predicting the viscosity and turbidity of vinegars with a goodness-of-fit R2 over 0.947.

  19. Leakage detection in galvanized iron pipelines using ensemble empirical mode decomposition analysis

    Science.gov (United States)

    Amin, Makeen; Ghazali, M. Fairusham

    2015-05-01

    There are many numbers of possible approaches to detect leaks. Some leaks are simply noticeable when the liquids or water appears on the surface. However many leaks do not find their way to the surface and the existence has to be check by analysis of fluid flow in the pipeline. The first step is to determine the approximate position of leak. This can be done by isolate the sections of the mains in turn and noting which section causes a drop in the flow. Next approach is by using sensor to locate leaks. This approach are involves strain gauge pressure transducers and piezoelectric sensor. the occurrence of leaks and know its exact location in the pipeline by using specific method which are Acoustic leak detection method and transient method. The objective is to utilize the signal processing technique in order to analyse leaking in the pipeline. With this, an EEMD method will be applied as the analysis method to collect and analyse the data.

  20. Spatial correlation analysis of urban traffic state under a perspective of community detection

    Science.gov (United States)

    Yang, Yanfang; Cao, Jiandong; Qin, Yong; Jia, Limin; Dong, Honghui; Zhang, Aomuhan

    2018-05-01

    Understanding the spatial correlation of urban traffic state is essential for identifying the evolution patterns of urban traffic state. However, the distribution of traffic state always has characteristics of large spatial span and heterogeneity. This paper adapts the concept of community detection to the correlation network of urban traffic state and proposes a new perspective to identify the spatial correlation patterns of traffic state. In the proposed urban traffic network, the nodes represent road segments, and an edge between a pair of nodes is added depending on the result of significance test for the corresponding correlation of traffic state. Further, the process of community detection in the urban traffic network (named GWPA-K-means) is applied to analyze the spatial dependency of traffic state. The proposed method extends the traditional K-means algorithm in two steps: (i) redefines the initial cluster centers by two properties of nodes (the GWPA value and the minimum shortest path length); (ii) utilizes the weight signal propagation process to transfer the topological information of the urban traffic network into a node similarity matrix. Finally, numerical experiments are conducted on a simple network and a real urban road network in Beijing. The results show that GWPA-K-means algorithm is valid in spatial correlation analysis of traffic state. The network science and community structure analysis perform well in describing the spatial heterogeneity of traffic state on a large spatial scale.