WorldWideScience

Sample records for analysis detection processing

  1. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  2. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Science.gov (United States)

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A Similarity Ratio Analysis (SRA) method is proposed for early-stage Fault Detection (FD) in plasma etching processes using real-time Optical Emission Spectrometer (OES) data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR) statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A), takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  3. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available A Similarity Ratio Analysis (SRA method is proposed for early-stage Fault Detection (FD in plasma etching processes using real-time Optical Emission Spectrometer (OES data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A, takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  4. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  5. Application of Ensemble Detection and Analysis to Modeling Uncertainty in Non Stationary Process

    Science.gov (United States)

    Racette, Paul

    2010-01-01

    Characterization of non stationary and nonlinear processes is a challenge in many engineering and scientific disciplines. Climate change modeling and projection, retrieving information from Doppler measurements of hydrometeors, and modeling calibration architectures and algorithms in microwave radiometers are example applications that can benefit from improvements in the modeling and analysis of non stationary processes. Analyses of measured signals have traditionally been limited to a single measurement series. Ensemble Detection is a technique whereby mixing calibrated noise produces an ensemble measurement set. The collection of ensemble data sets enables new methods for analyzing random signals and offers powerful new approaches to studying and analyzing non stationary processes. Derived information contained in the dynamic stochastic moments of a process will enable many novel applications.

  6. Analysis of Space Shuttle Ground Support System Fault Detection, Isolation, and Recovery Processes and Resources

    Science.gov (United States)

    Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.

    2009-01-01

    As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.

  7. Improvement on Exoplanet Detection Methods and Analysis via Gaussian Process Fitting Techniques

    Science.gov (United States)

    Van Ross, Bryce; Teske, Johanna

    2018-01-01

    Planetary signals in radial velocity (RV) data are often accompanied by signals coming solely from stellar photo- or chromospheric variation. Such variation can reduce the precision of planet detection and mass measurements, and cause misidentification of planetary signals. Recently, several authors have demonstrated the utility of Gaussian Process (GP) regression for disentangling planetary signals in RV observations (Aigrain et al. 2012; Angus et al. 2017; Czekala et al. 2017; Faria et al. 2016; Gregory 2015; Haywood et al. 2014; Rajpaul et al. 2015; Foreman-Mackey et al. 2017). GP models the covariance of multivariate data to make predictions about likely underlying trends in the data, which can be applied to regions where there are no existing observations. The potency of GP has been used to infer stellar rotation periods; to model and disentangle time series spectra; and to determine physical aspects, populations, and detection of exoplanets, among other astrophysical applications. Here, we implement similar analysis techniques to times series of the Ca-2 H and K activity indicator measured simultaneously with RVs in a small sample of stars from the large Keck/HIRES RV planet search program. Our goal is to characterize the pattern(s) of non-planetary variation to be able to know what is/ is not a planetary signal. We investigated ten different GP kernels and their respective hyperparameters to determine the optimal combination (e.g., the lowest Bayesian Information Criterion value) in each stellar data set. To assess the hyperparameters’ error, we sampled their posterior distributions using Markov chain Monte Carlo (MCMC) analysis on the optimized kernels. Our results demonstrate how GP analysis of stellar activity indicators alone can contribute to exoplanet detection in RV data, and highlight the challenges in applying GP analysis to relatively small, irregularly sampled time series.

  8. Formation of hydrocarbons in irradiated Brazilian beans: gas chromatographic analysis to detect radiation processing

    International Nuclear Information System (INIS)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Hartmann, M.; Ammon, J.; Delincee, H.

    1997-01-01

    Radiation processing of beans, which are a major source of dietary protein in Brazil, is a valuable alternative to chemical fumigation to combat postharvest losses due to insect infestation. To ensure free consumer choice, irradiated food will be labeled as such, and to enforce labeling, analytical methods to detect the irradiation treatment in the food product itself are desirable. In two varieties of Brazilian beans, Carioca and Macacar beans, the radiolytic formation of hydrocarbons formed after alpha and beta cleavage, with regard to the carbonyl group in triglycerides, have been studied. Using gas chromatographic analysis of these radiolytic hydrocarbons, different yields per precursor fatty acid are observed for the two types of beans. However, the typical degradation pattern allows the identification of the irradiation treatment in both bean varieties, even after 6 months of storage

  9. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    Science.gov (United States)

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  10. Automated detection method for architectural distortion areas on mammograms based on morphological processing and surface analysis

    Science.gov (United States)

    Ichikawa, Tetsuko; Matsubara, Tomoko; Hara, Takeshi; Fujita, Hiroshi; Endo, Tokiko; Iwase, Takuji

    2004-05-01

    As well as mass and microcalcification, architectural distortion is a very important finding for the early detection of breast cancer via mammograms, and such distortions can be classified into three typical types: spiculation, retraction, and distortion. The purpose of this work is to develop an automatic method for detecting areas of architectural distortion with spiculation. The suspect areas are detected by concentration indexes of line-structures extracted by using mean curvature. After that, discrimination analysis of nine features is employed for the classifications of true and false positives. The employed features are the size, the mean pixel value, the mean concentration index, the mean isotropic index, the contrast, and four other features based on the power spectrum. As a result of this work, the accuracy of the classification was 76% and the sensitivity was 80% with 0.9 false positives per image in our database in regard to spiculation. It was concluded that our method was effective in detectiong the area of architectural distortion; however, some architectural distortions were not detected accurately because of the size, the density, or the different appearance of the distorted areas.

  11. Analysis and detection of an incorrect profile shape in a classical scatterometric process

    Science.gov (United States)

    Fawzi, Zaki Sabit; Robert, Stéphane; El Kalyoubi, Ismail; Bayard, Bernard

    2017-01-01

    Scatterometry has become an efficient alternative method for subwavelength diffraction grating characterisation in semiconductor industries. It is based on the reconstruction of the periodic surface from its optical response. Ellipsometry seems to be a more powerful technique for optical measurement and neural networks has proved its effectiveness in the inverse scattering problem. However, in all cases, inverse characterisation processing needs a previously defined geometrical model. The aim of this works to study the impact of an incorrect profile shape in the characterization process and to measure the ability to detect it. Two type of neural network will be treated based respectively on a fixed trapezoidal profile and a generic one involving four different shapes. Theoretical results are presented for different simulated samples including several profile shapes. Experimental results are performed on a photoresist grating with a period of 140 nm on silicon substrate.

  12. Acoustic emission analysis for the detection of appropriate cutting operations in honing processes

    Science.gov (United States)

    Buj-Corral, Irene; Álvarez-Flórez, Jesús; Domínguez-Fernández, Alejandro

    2018-01-01

    In the present paper, acoustic emission was studied in honing experiments obtained with different abrasive densities, 15, 30, 45 and 60. In addition, 2D and 3D roughness, material removal rate and tool wear were determined. In order to treat the sound signal emitted during the machining process, two methods of analysis were compared: Fast Fourier Transform (FFT) and Hilbert Huang Transform (HHT). When density 15 is used, the number of cutting grains is insufficient to provide correct cutting, while clogging appears with densities 45 and 60. The results were confirmed by means of treatment of the sound signal. In addition, a new parameter S was defined as the relationship between energy in low and high frequencies contained within the emitted sound. The selected density of 30 corresponds to S values between 0.1 and 1. Correct cutting operations in honing processes are dependent on the density of the abrasive employed. The density value to be used can be selected by means of measurement and analysis of acoustic emissions during the honing operation. Thus, honing processes can be monitored without needing to stop the process.

  13. Distributed dendritic processing facilitates object detection: a computational analysis on the visual system of the fly.

    Directory of Open Access Journals (Sweden)

    Patrick Hennig

    Full Text Available BACKGROUND: Detecting objects is an important task when moving through a natural environment. Flies, for example, may land on salient objects or may avoid collisions with them. The neuronal ensemble of Figure Detection cells (FD-cells in the visual system of the fly is likely to be involved in controlling these behaviours, as these cells are more sensitive to objects than to extended background structures. Until now the computations in the presynaptic neuronal network of FD-cells and, in particular, the functional significance of the experimentally established distributed dendritic processing of excitatory and inhibitory inputs is not understood. METHODOLOGY/PRINCIPAL FINDINGS: We use model simulations to analyse the neuronal computations responsible for the preference of FD-cells for small objects. We employed a new modelling approach which allowed us to account for the spatial spread of electrical signals in the dendrites while avoiding detailed compartmental modelling. The models are based on available physiological and anatomical data. Three models were tested each implementing an inhibitory neural circuit, but differing by the spatial arrangement of the inhibitory interaction. Parameter optimisation with an evolutionary algorithm revealed that only distributed dendritic processing satisfies the constraints arising from electrophysiological experiments. In contrast to a direct dendro-dendritic inhibition of the FD-cell (Direct Distributed Inhibition model, an inhibition of its presynaptic retinotopic elements (Indirect Distributed Inhibition model requires smaller changes in input resistance in the inhibited neurons during visual stimulation. CONCLUSIONS/SIGNIFICANCE: Distributed dendritic inhibition of retinotopic elements as implemented in our Indirect Distributed Inhibition model is the most plausible wiring scheme for the neuronal circuit of FD-cells. This microcircuit is computationally similar to lateral inhibition between the

  14. Distributed dendritic processing facilitates object detection: a computational analysis on the visual system of the fly.

    Science.gov (United States)

    Hennig, Patrick; Möller, Ralf; Egelhaaf, Martin

    2008-08-28

    Detecting objects is an important task when moving through a natural environment. Flies, for example, may land on salient objects or may avoid collisions with them. The neuronal ensemble of Figure Detection cells (FD-cells) in the visual system of the fly is likely to be involved in controlling these behaviours, as these cells are more sensitive to objects than to extended background structures. Until now the computations in the presynaptic neuronal network of FD-cells and, in particular, the functional significance of the experimentally established distributed dendritic processing of excitatory and inhibitory inputs is not understood. We use model simulations to analyse the neuronal computations responsible for the preference of FD-cells for small objects. We employed a new modelling approach which allowed us to account for the spatial spread of electrical signals in the dendrites while avoiding detailed compartmental modelling. The models are based on available physiological and anatomical data. Three models were tested each implementing an inhibitory neural circuit, but differing by the spatial arrangement of the inhibitory interaction. Parameter optimisation with an evolutionary algorithm revealed that only distributed dendritic processing satisfies the constraints arising from electrophysiological experiments. In contrast to a direct dendro-dendritic inhibition of the FD-cell (Direct Distributed Inhibition model), an inhibition of its presynaptic retinotopic elements (Indirect Distributed Inhibition model) requires smaller changes in input resistance in the inhibited neurons during visual stimulation. Distributed dendritic inhibition of retinotopic elements as implemented in our Indirect Distributed Inhibition model is the most plausible wiring scheme for the neuronal circuit of FD-cells. This microcircuit is computationally similar to lateral inhibition between the retinotopic elements. Hence, distributed inhibition might be an alternative explanation of

  15. Analysis of In-Situ Vibration Monitoring for End-Point Detection of CMP Planarization Processes

    International Nuclear Information System (INIS)

    Hetherington, Dale L.; Lauffer, James P.; Shingledecker, David M.; Stein, David J.; Wyckoff, Edward E.

    1999-01-01

    This paper details the analysis of vibration monitoring for end-point control in oxide CMP processes. Two piezoelectric accelerometers were integrated onto the backside of a stainless steel polishing head of an IPEC 472 polisher. One sensor was placed perpendicular to the carrier plate (vertical) and the other parallel to the plate (horizontal). Wafers patterned with metal and coated with oxide material were polished at different speeds and pressures. Our results show that it is possible to sense a change in the vibration signal over time during planarization of oxide material on patterned wafers. The horizontal accelerometer showed more sensitivity to change in vibration amplitude compared to the vertical accelerometer for a given polish condition. At low carrier and platen rotation rates, the change in vibration signal over time at fixed frequencies decreased approximately ampersand frac12; - 1 order of magnitude (over the 2 to 10 psi polish pressure ranges). At high rotation speeds, the vibration signal remained essentially constant indicating that other factors dominated the vibration signaL These results show that while it is possible to sense changes in acceleration during polishing, more robust hardware and signal processing algorithms are required to ensure its use over a wide range of process conditions

  16. Research and Analysis Laser Target Optics Characteristics and Signal Recognition Processing in Detection Screen System

    Directory of Open Access Journals (Sweden)

    Hanshan LI

    2014-02-01

    Full Text Available In order to improve the measurement accuracy of the laser measurement distance system, this paper study the laser target optics characteristics based on the laser detection principle in the laser measurement distance system. A calculation model of laser reflective echo signal is put forward by analyzing the influence factors on the detector output value, and discuss the relationship between the distance from the detector to the target, the laser wavelength, the Transmission power of laser and the detector output power, the radiation intensity, and use the Fisher identification and modulus maxima method based on wavelet analysis to distinguish and identify the received echo signals. By the theoretical calculation and experimentation, the result shows the laser target optics characteristics are consistent with the calculation method of radiation. The real reflective signal can be identified by using wavelet transform, and the numerical value of the distance between the target and the detector is larger, the numerical value of echo signal will be smaller.

  17. Leak Detection and Model Analysis for Nonlinear Spherical Tank Process Using Conductivity Sensor

    Directory of Open Access Journals (Sweden)

    P. Madhavasarma

    2008-03-01

    Full Text Available Model parameters are affected by leak in process plants and this leak can be considered as disturbance in model evaluation. A non linear spherical tank of 40 liter capacity was used for this study. Process Model was generated experimentally using step response technique with online conductivity as monitoring parameter. A microcontroller using ATMEGA16 was used for the measurement of conductivity with and without leak of process liquid. The process model parameters were affected due to leak up to seventeen percent in the range of flow rates studied.

  18. A Fast Independent Component Analysis Algorithm for Geochemical Anomaly Detection and Its Application to Soil Geochemistry Data Processing

    Directory of Open Access Journals (Sweden)

    Bin Liu

    2014-01-01

    Full Text Available A fast independent component analysis algorithm (FICAA is introduced to process geochemical data for anomaly detection. In geochemical data processing, the geological significance of separated geochemical elements must be explicit. This requires that correlation coefficients be used to overcome the limitation of indeterminacy for the sequences of decomposed signals by the FICAA, so that the sequences of the decomposed signals can be correctly reflected. Meanwhile, the problem of indeterminacy in the scaling of the decomposed signals by the FICAA can be solved by the cumulative frequency method (CFM. To classify surface geochemical samples into true anomalies and false anomalies, assays of the 1 : 10 000 soil geochemical data in the area of Dachaidan in the Qinghai province of China are processed. The CFM and FICAA are used to detect the anomalies of Cu and Au. The results of this research demonstrate that the FICAA can demultiplex the mixed signals and achieve results similar to actual mineralization when 85%, 95%, and 98% are chosen as three levels of anomaly delineation. However, the traditional CFM failed to produce realistic results and has no significant use for prospecting indication. It is shown that application of the FICAA to geochemical data processing is effective.

  19. The rSPA Processes of River Water-quality Analysis System for Critical Contaminate Detection, Classification Multiple-water-quality-parameter Values and Real-time Notification

    OpenAIRE

    Chalisa VEESOMMAI; Yasushi KIYOKI

    2016-01-01

    The water quality analysis is one of the most important aspects of designing environmental systems. It is necessary to realize detection and classification processes and systems for water quality analysis. The important direction is to lead to uncomplicated understanding for public utilization. This paper presents the river Sensing Processing Actuation processes (rSPA) for determination and classification of multiple-water- parameters in Chaophraya river. According to rSPA processes of multip...

  20. Unconscious processes improve lie detection.

    Science.gov (United States)

    Reinhard, Marc-André; Greifeneder, Rainer; Scharmach, Martin

    2013-11-01

    The capacity to identify cheaters is essential for maintaining balanced social relationships, yet humans have been shown to be generally poor deception detectors. In fact, a plethora of empirical findings holds that individuals are only slightly better than chance when discerning lies from truths. Here, we report 5 experiments showing that judges' ability to detect deception greatly increases after periods of unconscious processing. Specifically, judges who were kept from consciously deliberating outperformed judges who were encouraged to do so or who made a decision immediately; moreover, unconscious thinkers' detection accuracy was significantly above chance level. The reported experiments further show that this improvement comes about because unconscious thinking processes allow for integrating the particularly rich information basis necessary for accurate lie detection. These findings suggest that the human mind is not unfit to distinguish between truth and deception but that this ability resides in previously overlooked processes. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  1. Malware detection and analysis

    Science.gov (United States)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan; Sanders, Benjamin; Erickson, Jeremy Lee; Fritz, David Jakob

    2016-03-22

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable to the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.

  2. Phase analysis of coherent radial-breathing-mode phonons in carbon nanotubes: Implications for generation and detection processes

    Science.gov (United States)

    Shimura, Akihiko; Yanagi, Kazuhiro; Yoshizawa, Masayuki

    2018-01-01

    In time-resolved pump-probe spectroscopy of carbon nanotubes, the fundamental understanding of the optical generation and detection processes of radial-breathing-mode (RBM) phonons has been inconsistent among the previous reports. In this study, the tunable-pumping/broadband-probing scheme was used to fully reveal the amplitude and phase of the phonon-modulated signals. We observed that signals detected off resonantly to excitonic transitions are delayed by π /2 radians with respect to resonantly detected signals, which demonstrates that RBM phonons are detected through dynamically modulating the linear response, not through adiabatically modulating the light absorption. Furthermore, we found that the initial phases are independent of the pump detuning across the first (E11) and the second (E22) excitonic resonances, evidencing that the RBM phonons are generated by the displacive excitation rather than stimulated Raman process.

  3. Activation analysis. Detection limits

    International Nuclear Information System (INIS)

    Revel, G.

    1999-01-01

    Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)

  4. Crack detection using image processing

    International Nuclear Information System (INIS)

    Moustafa, M.A.A

    2010-01-01

    This thesis contains five main subjects in eight chapters and two appendices. The first subject discus Wiener filter for filtering images. In the second subject, we examine using different methods, as Steepest Descent Algorithm (SDA) and the Wavelet Transformation, to detect and filling the cracks, and it's applications in different areas as Nano technology and Bio-technology. In third subject, we attempt to find 3-D images from 1-D or 2-D images using texture mapping with Open Gl under Visual C ++ language programming. The fourth subject consists of the process of using the image warping methods for finding the depth of 2-D images using affine transformation, bilinear transformation, projective mapping, Mosaic warping and similarity transformation. More details about this subject will be discussed below. The fifth subject, the Bezier curves and surface, will be discussed in details. The methods for creating Bezier curves and surface with unknown distribution, using only control points. At the end of our discussion we will obtain the solid form, using the so called NURBS (Non-Uniform Rational B-Spline); which depends on: the degree of freedom, control points, knots, and an evaluation rule; and is defined as a mathematical representation of 3-D geometry that can accurately describe any shape from a simple 2-D line, circle, arc, or curve to the most complex 3-D organic free-form surface or (solid) which depends on finding the Bezier curve and creating family of curves (surface), then filling in between to obtain the solid form. Another representation for this subject is concerned with building 3D geometric models from physical objects using image-based techniques. The advantage of image techniques is that they require no expensive equipment; we use NURBS, subdivision surface and mesh for finding the depth of any image with one still view or 2D image. The quality of filtering depends on the way the data is incorporated into the model. The data should be treated with

  5. Novel image processing approach to detect malaria

    Science.gov (United States)

    Mas, David; Ferrer, Belen; Cojoc, Dan; Finaurini, Sara; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev

    2015-09-01

    In this paper we present a novel image processing algorithm providing good preliminary capabilities for in vitro detection of malaria. The proposed concept is based upon analysis of the temporal variation of each pixel. Changes in dark pixels mean that inter cellular activity happened, indicating the presence of the malaria parasite inside the cell. Preliminary experimental results involving analysis of red blood cells being either healthy or infected with malaria parasites, validated the potential benefit of the proposed numerical approach.

  6. A rapid automatic processing platform for bead label-assisted microarray analysis: application for genetic hearing-loss mutation detection.

    Science.gov (United States)

    Zhu, Jiang; Song, Xiumei; Xiang, Guangxin; Feng, Zhengde; Guo, Hongju; Mei, Danyang; Zhang, Guohao; Wang, Dong; Mitchelson, Keith; Xing, Wanli; Cheng, Jing

    2014-04-01

    Molecular diagnostics using microarrays are increasingly being used in clinical diagnosis because of their high throughput, sensitivity, and accuracy. However, standard microarray processing takes several hours and involves manual steps during hybridization, slide clean up, and imaging. Here we describe the development of an integrated platform that automates these individual steps as well as significantly shortens the processing time and improves reproducibility. The platform integrates such key elements as a microfluidic chip, flow control system, temperature control system, imaging system, and automated analysis of clinical results. Bead labeling of microarray signals required a simple imaging system and allowed continuous monitoring of the microarray processing. To demonstrate utility, the automated platform was used to genotype hereditary hearing-loss gene mutations. Compared with conventional microarray processing procedures, the platform increases the efficiency and reproducibility of hybridization, speeding microarray processing through to result analysis. The platform also continuously monitors the microarray signals, which can be used to facilitate optimization of microarray processing conditions. In addition, the modular design of the platform lends itself to development of simultaneous processing of multiple microfluidic chips. We believe the novel features of the platform will benefit its use in clinical settings in which fast, low-complexity molecular genetic testing is required.

  7. Detecting periodicities with Gaussian processes

    Directory of Open Access Journals (Sweden)

    Nicolas Durrande

    2016-04-01

    Full Text Available We consider the problem of detecting and quantifying the periodic component of a function given noise-corrupted observations of a limited number of input/output tuples. Our approach is based on Gaussian process regression, which provides a flexible non-parametric framework for modelling periodic data. We introduce a novel decomposition of the covariance function as the sum of periodic and aperiodic kernels. This decomposition allows for the creation of sub-models which capture the periodic nature of the signal and its complement. To quantify the periodicity of the signal, we derive a periodicity ratio which reflects the uncertainty in the fitted sub-models. Although the method can be applied to many kernels, we give a special emphasis to the Matérn family, from the expression of the reproducing kernel Hilbert space inner product to the implementation of the associated periodic kernels in a Gaussian process toolkit. The proposed method is illustrated by considering the detection of periodically expressed genes in the arabidopsis genome.

  8. Integration of sensitivity and bifurcation analysis to detect critical processes in a model combining signalling and cell population dynamics

    Science.gov (United States)

    Nikolov, S.; Lai, X.; Liebal, U. W.; Wolkenhauer, O.; Vera, J.

    2010-01-01

    In this article we present and test a strategy to integrate, in a sequential manner, sensitivity analysis, bifurcation analysis and predictive simulations. Our strategy uses some of these methods in a coordinated way such that information, generated in one step, feeds into the definition of further analyses and helps refining the structure of the mathematical model. The aim of the method is to help in the designing of more informative predictive simulations, which focus on critical model parameters and the biological effects of their modulation. We tested our methodology with a multilevel model, accounting for the effect of erythropoietin (Epo)-mediated JAK2-STAT5 signalling in erythropoiesis. Our analysis revealed that time-delays associated with the proliferation-differentiation process are critical to induce pathological sustained oscillations, whereas the modulation of time-delays related to intracellular signalling and hypoxia-controlled physiological dynamics is not enough to induce self-oscillations in the system. Furthermore, our results suggest that the system is able to compensate (through the physiological-level feedback loop on hypoxia) the partial impairment of intracellular signalling processes (downregulation or overexpression of Epo receptor complex and STAT5), but cannot control impairment in some critical physiological-level processes, which provoke the emergence of pathological oscillations.

  9. Metabonomics for detection of nuclear materials processing

    International Nuclear Information System (INIS)

    Alam, Todd Michael; Luxon, Bruce A.; Neerathilingam, Muniasamy; Ansari, S.; Volk, David; Sarkar, S.; Alam, Mary Kathleen

    2010-01-01

    Tracking nuclear materials production and processing, particularly covert operations, is a key national security concern, given that nuclear materials processing can be a signature of nuclear weapons activities by US adversaries. Covert trafficking can also result in homeland security threats, most notably allowing terrorists to assemble devices such as dirty bombs. Existing methods depend on isotope analysis and do not necessarily detect chronic low-level exposure. In this project, indigenous organisms such as plants, small mammals, and bacteria are utilized as living sensors for the presence of chemicals used in nuclear materials processing. Such 'metabolic fingerprinting' (or 'metabonomics') employs nuclear magnetic resonance (NMR) spectroscopy to assess alterations in organismal metabolism provoked by the environmental presence of nuclear materials processing, for example the tributyl phosphate employed in the processing of spent reactor fuel rods to extract and purify uranium and plutonium for weaponization.

  10. Planetary Crater Detection and Registration Using Marked Point Processes, Multiple Birth and Death Algorithms, and Region-Based Analysis

    Science.gov (United States)

    Solarna, David; Moser, Gabriele; Le Moigne-Stewart, Jacqueline; Serpico, Sebastiano B.

    2017-01-01

    Because of the large variety of sensors and spacecraft collecting data, planetary science needs to integrate various multi-sensor and multi-temporal images. These multiple data represent a precious asset, as they allow the study of targets spectral responses and of changes in the surface structure; because of their variety, they also require accurate and robust registration. A new crater detection algorithm, used to extract features that will be integrated in an image registration framework, is presented. A marked point process-based method has been developed to model the spatial distribution of elliptical objects (i.e. the craters) and a birth-death Markov chain Monte Carlo method, coupled with a region-based scheme aiming at computational efficiency, is used to find the optimal configuration fitting the image. The extracted features are exploited, together with a newly defined fitness function based on a modified Hausdorff distance, by an image registration algorithm whose architecture has been designed to minimize the computational time.

  11. BEAP profiles as rapid test system for status analysis and early detection of process incidents in biogas plants.

    Science.gov (United States)

    Refai, Sarah; Berger, Stefanie; Wassmann, Kati; Hecht, Melanie; Dickhaus, Thomas; Deppenmeier, Uwe

    2017-03-01

    A method was developed to quantify the performance of microorganisms involved in different digestion levels in biogas plants. The test system was based on the addition of butyrate (BCON), ethanol (ECON), acetate (ACON) or propionate (PCON) to biogas sludge samples and the subsequent analysis of CH 4 formation in comparison to control samples. The combination of the four values was referred to as BEAP profile. Determination of BEAP profiles enabled rapid testing of a biogas plant's metabolic state within 24 h and an accurate mapping of all degradation levels in a lab-scale experimental setup. Furthermore, it was possible to distinguish between specific BEAP profiles for standard biogas plants and for biogas reactors with process incidents (beginning of NH 4 + -N inhibition, start of acidification, insufficient hydrolysis and potential mycotoxin effects). Finally, BEAP profiles also functioned as a warning system for the early prediction of critical NH 4 + -N concentrations leading to a drop of CH 4 formation.

  12. Object Detection using Image Processing

    OpenAIRE

    Jalled, Fares; Voronkov, Ilia

    2016-01-01

    An Unmanned Ariel vehicle (UAV) has greater importance in the army for border security. The main objective of this article is to develop an OpenCV-Python code using Haar Cascade algorithm for object and face detection. Currently, UAVs are used for detecting and attacking the infiltrated ground targets. The main drawback for this type of UAVs is that sometimes the object are not properly detected, which thereby causes the object to hit the UAV. This project aims to avoid such unwanted collisio...

  13. Random Deterioration Process of Conveyor Belt Evaluated by Statistical Analysis of Core Failures Detected Along Belt Axis and Elapsed Time

    Science.gov (United States)

    Blazej, Ryszard; Jurdziak, Leszek; Kirjanów, Agata; Kozlowski, Tomasz

    2017-12-01

    Magnetic diagnostic methods are used for steel cord belt condition evaluation since the beginning of 1970s. Initially they generated an analogue signal for several tens of centimetres of conveyor belts scanned sequentially with one measuring head in several cycles or the whole width of the belt at one time thanks to the installation of many measuring heads across the entire cross section. This did not allow identification of single centimetre failures, but rather an aggregate assessment of the state of quite wide waist. Modern diagnostic devices, thanks to miniaturization, allow up to 200 heads per belt width to identify damage of individual cords. Instead of analogue signals, they generate a zero-one digital signal corresponding to a change in the magnetic field sign, which can illustrate damage on 2D images. This makes it easier to identify the location and size of the damage in the belt image. Statistical analysis of digital signals summed up for consecutive sections along the belt axis allows to present both the source signal and its aggregation for band of a given width to form aggregate measures of belt damage such as the damage density per 1 meter of belt. Observation of changes in these measurements at different times allows on evaluation of its rate of change over time, which can be used to forecast future belt condition and to select the proper moment of preventive belt replacement to another one to avoid emergency downtimes (egg in underground mines) or to recondition of belts (egg. in lignite surface mines). The paper presents the results of investigations of the damage condition of a core of a single belt segment working in one of the copper ore underground mines. Scanning of the belt condition was performed few times at intervals of several months. The paper presents the results of the analysis of the changes in core condition, showing the random character of the damage process along the axis and its change over time.

  14. Post-factum detection of radiation treatment in processed food by analysis of radiation-induced hydrocarbons. Pt. 1. Applying the method L 06.00-37 defined in Para. 35 LMBG (German Act on Food Irradiation) to processed food

    International Nuclear Information System (INIS)

    Hartmann, M.; Ammon, J.; Berg, H.

    1995-01-01

    The German official method L 06.00-37 (Para. 35 German Act on Food Irradiation) is used for the identification of irradiated fat-containing food by GC-analysis of radiation-induced hydrocarbons. Simple modifications in sample preparation allow a distinctive improvement in detection possibilities and detection limits as well. The applicability of the modified method for the detection of irradiated ingredients in model-like processed food is shown. An identification of only 3% (irradiated fat to total fat ratio) irradiated ingredient (1,5 kGy) in processed food was possible. Additionally, the kind of irradiated ingredient could be identified by the pattern of radiation induced hydrocarbons. Their concentrations are corresponding with the fatty acid composition of the irradiated compound. (orig.) [de

  15. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  16. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  17. Aluminium Process Fault Detection and Diagnosis

    Directory of Open Access Journals (Sweden)

    Nazatul Aini Abd Majid

    2015-01-01

    Full Text Available The challenges in developing a fault detection and diagnosis system for industrial applications are not inconsiderable, particularly complex materials processing operations such as aluminium smelting. However, the organizing into groups of the various fault detection and diagnostic systems of the aluminium smelting process can assist in the identification of the key elements of an effective monitoring system. This paper reviews aluminium process fault detection and diagnosis systems and proposes a taxonomy that includes four key elements: knowledge, techniques, usage frequency, and results presentation. Each element is explained together with examples of existing systems. A fault detection and diagnosis system developed based on the proposed taxonomy is demonstrated using aluminium smelting data. A potential new strategy for improving fault diagnosis is discussed based on the ability of the new technology, augmented reality, to augment operators’ view of an industrial plant, so that it permits a situation-oriented action in real working environments.

  18. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  19. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    Science.gov (United States)

    Qin, Kunming; Liu, Qidi; Cai, Hao; Cao, Gang; Lu, Tulin; Shen, Baojia; Shu, Yachun; Cai, Baochang

    2014-01-01

    Background: In traditional Chinese medicine (TCM), raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae), is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA) by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry. Results: The results indicated that there was less arctiin in stir-fried materials than in raw materials. however, there were higher levels of arctigenin in stir-fried materials than in raw materials. Conclusion: We suggest that arctiin reduced significantly following the thermal conversion of arctiin to arctigenin. In conclusion, this finding may shed some light on understanding the differences in the therapeutic values of raw versus PFA in TCM. PMID:25422559

  20. BURAR: Detection and signal processing capabilities

    International Nuclear Information System (INIS)

    Ghica, Daniela; Radulian, Mircea; Popa, Mihaela

    2004-01-01

    Since July 2002, a new seismic monitoring station, the Bucovina Seismic Array (BURAR), has been installed in the northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics (NIEP), Romania. The array consists of 10 seismic sensors (9 short-period and one broad band) located in boreholes and distributed in a 5 x 5 km 2 area. At present, the seismic data are continuously recorded by BURAR and transmitted in real-time to the Romanian National Data Centre (ROM N DC), in Bucharest and to the National Data Center of USA, in Florida. The statistical analysis for the seismic information gathered at ROM N DC by the BURAR in the August 2002 - December 2003 time interval points out a much better efficiency of the BURAR system in detecting teleseismic events and local events occurred in the N-NE part of Romanian territory, in comparison with the actual Romanian Telemetered Network. Furthermore, the BURAR monitoring system has proven to be an important source of reliable data for NIEP efforts in issuing the seismic bulletins. Signal processing capability of the system provides useful information in order to improve the location of the local seismic events, using the array beamforming procedure. This method increases significantly the signal-to-noise ratio by summing up the coherent signals from the array components. In this way, possible source nucleation phases can be detected. At the same time, using the slowness and back azimuth estimations by f-k analysis, locations for the seismic events can be established based only on the information recorded by the BURAR array, acting like a single seismic station recording system. (authors)

  1. The Data Analysis in Gravitational Wave Detection

    Science.gov (United States)

    Wang, Xiao-ge; Lebigot, Eric; Du, Zhi-hui; Cao, Jun-wei; Wang, Yun-yong; Zhang, Fan; Cai, Yong-zhi; Li, Mu-zi; Zhu, Zong-hong; Qian, Jin; Yin, Cong; Wang, Jian-bo; Zhao, Wen; Zhang, Yang; Blair, David; Ju, Li; Zhao, Chun-nong; Wen, Lin-qing

    2017-01-01

    Gravitational wave (GW) astronomy based on the GW detection is a rising interdisciplinary field, and a new window for humanity to observe the universe, followed after the traditional astronomy with the electromagnetic waves as the detection means, it has a quite important significance for studying the origin and evolution of the universe, and for extending the astronomical research field. The appearance of laser interferometer GW detector has opened a new era of GW detection, and the data processing and analysis of GWs have already been developed quickly around the world, to provide a sharp weapon for the GW astronomy. This paper introduces systematically the tool software that commonly used for the data analysis of GWs, and discusses in detail the basic methods used in the data analysis of GWs, such as the time-frequency analysis, composite analysis, pulsar timing analysis, matched filter, template, χ2 test, and Monte-Carlo simulation, etc.

  2. Proactive detection of bones in poultry processing

    Science.gov (United States)

    Daley, W. D. R.; Stewart, John

    2009-05-01

    Bones continue to be a problem of concern for the poultry industry. Most further processed products begin with the requirement for raw material with minimal bones. The current process for generating deboned product requires systems for monitoring and inspecting the output product. The current detection systems are either people palpitating the product or X-ray systems. The current performance of these inspection techniques are below the desired levels of accuracies and are costly. We propose a technique for monitoring bones that conduct the inspection operation in the deboning the process so as to have enough time to take action to reduce the probability that bones will end up in the final product. This is accomplished by developing active cones with built in illumination to backlight the cage (skeleton) on the deboning line. If the bones of interest are still on the cage then the bones are not in the associated meat. This approach also allows for the ability to practice process control on the deboning operation to keep the process under control as opposed to the current system where the detection is done post production and does not easily present the opportunity to adjust the process. The proposed approach shows overall accuracies of about 94% for the detection of the clavicle bones.

  3. Signal processing aspects of windshear detection

    Science.gov (United States)

    Aalfs, David D.; Baxa, Ernest G., Jr.; Bracalente, Emedio M.

    1993-01-01

    Low-altitude windshear (LAWS) has been identified as a major hazard to aircraft, particularly during takeoff and landing. The Federal Aviation Administration (FAA) has been involved with developing technology to detect LAWS. A key element in this technology is high resolution pulse Doppler weather radar equipped with signal and data processing to provide timely information about possible hazardous conditions.

  4. Change Detection Analysis With Spectral Thermal Imagery

    National Research Council Canada - National Science Library

    Behrens, Richard

    1998-01-01

    ... (LWIR) region. This study used analysis techniques of differencing, histograms, and principal components analysis to detect spectral changes and investigate the utility of spectral change detection...

  5. PCB Fault Detection Using Image Processing

    Science.gov (United States)

    Nayak, Jithendra P. R.; Anitha, K.; Parameshachari, B. D., Dr.; Banu, Reshma, Dr.; Rashmi, P.

    2017-08-01

    The importance of the Printed Circuit Board inspection process has been magnified by requirements of the modern manufacturing environment where delivery of 100% defect free PCBs is the expectation. To meet such expectations, identifying various defects and their types becomes the first step. In this PCB inspection system the inspection algorithm mainly focuses on the defect detection using the natural images. Many practical issues like tilt of the images, bad light conditions, height at which images are taken etc. are to be considered to ensure good quality of the image which can then be used for defect detection. Printed circuit board (PCB) fabrication is a multidisciplinary process, and etching is the most critical part in the PCB manufacturing process. The main objective of Etching process is to remove the exposed unwanted copper other than the required circuit pattern. In order to minimize scrap caused by the wrongly etched PCB panel, inspection has to be done in early stage. However, all of the inspections are done after the etching process where any defective PCB found is no longer useful and is simply thrown away. Since etching process costs 0% of the entire PCB fabrication, it is uneconomical to simply discard the defective PCBs. In this paper a method to identify the defects in natural PCB images and associated practical issues are addressed using Software tools and some of the major types of single layer PCB defects are Pattern Cut, Pin hole, Pattern Short, Nick etc., Therefore the defects should be identified before the etching process so that the PCB would be reprocessed. In the present approach expected to improve the efficiency of the system in detecting the defects even in low quality images

  6. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  7. Improvement of lung abnormality detection in computed radiography using multi-objective frequency processing: Evaluation by receiver operating characteristics (ROC) analysis

    International Nuclear Information System (INIS)

    Tagashira, Hiroyuki; Arakawa, Kenji; Yoshimoto, Masahiro; Mochizuki, Teruhito; Murase, Kenya; Yoshida, Hiroyuki

    2008-01-01

    Computed radiography (CR) has been shown to have relatively low sensitivity for detection of pulmonary nodules. This poor sensitivity precludes its use as a screening modality despite the low cost, low dose and wide distribution of devices. The purpose of this study was to apply multi-objective frequency processing (MFP) to CR images and to evaluate its usefulness for diagnosing subtle lung abnormalities. Fifty CR images with simulated subtle lung abnormalities were obtained from 50 volunteers. Each image was processed with MFP. We cut chest images. The chest image was divided into two rights and left. A total of 200 half-chest images (100 MFP-processed images and 100 MFP-unprocessed images) were prepared. Five radiologists participated in this study. ROC analyses demonstrated that the detection rate of simulated subtle lung abnormalities on the CR images was significantly better with MFP (Az = 0.8508) than without MFP (Az = 0.7925). The CR images processed with MFP could be useful for diagnosing subtle lung abnormalities. In conclusion, MFP appears to be useful for increasing the sensitivity and specificity in the detection of pulmonary nodules, ground-glass opacity (GGO) and reticular shadow

  8. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  9. Detection of Plant Diseases Using Image Processing Tools -A Overview

    OpenAIRE

    Asha R. Patil Varsha I.Pati; B.S.Panchbhai

    2017-01-01

    Analysis of plants disease is main goal for increase productivity of grain, fruits, vegetable. Detection of proper disease of plants using image processing is possible by different steps of it. Like image Acquisition, image enhancement, segmentation, feature extraction, and classification.RGB image is acquire and translate for processing and diagnosis of plant disease by CR-Network. Segmentation is used for which and how many areas are affected by disease using k-clustering. Future extraction...

  10. Successively detected events of the multiple muon catalysis process

    International Nuclear Information System (INIS)

    Zinov, V.G.; Somov, L.N.; Fil'chenkov, V.V.

    1984-01-01

    The kinetics of multiple muon catalysis process is considered. Expressions for the yields and time distributions of successive events of mu-catalysis reactions detected with epsilon 2 +t 2 mixtere are given. These results can be also applied to the synthesis reactions p+d -> 3 He+γ in h 2 d 2 mixture and p+t -> 4 He+γ in h 2 +t 2 mixture. It is shown that the mu-catalysis process parameters and the detection efficiency can be independently determined for the experimental data analysis

  11. Statistical methods for anomaly detection in the complex process

    International Nuclear Information System (INIS)

    Al Mouhamed, Mayez

    1977-09-01

    In a number of complex physical systems the accessible signals are often characterized by random fluctuations about a mean value. The fluctuations (signature) often transmit information about the state of the system that the mean value cannot predict. This study is undertaken to elaborate statistical methods of anomaly detection on the basis of signature analysis of the noise inherent in the process. The algorithm presented first learns the characteristics of normal operation of a complex process. Then it detects small deviations from the normal behavior. The algorithm can be implemented in a medium-sized computer for on line application. (author) [fr

  12. Detection of microparticles in dynamic processes

    International Nuclear Information System (INIS)

    Ten, K A; Pruuel, E R; Kashkarov, A O; Rubtsov, I A; Shechtman, L I; Zhulanov, V V; Tolochko, B P; Rykovanov, G N; Muzyrya, A K; Smirnov, E B; Stolbikov, M Yu; Prosvirnin, K M

    2016-01-01

    When a metal plate is subjected to a strong shock impact, its free surface emits a flow of particles of different sizes (shock-wave “dusting”). Traditionally, the process of dusting is investigated by the methods of pulsed x-ray or piezoelectric sensor or via an optical technique. The particle size ranges from a few microns to hundreds of microns. The flow is assumed to include also finer particles, which cannot be detected with the existing methods yet. On the accelerator complex VEPP-3-VEPP-4 at the BINP there are two experiment stations for research on fast processes, including explosion ones. The stations enable measurement of both passed radiation (absorption) and small-angle x-ray scattering on synchrotron radiation (SR). Radiation is detected with a precision high-speed detector DIMEX. The detector has an internal memory of 32 frames, which enables recording of the dynamics of the process (shooting of movies) with intervals of 250 ns to 2 μ s. Flows of nano- and microparticles from free surfaces of various materials (copper and tin) have been examined. Microparticle flows were emitted from grooves of 50-200 μ s in size and joints (gaps) between metal parts. With the soft x-ray spectrum of SR one can explore the dynamics of a single microjet of micron size. The dynamics of density distribution along micro jets were determined. Under a shock wave (∼ 60 GPa) acting on tin disks, flows of microparticles from a smooth surface were recorded. (paper)

  13. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  14. Early skin tumor detection from microscopic images through image processing

    International Nuclear Information System (INIS)

    Siddiqi, A.A.; Narejo, G.B.; Khan, A.M.

    2017-01-01

    The research is done to provide appropriate detection technique for skin tumor detection. The work is done by using the image processing toolbox of MATLAB. Skin tumors are unwanted skin growth with different causes and varying extent of malignant cells. It is a syndrome in which skin cells mislay the ability to divide and grow normally. Early detection of tumor is the most important factor affecting the endurance of a patient. Studying the pattern of the skin cells is the fundamental problem in medical image analysis. The study of skin tumor has been of great interest to the researchers. DIP (Digital Image Processing) allows the use of much more complex algorithms for image processing, and hence, can offer both more sophisticated performance at simple task, and the implementation of methods which would be impossibly by analog means. It allows much wider range of algorithms to be applied to the input data and can avoid problems such as build up of noise and signal distortion during processing. The study shows that few works has been done on cellular scale for the images of skin. This research allows few checks for the early detection of skin tumor using microscopic images after testing and observing various algorithms. After analytical evaluation the result has been observed that the proposed checks are time efficient techniques and appropriate for the tumor detection. The algorithm applied provides promising results in lesser time with accuracy. The GUI (Graphical User Interface) that is generated for the algorithm makes the system user friendly. (author)

  15. Early Skin Tumor Detection from Microscopic Images through Image Processing

    Directory of Open Access Journals (Sweden)

    AYESHA AMIR SIDDIQI

    2017-10-01

    Full Text Available The research is done to provide appropriate detection technique for skin tumor detection. The work is done by using the image processing toolbox of MATLAB. Skin tumors are unwanted skin growth with different causes and varying extent of malignant cells. It is a syndrome in which skin cells mislay the ability to divide and grow normally. Early detection of tumor is the most important factor affecting the endurance of a patient. Studying the pattern of the skin cells is the fundamental problem in medical image analysis. The study of skin tumor has been of great interest to the researchers. DIP (Digital Image Processing allows the use of much more complex algorithms for image processing, and hence, can offer both more sophisticated performance at simple task, and the implementation of methods which would be impossibly by analog means. It allows much wider range of algorithms to be applied to the input data and can avoid problems such as build up of noise and signal distortion during processing. The study shows that few works has been done on cellular scale for the images of skin. This research allows few checks for the early detection of skin tumor using microscopic images after testing and observing various algorithms. After analytical evaluation the result has been observed that the proposed checks are time efficient techniques and appropriate for the tumor detection. The algorithm applied provides promising results in lesser time with accuracy. The GUI (Graphical User Interface that is generated for the algorithm makes the system user friendly

  16. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  17. State-based Event Detection Optimization for Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Shanglian PENG

    2014-02-01

    Full Text Available Detection of patterns in high speed, large volume of event streams has been an important paradigm in many application areas of Complex Event Processing (CEP including security monitoring, financial markets analysis and health-care monitoring. To assure real-time responsive complex pattern detection over high volume and speed event streams, efficient event detection techniques have to be designed. Unfortunately evaluation of the Nondeterministic Finite Automaton (NFA based event detection model mainly considers single event query and its optimization. In this paper, we propose multiple event queries evaluation on event streams. In particular, we consider scalable multiple event detection model that shares NFA transfer states of different event queries. For each event query, the event query is parse into NFA and states of the NFA are partitioned into different units. With this partition, the same individual state of NFA is run on different processing nodes, providing states sharing and reducing partial matches maintenance. We compare our state-based approach with Stream-based And Shared Event processing (SASE. Our experiments demonstrate that state-based approach outperforms SASE both on CPU time usage and memory consumption.

  18. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    Science.gov (United States)

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  19. Review of Leaf Unhealthy Region Detection Using Image Processing Techniques

    OpenAIRE

    A. Dhole, S; Shaikh, Rukaiyya Pyarelal

    2016-01-01

    - In agricultural field the plants comes to an attack from the various pets bacterial and micro-organism diseases. This diseases attacks on the plant leaves, steams, and fruit part. This present review paper discussed the image processing techniques which is used in performing the early detection of plant diseases through leaf feature inspection. the basic objective of this work is to develop image analysis and classification techniques for extraction and finally classified the diseases pre...

  20. DETECTION OF CRACKS IN ASPHALT PAVEMENT DURING ROAD INSPECTION PROCESSES

    Directory of Open Access Journals (Sweden)

    Marcin STANIEK

    2017-09-01

    Full Text Available Road inspection is one of key processes of a pavement management system, whose function is to examine and describe the road infrastructure condition. When thoroughly performed, it provides the information required to implement an adequate road infrastructure maintenance policy and plan ad hoc repairs or refurbishments. This article discusses a solution for automatic asphalt pavement cracking detection, based on image-processing technology. This solution makes it possible to identify different crack types, i.e., transverse, longitudinal, alligator-type and technological cracks. The detection process is based on the application of various methods, including statistical difference identification for pre-assumed image analysis directions, i.e., in and opposite to the test vehicle running direction. The purpose of the morphological and filtering operations applied was to reduce the image noise level. The solution proposed was verified using video material in the form of a sequence of images recorded using the test vehicle.

  1. Lameness detection in dairy cattle: single predictor v. multivariate analysis of image-based posture processing and behaviour and performance sensing.

    Science.gov (United States)

    Van Hertem, T; Bahr, C; Schlageter Tello, A; Viazzi, S; Steensels, M; Romanini, C E B; Lokhorst, C; Maltz, E; Halachmi, I; Berckmans, D

    2016-09-01

    The objective of this study was to evaluate if a multi-sensor system (milk, activity, body posture) was a better classifier for lameness than the single-sensor-based detection models. Between September 2013 and August 2014, 3629 cow observations were collected on a commercial dairy farm in Belgium. Human locomotion scoring was used as reference for the model development and evaluation. Cow behaviour and performance was measured with existing sensors that were already present at the farm. A prototype of three-dimensional-based video recording system was used to quantify automatically the back posture of a cow. For the single predictor comparisons, a receiver operating characteristics curve was made. For the multivariate detection models, logistic regression and generalized linear mixed models (GLMM) were developed. The best lameness classification model was obtained by the multi-sensor analysis (area under the receiver operating characteristics curve (AUC)=0.757±0.029), containing a combination of milk and milking variables, activity and gait and posture variables from videos. Second, the multivariate video-based system (AUC=0.732±0.011) performed better than the multivariate milk sensors (AUC=0.604±0.026) and the multivariate behaviour sensors (AUC=0.633±0.018). The video-based system performed better than the combined behaviour and performance-based detection model (AUC=0.669±0.028), indicating that it is worthwhile to consider a video-based lameness detection system, regardless the presence of other existing sensors in the farm. The results suggest that Θ2, the feature variable for the back curvature around the hip joints, with an AUC of 0.719 is the best single predictor variable for lameness detection based on locomotion scoring. In general, this study showed that the video-based back posture monitoring system is outperforming the behaviour and performance sensing techniques for locomotion scoring-based lameness detection. A GLMM with seven specific

  2. Flow injection analysis of ethyl xanthate by gas diffusion and UV detection as CS2 for process monitoring of sulfide ore flotation.

    Science.gov (United States)

    Cordeiro, Thiago G; Hidalgo, Pilar; Gutz, Ivano G R; Pedrotti, Jairo J

    2010-07-15

    A sensitive and robust analytical method for spectrophotometric determination of ethyl xanthate, CH(3)CH(2)OCS(2)(-) at trace concentrations in pulp solutions from froth flotation process is proposed. The analytical method is based on the decomposition of ethyl xanthate, EtX(-), with 2.0 mol L(-1) HCl generating ethanol and carbon disulfide, CS(2). A gas diffusion cell assures that only the volatile compounds diffuse through a PTFE membrane towards an acceptor stream of deionized water, thus avoiding the interferences of non-volatile compounds and suspended particles. The CS(2) is selectively detected by UV absorbance at 206 nm (epsilon=65,000 L mol(-1) cm(-1)). The measured absorbance is directly proportional to EtX(-) concentration present in the sample solutions. The Beer's law is obeyed in a 1x10(-6) to 2x10(-4) mol L(-1) concentration range of ethyl xanthate in the pulp with an excellent correlation coefficient (r=0.999) and a detection limit of 3.1x10(-7) mol L(-1), corresponding to 38 microg L(-1). At flow rates of 200 microL min(-1) of the donor stream and 100 microL min(-1) of the acceptor channel a sampling rate of 15 injections per hour could be achieved with RSDapplications demonstrate the versatility of the FIA method: (i) evaluation the free EtX(-) concentration during a laboratory study of the EtX(-) adsorption capacity on pulverized sulfide ore (pyrite) and (ii) monitoring of EtX(-) at different stages (from starting load to washing effluents) of a flotation pilot plant processing a Cu-Zn sulfide ore. Copyright 2010 Elsevier B.V. All rights reserved.

  3. Automatic flow analysis method to determine traces of Mn²⁺ in sea and drinking waters by a kinetic catalytic process using LWCC-spectrophotometric detection.

    Science.gov (United States)

    Chaparro, Laura; Ferrer, Laura; Leal, Luz O; Cerdà, Víctor

    2016-02-01

    A new automatic kinetic catalytic method has been developed for the measurement of Mn(2+) in drinking and seawater samples. The method is based on the catalytic effect of Mn(2+) on the oxidation of tiron by hydrogen peroxide in presence of Pb(2+) as an activator. The optimum conditions were obtained at pH 10 with 0.019 mol L(-1) 2'2 bipyridyl, 0.005 mol L(-1) tiron and 0.38 mol L(-1) hydrogen peroxide. Flow system is based on multisyringe flow injection analysis (MSFIA) coupled with a lab-on-valve (LOV) device exploiting on line spectrophotometric detection by a Liquid Waveguide Capillary Cell (LWCC), 1m optical length and performed at 445 nm. Under the optimized conditions by a multivariate approach, the method allowed the measurement of Mn(2+) in a range of 0.03-35 µg L(-1) with a detection limit of 0.010 µg L(-1), attaining a repeatability of 1.4% RSD. The method was satisfactorily applied to the determination of Mn(2+) in environmental water samples. The reliability of method was also verified by determining the manganese content of the certified standard reference seawater sample, CASS-4. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Automated Windowing Processing for Pupil Detection

    National Research Council Canada - National Science Library

    Ebisawa, Y

    2001-01-01

    .... The pupil center in the video image is a focal point to determine the eye gaze. Recently, to improve the disadvantages of traditional pupil detection methods, a pupil detection technique using two light sources (LEDs...

  5. Fake currency detection using image processing

    Science.gov (United States)

    Agasti, Tushar; Burand, Gajanan; Wade, Pratik; Chitra, P.

    2017-11-01

    The advancement of color printing technology has increased the rate of fake currency note printing and duplicating the notes on a very large scale. Few years back, the printing could be done in a print house, but now anyone can print a currency note with maximum accuracy using a simple laser printer. As a result the issue of fake notes instead of the genuine ones has been increased very largely. India has been unfortunately cursed with the problems like corruption and black money. And counterfeit of currency notes is also a big problem to it. This leads to design of a system that detects the fake currency note in a less time and in a more efficient manner. The proposed system gives an approach to verify the Indian currency notes. Verification of currency note is done by the concepts of image processing. This article describes extraction of various features of Indian currency notes. MATLAB software is used to extract the features of the note. The proposed system has got advantages like simplicity and high performance speed. The result will predict whether the currency note is fake or not.

  6. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  7. Entanglement of identical particles and the detection process

    DEFF Research Database (Denmark)

    Tichy, Malte C.; de Melo, Fernando; Kus, Marek

    2013-01-01

    , which is controlled by the measurement setup and which quantifies the extent to which the (anti-)symmetrization of the wavefunction impacts on physical observables. Initially indistinguishable particles can gain or loose entanglement on their transition to distinguishability, and their quantum......We introduce detector-level entanglement, a unified entanglement concept for identical particles that takes into account the possible deletion of many-particle which-way information through the detection process. The concept implies a measure for the effective indistinguishability of the particles...... statistical behavior depends on their initial entanglement. Our results show that entanglement cannot be attributed to a state of identical particles alone, but that the detection process has to be incorporated in the analysis....

  8. Calculation of the detection limits for radionuclides identified in gamma-ray spectra based on post-processing peak analysis results.

    Science.gov (United States)

    Korun, M; Vodenik, B; Zorko, B

    2018-03-01

    A new method for calculating the detection limits of gamma-ray spectrometry measurements is presented. The method is applicable for gamma-ray emitters, irrespective of the influences of the peaked background, the origin of the background and the overlap with other peaks. It offers the opportunity for multi-gamma-ray emitters to calculate the common detection limit, corresponding to more peaks. The detection limit is calculated by approximating the dependence of the uncertainty in the indication on its value with a second-order polynomial. In this approach the relation between the input quantities and the detection limit are described by an explicit expression and can be easy investigated. The detection limit is calculated from the data usually provided by the reports of peak-analyzing programs: the peak areas and their uncertainties. As a result, the need to use individual channel contents for calculating the detection limit is bypassed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Automatic Gap Detection in Friction Stir Welding Processes (Preprint)

    National Research Council Canada - National Science Library

    Yang, Yu; Kalya, Prabhanjana; Landers, Robert G; Krishnamurthy, K

    2006-01-01

    .... This paper develops a monitoring algorithm to detect gaps in Friction Stir Welding (FSW) processes. Experimental studies are conducted to determine how the process parameters and the gap width affect the welding process...

  10. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  11. An automated process for deceit detection

    Science.gov (United States)

    Nwogu, Ifeoma; Frank, Mark; Govindaraju, Venu

    2010-04-01

    In this paper we present a prototype for an automated deception detection system. Similar to polygraph examinations, we attempt to take advantage of the theory that false answers will produce distinctive measurements in certain physiological manifestations. We investigate the role of dynamic eye-based features such as eye closure/blinking and lateral movements of the iris in detecting deceit. The features are recorded both when the test subjects are having non-threatening conversations as well as when they are being interrogated about a crime they might have committed. The rates of the behavioral changes are blindly clustered into two groups. Examining the clusters and their characteristics, we observe that the dynamic features selected for deception detection show promising results with an overall deceptive/non-deceptive prediction rate of 71.43% from a study consisting of 28 subjects.

  12. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  13. Traffic sign detection and analysis

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Trivedi, Mohan M.; Moeslund, Thomas B.

    2012-01-01

    Traffic sign recognition (TSR) is a research field that has seen much activity in the recent decade. This paper introduces the problem and presents 4 recent papers on traffic sign detection and 4 recent papers on traffic sign classification. It attempts to extract recent trends in the field and t...

  14. Crack Length Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    1990-01-01

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better then that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  15. Crack Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal, Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better than that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  16. Processing Ocean Images to Detect Large Drift Nets

    Science.gov (United States)

    Veenstra, Tim

    2009-01-01

    A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.

  17. Employing image processing techniques for cancer detection using microarray images.

    Science.gov (United States)

    Dehghan Khalilabad, Nastaran; Hassanpour, Hamid

    2017-02-01

    Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  19. Matrix Characterization in Threat Material Detection Processes

    International Nuclear Information System (INIS)

    Obhodas, J.; Sudac, D.; Valkovic, V.

    2009-01-01

    Matrix characterization in the threat material detection is of utmost importance, it generates the background against which the threat material signal has to be identified. Threat materials (explosive, chemical warfare, ...) are usually contained within small volume inside large volumes of variable matrices. We have studied the influence of matrix materials on the capability of neutron systems to identify hidden threat material. Three specific scenarios are considered in some details: case 1--contraband material in the sea containers, case 2 - explosives in soil (landmines), case 3 - explosives and chemical warfare on the sea bottom. Effects of container cargo material on tagged neutron system are seen in the increase of gamma background and the decrease of neutron beam intensity. Detection of landmines is more complex because of variable soil properties. We have studied in detail space and time variations of soil elemental compositions and in particular hydrogen content (humidity). Of special interest are ammunitions and chemical warfare on the sea bottom, damping sites and leftovers from previous conflicts (WW-I, WW-II and local). In this case sea sediment is background source and its role is similar to the role of the soil in the landmine detection. In addition to geochemical cycling of chemical elements in semi-enclosed sea, like the Adriatic Sea, one has to consider also anthropogenic influence, especially when studying small scale variations in concentration levels. Some preliminary experimental results obtained with tagged neutron sensor inside an underwater vehicle are presented as well as data on sediment characterization by X-Ray Fluorescence.

  20. Detecting jaundice by using digital image processing

    Science.gov (United States)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  1. Real-Time Plasma Process Condition Sensing and Abnormal Process Detection

    Directory of Open Access Journals (Sweden)

    Ryan Yang

    2010-06-01

    Full Text Available The plasma process is often used in the fabrication of semiconductor wafers. However, due to the lack of real-time etching control, this may result in some unacceptable process performances and thus leads to significant waste and lower wafer yield. In order to maximize the product wafer yield, a timely and accurately process fault or abnormal detection in a plasma reactor is needed. Optical emission spectroscopy (OES is one of the most frequently used metrologies in in-situ process monitoring. Even though OES has the advantage of non-invasiveness, it is required to provide a huge amount of information. As a result, the data analysis of OES becomes a big challenge. To accomplish real-time detection, this work employed the sigma matching method technique, which is the time series of OES full spectrum intensity. First, the response model of a healthy plasma spectrum was developed. Then, we defined a matching rate as an indictor for comparing the difference between the tested wafers response and the health sigma model. The experimental results showed that this proposal method can detect process faults in real-time, even in plasma etching tools.

  2. Processing of Graphene combining Optical Detection and Scanning Probe Lithography

    Directory of Open Access Journals (Sweden)

    Zimmermann Sören

    2015-01-01

    Full Text Available This paper presents an experimental setup tailored for robotic processing of graphene with in-situ vision based control. A robust graphene detection approach is presented applying multiple image processing operations of the visual feedback provided by a high-resolution light microscope. Detected graphene flakes can be modified using a scanning probe based lithographical process that is directly linked to the in-situ optical images. The results of this process are discussed with respect to further application scenarios.

  3. Recent developments in analytical detection methods for radiation processed foods

    International Nuclear Information System (INIS)

    Wu Jilan

    1993-01-01

    A short summary of the programmes of 'ADMIT' (FAO/IAEA) and the developments in analytical detection methods for radiation processed foods has been given. It is suggested that for promoting the commercialization of radiation processed foods and controlling its quality, one must pay more attention to the study of analytical detection methods of irradiated food

  4. REVIEW ON LUNG CANCER DETECTION USING IMAGE PROCESSING TECHNIQUE

    OpenAIRE

    Anam Quadri; Rashida Shujaee; Nishat Khan

    2016-01-01

    This paper presents a review on the lung cancer detection method using image processing. In recent years the image processing mechanisms are used widely in several medical areas for improving earlier detection and treatment stages.Also the different procedures and design methodologies for the same have also been discussed

  5. Numerical analysis of Eucalyptus grandis × E. urophylla heat-treatment: A dynamically detecting method of mass loss during the process

    Directory of Open Access Journals (Sweden)

    Zijian Zhao

    Full Text Available Eucalyptus particles, lamellas and boards were applied to explore a simply-implemented method with neglected heat and mass transfer to inspect the mass loss during the heat-treatment course. The results revealed that the mass loss of a certain period was theoretically the definite integration of loss rate to time in this period, and a monitoring model for mass loss speed was developed with the particles and validated with the lamellas and boards. The loss rate was correlated to the temperature and temperature-evolving speed in the model which was composed of three functions during different temperature-evolving period. The sample mass loss was calculated in the MATLAB for the lamellas and boards and the model was validated and adjusted based on the difference between the computed results and the practically measured loss values. The error ranges of the new models were −16.30% to 18.35% for wood lamellas and −9.86% to 6.80% for wood boards. This method made it possible to acquire the instantaneous loss value through continuously detecting the wood temperature evolution. This idea could provide a reference for the Eucalyptus heat-treatment to detect the treating course and control the final material characteristics. Keywords: Heat treatment, Successively monitoring, Mass loss, Eucalyptus, Regression

  6. Numerical analysis of Eucalyptus grandis × E. urophylla heat-treatment: A dynamically detecting method of mass loss during the process

    Science.gov (United States)

    Zhao, Zijian; Ma, Qing; Mu, Jun; Yi, Songlin; He, Zhengbin

    Eucalyptus particles, lamellas and boards were applied to explore a simply-implemented method with neglected heat and mass transfer to inspect the mass loss during the heat-treatment course. The results revealed that the mass loss of a certain period was theoretically the definite integration of loss rate to time in this period, and a monitoring model for mass loss speed was developed with the particles and validated with the lamellas and boards. The loss rate was correlated to the temperature and temperature-evolving speed in the model which was composed of three functions during different temperature-evolving period. The sample mass loss was calculated in the MATLAB for the lamellas and boards and the model was validated and adjusted based on the difference between the computed results and the practically measured loss values. The error ranges of the new models were -16.30% to 18.35% for wood lamellas and -9.86% to 6.80% for wood boards. This method made it possible to acquire the instantaneous loss value through continuously detecting the wood temperature evolution. This idea could provide a reference for the Eucalyptus heat-treatment to detect the treating course and control the final material characteristics.

  7. Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR

    Science.gov (United States)

    2014-07-12

    Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report...Aggregation Operator For Humanitarian Demining Using Hand- Held GPR , , (01 2008): . doi: D. Ho, P. Gader, J. Wilson, H. Frigui. Subspace Processing

  8. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  9. Crack Detection with Lamb Wave Wavenumber Analysis

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  10. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  11. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  12. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  13. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  14. Detection of cracks on concrete surfaces by hyperspectral image processing

    Science.gov (United States)

    Santos, Bruno O.; Valença, Jonatas; Júlio, Eduardo

    2017-06-01

    All large infrastructures worldwide must have a suitable monitoring and maintenance plan, aiming to evaluate their behaviour and predict timely interventions. In the particular case of concrete infrastructures, the detection and characterization of crack patterns is a major indicator of their structural response. In this scope, methods based on image processing have been applied and presented. Usually, methods focus on image binarization followed by applications of mathematical morphology to identify cracks on concrete surface. In most cases, publications are focused on restricted areas of concrete surfaces and in a single crack. On-site, the methods and algorithms have to deal with several factors that interfere with the results, namely dirt and biological colonization. Thus, the automation of a procedure for on-site characterization of crack patterns is of great interest. This advance may result in an effective tool to support maintenance strategies and interventions planning. This paper presents a research based on the analysis and processing of hyper-spectral images for detection and classification of cracks on concrete structures. The objective of the study is to evaluate the applicability of several wavelengths of the electromagnetic spectrum for classification of cracks in concrete surfaces. An image survey considering highly discretized wavelengths between 425 nm and 950 nm was performed on concrete specimens, with bandwidths of 25 nm. The concrete specimens were produced with a crack pattern induced by applying a load with displacement control. The tests were conducted to simulate usual on-site drawbacks. In this context, the surface of the specimen was subjected to biological colonization (leaves and moss). To evaluate the results and enhance crack patterns a clustering method, namely k-means algorithm, is being applied. The research conducted allows to define the suitability of using clustering k-means algorithm combined with hyper-spectral images highly

  15. Measurement Bias Detection through Factor Analysis

    Science.gov (United States)

    Barendse, M. T.; Oort, F. J.; Werner, C. S.; Ligtvoet, R.; Schermelleh-Engel, K.

    2012-01-01

    Measurement bias is defined as a violation of measurement invariance, which can be investigated through multigroup factor analysis (MGFA), by testing across-group differences in intercepts (uniform bias) and factor loadings (nonuniform bias). Restricted factor analysis (RFA) can also be used to detect measurement bias. To also enable nonuniform…

  16. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  17. Signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    1989-05-01

    At the Specialists' Meeting on Sodium Boiling Detection organized by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency at Chester in the United Kingdom in 1981 various methods of detecting sodium boiling were reported. But, it was not possible to make a comparative assessment of these methods because the signal condition in each experiment was different from others. That is why participants of this meeting recommended that a benchmark test should be carried out in order to evaluate and compare signal processing methods for boiling detection. Organization of the Co-ordinated Research Programme (CRP) on signal processing techniques for sodium boiling noise detection was also recommended at the 16th meeting of the IWGFR. The CRP on Signal Processing Techniques for Sodium Boiling Noise Detection was set up in 1984. Eight laboratories from six countries have agreed to participate in this CRP. The overall objective of the programme was the development of reliable on-line signal processing techniques which could be used for the detection of sodium boiling in an LMFBR core. During the first stage of the programme a number of existing processing techniques used by different countries have been compared and evaluated. In the course of further work, an algorithm for implementation of this sodium boiling detection system in the nuclear reactor will be developed. It was also considered that the acoustic signal processing techniques developed for boiling detection could well make a useful contribution to other acoustic applications in the reactor. This publication consists of two parts. Part I is the final report of the co-ordinated research programme on signal processing techniques for sodium boiling noise detection. Part II contains two introductory papers and 20 papers presented at four research co-ordination meetings since 1985. A separate abstract was prepared for each of these 22 papers. Refs, figs and tabs

  18. Fault Management: Degradation Signature Detection, Modeling, and Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  19. Fault Management: Degradation Signature Detection, Modeling, and Processing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  20. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING & EVALUATION MEHTODS & REQUIREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    SCHOFIELD JS

    2007-10-04

    This document has two purposes: {sm_bullet} Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. {sm_bullet} Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals.

  1. Plagiarism Detection for Indonesian Language using Winnowing with Parallel Processing

    Science.gov (United States)

    Arifin, Y.; Isa, S. M.; Wulandhari, L. A.; Abdurachman, E.

    2018-03-01

    The plagiarism has many forms, not only copy paste but include changing passive become active voice, or paraphrasing without appropriate acknowledgment. It happens on all language include Indonesian Language. There are many previous research that related with plagiarism detection in Indonesian Language with different method. But there are still some part that still has opportunity to improve. This research proposed the solution that can improve the plagiarism detection technique that can detect not only copy paste form but more advance than that. The proposed solution is using Winnowing with some addition process in pre-processing stage. With stemming processing in Indonesian Language and generate fingerprint in parallel processing that can saving time processing and produce the plagiarism result on the suspected document.

  2. Leucocyte classification for leukaemia detection using image processing techniques.

    Science.gov (United States)

    Putzu, Lorenzo; Caocci, Giovanni; Di Ruberto, Cecilia

    2014-11-01

    The counting and classification of blood cells allow for the evaluation and diagnosis of a vast number of diseases. The analysis of white blood cells (WBCs) allows for the detection of acute lymphoblastic leukaemia (ALL), a blood cancer that can be fatal if left untreated. Currently, the morphological analysis of blood cells is performed manually by skilled operators. However, this method has numerous drawbacks, such as slow analysis, non-standard accuracy, and dependences on the operator's skill. Few examples of automated systems that can analyse and classify blood cells have been reported in the literature, and most of these systems are only partially developed. This paper presents a complete and fully automated method for WBC identification and classification using microscopic images. In contrast to other approaches that identify the nuclei first, which are more prominent than other components, the proposed approach isolates the whole leucocyte and then separates the nucleus and cytoplasm. This approach is necessary to analyse each cell component in detail. From each cell component, different features, such as shape, colour and texture, are extracted using a new approach for background pixel removal. This feature set was used to train different classification models in order to determine which one is most suitable for the detection of leukaemia. Using our method, 245 of 267 total leucocytes were properly identified (92% accuracy) from 33 images taken with the same camera and under the same lighting conditions. Performing this evaluation using different classification models allowed us to establish that the support vector machine with a Gaussian radial basis kernel is the most suitable model for the identification of ALL, with an accuracy of 93% and a sensitivity of 98%. Furthermore, we evaluated the goodness of our new feature set, which displayed better performance with each evaluated classification model. The proposed method permits the analysis of blood cells

  3. A signal processing method for the friction-based endpoint detection system of a CMP process

    International Nuclear Information System (INIS)

    Xu Chi; Guo Dongming; Jin Zhuji; Kang Renke

    2010-01-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process. (semiconductor technology)

  4. Sequential Detection of Fission Processes for Harbor Defense

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  5. A visual analysis of the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process

  6. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics. © 2014 The Author. Biometrical Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Detection of optimum maturity of maize using image processing and ...

    African Journals Online (AJOL)

    Detection of optimum maturity of maize using image processing and artificial neural networks. ... The leaves of maize are also very good source of food for grazing livestock like cows, goats, sheep, etc. However, in Nigeria ... of maturity. Keywords: Maize, Maturity, CCD Camera, Image Processing, Artificial Neural Network ...

  8. Pre-Processes for Urban Areas Detection in SAR Images

    Science.gov (United States)

    Altay Açar, S.; Bayır, Ş.

    2017-11-01

    In this study, pre-processes for urban areas detection in synthetic aperture radar (SAR) images are examined. These pre-processes are image smoothing, thresholding and white coloured regions determination. Image smoothing is carried out to remove noises then thresholding is applied to obtain binary image. Finally, candidate urban areas are detected by using white coloured regions determination. All pre-processes are applied by utilizing the developed software. Two different SAR images which are acquired by TerraSAR-X are used in experimental study. Obtained results are shown visually.

  9. Application of image processing technology in yarn hairiness detection

    Directory of Open Access Journals (Sweden)

    Guohong ZHANG

    2016-02-01

    Full Text Available Digital image processing technology is one of the new methods for yarn detection, which can realize the digital characterization and objective evaluation of yarn appearance. This paper overviews the current status of development and application of digital image processing technology used for yarn hairiness evaluation, and analyzes and compares the traditional detection methods and this new developed method. Compared with the traditional methods, the image processing technology based method is more objective, fast and accurate, which is the vital development trend of the yarn appearance evaluation.

  10. Image Pre-processing in Vertical Traffic Signs Detection System

    Directory of Open Access Journals (Sweden)

    Dávid Solus

    2015-06-01

    Full Text Available The aim of this paper is to present the first steps of the systems design that will be able to detect vertical traffic signs and to provide descriptions of the applied data processing methods. This system includes various functional blocks that are described in this paper. The basis of Vertical Traffic Signs Detection System is a pre-processing of a captured traffic scene ahead of vehicle. Main part of this paper contains a description of user friendly software interface for an image pre-processing.

  11. Use of Sparse Principal Component Analysis (SPCA) for Fault Detection

    DEFF Research Database (Denmark)

    Gajjar, Shriram; Kulahci, Murat; Palazoglu, Ahmet

    2016-01-01

    Principal component analysis (PCA) has been widely used for data dimension reduction and process fault detection. However, interpreting the principal components and the outcomes of PCA-based monitoring techniques is a challenging task since each principal component is a linear combination of the ...

  12. Generating functional analysis of CDMA detection dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Mimura, Kazushi [Faculty of Information Sciences, Hiroshima City University, Hiroshima 731-3194 (Japan); Okada, Masato [Graduate School of Frontier Sciences, University of Tokyo, Chiba 277-5861 (Japan); Brain Science Institute, RIKEN, Saitama 351-0198 (Japan); PRESTO, Japan Science and Technology Agency, Chiba 277-8561 (Japan)

    2005-11-18

    We investigate the detection dynamics of the parallel interference canceller (PIC) for code-division multiple-access (CDMA) multiuser detection, applied to a randomly spread, fully synchronous base-band uncoded CDMA channel model with additive white Gaussian noise (AWGN) under perfect power control in the large-system limit. It is known that the predictions of the density evolution (DE) can fairly explain the detection dynamics only in the case where the detection dynamics converge. At transients, though, the predictions of DE systematically deviate from computer simulation results. Furthermore, when the detection dynamics fail to converge, the deviation of the predictions of DE from the results of numerical experiments becomes large. As an alternative, generating functional analysis (GFA) can take into account the effect of the Onsager reaction term exactly and does not need the Gaussian assumption of the local field. We present GFA to evaluate the detection dynamics of PIC for CDMA multiuser detection. The predictions of GFA exhibit good consistency with the computer simulation result for any condition, even if the dynamics fail to converge.

  13. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    International Nuclear Information System (INIS)

    Danise, A.E.

    2004-01-01

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations

  14. Pipeline Processing with an Iterative, Context-Based Detection Model

    Science.gov (United States)

    2016-01-22

    unlimited. 59 optimum choice for steering vectors is the adaptive beamformer weighting [ Capon et al ., 1967] also known as the minimum variance...AFRL-RV-PS- AFRL-RV-PS- TR-2016-0080 TR-2016-0080 PIPELINE PROCESSING WITH AN ITERATIVE, CONTEXT-BASED DETECTION MODEL T. Kværna, et al ...conditions, the capability to detect events down to magnitude 2.0 [Gibbons et al ., 2011]. Figure 3. Location of the SPITS array in relation to Novaya

  15. Simulation of land mine detection processes using nuclear techniques

    International Nuclear Information System (INIS)

    Aziz, M.

    2005-01-01

    A computer models were designed to study the processes of land mine detection using nuclear technique. Parameters that affect the detection were analyzed . Mines of different masses at different depths in the soil are considered using two types of sources , 252 C f and 14 MeV neutron source. The capability to differentiate between mines and other objects such as concrete , iron , wood , Aluminum ,water and polyethylene were analyzed and studied

  16. Fault detection of Tennessee Eastman process based on topological features and SVM

    Science.gov (United States)

    Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen

    2018-03-01

    Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.

  17. Molecular sieves analysis by elastic recoil detection

    International Nuclear Information System (INIS)

    Salah, H.; Azzouz, A.

    1992-01-01

    The opportunity of water determination in zeolites via hydrogen detection using the elastic recoil detection analysis (ERDA) was investigated. The radiation effect upon the desorption rate of hydrogen in miscellaneous types of zeolites, e.g. Y-Faujasite, ZSM-5, SK, etc. and in a natural clay, e.g. an Algerian bentonite was discussed. Quantitative measurements were carried out in order to determine the amount and distribution shape of hydrogen in each material. Various explanations dealing with hydration and constitution water in such a crystalline framework were proposed. The experimental results are in a good agreement with the corresponding theoretical values

  18. Detection of genetically modified organisms in foreign-made processed foods containing corn and potato.

    Science.gov (United States)

    Monma, Kimio; Araki, Rie; Sagi, Naoki; Satoh, Masaki; Ichikawa, Hisatsugu; Satoh, Kazue; Tobe, Takashi; Kamata, Kunihiro; Hino, Akihiro; Saito, Kazuo

    2005-06-01

    Investigations of the validity of labeling regarding genetically modified (GM) products were conducted using polymerase chain reaction (PCR) methods for foreign-made processed foods made from corn and potato purchased in the Tokyo area and in the USA. Several kinds of GM crops were detected in 12 of 32 samples of processed corn samples. More than two GM events for which safety reviews have been completed in Japan were simultaneously detected in 10 samples. GM events MON810 and Bt11 were most frequently detected in the samples by qualitative PCR methods. MON810 was detected in 11 of the 12 samples, and Bt11 was detected in 6 of the 12 samples. In addition, Roundup Ready soy was detected in one of the 12 samples. On the other hand, CBH351, for which the safety assessment was withdrawn in Japan, was not detected in any of the 12 samples. A trial quantitative analysis was performed on six of the GM maize qualitatively positive samples. The estimated amounts of GM maize in these samples ranged from 0.2 to 2.8%, except for one sample, which contained 24.1%. For this sample, the total amount found by event-specific quantitative analysis was 23.8%. Additionally, Roundup Ready soy was detected in one sample of 21 potato-processed foods, although GM potatoes were not detected in any sample.

  19. Digital Image Processing Technique for Breast Cancer Detection

    Science.gov (United States)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  20. Hyperspectral Analysis for Standoff Detection of Dimethyl ...

    Science.gov (United States)

    Journal Article Detecting organophosphates in indoor settings requires more efficient and faster methods of surveying large surface areas than conventional approaches, which sample small surface areas followed by extraction and analysis. This study examined a standoff detection technique utilizing hyperspectral imaging for analysis of building materials in near-real time. In this proof-of-concept study, dimethyl methylphosphonate (DMMP) was applied to stainless steel and laminate coupons and spectra were collected during active illumination. Absorbance bands at approximately 1275 cm-1 and 1050 cm-1 were associated with phosphorus-oxygen double bond (P=O) and phosphorus-oxygen-carbon (P-O-C) bond stretches of DMMP, respectively. The magnitude of these bands increased linearly (r2 = 0.93) with DMMP across the full absorbance spectrum, between ν1 = 877 cm-1 to ν2 = 1262 cm-1. Comparisons between bare and contaminated surfaces on stainless steel using the spectral contrast angle technique indicated that the bare samples showed no sign of contamination, with large uniformly distributed contrast angles of 45˚-55˚, while the contaminated samples had smaller spectral contact angles of 40° in the uncontaminated region. The laminate contaminated region exhibited contact angles of detect DMMP on building materials, with detection levels similar to c

  1. PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-07-01

    Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.

  2. Counterfeit Electronics Detection Using Image Processing and Machine Learning

    International Nuclear Information System (INIS)

    Asadizanjani, Navid; Tehranipoor, Mark; Forte, Domenic

    2017-01-01

    Counterfeiting is an increasing concern for businesses and governments as greater numbers of counterfeit integrated circuits (IC) infiltrate the global market. There is an ongoing effort in experimental and national labs inside the United States to detect and prevent such counterfeits in the most efficient time period. However, there is still a missing piece to automatically detect and properly keep record of detected counterfeit ICs. Here, we introduce a web application database that allows users to share previous examples of counterfeits through an online database and to obtain statistics regarding the prevalence of known defects. We also investigate automated techniques based on image processing and machine learning to detect different physical defects and to determine whether or not an IC is counterfeit. (paper)

  3. Counterfeit Electronics Detection Using Image Processing and Machine Learning

    Science.gov (United States)

    Asadizanjani, Navid; Tehranipoor, Mark; Forte, Domenic

    2017-01-01

    Counterfeiting is an increasing concern for businesses and governments as greater numbers of counterfeit integrated circuits (IC) infiltrate the global market. There is an ongoing effort in experimental and national labs inside the United States to detect and prevent such counterfeits in the most efficient time period. However, there is still a missing piece to automatically detect and properly keep record of detected counterfeit ICs. Here, we introduce a web application database that allows users to share previous examples of counterfeits through an online database and to obtain statistics regarding the prevalence of known defects. We also investigate automated techniques based on image processing and machine learning to detect different physical defects and to determine whether or not an IC is counterfeit.

  4. System for detecting and processing abnormality in electromagnetic shielding

    International Nuclear Information System (INIS)

    Takahashi, T.; Nakamura, M.; Yabana, Y.; Ishikawa, T.; Nagata, K.

    1991-01-01

    The present invention relates to a system for detecting and processing an abnormality in electromagnetic shielding of an intelligent building which is constructed using an electromagnetic shielding material for the skeleton and openings such as windows and doorways so that the whole of the building is formed into an electromagnetic shielding structure. (author). 4 figs

  5. Kernel principal component analysis for change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Morton, J.C.

    2008-01-01

    Principal component analysis (PCA) is often used to detect change over time in remotely sensed images. A commonly used technique consists of finding the projections along the two eigenvectors for data consisting of two variables which represent the same spectral band covering the same geographical...... region acquired at two different time points. If change over time does not dominate the scene, the projection of the original two bands onto the second eigenvector will show change over time. In this paper a kernel version of PCA is used to carry out the analysis. Unlike ordinary PCA, kernel PCA...

  6. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    International Nuclear Information System (INIS)

    Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.; Palanque-Delabrouille, N.; Lanusse, F.; Starck, J.-L.

    2015-01-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000 detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved

  7. Fall detection for multiple pedestrians using depth image processing technique.

    Science.gov (United States)

    Yang, Shih-Wei; Lin, Shir-Kuan

    2014-04-01

    A fall detection method based on depth image analysis is proposed in this paper. As different from the conventional methods, if the pedestrians are partially overlapped or partially occluded, the proposed method is still able to detect fall events and has the following advantages: (1) single or multiple pedestrian detection; (2) recognition of human and non-human objects; (3) compensation for illumination, which is applicable in scenarios using indoor light sources of different colors; (4) using the central line of a human silhouette to obtain the pedestrian tilt angle; and (5) avoiding misrecognition of a squat or stoop as a fall. According to the experimental results, the precision of the proposed fall detection method is 94.31% and the recall is 85.57%. The proposed method is verified to be robust and specifically suitable for applying in family homes, corridors and other public places. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. SCADA alarms processing for wind turbine component failure detection

    Science.gov (United States)

    Gonzalez, E.; Reder, M.; Melero, J. J.

    2016-09-01

    Wind turbine failure and downtime can often compromise the profitability of a wind farm due to their high impact on the operation and maintenance (O&M) costs. Early detection of failures can facilitate the changeover from corrective maintenance towards a predictive approach. This paper presents a cost-effective methodology to combine various alarm analysis techniques, using data from the Supervisory Control and Data Acquisition (SCADA) system, in order to detect component failures. The approach categorises the alarms according to a reviewed taxonomy, turning overwhelming data into valuable information to assess component status. Then, different alarms analysis techniques are applied for two purposes: the evaluation of the SCADA alarm system capability to detect failures, and the investigation of the relation between components faults being followed by failure occurrences in others. Various case studies are presented and discussed. The study highlights the relationship between faulty behaviour in different components and between failures and adverse environmental conditions.

  9. Detection of small atom numbers through image processing

    NARCIS (Netherlands)

    Ockeloen, C.F.; Tauschinsky, A.F.; Spreeuw, R.J.C.; Whitlock, S.

    2010-01-01

    We demonstrate improved detection of small trapped atomic ensembles through advanced postprocessing and optimal analysis of absorption images. A fringe-removal algorithm reduces imaging noise to the fundamental photon-shot-noise level and proves beneficial even in the absence of fringes. A

  10. Process and device for detecting tumours of the eyes

    International Nuclear Information System (INIS)

    Safi, Nour; Thoreson, Elisabeth.

    1975-01-01

    This invention refers to a process and system for detecting tumours of the eye likely to take up radioelements. To this end, the invention proposes a detection process whereby a molecule labelled by a radioelement emitting Beta radiations having an energy spectrum extending beyond the Cerenkov threshold in the vitreous humor of the eye is introduced into the circulatory system of the patient under examination and whereby the Cerenkov emission is measured through the lens and pupil. A β-emitter radioelement and notably 32 P which is taken up energetically by high metabolic activity tissue can be employed in particular. The invention also proposes a system to use the process described above, comprising a dioptric system for transmitting the light produced in the vitreous humor by Cerenkov effect to a light detector of a type enabling the luminous flux it receives to be integrated [fr

  11. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  12. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    Directory of Open Access Journals (Sweden)

    Peter J. Best

    2009-03-01

    Full Text Available Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1 threat monitoring-surveillance of security audit logs for ‘red flags’, (2 automated extraction and analysis of data from audit trails, and (3 using forensic investigation techniques to determine whether a fraud has actually occurred. We demonstrate how mySAP, an enterprise system, can be used for audit trail analysis in detecting financial frauds; afterwards we use a case study of a suspected fraud to illustrate how to implement the methodology.

  13. The effect of image processing on the detection of cancers in digital mammography.

    Science.gov (United States)

    Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C

    2014-08-01

    OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.

  14. Analysis of multiparty mediation processes

    NARCIS (Netherlands)

    Vuković, Siniša

    2013-01-01

    Crucial challenges for multiparty mediation processes include the achievement of adequate cooperation among the mediators and consequent coordination of their activities in the mediation process. Existing literature goes only as far as to make it clear that successful mediation requires necessary

  15. Experimental analysis of armouring process

    Science.gov (United States)

    Lamberti, Alberto; Paris, Ennio

    Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.

  16. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  17. Discontinuity Detection for Analysis of Telerobot Trajectories

    Science.gov (United States)

    Yeom, Kiwon; Ellis, Stephen R.; Adelstein, Bernard D.

    2013-01-01

    To identify spatial and temporal discontinuities in telerobot movement in order to describe the shift in operators control and error correction strategies from continuous control to move-and-wait strategies. This shift was studied under conditions of simulated increasingly time-delayed teleoperation. The ultimate goal is to determine if the time delay associated with the shift is invariant with independently imposed control difficulty. We expect this shift to manifest itself as changes in the number of discontinuity of movement path. We proposed an approach to spatial and temporal discontinuity detection algorithm for analysis of teleoperated trajectory in three dimensional space. The algorithm provides a simple and potentially objective method for detecting the discontinuity during telerobot operation and evaluating the difficulty of rotational coordinate condition in teleoperation.

  18. Process analysis and metrics of complex organisational processes

    OpenAIRE

    Bošković, Dražen

    2008-01-01

    The model of process analysis and process metrics is presented for complex organisational processes, as applied in the management of construction contract documents. The model was applied on four test samples of public-works clients. The qualitative data collected during the survey were converted into quantitative ones using the modified Likert scale. It was demonstrated that the organisational capacity is at the basic level, with an insufficient integration and optimisation of processes, and...

  19. Automatic detection and severity measurement of eczema using image processing.

    Science.gov (United States)

    Alam, Md Nafiul; Munia, Tamanna Tabassum Khan; Tavakolian, Kouhyar; Vasefi, Fartash; MacKinnon, Nick; Fazel-Rezai, Reza

    2016-08-01

    Chronic skin diseases like eczema may lead to severe health and financial consequences for patients if not detected and controlled early. Early measurement of disease severity, combined with a recommendation for skin protection and use of appropriate medication can prevent the disease from worsening. Current diagnosis can be costly and time-consuming. In this paper, an automatic eczema detection and severity measurement model are presented using modern image processing and computer algorithm. The system can successfully detect regions of eczema and classify the identified region as mild or severe based on image color and texture feature. Then the model automatically measures skin parameters used in the most common assessment tool called "Eczema Area and Severity Index (EASI)," by computing eczema affected area score, eczema intensity score, and body region score of eczema allowing both patients and physicians to accurately assess the affected skin.

  20. Detecting damped Ly α absorbers with Gaussian processes

    Science.gov (United States)

    Garnett, Roman; Ho, Shirley; Bird, Simeon; Schneider, Jeff

    2017-12-01

    We develop an automated technique for detecting damped Ly α absorbers (DLAs) along spectroscopic lines of sight to quasi-stellar objects (QSOs or quasars). The detection of DLAs in large-scale spectroscopic surveys such as SDSS III sheds light on galaxy formation at high redshift, showing the nucleation of galaxies from diffuse gas. We use nearly 50 000 QSO spectra to learn a novel tailored Gaussian process model for quasar emission spectra, which we apply to the DLA detection problem via Bayesian model selection. We propose models for identifying an arbitrary number of DLAs along a given line of sight. We demonstrate our method's effectiveness using a large-scale validation experiment, with excellent performance. We also provide a catalogue of our results applied to 162 858 spectra from SDSS-III data release 12.

  1. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  2. Detection, information fusion, and temporal processing for intelligence in recognition

    Energy Technology Data Exchange (ETDEWEB)

    Casasent, D. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  3. Detection of subsurface defects in metal materials using infrared thermography; Image processing and finite element modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ranjit, Shrestha; Kim, Won Tae [Dept. of Mechanical Engineering, Kongju National University, Cheonan (Korea, Republic of)

    2014-04-15

    Infrared thermography is an emerging approach to non-contact, non-intrusive, and non-destructive inspection of various solid materials such as metals, composites, and semiconductors for industrial and research interests. In this study, data processing was applied to infrared thermography measurements to detect defects in metals that were widely used in industrial fields. When analyzing experimental data from infrared thermographic testing, raw images were often not appropriate. Thus, various data analysis methods were used at the pre-processing and processing levels in data processing programs for quantitative analysis of defect detection and characterization; these increased the infrared non-destructive testing capabilities since subtle defects signature became apparent. A 3D finite element simulation was performed to verify and analyze the data obtained from both the experiment and the image processing techniques.

  4. Sound Event Detection for Music Signals Using Gaussian Processes

    Directory of Open Access Journals (Sweden)

    Pablo A. Alvarado-Durán

    2013-11-01

    Full Text Available In this paper we present a new methodology for detecting sound events in music signals using Gaussian Processes. Our method firstly takes a time-frequency representation, i.e. the spectrogram, of the input audio signal. Secondly the spectrogram dimension is reduced translating the linear Hertz frequency scale into the logarithmic Mel frequency scale using a triangular filter bank. Finally every short-time spectrum, i.e. every Mel spectrogram column, is classified as “Event” or “Not Event” by a Gaussian Processes Classifier. We compare our method with other event detection techniques widely used. To do so, we use MATLAB® to program each technique and test them using two datasets of music with different levels of complexity. Results show that the new methodology outperforms the standard approaches, getting an improvement by about 1.66 % on the dataset one and 0.45 % on the dataset two in terms of F-measure.

  5. IMAGE PROCESSING FOR DETECTION OF ORAL WHITE SPONGE NEVUS LESIONS

    Directory of Open Access Journals (Sweden)

    Rajdeep Mitra

    2016-12-01

    Full Text Available White Sponge Nevus is a rear hereditary disease in human causes incurable white lesions in oral mucosa. Appropriate history, clinical examination along with biopsy and cytological studies are helpful for diagnosis of this disorder. Identification can also be made in alternative way by applying image processing technique using Watershed segmentation with MATLAB software. The applied techniques are effective and reliable for early accurate detection of the disease as alternative of expertise clinical and time taking laboratory investigations.

  6. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-01-01

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  7. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-12-31

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  8. [Application of image processing technique in scoliosis detection].

    Science.gov (United States)

    Lu, Donghui; Xu, Chaojing; Sun, Jinai

    2012-08-01

    Scoliosis, the abnormal lateral curvature of the spine, is an idiopathic disease often suffered by teenagers. Normally medical doctors use X-rays to measure the Cobb angle, and then assess the severity of scoliosis with it. In this paper, we point out the superiorities of image processing technique through analyzing the existing methods in the diagnosis of scoliosis. Two kinds of image processing technique are mainly introduced for scoliosis detection. The moiré stripe images show an asymmetric deformation pattern between the left-hand side and the right-hand side of human back, and these associate with Cobb angle to detect scoliosis. In order to check scoliosis through accurate three-dimensional surface features of human back, we in the second technique use different optical imaging methods to reconstruct the three-dimensional surface model of the patient's back, and extract the characteristics of the reconstructive human back to diagnose the disease. Both approaches tried to describe symmetry discrimination of human back and correlated it with the Cobb angles. Finally, we look forward to the future development in the application of image processing technique for scoliosis detection.

  9. Detection of outliers by neural network on the gas centrifuge experimental data of isotopic separation process

    International Nuclear Information System (INIS)

    Andrade, Monica de Carvalho Vasconcelos

    2004-01-01

    This work presents and discusses the neural network technique aiming at the detection of outliers on a set of gas centrifuge isotope separation experimental data. In order to evaluate the application of this new technique, the result obtained of the detection is compared to the result of the statistical analysis combined with the cluster analysis. This method for the detection of outliers presents a considerable potential in the field of data analysis and it is at the same time easier and faster to use and requests very less knowledge of the physics involved in the process. This work established a procedure for detecting experiments which are suspect to contain gross errors inside a data set where the usual techniques for identification of these errors cannot be applied or its use/demands an excessively long work. (author)

  10. EXOPLANETARY DETECTION BY MULTIFRACTAL SPECTRAL ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Sahil; Wettlaufer, John S. [Program in Applied Mathematics, Yale University, New Haven, CT (United States); Sordo, Fabio Del [Department of Astronomy, Yale University, New Haven, CT (United States)

    2017-01-01

    Owing to technological advances, the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies timescales that characterize planetary orbital motion around the host star and those that arise from stellar features such as spots. Without fitting stellar models to spectral data, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the timescales obtained to primary and secondary eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via Doppler shift detection. Finally, we have analyzed synthetic spectra obtained using the SOAP 2.0 tool, which simulates a stellar spectrum and the influence of the presence of a planet or a spot on that spectrum over one orbital period. We have demonstrated that, so long as the signal-to-noise-ratio ≥ 75, our approach reconstructs the planetary orbital period, as well as the rotation period of a spot on the stellar surface.

  11. Detection and analysis of CRISPRs of Shigella.

    Science.gov (United States)

    Guo, Xiangjiao; Wang, Yingfang; Duan, Guangcai; Xue, Zerun; Wang, Linlin; Wang, Pengfei; Qiu, Shaofu; Xi, Yuanlin; Yang, Haiyan

    2015-01-01

    The recently discovered CRISPRs (Clustered regularly interspaced short palindromic repeats) and Cas (CRISPR-associated) proteins are a novel genetic barrier that limits horizontal gene transfer in prokaryotes and the CRISPR loci provide a historical view of the exposure of prokaryotes to a variety of foreign genetic elements. The aim of study was to investigate the occurrence and distribution of the CRISPRs in Shigella. A collection of 61 strains of Shigella were screened for the existence of CRISPRs. Three CRISPR loci were identified among 61 shigella strains. CRISPR1/cas loci are detected in 49 strains of shigella. Yet, IS elements were detected in cas gene in some strains. In the remaining 12 Shigella flexneri strains, the CRISPR1/cas locus is deleted and only a cas3' pseudo gene and a repeat sequence are present. The presence of CRISPR2 is frequently accompanied by the emergence of CRISPR1. CRISPR3 loci were present in almost all strains (52/61). The length of CRISPR arrays varied from 1 to 9 spacers. Sequence analysis of the CRISPR arrays revealed that few spacers had matches in the GenBank databases. However, one spacer in CRISPR3 loci matches the cognate cas3 genes and no cas gene was present around CRISPR3 region. Analysis of CRISPR sequences show that CRISPR have little change which makes CRISPR poor genotyping markers. The present study is the first attempt to determine and analyze CRISPRs of shigella isolated from clinical patients.

  12. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  13. Combustion Analysis and Knock Detection in Single Cylinder DI-Diesel Engine Using Vibration Signature Analysis

    OpenAIRE

    Y.V.V.SatyanarayanaMurthy

    2011-01-01

    The purpose of this paper is to detect the “knock” in Diesel engines which deteriorate the engine performance adversely. The methodology introduced in the present work suggests a newly developed approach towards analyzing the vibration analysis of diesel engines. The method is based on fundamental relationship between the engine vibration pattern and the relative characteristics of the combustion process in each or different cylinders. Knock in diesel engine is detected by measuring the vibra...

  14. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  15. Improved Anomaly Detection using Integrated Supervised and Unsupervised Processing

    Science.gov (United States)

    Hunt, B.; Sheppard, D. G.; Wetterer, C. J.

    There are two broad technologies of signal processing applicable to space object feature identification using nonresolved imagery: supervised processing analyzes a large set of data for common characteristics that can be then used to identify, transform, and extract information from new data taken of the same given class (e.g. support vector machine); unsupervised processing utilizes detailed physics-based models that generate comparison data that can then be used to estimate parameters presumed to be governed by the same models (e.g. estimation filters). Both processes have been used in non-resolved space object identification and yield similar results yet arrived at using vastly different processes. The goal of integrating the results of the two is to seek to achieve an even greater performance by building on the process diversity. Specifically, both supervised processing and unsupervised processing will jointly operate on the analysis of brightness (radiometric flux intensity) measurements reflected by space objects and observed by a ground station to determine whether a particular day conforms to a nominal operating mode (as determined from a training set) or exhibits anomalous behavior where a particular parameter (e.g. attitude, solar panel articulation angle) has changed in some way. It is demonstrated in a variety of different scenarios that the integrated process achieves a greater performance than each of the separate processes alone.

  16. Algorithms Development in Detection of the Gelatinization Process during Enzymatic ‘Dodol’ Processing

    Directory of Open Access Journals (Sweden)

    Azman Hamzah

    2013-09-01

    Full Text Available Computer vision systems have found wide application in foods processing industry to perform quality evaluation. The systems enable to replace human inspectors for the evaluation of a variety of quality attributes. This paper describes the implementation of the Fast Fourier Transform and Kalman filtering algorithms to detect the glutinous rice flour slurry (GRFS gelatinization in an enzymatic „dodol. processing. The onset of the GRFS gelatinization is critical in determining the quality of an enzymatic „dodol.. Combinations of these two algorithms were able to detect the gelatinization of the GRFS. The result shows that the gelatinization of the GRFS was at the time range of 11.75 minutes to 14.75 minutes for 24 batches of processing. This paper will highlight the capability of computer vision using our proposed algorithms in monitoring and controlling of an enzymatic „dodol. processing via image processing technology.

  17. Does facial processing prioritize change detection?: change blindness illustrates costs and benefits of holistic processing.

    Science.gov (United States)

    Wilford, Miko M; Wells, Gary L

    2010-11-01

    There is broad consensus among researchers both that faces are processed more holistically than other objects and that this type of processing is beneficial. We predicted that holistic processing of faces also involves a cost, namely, a diminished ability to localize change. This study (N = 150) utilized a modified change-blindness paradigm in which some trials involved a change in one feature of an image (nose, chin, mouth, hair, or eyes for faces; chimney, porch, window, roof, or door for houses), whereas other trials involved no change. People were better able to detect the occurrence of a change for faces than for houses, but were better able to localize which feature had changed for houses than for faces. Half the trials used inverted images, a manipulation that disrupts holistic processing. With inverted images, the critical interaction between image type (faces vs. houses) and task (change detection vs. change localization) disappeared. The results suggest that holistic processing reduces change-localization abilities.

  18. Image processing, analysis, measurement, and quality

    International Nuclear Information System (INIS)

    Hughes, G.W.; Mantey, P.E.; Rogowitz, B.E.

    1988-01-01

    Topics covered in these proceedings include: image aquisition, image processing and analysis, electronic vision, IR imaging, measurement and quality, spatial vision and spatial sampling, and contrast-detail curve measurement and analysis in radiological imaging

  19. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  20. Detection of outliers in processing of small size data

    Directory of Open Access Journals (Sweden)

    Popukaylo V. S.

    2016-10-01

    Full Text Available This article describes the criteria for detection of outliers power depending on a small size sample. Removing outliers is one of the stages of signals pre-processing. Statistical experiment, in which using a random number generator were received arrays of data, containing several thousand samples with normal distribution, with the given mean averages and standard deviation for each n-value, was conducted to solve this problem. Thus, we researched and vividly illustrated the possibility of Grubbs, Dixon, Tietjen—Moore, Irving, Chauvenet, Lvovsky and Romanovsky criteria at studied data sizes from 5 to 20 meterages. Conclusions about the applicability of each criterion for the outliersdetection in processing of small size data were made. Lvovsky criterion was recognized the optimal criterion. Dixon’s criterion was recommended for n ≤ 10. Irwin’s criterion was recommended when n ≥ 10. Tietjen—Moore’scriterion can be recommended for the detection of outliers in small samples for n > 5, since it recognizes errors well in the values of a ¯x + 4σ and has the least amount of I type mistakes. Grubb’s with an unknown standard deviation may be used in samples for n ≥ 15. Chauvenet and Romanovsky criteria cannot be recommended for the detection of outliers in small size data.

  1. Stochastic process for white matter injury detection in preterm neonates

    Directory of Open Access Journals (Sweden)

    Irene Cheng

    2015-01-01

    Full Text Available Preterm births are rising in Canada and worldwide. As clinicians strive to identify preterm neonates at greatest risk of significant developmental or motor problems, accurate predictive tools are required. Infants at highest risk will be able to receive early developmental interventions, and will also enable clinicians to implement and evaluate new methods to improve outcomes. While severe white matter injury (WMI is associated with adverse developmental outcome, more subtle injuries are difficult to identify and the association with later impairments remains unknown. Thus, our goal was to develop an automated method for detection and visualization of brain abnormalities in MR images acquired in very preterm born neonates. We have developed a technique to detect WMI in T1-weighted images acquired in 177 very preterm born infants (24–32 weeks gestation. Our approach uses a stochastic process that estimates the likelihood of intensity variations in nearby pixels; with small variations being more likely than large variations. We first detect the boundaries between normal and injured regions of the white matter. Following this we use a measure of pixel similarity to identify WMI regions. Our algorithm is able to detect WMI in all of the images in the ground truth dataset with some false positives in situations where the white matter region is not segmented accurately.

  2. Digital radar-gram processing for water pipelines leak detection

    Science.gov (United States)

    García-Márquez, Jorge; Flores, Ricardo; Valdivia, Ricardo; Carreón, Dora; Malacara, Zacarías; Camposeco, Arturo

    2006-02-01

    Ground penetrating radars (GPR) are useful underground exploration devices. Applications are found in archaeology, mine detection, pavement evaluation, among others. Here we use a GPR to detect by an indirect way, the anomalies caused by the presence of water in the neighborhood of an underground water pipeline. By Fourier transforming a GPR profile map we interpret the signal as spatial frequencies, instead of the temporal frequencies, that composes the profile map. This allows differentiating between signals returning from a standard subsoil feature from those coming back from anomalous zones. Facilities in Mexican cities are commonly buried up to 2.5 m. Their constituent materials are PVC, concrete or metal, typically steel. GPRs are ultra-wide band devices; leak detection must be an indirect process since echoes due to the presence of underground zones with high moisture levels are masked by dense reflections (clutter). In radargrams the presence of water is visualized as anomalies in the neighborhood of the facility. Enhancement of these anomalies will give us the information required to detect leaks.

  3. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  4. Discontinuity Detection in the Shield Metal Arc Welding Process.

    Science.gov (United States)

    Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros

    2017-05-10

    This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors-a microphone and piezoelectric-that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system's high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries.

  5. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W.B.; Johnston, P.N.; Walker, S.R.; Bubb, I.F. [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J.F. [New South Wales Univ., Kensington, NSW (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  6. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  7. Vision based error detection for 3D printing processes

    Directory of Open Access Journals (Sweden)

    Baumann Felix

    2016-01-01

    Full Text Available 3D printers became more popular in the last decade, partly because of the expiration of key patents and the supply of affordable machines. The origin is located in rapid prototyping. With Additive Manufacturing (AM it is possible to create physical objects from 3D model data by layer wise addition of material. Besides professional use for prototyping and low volume manufacturing they are becoming widespread amongst end users starting with the so called Maker Movement. The most prevalent type of consumer grade 3D printers is Fused Deposition Modelling (FDM, also Fused Filament Fabrication FFF. This work focuses on FDM machinery because of their widespread occurrence and large number of open problems like precision and failure. These 3D printers can fail to print objects at a statistical rate depending on the manufacturer and model of the printer. Failures can occur due to misalignment of the print-bed, the print-head, slippage of the motors, warping of the printed material, lack of adhesion or other reasons. The goal of this research is to provide an environment in which these failures can be detected automatically. Direct supervision is inhibited by the recommended placement of FDM printers in separate rooms away from the user due to ventilation issues. The inability to oversee the printing process leads to late or omitted detection of failures. Rejects effect material waste and wasted time thus lowering the utilization of printing resources. Our approach consists of a camera based error detection mechanism that provides a web based interface for remote supervision and early failure detection. Early failure detection can lead to reduced time spent on broken prints, less material wasted and in some cases salvaged objects.

  8. Differential thermal analysis microsystem for explosive detection

    Science.gov (United States)

    Olsen, Jesper K.; Greve, Anders; Senesac, L.; Thundat, T.; Boisen, A.

    2011-06-01

    A micro differential thermal analysis (DTA) system is used for detection of trace explosive particles. The DTA system consists of two silicon micro chips with integrated heaters and temperature sensors. One chip is used for reference and one for the measurement sample. The sensor is constructed as a small silicon nitride membrane incorporating heater elements and a temperature measurement resistor. In this manuscript the DTA system is described and tested by measuring calorimetric response of 3 different kinds of explosives (TNT, RDX and PETN). This project is carried out under the framework of the Xsense project at the Technical University of Denmark (DTU) which combines four independent sensing techniques, these micro DNT sensors will be included in handheld explosives detectors with applications in homeland security and landmine clearance.

  9. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  10. Infrared landmine detection and thermal model analysis

    NARCIS (Netherlands)

    Schwering, P.B.W.; Kokonozi, A.; Carter, L.J.; Lensen, H.A.; Franken, E.M.

    2001-01-01

    Infrared imagers are capable of the detection of surface laid mines. Several sensor fused land mine detection systems make use of metal detectors, ground penetrating radar and infrared imagers. Infrared detection systems are sensitive to apparent temperature contrasts and their detection

  11. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1991-04-16

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level.

  12. DETECTING GLASS SURFACE CORROSION WITH IMAGE PROCESSING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Rafet AKDENİZ

    2012-12-01

    Full Text Available Glass is a kind of amorphous materials that exhibits a transition from rigid to viscous state and finally liquid state when heated. For daily usage, it is desirable to have different forms and differenttransparencies for different purposes. Most widely used one is the one with high transparency and flat surface.One of the detrimental effects that glass is undergone during the storage or usage periods is corrosion. In this work, a way for detecting corrosion on the glass surface by image processing methodis presented.

  13. Terrain Mapping and Obstacle Detection Using Gaussian Processes

    DEFF Research Database (Denmark)

    Kjærgaard, Morten; Massaro, Alessandro Salvatore; Bayramoglu, Enis

    2011-01-01

    In this paper we consider a probabilistic method for extracting terrain maps from a scene and use the information to detect potential navigation obstacles within it. The method uses Gaussian process regression (GPR) to predict an estimate function and its relative uncertainty. To test the new...... methods, we have arranged two setups: an artificial flat surface with an object in front of the sensors and an outdoor unstructured terrain. Two sensor types have been used to determine the point cloud fed to the system: a 3D laser scanner and a stereo camera pair. The results from both sensor systems...

  14. Automatic processing of isotopic dilution curves obtained by precordial detection

    International Nuclear Information System (INIS)

    Verite, J.C.

    1973-01-01

    Dilution curves pose two distinct problems: that of their acquisition and that of their processing. A study devoted to the latter aspect only was presented. It was necessary to satisfy two important conditions: the treatment procedure, although applied to a single category of curves (isotopic dilution curves obtained by precordial detection), had to be as general as possible; to allow dissemination of the method the equipment used had to be relatively modest and inexpensive. A simple method, considering the curve processing as a process identification, was developed and should enable the mean heart cavity volume and certain pulmonary circulation parameters to be determined. Considerable difficulties were encountered, limiting the value of the results obtained though not condemning the method itself. The curve processing question raised the problem of their acquisition, i.e. the number of these curves and their meaning. A list of the difficulties encountered is followed by a set of possible solutions, a solution being understood to mean a curve processing combination where the overlapping between the two aspects of the problem is accounted for [fr

  15. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  16. Computer-assisted image processing to detect spores from the fungus Pandora neoaphidis.

    Science.gov (United States)

    Korsnes, Reinert; Westrum, Karin; Fløistad, Erling; Klingen, Ingeborg

    2016-01-01

    This contribution demonstrates an example of experimental automatic image analysis to detect spores prepared on microscope slides derived from trapping. The application is to monitor aerial spore counts of the entomopathogenic fungus Pandora neoaphidis which may serve as a biological control agent for aphids. Automatic detection of such spores can therefore play a role in plant protection. The present approach for such detection is a modification of traditional manual microscopy of prepared slides, where autonomous image recording precedes computerised image analysis. The purpose of the present image analysis is to support human visual inspection of imagery data - not to replace it. The workflow has three components:•Preparation of slides for microscopy.•Image recording.•Computerised image processing where the initial part is, as usual, segmentation depending on the actual data product. Then comes identification of blobs, calculation of principal axes of blobs, symmetry operations and projection on a three parameter egg shape space.

  17. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2018-03-01

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  18. Uncertainty in the process of risk analysis

    International Nuclear Information System (INIS)

    Lovecek, Tomas; Kampova, Katarina

    2009-01-01

    In the risk analysis we are often trying to describe and assess various quantities, which express the surrounding risk environment and in this way influence its existence and size. In connection with quantifying these risk factors we come across the problem of uncertainty, which rests in the fact that the values of these factors are unknown at the time of analysis and in this way they make the process of risk analysis complicated. In the presented article we deal with the interconnections of this uncertainly as well as with the possibilities of its interpretation and quantification in the risk analysis process. Key words: Risk, Risk Analysis, Uncertainty, Numerical Methods, Exact Methods

  19. Edge detection - Image-plane versus digital processing

    Science.gov (United States)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  20. Effects of image processing on the detective quantum efficiency

    Science.gov (United States)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-04-01

    Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

  1. Video Traffic Analysis for Abnormal Event Detection

    Science.gov (United States)

    2010-01-01

    We propose the use of video imaging sensors for the detection and classification of abnormal events to be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for new road guidelines; for rapid deploymen...

  2. Video traffic analysis for abnormal event detection.

    Science.gov (United States)

    2010-01-01

    We propose the use of video imaging sensors for the detection and classification of abnormal events to : be used primarily for mitigation of traffic congestion. Successful detection of such events will allow for : new road guidelines; for rapid deplo...

  3. Applications of TOPS Anomaly Detection Framework to Amazon Drought Analysis

    Science.gov (United States)

    Votava, P.; Nemani, R. R.; Ganguly, S.; Michaelis, A.; Hashimoto, H.

    2011-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. We have integrated this knowledge base with a framework for deploying an ensemble of anomaly detection algorithms on large volumes of Earth science datasets and applied it to specific scientific applications that support research conducted by our group. In one early application, we were able to process large number of MODIS, TRMM, CERES data along with ground-based weather and river flow observations to detect the evolution of 2010 drought in the Amazon, identify the affected area, and publish the results in three weeks. A similar analysis of the 2005 drought using the same data sets took nearly 2 years, highlighting the potential contribution of our anomaly framework in accelerating scientific discoveries.

  4. Improved extraction of prolamins for gluten detection in processed foods

    Directory of Open Access Journals (Sweden)

    P. KANERVA

    2008-12-01

    Full Text Available A problem in gluten analysis has been inconsistent extractability of prolamins, particularly from processed foods consisting of unknown portions of prolamins from wheat, barley, and rye. This study aimed at improving the extraction of prolamins for immunological analysis, regardless of the cereal species and the production process. The prolamins were extracted with varying concentrations of ethanol, 1-propanol, and 2-propanol. Sodium dodecyl sulphate-polyacrylamide gel electrophoresis and Western blotting were applied to study the protein composition of the extracts and the antibody recognition of the prolamin subgroups. We characterized the affinities of prolamin-specific antibodies that are used in gluten analysis against the prolamin groups that were soluble in 40% 1-propanol. The antibody R5 recognized more abundantly the medium-molecular weight groups, including polymeric proteins, and less the high-molecular weight groups than the anti-ù-gliadin antibody. In the present study, the prolamins were most efficiently extracted by 40% 1-propanol with 1% dithiothreitol at 50 °C . The prolamins were extracted from processed bread samples with efficiency similar to that from untreated meal samples.;

  5. Dimensionality reduction using Principal Component Analysis for network intrusion detection

    Directory of Open Access Journals (Sweden)

    K. Keerthi Vasan

    2016-09-01

    Full Text Available Intrusion detection is the identification of malicious activities in a given network by analyzing its traffic. Data mining techniques used for this analysis study the traffic traces and identify hostile flows in the traffic. Dimensionality reduction in data mining focuses on representing data with minimum number of dimensions such that its properties are not lost and hence reducing the underlying complexity in processing the data. Principal Component Analysis (PCA is one of the prominent dimensionality reduction techniques widely used in network traffic analysis. In this paper, we focus on the efficiency of PCA for intrusion detection and determine its Reduction Ratio (RR, ideal number of Principal Components needed for intrusion detection and the impact of noisy data on PCA. We carried out experiments with PCA using various classifier algorithms on two benchmark datasets namely, KDD CUP and UNB ISCX. Experiments show that the first 10 Principal Components are effective for classification. The classification accuracy for 10 Principal Components is about 99.7% and 98.8%, nearly same as the accuracy obtained using original 41 features for KDD and 28 features for ISCX, respectively.

  6. Particle contamination formation and detection in magnetron sputtering processes

    Energy Technology Data Exchange (ETDEWEB)

    Selwyn, G.S. [Los Alamos National Lab., NM (United States); Weiss, C.A. [Materials Research Corp., Congers, NY (United States). Sputtering Systems Div.; Sequeda, F.; Huang, C. [Seagate Peripherals Disk Div., Milpitas, CA (United States)

    1996-10-01

    Defects caused by particulate contamination are an important concern in the fabrication of thin film products. Often, magnetron sputtering processes are used for this purpose. Particle contamination can cause electrical shorting, pin holes, problems with photolithography, adhesion failure, as well as visual and cosmetic defects. Particle contamination generated during thin film processing can be detected using laser light scattering, a powerful diagnostic technique that provides real-time, {ital in-situ} imaging of particles > 0.3 {mu}m in diameter. Using this technique, the causes, sources and influences on particles in plasma and non-plasma and non-plasma processes may be independently evaluated and corrected. Several studies employing laser light scattering have demonstrated both homogeneous and heterogeneous causes of particle contamination. In this paper, we demonstrate that the mechanisms for particle generation, transport and trapping during magnetron sputter deposition are different from the mechanisms reported in previously studied plasma etch processes. During magnetron sputter deposition, one source of particle contamination is linked to portions of the sputtering target surface exposed to weaker plasma density. In this region, film redeposition is followed by filament or nodule growth and enhanced trapping which increases filament growth. Eventually the filaments effectively ``short circuit`` the sheath, causing high currents to flow through these features. This, in turn, causes heating failure of the filament fracturing and ejecting the filaments into the plasma and onto the substrate. Evidence of this effect has been observed in semiconductor (IC) fabrication and storage disk manufacturing. Discovery of this mechanism in both technologies suggests that this mechanism may be universal to many sputtering processes.

  7. Effects of image processing on the detective quantum efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na [Yonsei University, Wonju (Korea, Republic of)

    2010-02-15

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  8. Effects of image processing on the detective quantum efficiency

    International Nuclear Information System (INIS)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-01-01

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  9. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar...... waveguides and fiber-to-waveguide coupler structures, are defined in the same processing step. This results in self-alignment of all components and enables a fabrication and packaging time of only one day. The fabrication scheme has recently been presented elsewhere for fluorescence excitation of beads...

  10. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2004-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  11. Computer Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2006-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  12. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2007-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  13. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2005-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  14. A process for detecting an interface by neutron diffusion

    International Nuclear Information System (INIS)

    1976-01-01

    The invention concerns a process and an apparatus for detecting an interface of substances having different hydrogen contents, present in a pipe or receptacle in metal (steel). It can concern the interface of a diphasic flow (hydrocarbons or water in the presence of gases, air or other gases) or of the level of an organic solid powdery substance containing hydrogen. A neutron source containing 0.1 to 1 μg of 252 Cf and a neutron detector are placed on or near the external face of the steel wall. The neutron detector is not located at more than 50 cm from the neutron source; it is a counter tube filled with He-3 more sensitive to neutrons diffused by collision with the hydrogen than to the fast neutrons of the source. In particular, the invention makes it possible to detect the liquids (water, hydrocarbons) present in the gas conveying facilities, the presence of foam or the level of a fluidised bed of polymer substances. It can be used for warning or control purposes [fr

  15. Detecting geomorphic processes and change with high resolution topographic data

    Science.gov (United States)

    Mudd, Simon; Hurst, Martin; Grieve, Stuart; Clubb, Fiona; Milodowski, David; Attal, Mikael

    2016-04-01

    The first global topographic dataset was released in 1996, with 1 km grid spacing. It is astonishing that in only 20 years we now have access to tens of thousands of square kilometres of LiDAR data at point densities greater than 5 points per square meter. This data represents a treasure trove of information that our geomorphic predecessors could only dream of. But what are we to do with this data? Here we explore the potential of high resolution topographic data to dig deeper into geomorphic processes across a wider range of landscapes and using much larger spatial coverage than previously possible. We show how this data can be used to constrain sediment flux relationships using relief and hillslope length, and how this data can be used to detect landscape transience. We show how the nonlinear sediment flux law, proposed for upland, soil mantled landscapes by Roering et al. (1999) is consistent with a number of topographic tests. This flux law allows us to predict how landscapes will respond to tectonic forcing, and we show how these predictions can be used to detect erosion rate perturbations across a range of tectonic settings.

  16. Damage Detection in Composite Structures with Wavenumber Array Data Processing

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Yu, Lingyu

    2013-01-01

    Guided ultrasonic waves (GUW) have the potential to be an efficient and cost-effective method for rapid damage detection and quantification of large structures. Attractive features include sensitivity to a variety of damage types and the capability of traveling relatively long distances. They have proven to be an efficient approach for crack detection and localization in isotropic materials. However, techniques must be pushed beyond isotropic materials in order to be valid for composite aircraft components. This paper presents our study on GUW propagation and interaction with delamination damage in composite structures using wavenumber array data processing, together with advanced wave propagation simulations. Parallel elastodynamic finite integration technique (EFIT) is used for the example simulations. Multi-dimensional Fourier transform is used to convert time-space wavefield data into frequency-wavenumber domain. Wave propagation in the wavenumber-frequency domain shows clear distinction among the guided wave modes that are present. This allows for extracting a guided wave mode through filtering and reconstruction techniques. Presence of delamination causes spectral change accordingly. Results from 3D CFRP guided wave simulations with delamination damage in flat-plate specimens are used for wave interaction with structural defect study.

  17. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  18. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  19. Process integration analysis of an industrial hydrogen production process

    Energy Technology Data Exchange (ETDEWEB)

    Tock, Laurence; Marechal, Francois; Metzger, Christian [EPFL (CH). Industrial Energy Systems Lab. (LENI); Arpentinier, Philippe [AIR LIQUIDE (France). Research center Claude-Delorme

    2010-07-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to exploit at maximum the heat excess at low temperature by producing valuable steam or electricity or by performing cogeneration. By applying a systematic methodology based on energy-flow models, process integration techniques and a multi-objective optimization procedure, the process performances defined by the specific natural gas consumption and the specific steam or electricity production is optimized and analyzed for different operating conditions (i.e. air preheating, pre-reforming/reforming, WGS temperature) and process modification options like pre-reformer integration. Identified measures are to increase the production of exportable steam by consuming the entire waste heat and optimizing the steam production pressure level, and to reduce the natural gas consumption by adjusting process parameters. By these measures the performance can be varied between 0.53-0.59 kmol natural gas/kmol H{sub 2} for the specific total natural gas consumption and 1.8-3.7 kmol steam/kmol H2 for the specific steam production. (orig.)

  20. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  1. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  2. Detection and Analysis of Threats to the Energy Sector: DATES

    Energy Technology Data Exchange (ETDEWEB)

    Alfonso Valdes

    2010-03-31

    This report summarizes Detection and Analysis of Threats to the Energy Sector (DATES), a project sponsored by the United States Department of Energy and performed by a team led by SRI International, with collaboration from Sandia National Laboratories, ArcSight, Inc., and Invensys Process Systems. DATES sought to advance the state of the practice in intrusion detection and situational awareness with respect to cyber attacks in energy systems. This was achieved through adaptation of detection algorithms for process systems as well as development of novel anomaly detection techniques suited for such systems into a detection suite. These detection components, together with third-party commercial security systems, were interfaced with the commercial Security Information Event Management (SIEM) solution from ArcSight. The efficacy of the integrated solution was demonstrated on two testbeds, one based on a Distributed Control System (DCS) from Invensys, and the other based on the Virtual Control System Environment (VCSE) from Sandia. These achievements advance the DOE Cybersecurity Roadmap [DOE2006] goals in the area of security monitoring. The project ran from October 2007 until March 2010, with the final six months focused on experimentation. In the validation phase, team members from SRI and Sandia coupled the two test environments and carried out a number of distributed and cross-site attacks against various points in one or both testbeds. Alert messages from the distributed, heterogeneous detection components were correlated using the ArcSight SIEM platform, providing within-site and cross-site views of the attacks. In particular, the team demonstrated detection and visualization of network zone traversal and denial-of-service attacks. These capabilities were presented to the DistribuTech Conference and Exhibition in March 2010. The project was hampered by interruption of funding due to continuing resolution issues and agreement on cost share for four months in 2008

  3. Probing the lifetimes of auditory novelty detection processes.

    Science.gov (United States)

    Pegado, Felipe; Bekinschtein, Tristan; Chausson, Nicolas; Dehaene, Stanislas; Cohen, Laurent; Naccache, Lionel

    2010-08-01

    Auditory novelty detection can be fractionated into multiple cognitive processes associated with their respective neurophysiological signatures. In the present study we used high-density scalp event-related potentials (ERPs) during an active version of the auditory oddball paradigm to explore the lifetimes of these processes by varying the stimulus onset asynchrony (SOA). We observed that early MMN (90-160 ms) decreased when the SOA increased, confirming the evanescence of this echoic memory system. Subsequent neural events including late MMN (160-220 ms) and P3a/P3b components of the P3 complex (240-500 ms) did not decay with SOA, but showed a systematic delay effect supporting a two-stage model of accumulation of evidence. On the basis of these observations, we propose a distinction within the MMN complex of two distinct events: (1) an early, pre-attentive and fast-decaying MMN associated with generators located within superior temporal gyri (STG) and frontal cortex, and (2) a late MMN more resistant to SOA, corresponding to the activation of a distributed cortical network including fronto-parietal regions. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  4. Hyperspectral imaging and quantitative analysis for prostate cancer detection

    Science.gov (United States)

    Akbari, Hamed; Halig, Luma V.; Schuster, David M.; Osunkoya, Adeboye; Master, Viraj; Nieh, Peter T.; Chen, Georgia Z.

    2012-01-01

    Abstract. Hyperspectral imaging (HSI) is an emerging modality for various medical applications. Its spectroscopic data might be able to be used to noninvasively detect cancer. Quantitative analysis is often necessary in order to differentiate healthy from diseased tissue. We propose the use of an advanced image processing and classification method in order to analyze hyperspectral image data for prostate cancer detection. The spectral signatures were extracted and evaluated in both cancerous and normal tissue. Least squares support vector machines were developed and evaluated for classifying hyperspectral data in order to enhance the detection of cancer tissue. This method was used to detect prostate cancer in tumor-bearing mice and on pathology slides. Spatially resolved images were created to highlight the differences of the reflectance properties of cancer versus those of normal tissue. Preliminary results with 11 mice showed that the sensitivity and specificity of the hyperspectral image classification method are 92.8% to 2.0% and 96.9% to 1.3%, respectively. Therefore, this imaging method may be able to help physicians to dissect malignant regions with a safe margin and to evaluate the tumor bed after resection. This pilot study may lead to advances in the optical diagnosis of prostate cancer using HSI technology. PMID:22894488

  5. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    Science.gov (United States)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  6. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  7. Detection and reduction of tungsten contamination in ion implantation processes

    International Nuclear Information System (INIS)

    Polignano, M.L.; Galbiati, A.; Grasso, S.; Mica, I.; Barbarossa, F.; Magni, D.

    2016-01-01

    In this paper, we review the results of some studies addressing the problem of tungsten contamination in implantation processes. For some tests, the implanter was contaminated by implantation of wafers with an exposed tungsten layer, resulting in critical contamination conditions. First, DLTS (deep level transient spectroscopy) measurements were calibrated to measure tungsten contamination in ion-implanted samples. DLTS measurements of tungsten-implanted samples showed that the tungsten concentration increases linearly with the dose up to a rather low dose (5 x 10 10 cm -2 ). Tungsten deactivation was observed when the dose was further increased. Under these conditions, ToF-SIMS revealed tungsten at the wafer surface, showing that deactivation was due to surface segregation. DLTS calibration could therefore be obtained in the linear dose regime only. This calibration was used to evaluate the tungsten contamination in arsenic implantations. Ordinary operating conditions and critical contamination conditions of the equipment were compared. A moderate tungsten contamination was observed in samples implanted under ordinary operating conditions. This contamination was easily suppressed by a thin screen oxide. On the contrary, implantations in critical conditions of the equipment resulted in a relevant tungsten contamination, which could be reduced but not suppressed even by a relatively thick screen oxide (up to 150 Aa). A decontamination process consisting of high dose implantations of dummy wafers was tested for its efficiency to remove tungsten and titanium contamination. This process was found to be much more effective for titanium than for tungsten. Finally, DLTS proved to be much more sensitive that TXRF (total reflection X-ray fluorescence) in detecting tungsten contamination. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  8. Detection and reduction of tungsten contamination in ion implantation processes

    Energy Technology Data Exchange (ETDEWEB)

    Polignano, M.L.; Galbiati, A.; Grasso, S.; Mica, I.; Barbarossa, F.; Magni, D. [STMicroelectronics, Agrate Brianza (Italy)

    2016-12-15

    In this paper, we review the results of some studies addressing the problem of tungsten contamination in implantation processes. For some tests, the implanter was contaminated by implantation of wafers with an exposed tungsten layer, resulting in critical contamination conditions. First, DLTS (deep level transient spectroscopy) measurements were calibrated to measure tungsten contamination in ion-implanted samples. DLTS measurements of tungsten-implanted samples showed that the tungsten concentration increases linearly with the dose up to a rather low dose (5 x 10{sup 10} cm{sup -2}). Tungsten deactivation was observed when the dose was further increased. Under these conditions, ToF-SIMS revealed tungsten at the wafer surface, showing that deactivation was due to surface segregation. DLTS calibration could therefore be obtained in the linear dose regime only. This calibration was used to evaluate the tungsten contamination in arsenic implantations. Ordinary operating conditions and critical contamination conditions of the equipment were compared. A moderate tungsten contamination was observed in samples implanted under ordinary operating conditions. This contamination was easily suppressed by a thin screen oxide. On the contrary, implantations in critical conditions of the equipment resulted in a relevant tungsten contamination, which could be reduced but not suppressed even by a relatively thick screen oxide (up to 150 Aa). A decontamination process consisting of high dose implantations of dummy wafers was tested for its efficiency to remove tungsten and titanium contamination. This process was found to be much more effective for titanium than for tungsten. Finally, DLTS proved to be much more sensitive that TXRF (total reflection X-ray fluorescence) in detecting tungsten contamination. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Detection and localization of leak of pipelines of RBMK reactor. Methods of processing of acoustic noise

    International Nuclear Information System (INIS)

    Tcherkaschov, Y.M.; Strelkov, B.P.; Chimanski, S.B.; Lebedev, V.I.; Belyanin, L.A.

    1997-01-01

    For realization of leak detection of input pipelines and output pipelines of RBMK reactor the method, based on detection and control of acoustic leak signals, was designed. In this report the review of methods of processing and analysis of acoustic noise is submitted. These methods were included in the software of the leak detection system and are used for the decision of the following problems: leak detection by method of sound pressure level in conditions of powerful background noise and strong attenuation of a signal; detection of a small leak in early stage by high-sensitivity correlation method; determination of a point of a sound source in conditions of strong reflection of a signal by a correlation method and sound pressure method; evaluation of leak size by the analysis of a sound level and point of a sound source. The work of considered techniques is illustrated on an example of test results of a fragment of the leak detection system. This test was executed on a Leningrad NPP, operated at power levels of 460, 700, 890 and 1000 MWe. 16 figs

  10. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  11. Processability analysis of candidate waste forms

    International Nuclear Information System (INIS)

    Gould, T.H. Jr.; Dunson, J.B. Jr.; Eisenberg, A.M.; Haight, H.G. Jr.; Mello, V.E.; Schuyler, R.L. III.

    1982-01-01

    A quantitative merit evaluation, or processability analysis, was performed to assess the relative difficulty of remote processing of Savannah River Plant high-level wastes for seven alternative waste form candidates. The reference borosilicate glass process was rated as the simplest, followed by FUETAP concrete, glass marbles in a lead matrix, high-silica glass, crystalline ceramics (SYNROC-D and tailored ceramics), and coated ceramic particles. Cost estimates for the borosilicate glass, high-silica glass, and ceramic waste form processing facilities are also reported

  12. Probabilistic analysis of a thermosetting pultrusion process

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion...... process. A new application for the probabilistic analysis of the pultrusion process is introduced using the response surface method (RSM). The results obtained from the RSM are validated by employing the Monte Carlo simulation (MCS) with Latin hypercube sampling technique. According to the results...

  13. Image Processing Methods Usable for Object Detection on the Chessboard

    Directory of Open Access Journals (Sweden)

    Beran Ladislav

    2016-01-01

    Full Text Available Image segmentation and object detection is challenging problem in many research. Although many algorithms for image segmentation have been invented, there is no simple algorithm for image segmentation and object detection. Our research is based on combination of several methods for object detection. The first method suitable for image segmentation and object detection is colour detection. This method is very simply, but there is problem with different colours. For this method it is necessary to have precisely determined colour of segmented object before all calculations. In many cases it is necessary to determine this colour manually. Alternative simply method is method based on background removal. This method is based on difference between reference image and detected image. In this paper several methods suitable for object detection are described. Thisresearch is focused on coloured object detection on chessboard. The results from this research with fusion of neural networks for user-computer game checkers will be applied.

  14. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    ones are the P-wave and the S-wave. The P-wave is polarized in the source-to-receiver direction, namely from the epicenter of the earth - quake to the...a bound on the average frequency of false alarms. The theoretical study of quickest changepoint detection has been initiated in two different direc...detection techniques to run at high speeds and with low delay, combined with the generally low frequency of intrusion attempts, presents an interesting

  15. Systematic Sustainable Process Design and Analysis of Biodiesel Processes

    Directory of Open Access Journals (Sweden)

    Seyed Soheil Mansouri

    2013-09-01

    Full Text Available Biodiesel is a promising fuel alternative compared to traditional diesel obtained from conventional sources such as fossil fuel. Many flowsheet alternatives exist for the production of biodiesel and therefore it is necessary to evaluate these alternatives using defined criteria and also from process intensification opportunities. This work focuses on three main aspects that have been incorporated into a systematic computer-aided framework for sustainable process design. First, the creation of a generic superstructure, which consists of all possible process alternatives based on available technology. Second, the evaluation of this superstructure for systematic screening to obtain an appropriate base case design. This is done by first reducing the search space using a sustainability analysis, which provides key indicators for process bottlenecks of different flowsheet configurations and then by further reducing the search space by using economic evaluation and life cycle assessment. Third, the determination of sustainable design with/without process intensification using a phenomena-based synthesis/design method. A detailed step by step application of the framework is highlighted through a biodiesel production case study.

  16. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  17. Strut analysis for osteoporosis detection model using dental panoramic radiography.

    Science.gov (United States)

    Hwang, Jae Joon; Lee, Jeong-Hee; Han, Sang-Sun; Kim, Young Hyun; Jeong, Ho-Gul; Choi, Yoon Jeong; Park, Wonse

    2017-10-01

    The aim of this study was to identify variables that can be used for osteoporosis detection using strut analysis, fractal dimension (FD) and the gray level co-occurrence matrix (GLCM) using multiple regions of interest and to develop an osteoporosis detection model based on panoramic radiography. A total of 454 panoramic radiographs from oral examinations in our dental hospital from 2012 to 2015 were randomly selected, equally distributed among osteoporotic and non-osteoporotic patients (n = 227 in each group). The radiographs were classified by bone mineral density (T-score). After 3 marrow regions and the endosteal margin area were selected, strut features, FD and GLCM were analysed using a customized image processing program. Image upsampling was used to obtain the optimal binarization for calculating strut features and FD. The independent-samples t-test was used to assess statistical differences between the 2 groups. A decision tree and support vector machine were used to create and verify an osteoporosis detection model. The endosteal margin area showed statistically significant differences in FD, GLCM and strut variables between the osteoporotic and non-osteoporotic patients, whereas the medullary portions showed few distinguishing features. The sensitivity, specificity, and accuracy of the strut variables in the endosteal margin area were 97.1%, 95.7 and 96.25 using the decision tree and 97.2%, 97.1 and 96.9% using support vector machine, and these were the best results obtained among the 3 methods. Strut variables with FD and/or GLCM did not increase the diagnostic accuracy. The analysis of strut features in the endosteal margin area showed potential for the development of an osteoporosis detection model based on panoramic radiography.

  18. Meteor radar signal processing and error analysis

    Science.gov (United States)

    Kang, Chunmei

    Meteor wind radar systems are a powerful tool for study of the horizontal wind field in the mesosphere and lower thermosphere (MLT). While such systems have been operated for many years, virtually no literature has focused on radar system error analysis. The instrumental error may prevent scientists from getting correct conclusions on geophysical variability. The radar system instrumental error comes from different sources, including hardware, software, algorithms and etc. Radar signal processing plays an important role in radar system and advanced signal processing algorithms may dramatically reduce the radar system errors. In this dissertation, radar system error propagation is analyzed and several advanced signal processing algorithms are proposed to optimize the performance of radar system without increasing the instrument costs. The first part of this dissertation is the development of a time-frequency waveform detector, which is invariant to noise level and stable to a wide range of decay rates. This detector is proposed to discriminate the underdense meteor echoes from the background white Gaussian noise. The performance of this detector is examined using Monte Carlo simulations. The resulting probability of detection is shown to outperform the often used power and energy detectors for the same probability of false alarm. Secondly, estimators to determine the Doppler shift, the decay rate and direction of arrival (DOA) of meteors are proposed and evaluated. The performance of these estimators is compared with the analytically derived Cramer-Rao bound (CRB). The results show that the fast maximum likelihood (FML) estimator for determination of the Doppler shift and decay rate and the spatial spectral method for determination of the DOAs perform best among the estimators commonly used on other radar systems. For most cases, the mean square error (MSE) of the estimator meets the CRB above a 10dB SNR. Thus meteor echoes with an estimated SNR below 10dB are

  19. Pattern Detection and Extreme Value Analysis on Large Climate Data

    Science.gov (United States)

    Prabhat, M.; Byna, S.; Paciorek, C.; Weber, G.; Wu, K.; Yopes, T.; Wehner, M. F.; Ostrouchov, G.; Pugmire, D.; Strelitz, R.; Collins, W.; Bethel, W.

    2011-12-01

    We consider several challenging problems in climate that require quantitative analysis of very large data volumes generated by modern climate simulations. We demonstrate new software capable of addressing these challenges that is designed to exploit petascale platforms using state-of-the-art methods in high performance computing. Atmospheric rivers and Hurricanes are important classes of extreme weather phenomena. Developing analysis tools that can automatically detect these events in large climate datasets can provide us with invaluable information about the frequency of these events. Application of these tools to different climate model outputs can provide us with quality metrics that evaluate whether models produce this important class of phenomena and how the statistics of these events will likely vary in the future. In this work, we present an automatic technique for detecting atmospheric rivers. We use techniques from image processing and topological analysis to extract these features. We implement this technique in a massively parallel fashion on modern supercomputing platforms, and apply the resulting software to both observational data and various models from the CMIP-3 archive. We have successfully completed atmospheric river detections on 1TB of data on 10000 hopper cores in 10 seconds. For hurricane tracking, we have adapted code from GFDL to run in parallel on large datasets. We present results from the application of this code to some recent high resolution CAM5 simulations. Our code is capable of processing 1TB of data in 10 seconds. Extreme value analysis involves statistical techniques for estimating the probability of extreme events and variations in the probabilities over time and space. Because of their rarity, there is a high degree of uncertainty when estimating the behavior of extremes from data at any one location. We are developing a local likelihood approach to borrow strength from multiple locations, with uncertainty estimated using the

  20. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  1. [Detection of genetically modified soy (Roundup-Ready) in processed food products].

    Science.gov (United States)

    Hagen, M; Beneke, B

    2000-01-01

    In this study, the application of a qualitative and a quantitative method of analysis to detect genetically modified RR-Soy (Roundup-Ready Soy) in processed foods is described. A total of 179 various products containing soy such as baby food and diet products, soy drinks and desserts, tofu and tofu products, soy based meat substitutes, soy protein, breads, flour, granules, cereals, noodles, soy bean sprouts, fats and oils as well as condiments were investigated following the pattern of the section 35 LMBG-method L 23.01.22-1. The DNA was extracted from the samples and analysed using a soybean specific lectin gene PCR as well as a PCR, specific for the genetic modification. Additional, by means of PCR in combination with fluorescence-detection (TaqMan 5'-Nuclease Assay), suspicious samples were subjected to a real-time quantification of the percentage of genetically modified RR-Soy. The methods of analysis proved to be extremely sensitive and specific in regard to the food groups checked. The fats and oils, as well as the condiments were the exceptions in which amplifiable soy DNA could not be detected. The genetic modification of RR-Soy was detected in 34 samples. Eight of these samples contained more than 1% of RR-Soy. It is necessary to determine the percentage of transgenic soy in order to assess whether genetically modified ingredients were deliberately added, or whether they were caused by technically unavoidable contamination (for example during transportation and processing).

  2. A cost analysis: processing maple syrup products

    Science.gov (United States)

    Neil K. Huyler; Lawrence D. Garrett

    1979-01-01

    A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...

  3. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  4. Real Time Intelligent Target Detection and Analysis with Machine Vision

    Science.gov (United States)

    Howard, Ayanna; Padgett, Curtis; Brown, Kenneth

    2000-01-01

    We present an algorithm for detecting a specified set of targets for an Automatic Target Recognition (ATR) application. ATR involves processing images for detecting, classifying, and tracking targets embedded in a background scene. We address the problem of discriminating between targets and nontarget objects in a scene by evaluating 40x40 image blocks belonging to an image. Each image block is first projected onto a set of templates specifically designed to separate images of targets embedded in a typical background scene from those background images without targets. These filters are found using directed principal component analysis which maximally separates the two groups. The projected images are then clustered into one of n classes based on a minimum distance to a set of n cluster prototypes. These cluster prototypes have previously been identified using a modified clustering algorithm based on prior sensed data. Each projected image pattern is then fed into the associated cluster's trained neural network for classification. A detailed description of our algorithm will be given in this paper. We outline our methodology for designing the templates, describe our modified clustering algorithm, and provide details on the neural network classifiers. Evaluation of the overall algorithm demonstrates that our detection rates approach 96% with a false positive rate of less than 0.03%.

  5. Trace analysis for 300 MM wafers and processes with TXRF

    International Nuclear Information System (INIS)

    Nutsch, A.; Erdmann, V.; Zielonka, G.; Pfitzner, L.; Ryssel, H.

    2000-01-01

    Efficient fabrication of semiconductor devices is combined with an increasing size of silicon wafers. The contamination level of processes, media, and equipment has to decrease continuously. A new test laboratory for 300 mm was installed in view of the above mentioned aspects. Aside of numerous processing tools this platform consist electrical test methods, particle detection, vapor phase decomposition (VPD) preparation, and TXRF. The equipment is installed in a cleanroom. It is common to perform process or equipment control, development, evaluation and qualification with monitor wafers. The evaluation and the qualification of 300 mm equipment require direct TXRF on 300 mm wafers. A new TXRF setup was installed due to the wafer size of 300 mm. The 300 mm TXRF is equipped with tungsten and molybdenum anode. This combination allows a sensitive detection of elements with fluorescence energy below 10 keV for tungsten excitation. The molybdenum excitation enables the detection of a wide variety of elements. The detection sensitivity for the tungsten anode excited samples is ten times higher than for molybdenum anode measured samples. The system is calibrated with 1 ng Ni. This calibration shows a stability within 5 % when monitored to control system stability. Decreasing the amount of Ni linear results in a linear decrease of the measured Ni signal. This result is verified for a range of elements by multielement samples. New designs demand new processes and materials, e.g. ferroelectric layers and copper. The trace analysis of many of these materials is supported by the higher excitation energy of the molybdenum anode. Reclaim and recycling of 300 mm wafers demand for an accurate contamination control of the processes to avoid cross contamination. Polishing or etching result in modified surfaces. TXRF as a non-destructive test method allows the simultaneously detection of a variety of elements on differing surfaces in view of contamination control and process

  6. A Universal High-Performance Correlation Analysis Detection Model and Algorithm for Network Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Hongliang Zhu

    2017-01-01

    Full Text Available In big data era, the single detection techniques have already not met the demand of complex network attacks and advanced persistent threats, but there is no uniform standard to make different correlation analysis detection be performed efficiently and accurately. In this paper, we put forward a universal correlation analysis detection model and algorithm by introducing state transition diagram. Based on analyzing and comparing the current correlation detection modes, we formalize the correlation patterns and propose a framework according to data packet timing and behavior qualities and then design a new universal algorithm to implement the method. Finally, experiment, which sets up a lightweight intrusion detection system using KDD1999 dataset, shows that the correlation detection model and algorithm can improve the performance and guarantee high detection rates.

  7. Applications of random process excursion analysis

    CERN Document Server

    Brainina, Irina S

    2013-01-01

    This book addresses one of the key problems in signal processing, the problem of identifying statistical properties of excursions in a random process in order to simplify the theoretical analysis and make it suitable for engineering applications. Precise and approximate formulas are explained, which are relatively simple and can be used for engineering applications such as the design of devices which can overcome the high initial uncertainty of the self-training period. The information presented in the monograph can be used to implement adaptive signal processing devices capable of d

  8. Space Applications for Ensemble Detection and Analysis

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA makes extensive investments to circumvent the engineering challenges posed by naturally occurring random processes for which conventional statistics do not...

  9. Detection of non-stationary leak signals at NPP primary circuit by cross-correlation analysis

    International Nuclear Information System (INIS)

    Shimanskij, S.B.

    2007-01-01

    A leak-detection system employing high-temperature microphones has been developed for the RBMK and ATR (Japan) reactors. Further improvement of the system focused on using cross-correlation analysis of the spectral components of the signal to detect a small leak at an early stage of development. Since envelope processes are less affected by distortions than are wave processes, they give a higher-degree of correlation and can be used to detect leaks with lower signal-noise ratios. Many simulation tests performed at nuclear power plants have shown that the proposed methods can be used to detect and find the location of a small leak [ru

  10. Process analysis of fluidized bed granulation.

    Science.gov (United States)

    Rantanen, J; Jørgensen, A; Räsänen, E; Luukkonen, P; Airaksinen, S; Raiman, J; Hänninen, K; Antikainen, O; Yliruusi, J

    2001-10-17

    This study assesses the fluidized bed granulation process for the optimization of a model formulation using in-line near-infrared (NIR) spectroscopy for moisture determination. The granulation process was analyzed using an automated granulator and optimization of the verapamil hydrochloride formulation was performed using a mixture design. The NIR setup with a fixed wavelength detector was applied for moisture measurement. Information from other process measurements, temperature difference between process inlet air and granules (T(diff)), and water content of process air (AH), was also analyzed. The application of in-line NIR provided information related to the amount of water throughout the whole granulation process. This information combined with trend charts of T(diff) and AH enabled the analysis of the different process phases. By this means, we can obtain in-line documentation from all the steps of the processing. The choice of the excipient affected the nature of the solid-water interactions; this resulted in varying process times. NIR moisture measurement combined with temperature and humidity measurements provides a tool for the control of water during fluid bed granulation.

  11. Development of image processing method to detect noise in geostationary imagery

    Science.gov (United States)

    Khlopenkov, Konstantin V.; Doelling, David R.

    2016-10-01

    The Clouds and the Earth's Radiant Energy System (CERES) has incorporated imagery from 16 individual geostationary (GEO) satellites across five contiguous domains since March 2000. In order to derive broadband fluxes uniform across satellite platforms it is important to ensure a good quality of the input raw count data. GEO data obtained by older GOES imagers (such as MTSAT-1, Meteosat-5, Meteosat-7, GMS-5, and GOES-9) are known to frequently contain various types of noise caused by transmission errors, sync errors, stray light contamination, and others. This work presents an image processing methodology designed to detect most kinds of noise and corrupt data in all bands of raw imagery from modern and historic GEO satellites. The algorithm is based on a set of different approaches to detect abnormal image patterns, including inter-line and inter-pixel differences within a scanline, correlation between scanlines, analysis of spatial variance, and also a 2D Fourier analysis of the image spatial frequencies. In spite of computational complexity, the described method is highly optimized for performance to facilitate volume processing of multi-year data and runs in fully automated mode. Reliability of this noise detection technique has been assessed by human supervision for each GEO dataset obtained during selected time periods in 2005 and 2006. This assessment has demonstrated the overall detection accuracy of over 99.5% and the false alarm rate of under 0.3%. The described noise detection routine is currently used in volume processing of historical GEO imagery for subsequent production of global gridded data products and for cross-platform calibration.

  12. Fault Detection and Diagnosis in Process Data Using Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Fang Wu

    2014-01-01

    Full Text Available For the complex industrial process, it has become increasingly challenging to effectively diagnose complicated faults. In this paper, a combined measure of the original Support Vector Machine (SVM and Principal Component Analysis (PCA is provided to carry out the fault classification, and compare its result with what is based on SVM-RFE (Recursive Feature Elimination method. RFE is used for feature extraction, and PCA is utilized to project the original data onto a lower dimensional space. PCA T2, SPE statistics, and original SVM are proposed to detect the faults. Some common faults of the Tennessee Eastman Process (TEP are analyzed in terms of the practical system and reflections of the dataset. PCA-SVM and SVM-RFE can effectively detect and diagnose these common faults. In RFE algorithm, all variables are decreasingly ordered according to their contributions. The classification accuracy rate is improved by choosing a reasonable number of features.

  13. Analysis and Detection of Malicious Insiders

    National Research Council Canada - National Science Library

    Maybury, Mark; Chase, Penny; Cheikes, Brant; Brackney, Dick; Matzner, Sara; Hetherington, Tom; Wood, Brad; Sibley, Conner; Marin, Jack; Longstaff, Tom

    2005-01-01

    ...) actions, and associated observables. The paper outlines several prototype techniques developed to provide early warning of insider activity, including novel algorithms for structured analysis and data fusion...

  14. Analysis and Detection of Malicious Insiders

    National Research Council Canada - National Science Library

    Maybury, Mark; Chase, Penny; Cheikes, Brant; Brackney, Dick; Matzner, Sara; Hetherington, Tom; Wood, Brad; Sibley, Conner; Marin, Jack; Longstaff, Tom

    2005-01-01

    This paper summarizes a collaborative, six month ARDA NRRC challenge workshop to characterize and create analysis methods to counter sophisticated malicious insiders in the United States Intelligence Community...

  15. Non-Harmonic Fourier Analysis for bladed wheels damage detection

    Science.gov (United States)

    Neri, P.; Peeters, B.

    2015-11-01

    The interaction between bladed wheels and the fluid distributed by the stator vanes results in cyclic loading of the rotating components. Compressors and turbines wheels are subject to vibration and fatigue issues, especially when resonance conditions are excited. Even if resonance conditions can be often predicted and avoided, high cycle fatigue failures can occur, causing safety issues and economic loss. Rigorous maintenance programs are then needed, forcing the system to expensive shut-down. Blade crack detection methods are beneficial for condition-based maintenance. While contact measurement systems are not always usable in exercise conditions (e.g. high temperature), non-contact methods can be more suitable. One (or more) stator-fixed sensor can measure all the blades as they pass by, in order to detect the damaged ones. The main drawback in this situation is the short acquisition time available for each blade, which is shortened by the high rotational speed of the components. A traditional Discrete Fourier Transform (DFT) analysis would result in a poor frequency resolution. A Non-Harmonic Fourier Analysis (NHFA) can be executed with an arbitrary frequency resolution instead, allowing to obtain frequency information even with short-time data samples. This paper shows an analytical investigation of the NHFA method. A data processing algorithm is then proposed to obtain frequency shift information from short time samples. The performances of this algorithm are then studied by experimental and numerical tests.

  16. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  17. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  18. System Runs Analysis with Process Mining

    Directory of Open Access Journals (Sweden)

    S. A. Shershakov

    2015-01-01

    Full Text Available Information systems (IS produce numerous traces and logs at runtime. In the context of SOA-based (service-oriented architecture IS, these logs contain details about sequences of process and service calls. Modern application monitoring and error tracking tools provide only rather straightforward log search and filtering functionality. However, “clever” analysis of the logs is highly useful, since it can provide valuable insights into the system architecture, interaction of business domains and services. Here we took runs event logs (trace data of a big booking system and discovered architectural guidelines violations and common anti-patterns. We applied mature process mining techniques for discovery and analysis of these logs. The aims of process mining are to discover, analyze, and improve processes on the basis of IS behavior recorded as event logs. In several specific examples, we show successful applications of process mining to system runtime analysis and motivate further research in this area.The article is published in the authors’ wording.

  19. Detecting DNS Tunnels Using Character Frequency Analysis

    OpenAIRE

    Born, Kenton; Gustafson, David

    2010-01-01

    High-bandwidth covert channels pose significant risks to sensitive and proprietary information inside company networks. Domain Name System (DNS) tunnels provide a means to covertly infiltrate and exfiltrate large amounts of information passed network boundaries. This paper explores the possibility of detecting DNS tunnels by analyzing the unigram, bigram, and trigram character frequencies of domains in DNS queries and responses. It is empirically shown how domains follow Zipf's law in a simil...

  20. Development of Quantum Devices and Algorithms for Radiation Detection and Radiation Signal Processing

    International Nuclear Information System (INIS)

    El Tokhy, M.E.S.M.E.S.

    2012-01-01

    The main functions of spectroscopy system are signal detection, filtering and amplification, pileup detection and recovery, dead time correction, amplitude analysis and energy spectrum analysis. Safeguards isotopic measurements require the best spectrometer systems with excellent resolution, stability, efficiency and throughput. However, the resolution and throughput, which depend mainly on the detector, amplifier and the analog-to-digital converter (ADC), can still be improved. These modules have been in continuous development and improvement. For this reason we are interested with both the development of quantum detectors and efficient algorithms of the digital processing measurement. Therefore, the main objective of this thesis is concentrated on both 1. Study quantum dot (QD) devices behaviors under gamma radiation 2. Development of efficient algorithms for handling problems of gamma-ray spectroscopy For gamma radiation detection, a detailed study of nanotechnology QD sources and infrared photodetectors (QDIP) for gamma radiation detection is introduced. There are two different types of quantum scintillator detectors, which dominate the area of ionizing radiation measurements. These detectors are QD scintillator detectors and QDIP scintillator detectors. By comparison with traditional systems, quantum systems have less mass, require less volume, and consume less power. These factors are increasing the need for efficient detector for gamma-ray applications such as gamma-ray spectroscopy. Consequently, the nanocomposite materials based on semiconductor quantum dots has potential for radiation detection via scintillation was demonstrated in the literature. Therefore, this thesis presents a theoretical analysis for the characteristics of QD sources and infrared photodetectors (QDIPs). A model of QD sources under incident gamma radiation detection is developed. A novel methodology is introduced to characterize the effect of gamma radiation on QD devices. The rate

  1. Detection and monitoring of neurotransmitters--a spectroscopic analysis.

    Science.gov (United States)

    Manciu, Felicia S; Lee, Kendall H; Durrer, William G; Bennet, Kevin E

    2013-01-01

    We demonstrate that confocal Raman mapping spectroscopy provides rapid, detailed, and accurate neurotransmitter analysis, enabling millisecond time resolution monitoring of biochemical dynamics. As a prototypical demonstration of the power of the method, we present real-time in vitro serotonin, adenosine, and dopamine detection, and dopamine diffusion in an inhomogeneous organic gel, which was used as a substitute for neurologic tissue.  Dopamine, adenosine, and serotonin were used to prepare neurotransmitter solutions in distilled water. The solutions were applied to the surfaces of glass slides, where they interdiffused. Raman mapping was achieved by detecting nonoverlapping spectral signatures characteristic of the neurotransmitters with an alpha 300 WITec confocal Raman system, using 532 nm neodymium-doped yttrium aluminum garnet laser excitation. Every local Raman spectrum was recorded in milliseconds and complete Raman mapping in a few seconds.  Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific microscale image regions. Such information is particularly important for complex, heterogeneous samples, where changes in composition can influence neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.  Accurate nondestructive characterization for real-time detection of neurotransmitters in inhomogeneous environments without the requirement of sample labeling is a key issue in neuroscience. Our work demonstrates the capabilities of Raman spectroscopy in biological applications, possibly providing a new tool for elucidating the mechanism and kinetics of deep brain stimulation. © 2012 International Neuromodulation Society.

  2. Analysis of the Industrial Biodiesel Production Process

    International Nuclear Information System (INIS)

    Di Nicola, G.; Moglie, M.; Santori, G.

    2009-01-01

    The reaction of transesterification is the chemical transformation through which you get biodiesel from vegetable oils. The purpose of this work is to plan carefully all the stages of various biodiesel production processes on the basis of recent results obtained in the experimental research. These results allow defining the proper thermodynamic models to be used, the right interpretation of the phenomena and identifying the parameters which affect the process. The modelling was done with ASPENPLUS (R) defining three possible processes used in industrial purpose. A subsequent sensitivity analysis was done for each process allowing the identification of the optimal configurations. By comparing these solutions it is possible to choose the most efficient one to reduce the costs of the final product. [it

  3. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  4. Onboard Detection of Snow, Ice, Clouds, and Other Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The detection of clouds within a satellite image is essential for retrieving surface geophysical parameters from optical and thermal imagery. Even a small percentage...

  5. An image processing pipeline to detect and segment nuclei in muscle fiber microscopic images.

    Science.gov (United States)

    Guo, Yanen; Xu, Xiaoyin; Wang, Yuanyuan; Wang, Yaming; Xia, Shunren; Yang, Zhong

    2014-08-01

    Muscle fiber images play an important role in the medical diagnosis and treatment of many muscular diseases. The number of nuclei in skeletal muscle fiber images is a key bio-marker of the diagnosis of muscular dystrophy. In nuclei segmentation one primary challenge is to correctly separate the clustered nuclei. In this article, we developed an image processing pipeline to automatically detect, segment, and analyze nuclei in microscopic image of muscle fibers. The pipeline consists of image pre-processing, identification of isolated nuclei, identification and segmentation of clustered nuclei, and quantitative analysis. Nuclei are initially extracted from background by using local Otsu's threshold. Based on analysis of morphological features of the isolated nuclei, including their areas, compactness, and major axis lengths, a Bayesian network is trained and applied to identify isolated nuclei from clustered nuclei and artifacts in all the images. Then a two-step refined watershed algorithm is applied to segment clustered nuclei. After segmentation, the nuclei can be quantified for statistical analysis. Comparing the segmented results with those of manual analysis and an existing technique, we find that our proposed image processing pipeline achieves good performance with high accuracy and precision. The presented image processing pipeline can therefore help biologists increase their throughput and objectivity in analyzing large numbers of nuclei in muscle fiber images. © 2014 Wiley Periodicals, Inc.

  6. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  7. An Advanced Platform for Biomolecular Detection and Analysis Systems

    National Research Council Canada - National Science Library

    Beebe, David J

    2005-01-01

    ...) agent detection has been demonstrated. The foundation of the approach is a new manufacturing process called MicroFluidic Tectonics that combines responsive hydrogel materials with novel liquid phase Microfluidic construction methods...

  8. Detection of a novel, integrative aging process suggests complex physiological integration.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  9. Image corruption detection in diffusion tensor imaging for post-processing and real-time monitoring.

    Science.gov (United States)

    Li, Yue; Shea, Steven M; Lorenz, Christine H; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called "corrected Inter-Slice Intensity Discontinuity" (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies.

  10. Image corruption detection in diffusion tensor imaging for post-processing and real-time monitoring.

    Directory of Open Access Journals (Sweden)

    Yue Li

    Full Text Available Due to the high sensitivity of diffusion tensor imaging (DTI to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called "corrected Inter-Slice Intensity Discontinuity" (cISID to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies.

  11. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    Science.gov (United States)

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  12. Detection and Analysis of Circular RNAs by RT-PCR.

    Science.gov (United States)

    Panda, Amaresh C; Gorospe, Myriam

    2018-03-20

    Gene expression in eukaryotic cells is tightly regulated at the transcriptional and posttranscriptional levels. Posttranscriptional processes, including pre-mRNA splicing, mRNA export, mRNA turnover, and mRNA translation, are controlled by RNA-binding proteins (RBPs) and noncoding (nc)RNAs. The vast family of ncRNAs comprises diverse regulatory RNAs, such as microRNAs and long noncoding (lnc)RNAs, but also the poorly explored class of circular (circ)RNAs. Although first discovered more than three decades ago by electron microscopy, only the advent of high-throughput RNA-sequencing (RNA-seq) and the development of innovative bioinformatic pipelines have begun to allow the systematic identification of circRNAs (Szabo and Salzman, 2016; Panda et al ., 2017b; Panda et al ., 2017c). However, the validation of true circRNAs identified by RNA sequencing requires other molecular biology techniques including reverse transcription (RT) followed by conventional or quantitative (q) polymerase chain reaction (PCR), and Northern blot analysis (Jeck and Sharpless, 2014). RT-qPCR analysis of circular RNAs using divergent primers has been widely used for the detection, validation, and sometimes quantification of circRNAs (Abdelmohsen et al ., 2015 and 2017; Panda et al ., 2017b). As detailed here, divergent primers designed to span the circRNA backsplice junction sequence can specifically amplify the circRNAs and not the counterpart linear RNA. In sum, RT-PCR analysis using divergent primers allows direct detection and quantification of circRNAs.

  13. ANALYSIS AND PROCESSING OF ELECTROMYOGRAM SIGNALS

    Directory of Open Access Journals (Sweden)

    K. A. Zimenko

    2013-01-01

    Full Text Available A method of electromyogram signals processing and identification for implementation in rehabilitation devices control is given. The method is based on the high-frequency components filtration which improves the signal/noise ratio; also it is based on the wavelet analysis for signal preprocessing and motion type classification by taught artificial neural network. Obtained accuracy of motion type classification is 94%.

  14. Early Detection of Breast Cancer on Mammograms Using: Perceptual Feedback, Computer Processed Images and Ultrasound

    National Research Council Canada - National Science Library

    Bloch, Peter

    1996-01-01

    ... (as predicted by the gaze duration), enhanced the detectability or masses. (2) Computer processing of screening mammograms for detection of clusters of microcalcifications and parenchyma patterns associated with developing lesions...

  15. Histology image analysis for carcinoma detection and grading.

    Science.gov (United States)

    He, Lei; Long, L Rodney; Antani, Sameer; Thoma, George R

    2012-09-01

    This paper presents an overview of the image analysis techniques in the domain of histopathology, specifically, for the objective of automated carcinoma detection and classification. As in other biomedical imaging areas such as radiology, many computer assisted diagnosis (CAD) systems have been implemented to aid histopathologists and clinicians in cancer diagnosis and research, which have been attempted to significantly reduce the labor and subjectivity of traditional manual intervention with histology images. The task of automated histology image analysis is usually not simple due to the unique characteristics of histology imaging, including the variability in image preparation techniques, clinical interpretation protocols, and the complex structures and very large size of the images themselves. In this paper we discuss those characteristics, provide relevant background information about slide preparation and interpretation, and review the application of digital image processing techniques to the field of histology image analysis. In particular, emphasis is given to state-of-the-art image segmentation methods for feature extraction and disease classification. Four major carcinomas of cervix, prostate, breast, and lung are selected to illustrate the functions and capabilities of existing CAD systems. Published by Elsevier Ireland Ltd.

  16. Semiclassical analysis for diffusions and stochastic processes

    CERN Document Server

    Kolokoltsov, Vassili N

    2000-01-01

    The monograph is devoted mainly to the analytical study of the differential, pseudo-differential and stochastic evolution equations describing the transition probabilities of various Markov processes. These include (i) diffusions (in particular,degenerate diffusions), (ii) more general jump-diffusions, especially stable jump-diffusions driven by stable Lévy processes, (iii) complex stochastic Schrödinger equations which correspond to models of quantum open systems. The main results of the book concern the existence, two-sided estimates, path integral representation, and small time and semiclassical asymptotics for the Green functions (or fundamental solutions) of these equations, which represent the transition probability densities of the corresponding random process. The boundary value problem for Hamiltonian systems and some spectral asymptotics ar also discussed. Readers should have an elementary knowledge of probability, complex and functional analysis, and calculus.

  17. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira

    2013-01-01

    Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature w...

  18. Fault detection and isolation in processes involving induction machines

    Energy Technology Data Exchange (ETDEWEB)

    Zell, K.; Medvedev, A. [Control Engineering Group, Luleaa University of Technology, Luleaa (Sweden)

    1997-12-31

    A model-based technique for fault detection and isolation in electro-mechanical systems comprising induction machines is introduced. Two coupled state observers, one for the induction machine and another for the mechanical load, are used to detect and recognize fault-specific behaviors (fault signatures) from the real-time measurements of the rotor angular velocity and terminal voltages and currents. Practical applicability of the method is verified in full-scale experiments with a conveyor belt drive at SSAB, Luleaa Works. (orig.) 3 refs.

  19. Analysis of Exhaled Breath for Disease Detection

    Science.gov (United States)

    Amann, Anton; Miekisch, Wolfram; Schubert, Jochen; Buszewski, Bogusław; Ligor, Tomasz; Jezierski, Tadeusz; Pleil, Joachim; Risby, Terence

    2014-06-01

    Breath analysis is a young field of research with great clinical potential. As a result of this interest, researchers have developed new analytical techniques that permit real-time analysis of exhaled breath with breath-to-breath resolution in addition to the conventional central laboratory methods using gas chromatography-mass spectrometry. Breath tests are based on endogenously produced volatiles, metabolites of ingested precursors, metabolites produced by bacteria in the gut or the airways, or volatiles appearing after environmental exposure. The composition of exhaled breath may contain valuable information for patients presenting with asthma, renal and liver diseases, lung cancer, chronic obstructive pulmonary disease, inflammatory lung disease, or metabolic disorders. In addition, oxidative stress status may be monitored via volatile products of lipid peroxidation. Measurement of enzyme activity provides phenotypic information important in personalized medicine, whereas breath measurements provide insight into perturbations of the human exposome and can be interpreted as preclinical signals of adverse outcome pathways.

  20. Instrumental Analysis in Environmental Chemistry - Gas Phase Detection Systems

    Science.gov (United States)

    Stedman, Donald H.; Meyers, Philip A.

    1974-01-01

    Discusses advances made in chemical analysis instrumentation used in environmental monitoring. This first of two articles is concerned with analytical instrumentation in which detection and dispersion depend ultimately on the properties of gaseous molecules. (JR)

  1. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  2. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    Lu Zhigang; Wu Huan; Liu Baoxu

    2007-01-01

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  3. Fault detection in processes represented by PLS models using an EWMA control scheme

    KAUST Repository

    Harrou, Fouzi

    2016-10-20

    Fault detection is important for effective and safe process operation. Partial least squares (PLS) has been used successfully in fault detection for multivariate processes with highly correlated variables. However, the conventional PLS-based detection metrics, such as the Hotelling\\'s T and the Q statistics are not well suited to detect small faults because they only use information about the process in the most recent observation. Exponentially weighed moving average (EWMA), however, has been shown to be more sensitive to small shifts in the mean of process variables. In this paper, a PLS-based EWMA fault detection method is proposed for monitoring processes represented by PLS models. The performance of the proposed method is compared with that of the traditional PLS-based fault detection method through a simulated example involving various fault scenarios that could be encountered in real processes. The simulation results clearly show the effectiveness of the proposed method over the conventional PLS method.

  4. Protecting Student Intellectual Property in Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Barber, Craig

    2012-01-01

    The rapid development of the Internet along with increasing computer literacy has made it easy and tempting for digital natives to copy-paste someone's work. Plagiarism is now a burning issue in education, industry and even in the research community. In this study, the authors concentrate on plagiarism detection with particular focus on the…

  5. Across frequency processes involved in auditory detection of coloration

    DEFF Research Database (Denmark)

    Buchholz, Jörg; Kerketsos, P

    2008-01-01

    filterbank was designed to approximate auditory filter-shapes measured by Oxenham and Shera [JARO, 2003, 541-554], derived from forward masking data. The results of the present study demonstrate that a “purely” spectrum-based model approach can successfully describe auditory coloration detection even at high...

  6. Automatic detection of micronuclei by cell microscopic image processing.

    Science.gov (United States)

    Bahreyni Toossi, Mohammad Taghi; Azimian, Hosein; Sarrafzadeh, Omid; Mohebbi, Shokoufeh; Soleymanifard, Shokouhozaman

    2017-12-01

    With the development and applications of ionizing radiation in medicine, the radiation effects on human health get more and more attention. Ionizing radiation can lead to various forms of cytogenetic damage, including increased frequencies of micronuclei (MNi) and chromosome abnormalities. The cytokinesis block micronucleus (CBMN) assay is widely used method for measuring MNi to determine chromosome mutations or genome instability in cultured human lymphocytes. The visual scoring of MNi is time-consuming and scorer fatigue can lead to inconsistency. In this work, we designed software for the scoring of in vitro CBMN assay for biomonitoring on Giemsa-stained slides that overcome many previous limitations. Automatic scoring proceeds in four stages as follows. First, overall segmentation of nuclei is done. Then, binucleated (BN) cells are detected. Next, the entire cell is estimated for each BN as it is assumed that there is no detectable cytoplasm. Finally, MNi are detected within each BN cell. The designed Software is even able to detect BN cells with vague cytoplasm and MNi in peripheral blood smear. Our system is tested on a self-provided dataset and is achieved high sensitivities of about 98% and 82% in recognizing BN cells and MNi, respectively. Moreover, in our study less than 1% false positives were observed that makes our system reliable for practical MNi scoring. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Detection of Epileptic Seizures with Multi-modal Signal Processing

    DEFF Research Database (Denmark)

    Conradsen, Isa

    The main focus of this dissertation lies within the area of epileptic seizure detection. Medically refractory epileptic patients suffer from the unawareness of when the next seizure sets in, and what the consequences will be. A wearable device based on uni- or multi-modalities able to detect and ...... implemented in a wireless sEMG device. A double-blind test on patients in the clinic, showed 100 % reliability for three of four patients, whereas it failed for the last patient, who had atypical GTC seizures....... and alarm whenever a seizure starts is of great importance to these patients and their relatives, in the sense, that the alert of the seizure will make them feel more safe. Thus the objective of the project is to investigate the movements of convulsive epileptic seizures and design seizure detection...... methods have been applied in different studies in order to achieve the goal of reliable seizure detection. In the first study we present a method where the support vector machine classifier is applied on features based on wavelet bands. This was used on multi-modal data from control subjects...

  8. Protecting Students' Intellectual Property in the Web Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Dyagilev, Vadim; Tskhay, Alexander

    2012-01-01

    Learning management systems (LMS) play a central role in communications in online and distance education. In the digital era, with all the information now accessible at students' fingertips, plagiarism detection services (PDS) have become a must-have part of LMS. Such integration provides a seamless experience for users, allowing PDS to check…

  9. Auto-regressive processes explained by self-organized maps. Application to the detection of abnormal behavior in industrial processes.

    Science.gov (United States)

    Brighenti, Chiara; Sanz-Bobi, Miguel Á

    2011-12-01

    This paper analyzes the expected time evolution of an auto-regressive (AR) process using self-organized maps (SOM). It investigates how a SOM captures the time information given by the AR input process and how the transitions from one neuron to another one can be understood under a probabilistic perspective. In particular, regions of the map into which the AR process is expected to move are identified. This characterization allows detecting anomalous changes in the AR process structure or parameters. On the basis of the theoretical results, an anomaly detection method is proposed and applied to a real industrial process.

  10. Image Science and Analysis Group Spacecraft Damage Detection/Characterization

    Science.gov (United States)

    Wheaton, Ira M., Jr.

    2010-01-01

    This project consisted of several tasks that could be served by an intern to assist the ISAG in detecting damage to spacecrafts during missions. First, this project focused on supporting the Micrometeoroid Orbital Debris (MMOD) damage detection and assessment for the Hubble Space Telescope (HST) using imagery from the last two HST Shuttle servicing missions. In this project, we used coordinates of two windows on the Shuttle Aft flight deck from where images were taken and the coordinates of three ID points in order to calculate the distance from each window to the three points. Then, using the specifications from the camera used, we calculated the image scale in pixels per inch for planes parallel to and planes in the z-direction to the image plane (shown in Table 1). This will help in the future for calculating measurements of objects in the images. Next, tabulation and statistical analysis were conducted for screening results (shown in Table 2) of imagery with Orion Thermal Protection System (TPS) damage. Using the Microsoft Excel CRITBINOM function and Goal Seek, the probabilities of detection of damage to different shuttle tiles were calculated as shown in Table 3. Using developed measuring tools, volume and area measurements will be created from 3D models of Orion TPS damage. Last, mathematical expertise was provided to the Photogrammetry Team. These mathematical tasks consisted of developing elegant image space error equations for observations along 3D lines, circles, planes, etc. and checking proofs for minimal sets of sufficient multi-linear constraints. Some of the processes and resulting equations are displayed in Figure 1.

  11. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    Directory of Open Access Journals (Sweden)

    Kemal Akyol

    2016-01-01

    Full Text Available With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  12. Reading Ability, Processing Load and the Detection of Intersentence Inconsistencies.

    Science.gov (United States)

    Grabe, Mark

    A study was conducted to determine the relationship between processing load and ability to locate text segments containing intersentence contradictions. It was hypothesized that less able readers fail to exhibit comprehension monitoring skills because most tasks overload their processing capacity. Subjects were 87 fourth and sixth grade students…

  13. Optimizing detection and analysis of slow waves in sleep EEG.

    Science.gov (United States)

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  15. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  16. Data analysis of inertial sensor for train positioning detection system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seong Jin; Park, Sung Soo; Lee, Jae Ho; Kang, Dong Hoon [Korea Railroad Research Institute, Uiwang (Korea, Republic of)

    2015-02-15

    Train positioning detection information is fundamental for high-speed railroad inspection, making it possible to simultaneously determine the status and evaluate the integrity of railroad equipment. This paper presents the results of measurements and an analysis of an inertial measurement unit (IMU) used as a positioning detection sensors. Acceleration and angular rate measurements from the IMU were analyzed in the amplitude and frequency domains, with a discussion on vibration and train motions. Using these results and GPS information, the positioning detection of a Korean tilting train express was performed from Naju station to Illo station on the Honam-line. The results of a synchronized analysis of sensor measurements and train motion can help in the design of a train location detection system and improve the positioning detection performance.

  17. Signal processing for solar array monitoring, fault detection, and optimization

    CERN Document Server

    Braun, Henry; Spanias, Andreas

    2012-01-01

    Although the solar energy industry has experienced rapid growth recently, high-level management of photovoltaic (PV) arrays has remained an open problem. As sensing and monitoring technology continues to improve, there is an opportunity to deploy sensors in PV arrays in order to improve their management. In this book, we examine the potential role of sensing and monitoring technology in a PV context, focusing on the areas of fault detection, topology optimization, and performance evaluation/data visualization. First, several types of commonly occurring PV array faults are considered and detection algorithms are described. Next, the potential for dynamic optimization of an array's topology is discussed, with a focus on mitigation of fault conditions and optimization of power output under non-fault conditions. Finally, monitoring system design considerations such as type and accuracy of measurements, sampling rate, and communication protocols are considered. It is our hope that the benefits of monitoring presen...

  18. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    Energy Technology Data Exchange (ETDEWEB)

    Matzel, Eric M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-31

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.

  19. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  20. Scintillator for radiation detection and process for producing the same

    International Nuclear Information System (INIS)

    Ishii, M.; Akiyama, S.; Ishibashi, H.

    1985-01-01

    A scintillator for radiation detection obtained by coating a light reflective material in a thickness of 50 to 150 μm by a screen printing method on the surface of a solid scintillator material substrate is excellent in uniformity, dimensional accuracy with high light output. When the light reflective material layer is covered with a synthetic resin film, adhesive strength of the light reflective material layer to the substrate is increased remarkably

  1. Deterring digital plagiarism, how effective is the digital detection process?

    OpenAIRE

    Jayati Chaudhuri

    2008-01-01

    Academic dishonesty or plagiarism is a growing problem in today's digital world. Use of plagiarism detection tools can assist faculty to combat this form of academic dishonesty. In this article, a special emphasis is given to text-matching software called SafeAssignmentTM. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in detail. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in d...

  2. Utilizing Biomimetric Image Processing to Rapidly Detect Rollover Threats

    Science.gov (United States)

    2006-11-01

    algorithm based on the neurobiology of insect vision, specifically the vision of a fly. The system consists of a Long-Wavelength Infrared (LWIR...fly eye-based vision, or biomimetric edge vision, is based on the neurobiology of insect vision, since insects rely on edge detection to avoid...GPS, compass , and dead reckoning. Figure 6 shows an example of a potential DVI display for the driver. Figure 6: Conceptual Design for the DVI

  3. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    Energy Technology Data Exchange (ETDEWEB)

    Y. SUN

    2004-09-29

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P&CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN

  4. Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers

    Science.gov (United States)

    Pierścińska, D.

    2018-01-01

    This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.

  5. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  6. Warpage analysis in injection moulding process

    Science.gov (United States)

    Hidayah, M. H. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study was concentrated on the effects of process parameters in plastic injection moulding process towards warpage problem by using Autodesk Moldflow Insight (AMI) software for the simulation. In this study, plastic dispenser of dental floss has been analysed with thermoplastic material of Polypropylene (PP) used as the moulded material and details properties of 80 Tonne Nessei NEX 1000 injection moulding machine also has been used in this study. The variable parameters of the process are packing pressure, packing time, melt temperature and cooling time. Minimization of warpage obtained from the optimization and analysis data from the Design Expert software. Integration of Response Surface Methodology (RSM), Center Composite Design (CCD) with polynomial models that has been obtained from Design of Experiment (DOE) is the method used in this study. The results show that packing pressure is the main factor that will contribute to the formation of warpage in x-axis and y-axis. While in z-axis, the main factor is melt temperature and packing time is the less significant among the four parameters in x, y and z-axes. From optimal processing parameter, the value of warpage in x, y and z-axis have been optimised by 21.60%, 26.45% and 24.53%, respectively.

  7. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy......Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...

  8. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  9. ANALYSIS OF COMPUTER AIDED PROCESS PLANNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Salim A. Saleh

    2013-05-01

    Full Text Available Computer Aided Process Planning ( CAPP has been recognized as playing a key role in Computer Integrated Manufacturing ( CIM . It was used as a bridge to link CAD with CAM systems, in order to give the possibility of full integration in agreement with computer engineering to introduce CIM. The benefits of CAPP in the real industrial environment are still to be achieved. Due to different manufacturing applications, many different CAPP systems have been developed. The development of CAPP techniques needs to a summarized classification and a descriptive analysis. This paper presents the most important and famous techniques for the available CAPP systems, which are based on the variant, generative or semi-generative methods, and a descriptive analysis of their application possibilities.

  10. Auxetic polyurethane foam: Manufacturing and processing analysis

    Science.gov (United States)

    Jahan, Md Deloyer

    experimental design approach to identify significant processing parameters followed by optimization of those processing parameters in fabrication of auxetic PU foam. A split-plot factorial design has been selected for screening purpose. Response Surface Methodology (RSM) has been utilized to optimize the processing parameters in fabrication of auxetic PU foam. Two different designs named Box-Behnken and I-optimal designs have been employed for this analysis. The results obtained by those designs exhibit that I-optimal design provides more accurate and realistic results than Box-Behnken design when experiments are performed in split-plot manner. Finally, a near stationary ridge system is obtained by optimization analysis. As a result a set of operating conditions are obtained that produces similar minimum Poisson's ratio in auxetic PU foam.

  11. Digital Printing Quality Detection and Analysis Technology Based on CCD

    Science.gov (United States)

    He, Ming; Zheng, Liping

    2017-12-01

    With the help of CCD digital printing quality detection and analysis technology, it can carry out rapid evaluation and objective detection of printing quality, and can play a certain control effect on printing quality. It can be said CDD digital printing quality testing and analysis of the rational application of technology, its digital printing and printing materials for a variety of printing equipments to improve the quality of a very positive role. In this paper, we do an in-depth study and discussion based on the CCD digital print quality testing and analysis technology.

  12. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  13. Detection of wood failure by image processing method: influence of algorithm, adhesive and wood species

    Science.gov (United States)

    Lanying Lin; Sheng He; Feng Fu; Xiping Wang

    2015-01-01

    Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...

  14. Pipeline leak detection and location by on-line-correlation with a process computer

    International Nuclear Information System (INIS)

    Siebert, H.; Isermann, R.

    1977-01-01

    A method for leak detection using a correlation technique in pipelines is described. For leak detection and also for leak localisation and estimation of the leak flow recursive estimation algorithms are used. The efficiency of the methods is demonstrated with a process computer and a pipeline model operating on-line. It is shown that very small leaks can be detected. (orig.) [de

  15. People detection in nuclear plants by video processing for safety purpose

    International Nuclear Information System (INIS)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A.; Seixas, Jose M.; Silva, Eduardo Antonio B.; Cota, Raphael E.; Ramos, Bruno L.

    2011-01-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  16. People detection in nuclear plants by video processing for safety purpose

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN), Rio de Janeiro, RJ (Brazil); Seixas, Jose M.; Silva, Eduardo Antonio B., E-mail: seixas@lps.ufrj.b, E-mail: eduardo@lps.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Eletrica; Cota, Raphael E.; Ramos, Bruno L., E-mail: brunolange@poli.ufrj.b [Universidade Federal do Rio de Janeiro (EP/UFRJ), RJ (Brazil). Dept. de Engenharia Eletronica e de Computacao

    2011-07-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  17. Quantum Chemical Strain Analysis For Mechanochemical Processes.

    Science.gov (United States)

    Stauch, Tim; Dreuw, Andreas

    2017-04-18

    The use of mechanical force to initiate a chemical reaction is an efficient alternative to the conventional sources of activation energy, i.e., heat, light, and electricity. Applications of mechanochemistry in academic and industrial laboratories are diverse, ranging from chemical syntheses in ball mills and ultrasound baths to direct activation of covalent bonds using an atomic force microscope. The vectorial nature of force is advantageous because specific covalent bonds can be preconditioned for rupture by selective stretching. However, the influence of mechanical force on single molecules is still not understood at a fundamental level, which limits the applicability of mechanochemistry. As a result, many chemists still resort to rules of thumb when it comes to conducting mechanochemical syntheses. In this Account, we show that comprehension of mechanochemistry at the molecular level can be tremendously advanced by quantum chemistry, in particular by using quantum chemical force analysis tools. One such tool is the JEDI (Judgement of Energy DIstribution) analysis, which provides a convenient approach to analyze the distribution of strain energy in a mechanically deformed molecule. Based on the harmonic approximation, the strain energy contribution is calculated for each bond length, bond angle and dihedral angle, thus providing a comprehensive picture of how force affects molecules. This Account examines the theoretical foundations of quantum chemical force analysis and provides a critical overview of the performance of the JEDI analysis in various mechanochemical applications. We explain in detail how this analysis tool is to be used to identify the "force-bearing scaffold" of a distorted molecule, which allows both the rationalization and the optimization of diverse mechanochemical processes. More precisely, we show that the inclusion of every bond, bending and torsion of a molecule allows a particularly insightful discussion of the distribution of mechanical

  18. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    Science.gov (United States)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Abstract. Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor. PMID:26720879

  19. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery.

    Science.gov (United States)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  20. Detection of Optimum Maturity of Maize Using Image Processing

    African Journals Online (AJOL)

    Ayuba et al.

    2017-04-13

    Apr 13, 2017 ... A CCD camera for image acquisition of the different green colorations of the maize leaves at maturity was used. Different color features were extracted from the image processing system. (MATLAB) and used as inputs to the artificial neural network that classify different levels of maturity. Keywords: Maize ...

  1. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  2. Multisensor network system for wildfire detection using infrared image processing.

    Science.gov (United States)

    Bosch, I; Serrano, A; Vergara, L

    2013-01-01

    This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire), for different probability of false alarm (PFA). The necessity of including decision fusion is thereby demonstrated.

  3. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-05-17

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  4. Identifying Time Measurement Tampering in the Traversal Time and Hop Count Analysis (TTHCA Wormhole Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Jonny Karlsson

    2013-05-01

    Full Text Available Traversal time and hop count analysis (TTHCA is a recent wormhole detection algorithm for mobile ad hoc networks (MANET which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  5. Differential Leukocyte Counting via Fluorescent Detection and Image Processing on a Centrifugal Microfluidic Platform.

    Science.gov (United States)

    Balter, Max L; Chen, Alvin I; Colinco, C Amara; Gorshkov, Alexander; Bixon, Brian; Martin, Vincent; Fromholtz, Alexander; Maguire, Timothy J; Yarmush, Martin L

    2016-12-21

    Centrifugal microfluidics has received much attention in the last decade for the automation of blood testing at the point-of-care, specifically for the detection of chemistries, proteins, and nucleic acids. However, the detection of common blood cells on-disc, particularly leukocytes, remains a challenge. In this paper, we present two analytical methods for enumerating leukocytes on a centrifugal platform using a custom-built fluorescent microscope, acridine orange nuclear staining, and image processing techniques. In the first method, cell analysis is performed in glass capillary tubes; in the second, acrylic chips are used. A bulk-cell analysis approach is implemented in both cases where the pixel areas of fractionated lymphocyte/monocyte and granulocyte layers are correlated with cell counts. Generating standard curves using porcine blood sample controls, we observed strong linear fits to measured cell counts using both methods. Analyzing the pixel intensities of the fluorescing white cell region, we are able to differentiate lymphocytes from monocytes via pixel clustering, demonstrating the capacity to perform a 3-part differential. Finally, a discussion of pros and cons of the bulk-cell analysis approach concludes the paper.

  6. Flow analysis with chemiluminescence detection: Recent advances and applications.

    Science.gov (United States)

    Timofeeva, Irina I; Vakh, Christina S; Bulatov, Andrey V; Worsfold, Paul J

    2018-03-01

    This article highlights the most important developments in flow analysis with chemiluminescence (CL) detection, describing different flow systems that are compatible with CL detection, detector designs, commonly applied CL reactions and approaches to sample treatment. Recent applications of flow analysis with CL detection (focusing on outputs published since 2010) are also presented. Applications are classified by sample matrix, covering foods and beverages, environmental matrices, pharmaceuticals and biological fluids. Comprehensive tables are provided for each area, listing the specific sample matrix, CL reaction used, linear range, limit of detection and sample treatment for each analyte. Finally, recent and emerging trends in the field are also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure

    Science.gov (United States)

    2018-01-01

    ARL-TR-8271 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter... Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure by Kwok F Tom Sensors and Electron Devices...September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure 5a

  8. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  9. Analysis of microphysical processes in fog

    Science.gov (United States)

    Li, Yunlong; Hoogeboom, Peter; Russchenberg, Herman W. J.; Klein Baltink, H.

    2014-10-01

    The microphysical processes in fog are examined based on an analysis of four fog events captured by the in-situ and remote sensing synergy at the Cabauw Experimental Site for Atmospheric Research (CESAR) in the western part of the Netherlands. A 35 GHz cloud radar at CESAR has been used in "fog mode" for the first time in the campaign. In this paper, the microphysical parameterization of fog is first introduced as the basis for analyzing the microphysical processes in the lifecycle of fog. The general microphysical characteristics of the four fog events are studied and key microphysical parameters (droplet number concentration, liquid water content, mean radius, and spectral standard deviation) related to fog are found lower than those in other sites due to the low aerosol concentration at Cabauw. The dominant processes in fog are investigated from the relationships among the key microphysical parameters. The positive correlations of each two parameters in lifecycle stages of a stratus-fog case suggest the dominant scheme in fog is droplet activation with subsequent hygroscopic growth and/or droplet evaporation, which is also supported by the combined observations of visibility and radar reflectivity. The shape of fog drop size distribution regularly broadens and then narrows in the whole lifecycle. However, other mechanisms could exist, although not dominating. Collision-coalescence is a significant factor for the continued growth of big fog droplets when they have reached certain sizes in the mature stage. In the datasets, the collision-coalescence process could be distinguished from the unusual negative correlations among the key microphysical parameters in the lifecycle of another stratus-fog case, and the temporal evolutions of droplet number concentration, mean radius, spectra width, visibility and radar reflectivity show the evidence of it.

  10. Accelerating Malware Detection via a Graphics Processing Unit

    Science.gov (United States)

    2010-09-01

    Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...pro.mspx, Accessed July 2010, 2001. 79 Mic06. Microsoft. Common object file format ( coff ). MSDN, November 2006. Re- vision 4.1. Mic07a. Microsoft

  11. Steam leak detection method in pipeline using histogram analysis

    International Nuclear Information System (INIS)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun; Park, Jong Won

    2015-01-01

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results

  12. Detecting fire in video stream using statistical analysis

    Directory of Open Access Journals (Sweden)

    Koplík Karel

    2017-01-01

    Full Text Available The real time fire detection in video stream is one of the most interesting problems in computer vision. In fact, in most cases it would be nice to have fire detection algorithm implemented in usual industrial cameras and/or to have possibility to replace standard industrial cameras with one implementing the fire detection algorithm. In this paper, we present new algorithm for detecting fire in video. The algorithm is based on tracking suspicious regions in time with statistical analysis of their trajectory. False alarms are minimized by combining multiple detection criteria: pixel brightness, trajectories of suspicious regions for evaluating characteristic fire flickering and persistence of alarm state in sequence of frames. The resulting implementation is fast and therefore can run on wide range of affordable hardware.

  13. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis.

    Science.gov (United States)

    Motooka, Masanobu; Uno, Shigeyasu

    2018-02-02

    Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  14. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis

    Directory of Open Access Journals (Sweden)

    Masanobu Motooka

    2018-02-01

    Full Text Available Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  15. URBAN DETECTION, DELIMITATION AND MORPHOLOGY: COMPARATIVE ANALYSIS OF SELECTIVE "MEGACITIES"

    Directory of Open Access Journals (Sweden)

    B. Alhaddad

    2012-08-01

    Full Text Available Over the last 50 years, the world has faced an impressive growth of urban population. The walled city, close to the outside, an "island"for economic activities and population density within the rural land, has led to the spread of urban life and urban networks in almost all the territory. There was, as said Margalef (1999, "a topological inversion of the landscape". The "urban" has gone from being an island in the ocean of rural land vastness, to represent the totally of the space in which are inserted natural and rural "systems". New phenomena such as the fall of the fordist model of production, the spread of urbanization known as urban sprawl, and the change of scale of the metropolis, covering increasingly large regions, called "megalopolis" (Gottmann, 1961, have characterized the century. However there are no rigorous databases capable of measuring and evaluating the phenomenon of megacities and in general the process of urbanization in the contemporary world. The aim of this paper is to detect, identify and analyze the morphology of the megacities through remote sensing instruments as well as various indicators of landscape. To understand the structure of these heterogeneous landscapes called megacities, land consumption and spatial complexity needs to be quantified accurately. Remote sensing might be helpful in evaluating how the different land covers shape urban megaregions. The morphological landscape analysis allows establishing the analogies and the differences between patterns of cities and studying the symmetry, growth direction, linearity, complexity and compactness of the urban form. The main objective of this paper is to develop a new methodology to detect urbanized land of some megacities around the world (Tokyo, Mexico, Chicago, New York, London, Moscow, Sao Paulo and Shanghai using Landsat 7 images.

  16. Detection and categorization of bacteria habitats using shallow linguistic analysis.

    Science.gov (United States)

    Karadeniz, İlknur; Özgür, Arzucan

    2015-01-01

    Information regarding bacteria biotopes is important for several research areas including health sciences, microbiology, and food processing and preservation. One of the challenges for scientists in these domains is the huge amount of information buried in the text of electronic resources. Developing methods to automatically extract bacteria habitat relations from the text of these electronic resources is crucial for facilitating research in these areas. We introduce a linguistically motivated rule-based approach for recognizing and normalizing names of bacteria habitats in biomedical text by using an ontology. Our approach is based on the shallow syntactic analysis of the text that include sentence segmentation, part-of-speech (POS) tagging, partial parsing, and lemmatization. In addition, we propose two methods for identifying bacteria habitat localization relations. The underlying assumption for the first method is that discourse changes with a new paragraph. Therefore, it operates on a paragraph-basis. The second method performs a more fine-grained analysis of the text and operates on a sentence-basis. We also develop a novel anaphora resolution method for bacteria coreferences and incorporate it with the sentence-based relation extraction approach. We participated in the Bacteria Biotope (BB) Task of the BioNLP Shared Task 2013. Our system (Boun) achieved the second best performance with 68% Slot Error Rate (SER) in Sub-task 1 (Entity Detection and Categorization), and ranked third with an F-score of 27% in Sub-task 2 (Localization Event Extraction). This paper reports the system that is implemented for the shared task, including the novel methods developed and the improvements obtained after the official evaluation. The extensions include the expansion of the OntoBiotope ontology using the training set for Sub-task 1, and the novel sentence-based relation extraction method incorporated with anaphora resolution for Sub-task 2. These extensions resulted in

  17. Nonparametric signal processing validation in T-wave alternans detection and estimation.

    Science.gov (United States)

    Goya-Esteban, R; Barquero-Pérez, O; Blanco-Velasco, M; Caamaño-Fernández, A J; García-Alberola, A; Rojo-Álvarez, J L

    2014-04-01

    Although a number of methods have been proposed for T-Wave Alternans (TWA) detection and estimation, their performance strongly depends on their signal processing stages and on their free parameters tuning. The dependence of the system quality with respect to the main signal processing stages in TWA algorithms has not yet been studied. This study seeks to optimize the final performance of the system by successive comparisons of pairs of TWA analysis systems, with one single processing difference between them. For this purpose, a set of decision statistics are proposed to evaluate the performance, and a nonparametric hypothesis test (from Bootstrap resampling) is used to make systematic decisions. Both the temporal method (TM) and the spectral method (SM) are analyzed in this study. The experiments were carried out in two datasets: first, in semisynthetic signals with artificial alternant waves and added noise; second, in two public Holter databases with different documented risk of sudden cardiac death. For semisynthetic signals (SNR = 15 dB), after the optimization procedure, a reduction of 34.0% (TM) and 5.2% (SM) of the power of TWA amplitude estimation errors was achieved, and the power of error probability was reduced by 74.7% (SM). For Holter databases, appropriate tuning of several processing blocks, led to a larger intergroup separation between the two populations for TWA amplitude estimation. Our proposal can be used as a systematic procedure for signal processing block optimization in TWA algorithmic implementations.

  18. A fully integrated electrochemical biosensor platform fabrication process for cytokines detection.

    Science.gov (United States)

    Baraket, Abdoullatif; Lee, Michael; Zine, Nadia; Sigaud, Monique; Bausells, Joan; Errachid, Abdelhamid

    2017-07-15

    Interleukin-1b (IL-1b) and interleukin-10 (IL-10) biomarkers are one of many antigens that are secreted in acute stages of inflammation after left ventricle assisted device (LVAD) implantation for patients suffering from heart failure (HF). In the present study, we have developed a fully integrated electrochemical biosensor platform for cytokine detection at minute concentrations. Using eight gold working microelectrodes (WEs) the design will increase the sensitivity of detection, decrease the time of measurements, and allow a simultaneous detection of varying cytokine biomarkers. The biosensor platform was fabricated onto silicon substrates using silicon technology. Monoclonal antibodies (mAb) of anti-human IL-1b and anti-human IL-10 were electroaddressed onto the gold WEs through functionalization with 4-carboxymethyl aryl diazonium (CMA). Cyclic voltammetry (CV) was applied during the WE functionalization process to characterize the gold WE surface properties. Finally, electrochemical impedance spectroscopy (EIS) characterized the modified gold WE. The biosensor platform was highly sensitive to the corresponding cytokines and no interference with other cytokines was observed. Both cytokines: IL-10 and IL-1b were detected within the range of 1pgmL -1 to 15pgmL -1 . The present electrochemical biosensor platform is very promising for multi-detection of biomolecules which can dramatically decrease the time of analysis. This can provide data to clinicians and doctors concerning cytokines secretion at minute concentrations and the prediction of the first signs of inflammation after LVAD implantation. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. ECG Signal Analysis and Arrhythmia Detection using Wavelet Transform

    Science.gov (United States)

    Kaur, Inderbir; Rajni, Rajni; Marwaha, Anupma

    2016-12-01

    Electrocardiogram (ECG) is used to record the electrical activity of the heart. The ECG signal being non-stationary in nature, makes the analysis and interpretation of the signal very difficult. Hence accurate analysis of ECG signal with a powerful tool like discrete wavelet transform (DWT) becomes imperative. In this paper, ECG signal is denoised to remove the artifacts and analyzed using Wavelet Transform to detect the QRS complex and arrhythmia. This work is implemented in MATLAB software for MIT/BIH Arrhythmia database and yields the sensitivity of 99.85 %, positive predictivity of 99.92 % and detection error rate of 0.221 % with wavelet transform. It is also inferred that DWT outperforms principle component analysis technique in detection of ECG signal.

  20. Multisensor Network System for Wildfire Detection Using Infrared Image Processing

    Directory of Open Access Journals (Sweden)

    I. Bosch

    2013-01-01

    Full Text Available This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire, for different probability of false alarm (PFA. The necessity of including decision fusion is thereby demonstrated.

  1. Multisensor Network System for Wildfire Detection Using Infrared Image Processing

    Science.gov (United States)

    Bosch, I.; Serrano, A.; Vergara, L.

    2013-01-01

    This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire), for different probability of false alarm (PFA). The necessity of including decision fusion is thereby demonstrated. PMID:23843734

  2. Traffic analysis and control using image processing

    Science.gov (United States)

    Senthilkumar, K.; Ellappan, Vijayan; Arun, A. R.

    2017-11-01

    This paper shows the work on traffic analysis and control till date. It shows an approach to regulate traffic the use of image processing and MATLAB systems. This concept uses computational images that are to be compared with original images of the street taken in order to determine the traffic level percentage and set the timing for the traffic signal accordingly which are used to reduce the traffic stoppage on traffic lights. They concept proposes to solve real life scenarios in the streets, thus enriching the traffic lights by adding image receivers like HD cameras and image processors. The input is then imported into MATLAB to be used. as a method for calculating the traffic on roads. Their results would be computed in order to adjust the traffic light timings on a particular street, and also with respect to other similar proposals but with the added value of solving a real, big instance.

  3. THE EQUIPMENT ACQUISITION PROCESS: ANALYSIS FOR NEGOTIATION

    Directory of Open Access Journals (Sweden)

    Irena Peharda

    2011-02-01

    Full Text Available The analysis for negotiation results in a prescriptive negotiation procedure that improves the decision making process by increasing the understanding of the decision situation and by the explaining rationale for decision. This prescriptive procedure is used for the evaluation of stakeholders who negotiate, for guidance when there are tensions between various stakeholders, and for the change in the stakeholders` perceptions when considering the creation of additional value and possible agreement(s. The procedure is divided into five specific steps, based on the standard acquisition procedure. The procedure is applied to the case of acquiring Armored Wheeled Vehicles 8x8 (AWV for the Croatian Armed Forces (CAF, according to the Croatian procurement regulations.

  4. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  5. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  6. Microstructuring of piezoresistive cantilevers for gas detection and analysis

    International Nuclear Information System (INIS)

    Sarov, Y.; Sarova, V.; Bitterlich, Ch.; Richter, O.; Guliyev, E.; Zoellner, J.-P.; Rangelow, I. W.; Andok, R.; Bencurova, A.

    2011-01-01

    In this work we report on a design and fabrication of cantilevers for gas detection and analysis. The cantilevers have expanded area of interaction with the gas, while the signal transduction is realized by an integrated piezoresistive deflection sensor, placed at the narrowed cantilever base with highest stress along the cantilever. Moreover, the cantilevers have integrated bimorph micro-actuator detection in a static and dynamic mode. The cantilevers are feasible as pressure, temperature and flow sensors and under chemical functionalization - for gas recognition, tracing and composition analysis. (authors)

  7. Decision analysis applications and the CERCLA process

    Energy Technology Data Exchange (ETDEWEB)

    Purucker, S.T.; Lyon, B.F. [Oak Ridge National Lab., TN (United States). Risk Analysis Section]|[Univ. of Tennessee, Knoxville, TN (United States)

    1994-06-01

    Quantitative decision methods can be developed during environmental restoration projects that incorporate stakeholder input and can complement current efforts that are undertaken for data collection and alternatives evaluation during the CERCLA process. These decision-making tools can supplement current EPA guidance as well as focus on problems that arise as attempts are made to make informed decisions regarding remedial alternative selection. In examining the use of such applications, the authors discuss the use of decision analysis tools and their impact on collecting data and making environmental decisions from a risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based decision rules that incorporate stakeholder concerns. This represents a quantitative method for implementing the Data Quality Objective (DQO) process. These objective functions can be expressed using a variety of indices to analyze problems that currently arise in the environmental field. Examples include cost, magnitude of risk, efficiency, and probability of success or failure. Based on such defined objective functions, a project can evaluate the impact of different risk and decision selection strategies on data worth and alternative selection.

  8. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  9. Delamination detection by Multi-Level Wavelet Processing of Continuous Scanning Laser Doppler Vibrometry data

    Science.gov (United States)

    Chiariotti, P.; Martarelli, M.; Revel, G. M.

    2017-12-01

    A novel non-destructive testing procedure for delamination detection based on the exploitation of the simultaneous time and spatial sampling provided by Continuous Scanning Laser Doppler Vibrometry (CSLDV) and the feature extraction capability of Multi-Level wavelet-based processing is presented in this paper. The processing procedure consists in a multi-step approach. Once the optimal mother-wavelet is selected as the one maximizing the Energy to Shannon Entropy Ratio criterion among the mother-wavelet space, a pruning operation aiming at identifying the best combination of nodes inside the full-binary tree given by Wavelet Packet Decomposition (WPD) is performed. The pruning algorithm exploits, in double step way, a measure of the randomness of the point pattern distribution on the damage map space with an analysis of the energy concentration of the wavelet coefficients on those nodes provided by the first pruning operation. A combination of the point pattern distributions provided by each node of the ensemble node set from the pruning algorithm allows for setting a Damage Reliability Index associated to the final damage map. The effectiveness of the whole approach is proven on both simulated and real test cases. A sensitivity analysis related to the influence of noise on the CSLDV signal provided to the algorithm is also discussed, showing that the processing developed is robust enough to measurement noise. The method is promising: damages are well identified on different materials and for different damage-structure varieties.

  10. Fault detection of feed water treatment process using PCA-WD with parameter optimization.

    Science.gov (United States)

    Zhang, Shirong; Tang, Qian; Lin, Yu; Tang, Yuling

    2017-05-01

    Feed water treatment process (FWTP) is an essential part of utility boilers; and fault detection is expected for its reliability improvement. Classical principal component analysis (PCA) has been applied to FWTPs in our previous work; however, the noises of T 2 and SPE statistics result in false detections and missed detections. In this paper, Wavelet denoise (WD) is combined with PCA to form a new algorithm, (PCA-WD), where WD is intentionally employed to deal with the noises. The parameter selection of PCA-WD is further formulated as an optimization problem; and PSO is employed for optimization solution. A FWTP, sustaining two 1000MW generation units in a coal-fired power plant, is taken as a study case. Its operation data is collected for following verification study. The results show that the optimized WD is effective to restrain the noises of T 2 and SPE statistics, so as to improve the performance of PCA-WD algorithm. And, the parameter optimization enables PCA-WD to get its optimal parameters in an automatic way rather than on individual experience. The optimized PCA-WD is further compared with classical PCA and sliding window PCA (SWPCA), in terms of four cases as bias fault, drift fault, broken line fault and normal condition, respectively. The advantages of the optimized PCA-WD, against classical PCA and SWPCA, is finally convinced with the results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Processing of Instantaneous Angular Speed Signal for Detection of a Diesel Engine Failure

    Directory of Open Access Journals (Sweden)

    Adam Charchalis

    2013-01-01

    Full Text Available Continuous monitoring of diesel engine performance under its operating is critical for the prediction of malfunction development and subsequently functional failure detection. Analysis of instantaneous angular speed (IAS of the crankshaft is considered as one of the nonintrusive and effective methods of the detection of combustion quality deterioration. In this paper results of experimental verification of fuel system's malfunction detecting, using optical encoder for IAS recording are presented. The implemented method relies on the comparison of measurement results, recorded under healthy and faulty conditions of the engine. Elaborated dynamic model of angular speed variations enables us to build templates of engine behavior. Recorded during experiment, values of cylinder pressure were taken for the approximation of pressure basic waveform. The main task of data processing is smoothing the raw angular speed signal. The noise is due to sensor mount vibrations, signal emitter machining, engine body vibrations, and crankshaft torsional vibrations. Smoothing of the measurement data was carried out by the implementation of the Savitzky-Golay filter. Measured signal after smoothing was compared with the model of IAS run.

  12. Image processing and analysis using neural networks for optometry area

    Science.gov (United States)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  13. Digital image processing: effect on detectability of simulated low-contrast radiographic patterns

    International Nuclear Information System (INIS)

    Ishida, M.; Doi, K.; Loo, L.N.; Metz, C.E.; Lehr, J.L.

    1984-01-01

    Detection studies of simulated low-contrast radiographic patterns were performed with a high-quality digital image processing system. The original images, prepared with conventional screen-film systems, were processed digitally to enhance contrast by a ''windowing'' technique. The detectability of simulated patterns was quantified in terms of the results of observer performance experiments by using the multiple-alternative forced-choice method. The processed images demonstrated a significant increase in observer detection performance over that for the original images. These results are related to the displayed and perceived signal-to-noise ratios derived from signal detection theory. The improvement in detectability is ascribed to a reduction in the relative magnitude of the human observer's ''internal'' noise after image processing. The measured dependence of threshold signal contrast on object size and noise level is accounted for by a statistical decision theory model that includes internal noise

  14. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  15. Signal analysis and anomaly detection for flood early warning systems

    NARCIS (Netherlands)

    Pyayt, A.L.; Kozionov, A.P.; Kusherbaeva, V.T.; Mokhov, I.I.; Krzhizhanovskaya, V.V.; Broekhuijsen, B.J.; Meijer, R.J.; Sloot, P.M.A.

    2014-01-01

    We describe the detection methods and the results of anomalous conditions in dikes (earthen dams/levees) based on a simultaneous processing of several data streams originating from sensors installed in these dikes. Applied methods are especially valuable in cases where lack of information or

  16. Automated spine and vertebrae detection in CT images using object-based image analysis.

    Science.gov (United States)

    Schwier, M; Chitiboi, T; Hülnhagen, T; Hahn, H K

    2013-09-01

    Although computer assistance has become common in medical practice, some of the most challenging tasks that remain unsolved are in the area of automatic detection and recognition. The human visual perception is in general far superior to computer vision algorithms. Object-based image analysis is a relatively new approach that aims to lift image analysis from a pixel-based processing to a semantic region-based processing of images. It allows effective integration of reasoning processes and contextual concepts into the recognition method. In this paper, we present an approach that applies object-based image analysis to the task of detecting the spine in computed tomography images. A spine detection would be of great benefit in several contexts, from the automatic labeling of vertebrae to the assessment of spinal pathologies. We show with our approach how region-based features, contextual information and domain knowledge, especially concerning the typical shape and structure of the spine and its components, can be used effectively in the analysis process. The results of our approach are promising with a detection rate for vertebral bodies of 96% and a precision of 99%. We also gain a good two-dimensional segmentation of the spine along the more central slices and a coarse three-dimensional segmentation. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Visual cerebral microbleed detection on 7T MR imaging: reliability and effects of image processing.

    Science.gov (United States)

    de Bresser, J; Brundel, M; Conijn, M M; van Dillen, J J; Geerlings, M I; Viergever, M A; Luijten, P R; Biessels, G J

    2013-01-01

    MR imaging at 7T has a high sensitivity for cerebral microbleed detection. We identified mIP processing conditions with an optimal balance between the number of visually detected microbleeds and the number of sections on 7T MR imaging. Even with optimal mIP processing, the limited size of some of the microbleeds and the susceptibility effects of other adjacent structures were a challenge for visual detection, which led to a modest inter-rater agreement, mainly due to missed microbleeds. Automated lesion-detection techniques may be required to optimally benefit from the increased spatial resolution offered by 7T MR imaging.

  18. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  19. Auditory detection of an increment in the rate of a random process

    International Nuclear Information System (INIS)

    Brown, W.S.; Emmerich, D.S.

    1994-01-01

    Recent experiments have presented listeners with complex tonal stimuli consisting of components with values (i.e., intensities or frequencies) randomly sampled from probability distributions [e.g., R. A. Lutfi, J. Acoust. Soc. Am. 86, 934--944 (1989)]. In the present experiment, brief tones were presented at intervals corresponding to the intensity of a random process. Specifically, the intervals between tones were randomly selected from exponential probability functions. Listeners were asked to decide whether tones presented during a defined observation interval represented a ''noise'' process alone or the ''noise'' with a ''signal'' process added to it. The number of tones occurring in any observation interval is a Poisson variable; receiver operating characteristics (ROCs) arising from Poisson processes have been considered by Egan [Signal Detection Theory and ROC Analysis (Academic, New York, 1975)]. Several sets of noise and signal intensities and observation interval durations were selected which were expected to yield equivalent performance. Rating ROCs were generated based on subjects' responses in a single-interval, yes--no task. The performance levels achieved by listeners and the effects of intensity and duration are compared to those predicted for an ideal observer

  20. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    Poline, J.B.; Mazoyer, B.M.

    1994-01-01

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  1. Detection of charged particles through a photodiode: design and analysis

    International Nuclear Information System (INIS)

    Angoli, A.; Quirino, L.L.; Hernandez, V.M.; Lopez del R, H.; Mireles, F.; Davila, J.I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    This project develops and construct an charge particle detector mean a pin photodiode array, design and analysis using a silicon pin Fotodiodo that generally is used to detect visible light, its good efficiency, size compact and reduced cost specifically allows to its use in the radiation monitoring and alpha particle detection. Here, so much, appears the design of the system of detection like its characterization for alpha particles where one is reported as alpha energy resolution and detection efficiency. The equipment used in the development of work consists of alpha particle a triple source composed of Am-241, Pu-239 and Cm-244 with 5,55 KBq as total activity, Maestro 32 software made by ORTEC, a multi-channel card Triumph from ORTEC and one low activity electroplated uranium sample. (Author)

  2. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T.R.H.; Whitlow, H.J. [Lund Univ. (Sweden); Bubb, I.F.; Short, R.; Johnston, P.N. [Royal Melbourne Inst. of Tech., VIC (Australia)

    1996-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  3. cDNA cloning, structural analysis, SNP detection and tissue ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 96; Issue 2. cDNA cloning, structural analysis, SNP detection and tissue ... Abstract. Insulin-like growth factor 1 (IGF1) plays an important role in growth, reproduction, foetal development and cell proliferation. The present study was conducted to clone and sequence the ...

  4. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    2005; Partal and Kahya 2006;. Keywords. Climate change; temperature; precipitation; trend analysis; change point detection; southwest Iran. J. Earth Syst. Sci. 123, No. 2, March 2014, pp. 281–295 ...... level are indicated by shaded triangles and hollow triangles indicate insignificant trends. Figure 7. Sequential values of the ...

  5. cDNA cloning, structural analysis, SNP detection and tissue ...

    Indian Academy of Sciences (India)

    THOMAS NAICY

    [Naicy T., Venkatachalapathy T., Aravindakshan T., Raghavan K. C., Mini M. and Shyama K. 2017 cDNA cloning, structural analysis, SNP detection and tissue expression profile of the IGF1 gene in Malabari and Attappady Black goats of India. J. Genet. 96, xx–xx]. Introduction. Insulin-like growth factor 1 (IGF1), an important ...

  6. Gravitational wave detection and data analysis for pulsar timing arrays

    NARCIS (Netherlands)

    Haasteren, Rutger van

    2011-01-01

    Long-term precise timing of Galactic millisecond pulsars holds great promise for measuring long-period (months-to-years) astrophysical gravitational waves. In this work we develop a Bayesian data analysis method for projects called pulsar timing arrays; projects aimed to detect these gravitational

  7. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    temperature and precipitation series have been investigated by many researchers throughout the world (Serra et al. 2001; Turkes and Sumer 2004;. Zer Lin et al. 2005; Partal and Kahya 2006;. Keywords. Climate change; temperature; precipitation; trend analysis; change point detection; southwest Iran. J. Earth Syst. Sci.

  8. Spiral analysis-improved clinical utility with center detection.

    Science.gov (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L

    2008-06-30

    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  9. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 123; Issue 2. Trend analysis and change point detection of annual and seasonal precipitation and ... Department of Geography, University of Pune, Pune 411 007, India. Centre for Advanced Training, Indian Institute of Tropical Meteorology, Pune 411 008, India.

  10. Multi-Modal Detection and Mapping of Static and Dynamic Obstacles in Agriculture for Process Evaluation

    Directory of Open Access Journals (Sweden)

    Timo Korthals

    2018-03-01

    Full Text Available Today, agricultural vehicles are available that can automatically perform tasks such as weed detection and spraying, mowing, and sowing while being steered automatically. However, for such systems to be fully autonomous and self-driven, not only their specific agricultural tasks must be automated. An accurate and robust perception system automatically detecting and avoiding all obstacles must also be realized to ensure safety of humans, animals, and other surroundings. In this paper, we present a multi-modal obstacle and environment detection and recognition approach for process evaluation in agricultural fields. The proposed pipeline detects and maps static and dynamic obstacles globally, while providing process-relevant information along the traversed trajectory. Detection algorithms are introduced for a variety of sensor technologies, including range sensors (lidar and radar and cameras (stereo and thermal. Detection information is mapped globally into semantical occupancy grid maps and fused across all sensors with late fusion, resulting in accurate traversability assessment and semantical mapping of process-relevant categories (e.g., crop, ground, and obstacles. Finally, a decoding step uses a Hidden Markov model to extract relevant process-specific parameters along the trajectory of the vehicle, thus informing a potential control system of unexpected structures in the planned path. The method is evaluated on a public dataset for multi-modal obstacle detection in agricultural fields. Results show that a combination of multiple sensor modalities increases detection performance and that different fusion strategies must be applied between algorithms detecting similar and dissimilar classes.

  11. Quantitative analysis of hindered amine light stabilizers by CZE with UV detection and quadrupole TOF mass spectrometric detection.

    Science.gov (United States)

    Hintersteiner, Ingrid; Schmid, Thomas; Himmelsbach, Markus; Klampfl, Christian W; Buchberger, Wolfgang W

    2014-10-01

    The current work describes the development of a CZE method with quadrupole QTOF-MS detection and UV detection for the quantitation of Cyasorb 3529, a common hindered amine light stabilizer (HALS), in polymer materials. Analysis of real polymer samples revealed that the oligomer composition of Cyasorb 3529 changes during processing, a fact hampering the development of a straightforward method for quantitation based on calibration with a Cyasorb 3529 standard. To overcome this obstacle in-depth investigations of the oligomer composition of this HALS using QTOF-MS and QTOF-MS/MS had to be performed whereby 22 new oligomer structures, in addition to the ten structures already described, were identified. Finally, a CZE method for quantitative analysis of this HALS was developed starting with a comprehensive characterization of a Cyasorb 3529 standard using CZE-QTOF-MS, subsequently allowing the correct assignment of most Cyasorb 3529 oligomers in an electropherogram with UV detection. Employing the latter detection technique and hexamethyl-melamine as internal standard, peak areas obtained for the melamine could be correlated with those from the triazine ring, the UV-absorbing unit present in the HALS. This approach finally allowed proper quantitation of the single oligomers of Cyasorb 3529, an imperative for the quantitative assessment of this HALS in real polymer samples. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Airborne LIDAR Data Processing and Analysis Tools

    Science.gov (United States)

    Zhang, K.

    2007-12-01

    Airborne LIDAR technology allows accurate and inexpensive measurements of topography, vegetation canopy heights, and buildings over large areas. In order to provide researchers high quality data, NSF has created the National Center for Airborne Laser Mapping (NCALM) to collect, archive, and distribute the LIDAR data. However, the LIDAR systems collect voluminous irregularly-spaced, three-dimensional point measurements of ground and non-ground objects scanned by the laser beneath the aircraft. To advance the use of the technology and data, NCALM is developing public domain algorithms for ground and non-ground measurement classification and tools for data retrieval and transformation. We present the main functions of the ALDPAT (Airborne LIDAR Data Processing and Analysis Tools) developed by NCALM. While Geographic Information Systems (GIS) provide a useful platform for storing, analyzing, and visualizing most spatial data, the shear volume of raw LIDAR data makes most commercial GIS packages impractical. Instead, we have developed a suite of applications in ALDPAT which combine self developed C++ programs with the APIs of commercial remote sensing and GIS software. Tasks performed by these applications include: 1) transforming data into specified horizontal coordinate systems and vertical datums; 2) merging and sorting data into manageable sized tiles, typically 4 square kilometers in dimension; 3) filtering point data to separate measurements for the ground from those for non-ground objects; 4) interpolating the irregularly spaced elevations onto a regularly spaced grid to allow raster based analysis; and 5) converting the gridded data into standard GIS import formats. The ALDPAT 1.0 is available through http://lidar.ihrc.fiu.edu/.

  13. Entrepreneurship Learning Process by using SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Jajat Sudrajat

    2016-03-01

    Full Text Available The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students and faculty. Participants of this study were Binus student of various majors who were taking courses EN001 and EN002. This study used research and development that examining the theoretical learning components of entrepreneurship education (teaching and learning dimension, where there were six dimensions of the survey which was a fundamental element in determining the framework of entrepreneurship education. Research finds that a strategy based on a matrix of factors is at least eight strategies for improving the learning process of entrepreneurship. From eight strategies are one of them strategies to increase collaboration BEC with family support. This strategy is supported by the survey results to the three majors who are following the EN001 and EN002, where more than 85% of the students are willing to do an aptitude test to determine the advantages and disadvantages of self-development and more of 54% of the students are not willing to accept the wishes of their parents because they do not correspond to his ideals. Based on the above results, it is suggested for further research, namely developing entrepreneurship research by analyzing other dimensions.

  14. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    distillation column. Next, these design methods are extended using element concept to also include ternary as well as multicomponent reactive distillation processes. The element concept is used to translate a ternary system of compounds (A + B ↔ C) to a binary system of elements (WA and WB). When only two...... elements are needed to represent the reacting system of more than two compounds, a binary element system is identified. In the case of multi-element reactive distillation processes (where more than two elements are encountered) the equivalent element concept is used to translate a multicomponent (multi......-element) system of compounds (A + B ↔ C + D) to a binary system of key elements (elements WHK and WLK). For an energy-efficient design, non-reactive driving force (for binary non-reactive distillation), reactive driving force (for binary element systems) and binary-equivalent driving force (for multicomponent...

  15. Early Detection of Diabetic Retinopathy in Fluorescent Angiography Retinal Images Using Image Processing Methods

    Directory of Open Access Journals (Sweden)

    Meysam Tavakoli

    2010-12-01

    Full Text Available Introduction: Diabetic retinopathy (DR is the single largest cause of sight loss and blindness in the working age population of Western countries; it is the most common cause of blindness in adults between 20 and 60 years of age. Early diagnosis of DR is critical for preventing vision loss so early detection of microaneurysms (MAs as the first signs of DR is important. This paper addresses the automatic detection of MAs in fluorescein angiography fundus images, which plays a key role in computer assisted diagnosis of DR, a serious and frequent eye disease. Material and Methods: The algorithm can be divided into three main steps. The first step or pre-processing was for background normalization and contrast enhancement of the image. The second step aimed at detecting landmarks, i.e., all patterns possibly corresponding to vessels and the optic nerve head, which was achieved using a local radon transform. Then, MAs were extracted, which were used in the final step to automatically classify candidates into real MA and other objects. A database of 120 fluorescein angiography fundus images was used to train and test the algorithm. The algorithm was compared to manually obtained gradings of those images. Results: Sensitivity of diagnosis for DR was 94%, with specificity of 75%, and sensitivity of precise microaneurysm localization was 92%, at an average number of 8 false positives per image. Discussion and Conclusion: Sensitivity and specificity of this algorithm make it one of the best methods in this field. Using local radon transform in this algorithm eliminates the noise sensitivity for microaneurysm detection in retinal image analysis.

  16. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  17. Citation-based plagiarism detection detecting disguised and cross-language plagiarism using citation pattern analysis

    CERN Document Server

    Gipp, Bela

    2014-01-01

    Plagiarism is a problem with far-reaching consequences for the sciences. However, even today's best software-based systems can only reliably identify copy & paste plagiarism. Disguised plagiarism forms, including paraphrased text, cross-language plagiarism, as well as structural and idea plagiarism often remain undetected. This weakness of current systems results in a large percentage of scientific plagiarism going undetected. Bela Gipp provides an overview of the state-of-the art in plagiarism detection and an analysis of why these approaches fail to detect disguised plagiarism forms. The aut

  18. Natural and professional help: a process analysis.

    Science.gov (United States)

    Tracey, T J; Toro, P A

    1989-08-01

    Differences in the helping interactions formed by mental health professionals, divorce lawyers, and mutual help group leaders were examined. Fourteen members of each of these three helper groups (N = 42) met independently with a coached client presenting marital difficulties. Using ratings of ability to ameliorate the personal and emotional problems presented, the 42 helpers were divided (using a median split) into successful and less successful outcome groups. The responses of each of the pairs were coded using the Hill Counselor Verbal Response Category System. The sequence of client-helper responses were examined using log-linear analysis as they varied by type of helper and outcome. Results indicated that successful helpers (regardless of type of helper) tended to use directives (e.g., guidance and approval-reassurance) differently from less successful helpers. Successful helpers used directives following client emotional expression and not following factual description. In addition, clear differences in helper responses by helper type and outcome were found. Each helper type had unique patterns of responses that differentiated successful from less successful outcomes. Client responses were found to vary across helper type even when given the same helper preceding response. Results are discussed with respect to the unique goals of each helping relationship and the different shaping process involved in each.

  19. Image Segmentation and Processing for Efficient Parking Space Analysis

    OpenAIRE

    Tutika, Chetan Sai; Vallapaneni, Charan; R, Karthik; KP, Bharath; Muthu, N Ruban Rajesh Kumar

    2018-01-01

    In this paper, we develop a method to detect vacant parking spaces in an environment with unclear segments and contours with the help of MATLAB image processing capabilities. Due to the anomalies present in the parking spaces, such as uneven illumination, distorted slot lines and overlapping of cars. The present-day conventional algorithms have difficulties processing the image for accurate results. The algorithm proposed uses a combination of image pre-processing and false contour detection ...

  20. Image processing based detection of lung cancer on CT scan images

    Science.gov (United States)

    Abdillah, Bariqi; Bustamam, Alhadi; Sarwinda, Devvi

    2017-10-01

    In this paper, we implement and analyze the image processing method for detection of lung cancer. Image processing techniques are widely used in several medical problems for picture enhancement in the detection phase to support the early medical treatment. In this research we proposed a detection method of lung cancer based on image segmentation. Image segmentation is one of intermediate level in image processing. Marker control watershed and region growing approach are used to segment of CT scan image. Detection phases are followed by image enhancement using Gabor filter, image segmentation, and features extraction. From the experimental results, we found the effectiveness of our approach. The results show that the best approach for main features detection is watershed with masking method which has high accuracy and robust.

  1. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  2. Layout dependent effects analysis on 28nm process

    Science.gov (United States)

    Li, Helen; Zhang, Mealie; Wong, Waisum; Song, Huiyuan; Xu, Wei; Hurat, Philippe; Ding, Hua; Zhang, Yifan; Cote, Michel; Huang, Jason; Lai, Ya-ch

    2015-03-01

    Advanced process nodes introduce new variability effects due to increased density, new material, new device structures, and so forth. This creates more and stronger Layout Dependent effects (LDE), especially below 28nm. These effects such as WPE (Well Proximity Effect), PSE (Poly Spacing Effect) change the carrier mobility and threshold voltage and therefore make the device performances, such as Vth and Idsat, extremely layout dependent. In traditional flows, the impact of these changes can only be simulated after the block has been fully laid out, the design is LVS and DRC clean. It's too late in the design cycle and it increases the number of post-layout iteration. We collaborated to develop a method on an advanced process to embed several LDE sources into a LDE kit. We integrated this LDE kit in custom analog design environment, for LDE analysis at early design stage. These features allow circuit and layout designers to detect the variations caused by LDE, and to fix the weak points caused by LDE. In this paper, we will present this method and how it accelerates design convergence of advanced node custom analog designs by detecting early-on LDE hotspots on partial or fully placed layout, reporting contribution of each LDE component to help identify the root cause of LDE variation, and even providing fixing guidelines on how to modify the layout and to reduce the LDE impact.

  3. Impact of LANDSAT MSS Sensor Differences on Change Detection Analysis

    Science.gov (United States)

    Likens, W. C.; Wrigley, R. C.

    1984-01-01

    Change detection techniques were used to pinpoint differences in the multispectral band scanners on LANDSAT 2, 3, and 4 satellites. The method of analysis was to co-register 512 by 512 pixel subwindows for all data pairs followed by scattergram generation and analysis. In all cases, the LANDSAT-4 data were used as the base to which other images were registered. There appear to be no major problems preventing use of LANDSAT-4 MSS with previous MSS sensors for charge detection, provided the interference noise can be removed or minimized. This noise may result in detection of spurious changes, as well as affect other uses of the data, including image classification. Analysis of dark (water and forests), rather than light features will be most impacted because the noise will form a higher percentage of the total response at low DN values. Any data normalizations for change detection should be based upon the data, rather than solely upon calibration information. While the observed relative radiometric transfer function between LANDSAT 3 and 4 was approximately as predicted, there were still significant deviations. Normalizing based upon data content also can have the advantage of allowing simultaneous normalization of the atmosphere as well as the radiometry.

  4. Detecting Phonemes and Letters in Text: Interactions Between Different Types and Levels of Processes

    National Research Council Canada - National Science Library

    Schnelder, Vivian

    1998-01-01

    .... In addition, visual word unitization processes were implicated. Experiments 3 and 4 provided support for the hypothesis that the Gestalt goodness of pattern affected detection errors when subjects searched for letters...

  5. Data Set for the manuscript entitled, "Sample Processing Approach for Detection of Ricin in Surface Samples."

    Data.gov (United States)

    U.S. Environmental Protection Agency — Figure. This dataset is associated with the following publication: Shah, S., S. Kane, A.M. Erler, and T. Alfaro. Sample Processing Approach for Detection of Ricin in...

  6. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING and EVALUATION MEHTODS and REQUIREMENTS

    International Nuclear Information System (INIS)

    SCHOFIELD JS

    2007-01-01

    This document has two purposes: (sm b ullet) Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. (sm b ullet) Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals

  7. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian

  8. Risk Analysis for Nonthermal process interventions

    Science.gov (United States)

    Over the last few years a number of nonthermal process interventions including ionizing radiation and ultraviolet light, high pressure processing, pulsed-electric and radiofrequency electric fields, microwave and infrared technologies, bacteriophages, etc. have been approved by regulatory agencies, ...

  9. Sparse principal component analysis in hyperspectral change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Larsen, Rasmus; Vestergaard, Jacob Schack

    2011-01-01

    This contribution deals with change detection by means of sparse principal component analysis (PCA) of simple differences of calibrated, bi-temporal HyMap data. Results show that if we retain only 15 nonzero loadings (out of 126) in the sparse PCA the resulting change scores appear visually very...... similar although the loadings are very different from their usual non-sparse counterparts. The choice of three wavelength regions as being most important for change detection demonstrates the feature selection capability of sparse PCA....

  10. Corpus analysis and automatic detection of emotion-including keywords

    Science.gov (United States)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  11. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  12. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ... affected employees, age of the process, and operating history of the process. The process hazard analysis... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process hazard analysis. 68.67 Section 68.67 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED...

  13. Criteria for assessing the quality of signal processing techniques for acoustic leak detection

    International Nuclear Information System (INIS)

    Prabhakar, R.; Singh, O.P.

    1990-01-01

    In this paper the criteria used in the first IAEA coordinated research programme to assess the quality of signal processing techniques for sodium boiling noise detection are highlighted. Signal processing techniques, using new features sensitive to boiling and a new approach for achieving higher reliability of detection, which were developed at Indira Gandhi Centre for Atomic Research are also presented. 10 refs, 3 figs, 2 tabs

  14. Statistical analysis of stretch film production process capabilities

    OpenAIRE

    Kovačić, Goran; Kondić, Živko

    2012-01-01

    The basic concept of statistical process control is based on the comparison of data collected from the process with calculated control limits and conclusions about the process based on the above. This process is recognized as a modern method for the analysis of process capabilities over different capability indexes. This paper describes the application of this method in monitoring and analysis of stretch film production process capabilities.

  15. Computer-assisted image processing to detect spores from the fungus Pandora neoaphidis

    OpenAIRE

    Korsnes, Reinert; Westrum, Karin; Fløistad, Erling; Klingen, Ingeborg

    2016-01-01

    This contribution demonstrates an example of experimental automatic image analysis to detect spores prepared on microscope slides derived from trapping. The application is to monitor aerial spore counts of the entomopathogenic fungus Pandora neoaphidis which may serve as a biological control agent for aphids. Automatic detection of such spores can therefore play a role in plant protection. The present approach for such detection is a modification of traditional manual microscopy of prepared s...

  16. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  17. Analysis of accelerants and fire debris using aroma detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Barshick, S.A.

    1997-01-17

    The purpose of this work was to investigate the utility of electronic aroma detection technologies for the detection and identification of accelerant residues in suspected arson debris. Through the analysis of known accelerant residues, a trained neural network was developed for classifying suspected arson samples. Three unknown fire debris samples were classified using this neural network. The item corresponding to diesel fuel was correctly identified every time. For the other two items, wide variations in sample concentration and excessive water content, producing high sample humidities, were shown to influence the sensor response. Sorbent sampling prior to aroma detection was demonstrated to reduce these problems and to allow proper neural network classification of the remaining items corresponding to kerosene and gasoline.

  18. Early detection of foot ulcers through asymmetry analysis

    Science.gov (United States)

    Kaabouch, Naima; Chen, Yi; Hu, Wen-Chen; Anderson, Julie; Ames, Forrest; Paulson, Rolf

    2009-02-01

    Foot ulcers affect millions of Americans annually. Areas that are likely to ulcerate have been associated with increased local skin temperatures due to inflammation and enzymatic autolysis of tissue. Conventional methods to assess skin, including inspection and palpation, may be valuable approaches, but usually they do not detect changes in skin integrity until an ulcer has already developed. Conversely, infrared imaging is a technology able to assess the integrity of the skin and its many layers, thus having the potential to index the cascade of physiological events in the prevention, assessment, and management of foot ulcers. In this paper, we propose a technique, asymmetry analysis, to automatically analyze the infrared images in order to detect inflammation. Preliminary results show that the proposed technique can be reliable and efficient to detect inflammation and, hence, predict potential ulceration.

  19. The application of image processing to the detection of corrosion by radiography

    International Nuclear Information System (INIS)

    Packer, M.E.

    1979-02-01

    The computer processing of digitised radiographs has been investigated with a view to improving x-radiography as a method for detecting corrosion. Linearisation of the image-density distribution in a radiograph has been used to enhance information which can be attributed to corrosion, making the detection of corrosion by radiography both easier and more reliable. However, conclusive evidence has yet to be obtained that image processing can result in the detection of corrosion which was not already faintly apparent on an unprocessed radiograph. A potential method has also been discovered for analysing the history of a corrosion site

  20. Fault Detection and Diagnosis System in Process industry Based on Big Data and WeChat

    Directory of Open Access Journals (Sweden)

    Sun Zengqiang

    2017-01-01

    Full Text Available The fault detection and diagnosis information in process industry can be received, anytime and anywhere, based on bigdata and WeChat with mobile phone, which got rid of constraints that can only check Distributed Control System (DCS in the central control room or look over in office. Then, fault detection, diagnosis information sharing can be provided, and what’s more, fault detection alarm range, code and inform time can be personalized. The pressure of managers who worked on process industry can be release with the mobile information system.

  1. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    OpenAIRE

    Constanta RADULESCU; Liviu Marius CÎRŢÎNĂ; Constantin MILITARU

    2011-01-01

    This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and ...

  2. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  3. Amorphous silicon batch process cost analysis

    International Nuclear Information System (INIS)

    Whisnant, R.A.; Sherring, C.

    1993-08-01

    This report describes the development of baseline manufacturing cost data to assist PVMaT monitoring teams in assessing current and future subcontracts, which an emphasis on commercialization and production. A process for the manufacture of a single-junction, large-area, a Si module was modeled using an existing Research Triangle Institute (RTI) computer model. The model estimates a required, or breakeven, price for the module based on its production process and the financial structure of the company operating the process. Sufficient detail on cost drivers is presented so the relationship of the process features and business characteristics can be related to the estimated required price

  4. Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process

    Directory of Open Access Journals (Sweden)

    Yanhui Wang

    2018-01-01

    Full Text Available Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.

  5. Leak detection in pipelines through spectral analysis of pressure signals

    Directory of Open Access Journals (Sweden)

    Souza A.L.

    2000-01-01

    Full Text Available The development and test of a technique for leak detection in pipelines is presented. The technique is based on the spectral analysis of pressure signals measured in pipeline sections where the formation of stationary waves is favoured, allowing leakage detection during the start/stop of pumps. Experimental tests were performed in a 1250 m long pipeline for various operational conditions of the pipeline (liquid flow rate and leakage configuration. Pressure transients were obtained by four transducers connected to a PC computer. The obtained results show that the spectral analysis of pressure transients, together with the knowledge of reflection points provide a simple and efficient way of identifying leaks during the start/stop of pumps in pipelines.

  6. Ring Laser Gyro-based Digital Processing Technique for Detecting Rotation Rate over Short Time Intervals

    Directory of Open Access Journals (Sweden)

    V. N. Enin

    2017-01-01

    Full Text Available The article investigates capabilities of digital techniques to improve measurement accuracy of dithering ring laser gyro (DRLG in detecting constant rotation rate over short time intervals. An array of the GL-1 device output within a LG triad to measure the vertical component of the angular rate of rotation of the Earth in the laboratory setting is selected as the object of study. The selected time of a single measurement is 2 minutes, and as a full standard deviation error of measurement is selected the magnitude at least 0.002 "/ min. The objective of this study is to develop and underpin a new effective technique of LG digital information processing to enable providing an appropriate accuracy to meet modern requirements with reducing measurement time of a constant rate Ωz component. The specific objectives are the comparative analysis of the precision capabilities of the known techniques over limited measurement time intervals, development and support of new, more efficient technique of digital information processing of dithering ring LG, and experimental verification and evaluation of effectiveness of the technique proposed. The article presents a comparative error analysis of practically applied digital techniques such a simple averaging method, Hamming method, and method of "conditional sample of regression lines" with the proposed technique of "recognition of the output signal of the image N". To compare the techniques were used the real digital processing device output data taken at a frequency of 400 Hz over 94 two-minute measurement intervals after the device has been switched on. The proposed LG output image recognition technique enables us to reach about three times higher measuring accuracy over two-minute interval as compared to the known techniques.

  7. Geomorphological change detection of fluvial processes of lower Siret channel using LIDAR data

    Science.gov (United States)

    Niculita, Mihai; Obreja, Florin; Boca, Bogdan

    2015-04-01

    :121-134. Lague D., Brodu N., Leroux J., 2013. Accurate 3D comparison of complex topography with terrestrial laser scanner: application to the Rangitikei canyon (N-Z), ISPRS journal of Photogrammmetry and Remote Sensing, 80:10-26. James L.A., Hodgson M.E., Ghoshal S., Latiolais M.M., 2012. Geomorphic change detection using historic maps and DEM differencing: the temporal dimension of geospatial analysis. Geomorphology, 137:181-198. Nedelcu G., Borcan M., Branescu E., Petre C., Teleanu B., Preda A., Murafa R., 2011. Exceptional floods from the years 2008 and 2010 in Siret river basin, Proceedings of the Annual Scientific Conference of National Romanian Institute of Hydrology and Water Administration, 1-3 November 2011. (in Romanian) Olariu P., Obreja F., Obreja I., 2009. Some aspects regarding the sediment transit from Trotus catchment and lower sector of Siret river during the exceptional floods from 1991 and 2005, Annals of Stefan cel Mare University of Suceava, XVIII:93-104.(in Romanian) Serbu M., Obreja F., Olariu P., 2009. The 2008 floods from upper Siret catchment. Causes, effects, evaluation, Hidrotechnics, 54(12):1-38. (in Romanian) Wheaton J.M., Brasington J., Darby S., Sear D., 2009. Accounting for uncertainty in DEMs from repeat topographic surveys: improved sediment budgets. Earth Surface Processes and Landforms, 35(2):136-156.

  8. Fault Detection via Stability Analysis for the Hybrid Control Unit of HEVs

    OpenAIRE

    Kyogun Chang; Yoon Bok Lee

    2011-01-01

    Fault detection determines faultexistence and detecting time. This paper discusses two layered fault detection methods to enhance the reliability and safety. Two layered fault detection methods consist of fault detection methods of component level controllers and system level controllers. Component level controllers detect faults by using limit checking, model-based detection, and data-driven detection and system level controllers execute detection by stability analysis w...

  9. Detecting Data Concealment Programs Using Passive File System Analysis

    Science.gov (United States)

    Davis, Mark; Kennedy, Richard; Pyles, Kristina; Strickler, Amanda; Shenoi, Sujeet

    Individuals who wish to avoid leaving evidence on computers and networks often use programs that conceal data from conventional digital forensic tools. This paper discusses the application of passive file system analysis techniques to detect trace evidence left by data concealment programs. In addition, it describes the design and operation of Seraph, a tool that determines whether certain encryption, steganography and erasing programs were used to hide or destroy data.

  10. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said...... to results for chlorine in freshwater from BCR certification analyses by highly competent analytical laboratories in the EC. Titration showed systematic errors of several percent, while radiochemical neutron activation analysis produced results without detectable bias....

  11. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  12. Introduction to image processing and analysis

    CERN Document Server

    Russ, John C

    2007-01-01

    ADJUSTING PIXEL VALUES Optimizing Contrast Color Correction Correcting Nonuniform Illumination Geometric Transformations Image Arithmetic NEIGHBORHOOD OPERATIONS Convolution Other Neighborhood Operations Statistical Operations IMAGE PROCESSING IN THE FOURIER DOMAIN The Fourier Transform Removing Periodic Noise Convolution and Correlation Deconvolution Other Transform Domains Compression BINARY IMAGES Thresholding Morphological Processing Other Morphological Operations Boolean Operations MEASUREMENTS Global Measurements Feature Measurements Classification APPENDIX: SOFTWARE REFERENCES AND LITERATURE INDEX.

  13. Shielding analysis of the advanced voloxidation process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Park, J. J.; Lee, J. W.; Shin, J. M.; Park, G. I.; Song, K. C

    2008-09-15

    This report deals describes how much a shielding benefit can be obtained by the Advanced Voloxidation process. The calculation was performed with the MCNPX code and a simple problem was modeled with a spent fuel source which was surrounded by a concrete wall. The source terms were estimated with the ORIGEN-ARP code and the gamma spectrum and the neutron spectrum were also obtained. The thickness of the concrete wall was estimated before and after the voloxidation process. From the results, the gamma spectrum after the voloxidation process was estimated as a 67% reduction compared with that of before the voloxidation process due to the removal of several gamma emission elements such as cesium and rubidium. The MCNPX calculations provided that the thickness of the general concrete wall could be reduced by 12% after the voloxidation process. And the heavy concrete wall provided a 28% reduction in the shielding of the source term after the voloxidation process. This can be explained in that there lots of gamma emission isotopes still exist after the advanced voloxidation process such as Pu-241, Y-90, and Sr-90 which are independent of the voloxidation process.

  14. Profitability Analysis of Groundnuts Processing in Maiduguri ...

    African Journals Online (AJOL)

    This paper examines the profitability of groundnuts processing in Maiduguri Metropolitan Council of Borno State. The specific objectives of the study were to examine the socioeconomic characteristics of groundnut processors, estimate the costs and returns in groundnut processing and determine the return on investment in ...

  15. Establishment of analysis method for methane detection by gas chromatography

    Science.gov (United States)

    Liu, Xinyuan; Yang, Jie; Ye, Tianyi; Han, Zeyu

    2018-02-01

    The study focused on the establishment of analysis method for methane determination by gas chromatography. Methane was detected by hydrogen flame ionization detector, and the quantitative relationship was determined by working curve of y=2041.2x+2187 with correlation coefficient of 0.9979. The relative standard deviation of 2.60-6.33% and the recovery rate of 96.36%∼105.89% were obtained during the parallel determination of standard gas. This method was not quite suitable for biogas content analysis because methane content in biogas would be over the measurement range in this method.

  16. Development of vibrational analysis for detection of antisymmetric shells

    CERN Document Server

    Esmailzadeh-Khadem, S; Rezaee, M

    2002-01-01

    In this paper, vibrational behavior of bodies of revolution with different types of structural faults is studied. Since vibrational characteristics of structures are natural properties of system, the existence of any structural faults causes measurable changes in these properties. Here, this matter is demonstrated. In other words, vibrational behavior of a body of revolution with no structural faults is analyzed by two methods of I) numerical analysis using super sap software, II) Experimental model analysis, and natural frequencies and mode shapes are obtained. Then, different types of cracks are introduced in the structure, and analysis is repeated and the results are compared. Based on this study, one may perform crack detection by measuring the natural frequencies and mode shapes of the samples and comparing with reference information obtained from the vibration analysis of the original structure with no fault.

  17. Development of vibrational analysis for detection of antisymmetric shells

    International Nuclear Information System (INIS)

    Esmailzadeh Khadem, S.; Mahmoodi, M.; Rezaee, M.

    2002-01-01

    In this paper, vibrational behavior of bodies of revolution with different types of structural faults is studied. Since vibrational characteristics of structures are natural properties of system, the existence of any structural faults causes measurable changes in these properties. Here, this matter is demonstrated. In other words, vibrational behavior of a body of revolution with no structural faults is analyzed by two methods of I) numerical analysis using super sap software, II) Experimental model analysis, and natural frequencies and mode shapes are obtained. Then, different types of cracks are introduced in the structure, and analysis is repeated and the results are compared. Based on this study, one may perform crack detection by measuring the natural frequencies and mode shapes of the samples and comparing with reference information obtained from the vibration analysis of the original structure with no fault

  18. In-Process Detection of Weld Defects Using Laser-Based Ultrasound

    International Nuclear Information System (INIS)

    Bacher, G.D.; Kercel, S.W.; Kisner, R.A.; Klein, M.B.; Pouet, B.

    1999-01-01

    Laser-based ultrasonic (LBU) measurement shows great promise for on-line monitoring of weld quality in tailor-welded blanks. Tailor-welded blanks are steel blanks made from plates of differing thickness and/or properties butt-welded together; they are used in automobile manufacturing to produce body, frame, and closure panels. LBU uses a pulsed laser to generate the ultrasound and a continuous wave (CW) laser interferometer to detect the ultrasound at the point of interrogation to perform ultrasonic inspection. LBU enables in-process measurements since there is no sensor contact or near-contact with the workpiece. The authors are using laser-generated plate (Lamb) waves to propagate from one plate into the weld nugget as a means of detecting defects. This paper reports the results of the investigation of a number of inspection architectures based on processing of signals from selected plate waves, which are either reflected from or transmitted through the weld zone. Bayesian parameter estimation and wavelet analysis (both continuous and discrete) have shown that the LBU time-series signal is readily separable into components that provide distinguishing features which describe weld quality. The authors anticipate that, in an on-line industrial application, these measurements can be implemented just downstream from the weld cell. Then the weld quality data can be fed back to control critical weld parameters or alert the operator of a problem requiring maintenance. Internal weld defects and deviations from the desired surface profile can then be corrected before defective parts are produced

  19. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  20. Adaptive Image Processing Methods for Improving Contaminant Detection Accuracy on Poultry Carcasses

    Science.gov (United States)

    Technical Abstract A real-time multispectral imaging system has demonstrated a science-based tool for fecal and ingesta contaminant detection during poultry processing. In order to implement this imaging system at commercial poultry processing industry, the false positives must be removed. For doi...

  1. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    Directory of Open Access Journals (Sweden)

    Constanta RADULESCU

    2011-07-01

    Full Text Available This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and operations of the dimensions andtolerances tree scheme will make for a machined piece is both indoor and outdoor.

  2. Compositional analysis of polycrystalline hafnium oxide thin films by heavy-ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, F.L. [Departamento de Electronica y Tecnologia de Computadoras, Universidad Politecnica de Cartagena, Campus Universitario Muralla del Mar, E-30202 Cartagena (Spain)]. E-mail: Felix.Martinez@upct.es; Toledano, M. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); San Andres, E. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Martil, I. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Gonzalez-Diaz, G. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Bohne, W. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany); Roehrich, J. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany); Strub, E. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany)

    2006-10-25

    The composition of polycrystalline hafnium oxide thin films has been measured by heavy-ion elastic recoil detection analysis (HI-ERDA). The films were deposited by high-pressure reactive sputtering (HPRS) on silicon wafers using an oxygen plasma at pressures between 0.8 and 1.6 mbar and during deposition times between 0.5 and 3.0 h. Hydrogen was found to be the main impurity and its concentration increased with deposition pressure. The composition was always slightly oxygen-rich, which is attributed to the oxygen plasma. Additionally, an interfacial silicon oxide thin layer was detected and taken into account. The thickness of the hafnium oxide film was found to increase linearly with deposition time and to decrease exponentially with deposition pressure, whereas the thickness of the silicon oxide interfacial layer has a minimum as a function of pressure at around 1.2 mbar and increases slightly as a function of time. The measurements confirmed that this interfacial layer is formed mainly during the early stages of the deposition process.

  3. Radiographic caries detection: A systematic review and meta-analysis.

    Science.gov (United States)

    Schwendicke, Falk; Tzschoppe, Markus; Paris, Sebastian

    2015-08-01

    This systematic review aimed at evaluating the accuracy of radiographic caries detection for different lesions at different locations. Studies reporting on the accuracy (sensitivity/specificity) of radiographic detection of natural primary caries lesions under clinical or in vitro conditions were included. Risk of bias was assessed using QUADAS-2. Pooled sensitivity, specificity and diagnostic odds ratios (DORs) were calculated using random-effects meta-analysis. Analyses were performed separately for occlusal and proximal lesions, with further discrimination between any kind of lesions, dentine lesions, and cavitated lesions. Electronic databases (Medline, Embase, Cochrane Central) and grey literature were systematically searched, complemented by cross-referencing from bibliographies. From 947 identified articles, 442 were analyzed full-text. 117 studies (13,375 teeth, 19,108 surfaces) were included, the majority of them reporting on permanent teeth and having high risk of bias. The detection of any kind (i.e. also initial) lesions had low sensitivities (pooled DOR [95% CI]: 0.24 [0.21/0.26] to 0.42 [0.31/0.34]), but moderate to high specificities (0.70 [0.76/0.84] to 0.97 [0.95/0.98]). For dentine lesions, sensitivities were higher (from 0.36 [0.24/0.49] for proximal to 0.56 [0.53/0.59] for occlusal lesions), and specificities ranged between 0.87 [0.85/0.89] and 0.95 [0.94/0.96]. No studies reported on cavitated occlusal lesions, whilst for cavitated proximal lesions, sensitivities increased above 0.60, whilst sensitivities remained high (above 0.90). Radiographic caries detection is highly accurate for cavitated proximal lesions, and seems also suitable to detect dentine caries lesions. For detecting initial lesions, more sensitive methods could be considered in population with high caries risk and prevalence. Radiographic caries detection is especially suitable for detecting more advanced caries lesions, and has limited risks for false positive diagnoses. For

  4. M-X Basing Area Analysis Process

    National Research Council Canada - National Science Library

    1980-01-01

    ... or unreasonable basing areas for the M-X missile in multiple protective structures (MPS). The process began in January 1977 with criteria involving geotechnical, cultural, safety and other concerns...

  5. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  6. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  7. A model of the gas analysis system operation process

    Science.gov (United States)

    Yakimenko, I. V.; Kanishchev, O. A.; Lyamets, L. L.; Volkova, I. V.

    2017-12-01

    The characteristic features of modeling the gas-analysis measurement system operation process on the basis of the semi-Markov process theory are discussed. The model of the measuring gas analysis system operation process is proposed, which makes it possible to take into account the influence of the replacement interval, the level of reliability and maintainability and to evaluate the product reliability.

  8. Error detection in GPS observations by means of Multi-process models

    DEFF Research Database (Denmark)

    Thomsen, Henrik F.

    2001-01-01

    The main purpose of this article is to present the idea of using Multi-process models as a method of detecting errors in GPS observations. The theory behind Multi-process models, and double differenced phase observations in GPS is presented shortly. It is shown how to model cycle slips in the Multi......-process context by means of a simple simulation. The simulation is used to illustrate how the method works, and it is concluded that the method deserves further investigation....

  9. Early Detection of Breast Cancer on Mammograms Using: Perceptual Feedback, Computer Processed Images and Ultrasound

    Science.gov (United States)

    1994-01-01

    amDlitude distortions in ultrasound mammography.. 14. SUBJECT TERMS 15. NUMBER OF PAGES Breast Cancer Detection, Biofeedback , Ultrasound , Image Processing 16...PROCESSED IMAGES AND ULTRASOUND PRINCIPAL INVESTIGATOR: Peter Bloch, Ph.D. D,.L~kiX2D 3 CONTRACTING ORGANIZATION: University of Pennsylvania Office of...Cancer on Mammograms Using. Perceptual Feedback, Computer Processed Images and Ultrasound Peter Bloch, Ph.D., Principal Investigator Page Numbers Front

  10. [Multi-DSP parallel processing technique of hyperspectral RX anomaly detection].

    Science.gov (United States)

    Guo, Wen-Ji; Zeng, Xiao-Ru; Zhao, Bao-Wei; Ming, Xing; Zhang, Gui-Feng; Lü, Qun-Bo

    2014-05-01

    To satisfy the requirement of high speed, real-time and mass data storage etc. for RX anomaly detection of hyperspectral image data, the present paper proposes a solution of multi-DSP parallel processing system for hyperspectral image based on CPCI Express standard bus architecture. Hardware topological architecture of the system combines the tight coupling of four DSPs sharing data bus and memory unit with the interconnection of Link ports. On this hardware platform, by assigning parallel processing task for each DSP in consideration of the spectrum RX anomaly detection algorithm and the feature of 3D data in the spectral image, a 4DSP parallel processing technique which computes and solves the mean matrix and covariance matrix of the whole image by spatially partitioning the image is proposed. The experiment result shows that, in the case of equivalent detective effect, it can reach the time efficiency 4 times higher than single DSP process with the 4-DSP parallel processing technique of RX anomaly detection algorithm proposed by this paper, which makes a breakthrough in the constraints to the huge data image processing of DSP's internal storage capacity, meanwhile well meeting the demands of the spectral data in real-time processing.

  11. A New MANET wormhole detection algorithm based on traversal time and hop count analysis.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2011-01-01

    As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET) are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA), which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  12. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    Directory of Open Access Journals (Sweden)

    Göran Pulkkis

    2011-11-01

    Full Text Available As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA, which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  13. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Raza, Safdar; Mokhlis, Hazlie; Arof, Hamzah; Laghari, J.A.; Wang, Li

    2015-01-01

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  14. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  15. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  16. Sensor Failure Detection of FASSIP System using Principal Component Analysis

    Science.gov (United States)

    Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina

    2018-02-01

    In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.

  17. Analysis of the theoretical bias in dark matter direct detection

    International Nuclear Information System (INIS)

    Catena, Riccardo

    2014-01-01

    Fitting the model ''A'' to dark matter direct detection data, when the model that underlies the data is ''B'', introduces a theoretical bias in the fit. We perform a quantitative study of the theoretical bias in dark matter direct detection, with a focus on assumptions regarding the dark matter interactions, and velocity distribution. We address this problem within the effective theory of isoscalar dark matter-nucleon interactions mediated by a heavy spin-1 or spin-0 particle. We analyze 24 benchmark points in the parameter space of the theory, using frequentist and Bayesian statistical methods. First, we simulate the data of future direct detection experiments assuming a momentum/velocity dependent dark matter-nucleon interaction, and an anisotropic dark matter velocity distribution. Then, we fit a constant scattering cross section, and an isotropic Maxwell-Boltzmann velocity distribution to the simulated data, thereby introducing a bias in the analysis. The best fit values of the dark matter particle mass differ from their benchmark values up to 2 standard deviations. The best fit values of the dark matter-nucleon coupling constant differ from their benchmark values up to several standard deviations. We conclude that common assumptions in dark matter direct detection are a source of potentially significant bias

  18. Detection of irradiated chicken by 2-alkylcyclobutanone analysis

    Energy Technology Data Exchange (ETDEWEB)

    Tanabe, Hiroko; Goto, Michiko [Tokyo Metropolitan Industrial Technology Research Institute, Tokyo (Japan); Miyahara, Makoto [National Institute of Health Sciences, Tokyo (Japan)

    2001-09-01

    Chicken meat irradiated at 0.5 kGy or higher doses were identified by GC/MS method analyzing 2-dodecylcyclobutanone (2-DCB) and 2-tetradecylcyclobutanone (2-TCB), which are formed from palmitic acid and stearic acid respectively, and isolated using extraction procedures of soxhlet-florisil chromatography. Many fat-containing foods have oleic acid in abundance as parent fatty acid, and chicken meat contains palmitoleic acid to the amount as much as stearic acid. In this study, we detected 2-tetradec-5'-enylcyclobutanone (2-TeCB) and 2-dodec-5'-enylcyclobutanone (2-DeCB) in chicken meat, which are formed from oleic acid and palmitoleic acid by irradiation respectively, using GC/MS method. Sensitivity in detection of both 2-TeCB and 2-DeCB were lower than that of 2-DCB. However, at least 0.57 {mu}g/g/fat of 2-TeCB was detected in chicken meat irradiated at 0.5 kGy, so 2-TeCB seems to be a useful marker for the identification of irradiated foods containing fat. On the contrary, 2-DeCB was not detected clearly at low doses. This suggests that 2-DeCB may be a useful marker for irradiated fat in the food having enough amount of palmitoleic acid needed to analysis. In addition, 2-tetradecadienylcyclobutanone, which is formed from linoleic acid was also found in chicken meat. (author)

  19. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  20. Using Order Tracking Analysis Method to Detect the Angle Faults of Blades on Wind Turbine

    DEFF Research Database (Denmark)

    Li, Pengfei; Hu, Weihao; Liu, Juncheng

    2016-01-01

    The angle faults of blades on wind turbines are usually included in the set angle fault and the pitch angle fault. They are occupied with a high proportion in all wind turbine faults. Compare with the traditional fault detection methods, using order tracking analysis method to detect angle faults...... has many advantages, such as easy implementation and high system reliability. Because of using Power Spectral Density method (PSD) or Fast Fourier Transform (FFT) method cannot get clear fault characteristic frequencies, this kind of faults should be detected by an effective method. This paper...... proposes a novel method of using order tracking analysis to analyze the signal of input aerodynamic torque which is received by hub. After the analyzed process, the fault characteristic frequency could be extracted by the analyzed signals and compared with the signals from normal operating conditions...

  1. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  2. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  3. 300 Area process trench sediment analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, M.G.; Kossik, C.D.

    1987-12-01

    This report describes the results of a sampling program for the sediments underlying the Process Trenches serving the 300 Area on the Hanford reservation. These Process Trenches were the subject of a Closure Plan submitted to the Washington State Department of Ecology and to the US Environmental Protection Agency in lieu of a Part B permit application on November 8, 1985. The closure plan described a proposed sampling plan for the underlying sediments and potential remedial actions to be determined by the sample analyses results. The results and proposed remedial action plan are presented and discussed in this report. 50 refs., 6 figs., 8 tabs.

  4. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun Saptohartyadi

    2014-01-01

    Condition monitoring of wind turbines is a field of continu- ous research and development as new turbine configurations enter into the market and new failure modes appear. Systems utilising well established techniques from the energy and in- dustry sector, such as vibration analysis......, are commercially available and functioning successfully in fixed speed and vari- able speed turbines. Power performance analysis is a method specifically applicable to wind turbines for the detection of power generation changes due to external factors, such as ic- ing, internal factors, such as controller...... malfunction, or delib- erate actions, such as power de-rating. In this paper, power performance analysis is performed by sliding a time-power window and calculating the two eigenvalues corresponding to the two dimensional wind speed - power generation dis- tribution. The power is classified into five bins...

  5. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  6. Effect of image processing version on detection of non-calcification cancers in 2D digital mammography imaging

    Science.gov (United States)

    Warren, L. M.; Cooke, J.; Given-Wilson, R. M.; Wallis, M. G.; Halling-Brown, M.; Mackenzie, A.; Chakraborty, D. P.; Bosmans, H.; Dance, D. R.; Young, K. C.

    2013-03-01

    Image processing (IP) is the last step in the digital mammography imaging chain before interpretation by a radiologist. Each manufacturer has their own IP algorithm(s) and the appearance of an image after IP can vary greatly depending upon the algorithm and version used. It is unclear whether these differences can affect cancer detection. This work investigates the effect of IP on the detection of non-calcification cancers by expert observers. Digital mammography images for 190 patients were collected from two screening sites using Hologic amorphous selenium detectors. Eighty of these cases contained non-calcification cancers. The images were processed using three versions of IP from Hologic - default (full enhancement), low contrast (intermediate enhancement) and pseudo screen-film (no enhancement). Seven experienced observers inspected the images and marked the location of regions suspected to be non-calcification cancers assigning a score for likelihood of malignancy. This data was analysed using JAFROC analysis. The observers also scored the clinical interpretation of the entire case using the BSBR classification scale. This was analysed using ROC analysis. The breast density in the region surrounding each cancer and the number of times each cancer was detected were calculated. IP did not have a significant effect on the radiologists' judgment of the likelihood of malignancy of individual lesions or their clinical interpretation of the entire case. No correlation was found between number of times each cancer was detected and the density of breast tissue surrounding that cancer.

  7. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  8. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  9. Analysis of food quality perception processes

    NARCIS (Netherlands)

    J.G. Termorshuizen (Koos); M.T.G. Meulenberg; B. Wierenga (Berend)

    1986-01-01

    textabstractA model of the quality perception process of the consumer with respect to food products has been developed. The model integrates a number of quality-related concepts. An empirical study was carried out to examine the relationships between the concepts. It appears that the various

  10. Population analysis for atomic cascade decay processes

    International Nuclear Information System (INIS)

    Suto, Keiko; Kagawa, Takashi; Futaba, Kaoru

    1998-01-01

    Down-stream cascade decay processes in atomic systems are analyzed by solving a coupled rate equation for which an analytical solution for a population in each excited state is obtained. Some typical numerical examples for populations to interpret the decay passes connecting to features of optical or electron spectra observed in various collision experiments are also given. (author)

  11. Detection and analysis of the stored grain insect creeping sound

    Science.gov (United States)

    Geng, Senlin; Zhang, Xiuqin; Zhao, Wei

    2017-09-01

    With the random acoustic source model as the theory model of the stored grain insects creeping, the sounds, of 20 Alphitobius diaperinus Panzer in wheat and 20 Tribolium castaneum Herbst adults in corn, are detected, respectively. By using Matlab, the original sound signals are reproduced and the de-noised signals are obtained. The power spectrums characteristics analysis are made. It is shown that the random acoustic source model is effective for the stored grain insect creeping sound detection, and their power spectrums are all discrete, where the highest frequency is 1600 Hz, the main frequency 205 Hz in the former, and the highest frequency is 800 Hz, the main frequency 350 Hz the latter, which may be used to distinguish different types insects in grain.

  12. Copy Move Forgery Detection Using SIFT Features- An Analysis

    Directory of Open Access Journals (Sweden)

    Rupal Amit Kapdi

    2015-08-01

    Full Text Available Emphasis on the need for authentication of image content has increased since images have been inferred to have some cognitive effects on human brain coupled along with the pervasiveness of images. General form of malicious image manipulations is Copy Move Forgery (CMF in which a region is cloned from source location and pasted onto the same image at a target location. Techniques often used to hide or increase presence of an object in the image. This need to establish detection of image originality and authentication without using any prior details of the image has increased by many folds. In this paper, we present a list of comparisons on the detection of image forgeries mostly pertaining to CMF using SIFT method. An effort has been made to produce suffused paper by quoting most of the recent practices by providing an in depth analysis of range of different techniques for forgery localization.

  13. Independent component analysis to detect clustered microcalcification breast cancers.

    Science.gov (United States)

    Gallardo-Caballero, R; García-Orellana, C J; García-Manso, A; González-Velasco, H M; Macías-Macías, M

    2012-01-01

    The presence of clustered microcalcifications is one of the earliest signs in breast cancer detection. Although there exist many studies broaching this problem, most of them are nonreproducible due to the use of proprietary image datasets. We use a known subset of the currently largest publicly available mammography database, the Digital Database for Screening Mammography (DDSM), to develop a computer-aided detection system that outperforms the current reproducible studies on the same mammogram set. This proposal is mainly based on the use of extracted image features obtained by independent component analysis, but we also study the inclusion of the patient's age as a nonimage feature which requires no human expertise. Our system achieves an average of 2.55 false positives per image at a sensitivity of 81.8% and 4.45 at a sensitivity of 91.8% in diagnosing the BCRP_CALC_1 subset of DDSM.

  14. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological and...is no longer needed. Do not return it to the originator. ARL-TR-8272 ● JAN 2018 US Army Research Laboratory An Automated Energy ...4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques 5a. CONTRACT NUMBER

  15. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    obtained works well for signals present or noise-only RF spectrum data files. The red curve is the threshold when added to the morphological processed...TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) US Army Research Laboratory Sensors and Electron Devices...morphological image techniques to the energy detection scenario of signals in the RF spectrum domain. The algorithm automatically establishes a detection

  16. Applied Neural Cross-Correlation into the Curved Trajectory Detection Process for Braitenberg Vehicles

    OpenAIRE

    Macktoobian, Matin; Jafari, Mohammad; Gh, Erfan Attarzadeh

    2014-01-01

    Curved Trajectory Detection (CTD) process could be considered among high-level planned capabilities for cognitive agents, has which been acquired under aegis of embedded artificial spiking neuronal circuits. In this paper, hard-wired implementation of the cross-correlation, as the most common comparison-driven scheme for both natural and artificial bionic constructions named Depth Detection Module(DDM), has been taken into account. It is manifestation of efficient handling upon epileptic seiz...

  17. Modeling fraud detection and the incorporation of forensic specialists in the audit process

    DEFF Research Database (Denmark)

    Sakalauskaite, Dominyka

    Financial statement audits are still comparatively poor in fraud detection. Forensic specialists can play a significant role in increasing audit quality. In this paper, based on prior academic research, I develop a model of fraud detection and the incorporation of forensic specialists in the audit...... process. The intention of the model is to identify the reasons why the audit is weak in fraud detection and to provide the analytical framework to assess whether the incorporation of forensic specialists can help to improve it. The results show that such specialists can potentially improve the fraud...... detection in the audit, but might also cause some negative implications. Overall, even though fraud detection is one of the main topics in research there are very few studies done on the subject of how auditors co-operate with forensic specialists. Thus, the paper concludes with suggestions for further...

  18. FEM Analysis on Electromagnetic Processing of Thin Metal Sheets

    Directory of Open Access Journals (Sweden)

    PASCA Sorin

    2014-10-01

    Full Text Available Based on finite element analysis, this paper investigates a possible new technology for electromagnetic processing of thin metal sheets, in order to improve the productivity, especially on automated manufacturing lines. This technology consists of induction heating process followed by magnetoforming process, both applied to metal sheet, using the same tool coil for both processes.

  19. DETECTION OF THE SECOND r-PROCESS PEAK ELEMENT TELLURIUM IN METAL-POOR STARS ,

    Energy Technology Data Exchange (ETDEWEB)

    Roederer, Ian U. [Carnegie Observatories, Pasadena, CA 91101 (United States); Lawler, James E. [Department of Physics, University of Wisconsin, Madison, WI 53706 (United States); Cowan, John J. [Homer L. Dodge Department of Physics and Astronomy, University of Oklahoma, Norman, OK 73019 (United States); Beers, Timothy C. [National Optical Astronomy Observatory, Tucson, AZ 85719 (United States); Frebel, Anna [Massachusetts Institute of Technology, Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Ivans, Inese I. [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States); Schatz, Hendrik [Department of Physics and Astronomy, Michigan State University, E. Lansing, MI 48824 (United States); Sobeck, Jennifer S. [Department of Astronomy and Astrophysics, University of Chicago, Chicago, IL 60637 (United States); Sneden, Christopher [Department of Astronomy, University of Texas at Austin, Austin, TX 78712 (United States)

    2012-03-15

    Using near-ultraviolet spectra obtained with the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope, we detect neutral tellurium in three metal-poor stars enriched by products of r-process nucleosynthesis, BD +17 3248, HD 108317, and HD 128279. Tellurium (Te, Z = 52) is found at the second r-process peak (A Almost-Equal-To 130) associated with the N = 82 neutron shell closure, and it has not been detected previously in Galactic halo stars. The derived tellurium abundances match the scaled solar system r-process distribution within the uncertainties, confirming the predicted second peak r-process residuals. These results suggest that tellurium is predominantly produced in the main component of the r-process, along with the rare earth elements.

  20. Analysis of Public Datasets for Wearable Fall Detection Systems.

    Science.gov (United States)

    Casilari, Eduardo; Santoyo-Ramón, José-Antonio; Cano-García, José-Manuel

    2017-06-27

    Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs) have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs). In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.). Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  1. Analysis of Public Datasets for Wearable Fall Detection Systems

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2017-06-01

    Full Text Available Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs. In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.. Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  2. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  3. [The process of detection and treatment of cases of tuberculosis in a prison].

    Science.gov (United States)

    Valença, Mariana Soares; Cezar-Vaz, Marta Regina; Brum, Clarice Brinck; Silva, Pedro Eduardo Almeida da

    2016-06-01

    This study seeks to analyze the process of detection and treatment of cases of tuberculosis (TB) in a prison in the south of Brazil. An active and passive search for TB was conducted to estimate the scale of TB in a prison with 764 inmates. In conjunction with the detection strategies and clinical follow-up of the 41 TB cases, participant observation and records in field diaries were performed, making it possible to analyze the scope and limitations of detection and treatment of cases of TB in prison. The development of search strategies is discussed along with the use of questionnaires to detect symptomatic cases, as well as the inadequacy of the clinical follow-up of TB cases, involvement of different workers and coordination between prison and health services. There is clear potential for the control of TB using an active search to induce the passive detection and screening for symptoms that - even skewed by the perceptions of inmates regarding symptoms of TB - enabled an increase in detection. The functional dynamics of prison life hamper the inclusion of health routines and can restrict actions to control TB and other diseases. In the process of control of TB in prisons, the feasibility of effective detection methods is as important as planning based on disease conditions, network services and workers involved.

  4. Smartphone Cortex Controlled Real-Time Image Processing and Reprocessing for Concentration Independent LED Induced Fluorescence Detection in Capillary Electrophoresis.

    Science.gov (United States)

    Szarka, Mate; Guttman, Andras

    2017-10-17

    We present the application of a smartphone anatomy based technology in the field of liquid phase bioseparations, particularly in capillary electrophoresis. A simple capillary electrophoresis system was built with LED induced fluorescence detection and a credit card sized minicomputer to prove the concept of real time fluorescent imaging (zone adjustable time-lapse fluorescence image processor) and separation controller. The system was evaluated by analyzing under- and overloaded aminopyrenetrisulfonate (APTS)-labeled oligosaccharide samples. The open source software based image processing tool allowed undistorted signal modulation (reprocessing) if the signal was inappropriate for the actual detection system settings (too low or too high). The novel smart detection tool for fluorescently labeled biomolecules greatly expands dynamic range and enables retrospective correction for injections with unsuitable signal levels without the necessity to repeat the analysis.

  5. A survey of the use of soy in processed Turkish meat products and detection of genetic modification.

    Science.gov (United States)

    Ulca, Pelin; Balta, Handan; Senyuva, Hamide Z

    2014-01-01

    To screen for possible illegal use of soybeans in meat products, the performance characteristics of a commercial polymer chain reaction (PCR) kit for detection of soybean DNA in raw and cooked meat products were established. Minced chicken and beef products containing soybean at levels from 0.1% to 10.0% were analysed by real-time PCR to amplify the soybean lectin gene. The PCR method could reliably detect the addition of soybean at a level of 0.1%. A survey of 38 Turkish processed meat products found only six samples to be negative for the presence of soybean. In 32 (84%) positive samples, 13 (34%) contained levels of soy above 0.1%. Of soybean positive samples, further DNA analysis was conducted by real-time PCR to detect whether genetically modified (GM) soybean had been used. Of 32 meat samples containing soybean, two samples were positive for GM modification.

  6. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  7. Comparison of the Signal Processing Methodologies for a Leak Detection of the LMR Steam Generator

    International Nuclear Information System (INIS)

    Kim, Tae-Joon; Jeong, Ji-Young; Kim, Byung-Ho

    2006-01-01

    The successful protection of a water/steam into a sodium leak in the LMR SG at an early phase of a leak origin depends on the fast response and sensitivity of a leak detection system. The control time for the protection of the LMR SG is several seconds. Subject of this study is to introduce the detection performance of the acoustic leak detection system discriminated by a back-propagation neural network according to a preprocessing of the FFT power spectrum analysis and the Octave band analysis, and to introduce the status of the development of the acoustic leak detection at KAERI. It was used for the acoustic signals from the injected Argon gas into water experiments at KAERI, the acoustic signals injected from the water into the sodium obtained in IPPE, and the background noise of the PFR superheater

  8. Reachability for Finite-State Process Algebras Using Static Analysis

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming

    2011-01-01

    In this work we present an algorithm for solving the reachability problem in finite systems that are modelled with process algebras. Our method uses Static Analysis, in particular, Data Flow Analysis, of the syntax of a process algebraic system with multi-way synchronisation. The results of the D......In this work we present an algorithm for solving the reachability problem in finite systems that are modelled with process algebras. Our method uses Static Analysis, in particular, Data Flow Analysis, of the syntax of a process algebraic system with multi-way synchronisation. The results...

  9. A CCTV system with SMS alert (CMDSA): An implementation of pixel processing algorithm for motion detection

    Science.gov (United States)

    Rahman, Nurul Hidayah Ab; Abdullah, Nurul Azma; Hamid, Isredza Rahmi A.; Wen, Chuah Chai; Jelani, Mohamad Shafiqur Rahman Mohd

    2017-10-01

    Closed-Circuit TV (CCTV) system is one of the technologies in surveillance field to solve the problem of detection and monitoring by providing extra features such as email alert or motion detection. However, detecting and alerting the admin on CCTV system may complicate due to the complexity to integrate the main program with an external Application Programming Interface (API). In this study, pixel processing algorithm is applied due to its efficiency and SMS alert is added as an alternative solution for users who opted out email alert system or have no Internet connection. A CCTV system with SMS alert (CMDSA) was developed using evolutionary prototyping methodology. The system interface was implemented using Microsoft Visual Studio while the backend components, which are database and coding, were implemented on SQLite database and C# programming language, respectively. The main modules of CMDSA are motion detection, capturing and saving video, image processing and Short Message Service (SMS) alert functions. Subsequently, the system is able to reduce the processing time making the detection process become faster, reduce the space and memory used to run the program and alerting the system admin instantly.

  10. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  11. Speech endpoint detection with non-language speech sounds for generic speech processing applications

    Science.gov (United States)

    McClain, Matthew; Romanowski, Brian

    2009-05-01

    Non-language speech sounds (NLSS) are sounds produced by humans that do not carry linguistic information. Examples of these sounds are coughs, clicks, breaths, and filled pauses such as "uh" and "um" in English. NLSS are prominent in conversational speech, but can be a significant source of errors in speech processing applications. Traditionally, these sounds are ignored by speech endpoint detection algorithms, where speech regions are identified in the audio signal prior to processing. The ability to filter NLSS as a pre-processing step can significantly enhance the performance of many speech processing applications, such as speaker identification, language identification, and automatic speech recognition. In order to be used in all such applications, NLSS detection must be performed without the use of language models that provide knowledge of the phonology and lexical structure of speech. This is especially relevant to situations where the languages used in the audio are not known apriori. We present the results of preliminary experiments using data from American and British English speakers, in which segments of audio are classified as language speech sounds (LSS) or NLSS using a set of acoustic features designed for language-agnostic NLSS detection and a hidden-Markov model (HMM) to model speech generation. The results of these experiments indicate that the features and model used are capable of detection certain types of NLSS, such as breaths and clicks, while detection of other types of NLSS such as filled pauses will require future research.

  12. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  13. Detecting depression stigma on social media: A linguistic analysis.

    Science.gov (United States)

    Li, Ang; Jiao, Dongdong; Zhu, Tingshao

    2018-05-01

    Efficient detection of depression stigma in mass media is important for designing effective stigma reduction strategies. Using linguistic analysis methods, this paper aims to build computational models for detecting stigma expressions in Chinese social media posts (Sina Weibo). A total of 15,879 Weibo posts with keywords were collected and analyzed. First, a content analysis was conducted on all 15,879 posts to determine whether each of them reflected depression stigma or not. Second, using four algorithms (Simple Logistic Regression, Multilayer Perceptron Neural Networks, Support Vector Machine, and Random Forest), two groups of classification models were built based on selected linguistic features; one for differentiating between posts with and without depression stigma, and one for differentiating among posts with three specific types of depression stigma. First, 967 of 15,879 posts (6.09%) indicated depression stigma. 39.30%, 15.82%, and 14.99% of them endorsed the stigmatizing view that "People with depression are unpredictable", "Depression is a sign of personal weakness", and "Depression is not a real medical illness", respectively. Second, the highest F-Measure value for differentiating between stigma and non-stigma reached 75.2%. The highest F-Measure value for differentiating among three specific types of stigma reached 86.2%. Due to the limited and imbalanced dataset of Chinese Weibo posts, the findings of this study might have limited generalizability. This paper confirms that incorporating linguistic analysis methods into online detection of stigma can be beneficial to improve the performance of stigma reduction programs. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Jack Lee

    Full Text Available Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA, high order spectrum analysis (HOS, fractal analysis (FA, and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC are obtained. They are 96.3%, 99.1% and 98.5% (99.3%, respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  15. Detection of particle motion using image processing with particular emphasis on rolling motion.

    Science.gov (United States)

    Agudo, J R; Luzi, G; Han, J; Hwang, M; Lee, J; Wierschem, A

    2017-05-01

    Image-processing has been used in granular systems for detecting particle positions and motion near optically accessible surfaces like sediment flow and bedload transport. We review the image-processing techniques used for single and multiple particles. To enhance reliability in particle recognition, tools like Canny edge and Hough transform are intensively used. We show exemplarily how they can be applied to detect not only particle positions but also rotatory motion. The different steps are described in detail and the algorithm is applied to different examples, which are discussed in view of the obtained accuracy.

  16. Modal Analysis for Crack Detection in Small Wind Turbine Blades

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Skov, Jonas falk; Dickow, Kristoffer Ahrens

    2013-01-01

    The aim of the present paper is to evaluate structural health monitoring (SHM) techniques based on modal analysis for crack detection in small wind turbine blades. A finite element (FE) model calibrated to measured modal parameters will be introduced to cracks with different sizes along one edge...... of the blade. Changes in modal parameters from the FE model are compared with data obtained from experimental tests. These comparisons will be used to validate the FE model and subsequently discuss the usability of SHM techniques based on modal parameters for condition monitoring of wind turbine blades....

  17. Detection of Prion Proteins and TSE Infectivity in the Rendering and Biodiesel Manufacture Processes

    Energy Technology Data Exchange (ETDEWEB)

    Brown, R.; Keller, B.; Oleschuk, R. [Queen' s University, Kingston, Ontario (Canada)

    2007-03-15

    This paper addresses emerging issues related to monitoring prion proteins and TSE infectivity in the products and waste streams of rendering and biodiesel manufacture processes. Monitoring is critical to addressing the knowledge gaps identified in 'Biodiesel from Specified Risk Material Tallow: An Appraisal of TSE Risks and their Reduction' (IEA's AMF Annex XXX, 2006) that prevent comprehensive risk assessment of TSE infectivity in products and waste. The most important challenge for monitoring TSE risk is the wide variety of sample types, which are generated at different points in the rendering/biodiesel production continuum. Conventional transmissible spongiform encephalopathy (TSE) assays were developed for specified risk material (SRM) and other biological tissues. These, however, are insufficient to address the diverse sample matrices produced in rendering and biodiesel manufacture. This paper examines the sample types expected in rendering and biodiesel manufacture and the implications of applying TSE assay methods to them. The authors then discuss a sample preparation filtration, which has not yet been applied to these sample types, but which has the potential to provide or significantly improve TSE monitoring. The main improvement will come from transfer of the prion proteins from the sample matrix to a matrix compatible with conventional and emerging bioassays. A second improvement will come from preconcentrating the prion proteins, which means transferring proteins from a larger sample volume into a smaller volume for analysis to provide greater detection sensitivity. This filtration method may also be useful for monitoring other samples, including wash waters and other waste streams, which may contain SRM, including those from abattoirs and on-farm operations. Finally, there is a discussion of emerging mass spectrometric methods, which Prusiner and others have shown to be suitable for detection and characterisation of prion proteins (Stahl

  18. Analysis of the medication reconciliation process conducted at hospital admission

    Directory of Open Access Journals (Sweden)

    María Beatriz Contreras Rey

    2016-07-01

    Full Text Available Objective: To analyze the outcomes of a medication reconciliation process at admission in the hospital setting. To assess the role of the Pharmacist in detecting reconciliation errors and preventing any adverse events entailed. Method: A retrospective study was conducted to analyze the medication reconciliation activity during the previous six months. The study included those patients for whom an apparently not justified discrepancy was detected at admission, after comparing the hospital medication prescribed with the home treatment stated in their clinical hospital records. Those patients for whom the physician ordered the introduction of home medication without any specification were also considered. In order to conduct the reconciliation process, the Pharmacist prepared the best pharmacotherapeutical history possible, reviewing all available information about the medication the patient could be taking before admission, and completing the process with a clinical interview. The discrepancies requiring clarification were reported to the physician. It was considered that the reconciliation proposal had been accepted if the relevant modification was made in the next visit of the physician, or within 24-48 hours maximum; this case was then labeled as a reconciliation error. For the descriptive analysis, the Statistics® SPSS program, version 17.0, was used. Outcomes: 494 medications were reconciled in 220 patients, with a mean of 2.25 medications per patient. More than half of patients (59.5% had some discrepancy that required clarification; the most frequent was the omission of a medication that the patient was taking before admission (86.2%, followed by an unjustified modification in dosing or way of administration (5.9%. In total, 312 discrepancies required clarification; out of these, 93 (29.8% were accepted and considered as reconciliation errors, 126 (40% were not accepted, and in 93 cases (29,8% acceptance was not relevant due to a change in

  19. Automatic ultrasonic image analysis method for defect detection

    International Nuclear Information System (INIS)

    Magnin, I.; Perdrix, M.; Corneloup, G.; Cornu, B.

    1987-01-01

    Ultrasonic examination of austenitic steel weld seams raises well known problems of interpreting signals perturbed by this type of material. The JUKEBOX ultrasonic imaging system developed at the Cadarache Nuclear Research Center provides a major improvement in the general area of defect localization and characterization, based on processing overall images obtained by (X, Y) scanning. (X, time) images are formed by juxtaposing input signals. A series of parallel images shifted on the Y-axis is also available. The authors present a novel defect detection method based on analysing the timeline positions of the maxima and minima recorded on (X, time) images. This position is statistically stable when a defect is encountered, and is random enough under spurious noise conditions to constitute a discriminating parameter. The investigation involves calculating the trace variance: this parameters is then taken into account for detection purposes. Correlation with parallel images enhances detection reliability. A significant increase in the signal-to-noise ratio during tests on artificial defects is shown

  20. Detection and Monitoring of Neurotransmitters - a Spectroscopic Analysis

    Science.gov (United States)

    Manciu, Felicia; Lee, Kendall; Durrer, William; Bennet, Kevin

    2012-10-01

    In this work we demonstrate the capability of confocal Raman mapping spectroscopy for simultaneously and locally detecting important compounds in neuroscience such as dopamine, serotonin, and adenosine. The Raman results show shifting of the characteristic vibrations of the compounds, observations consistent with previous spectroscopic studies. Although some vibrations are common in these neurotransmitters, Raman mapping was achieved by detecting non-overlapping characteristic spectral signatures of the compounds, as follows: for dopamine the vibration attributed to C-O stretching, for serotonin the indole ring stretching vibration, and for adenosine the adenine ring vibrations. Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific micro-scale image regions. Such information is particularly important for complex, heterogeneous samples, where modification of the chemical or physical composition can influence the neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.

  1. Image Processing and Analysis in Geotechnical Investigation

    Czech Academy of Sciences Publication Activity Database

    Ščučka, Jiří; Martinec, Petr; Šňupárek, Richard; Veselý, V.

    2006-01-01

    Roč. 21, 3-4 (2006), s. 1-6 ISSN 0886-7798. [AITES-ITA 2006 World Tunnel Congres and ITA General Assembly /32./. Seoul, 22.04.2006-27.04.2006] Institutional research plan: CEZ:AV0Z30860518 Keywords : underground working face * digital photography * image analysis Subject RIV: DB - Geology ; Mineralogy Impact factor: 0.278, year: 2006

  2. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  3. Safety analysis of SISL process module

    International Nuclear Information System (INIS)

    1983-05-01

    This report provides an assessment of various postulated accidental occurrences within an experimental process module which is part of a Special Isotope Separation Laboratory (SISL) currently under construction at the Lawrence Livermore National Laboratory (LLNL). The process module will contain large amounts of molten uranium and various water-cooled structures within a vacuum vessel. Special emphasis is therefore given to potential accidental interactions of molten uranium with water leading to explosive and/or rapid steam formation, as well as uranium oxidation and the potential for combustion. Considerations are also given to the potential for vessel melt-through. Evaluations include mechanical and thermal interactions and design implications both in terms of design basis as well as once-in-a-lifetime accident scenarios. These scenarios include both single- and multiple-failure modes leading to various contact modes and locations within the process module for possible thermal interactions. The evaluations show that a vacuum vessel design based upon nominal operating conditions would appear sufficient to meet safety requirements in connection with both design basis as well as once-in-a-lifetime accidents. Controlled venting requirements for removal of steam and hydrogen in order to avoid possible long-term pressurization events are recommended. Depending upon the resulting accident conditions, the vacuum system (i.e., the roughing system) could also serve this purpose. Finally, based upon accident evaluations of this study, immediate shut-off of all coolant water following an incident leak is not recommended, as such action may have adverse effects in terms of cool-down requirements for the melt crucibles etc. These requirements have not been assessed as part of this study

  4. Detecting inpatient falls by using natural language processing of electronic medical records

    Directory of Open Access Journals (Sweden)

    Toyabe Shin-ichi

    2012-12-01

    Full Text Available Abstract Background Incident reporting is the most common method for detecting adverse events in a hospital. However, under-reporting or non-reporting and delay in submission of reports are problems that prevent early detection of serious adverse events. The aim of this study was to determine whether it is possible to promptly detect serious injuries after inpatient falls by using a natural language processing method and to determine which data source is the most suitable for this purpose. Methods We tried to detect adverse events from narrative text data of electronic medical records by using a natural language processing method. We made syntactic category decision rules to detect inpatient falls from text data in electronic medical records. We compared how often the true fall events were recorded in various sources of data including progress notes, discharge summaries, image order entries and incident reports. We applied the rules to these data sources and compared F-measures to detect falls between these data sources with reference to the results of a manual chart review. The lag time between event occurrence and data submission and the degree of injury were compared. Results We made 170 syntactic rules to detect inpatient falls by using a natural language processing method. Information on true fall events was most frequently recorded in progress notes (100%, incident reports (65.0% and image order entries (12.5%. However, F-measure to detect falls using the rules was poor when using progress notes (0.12 and discharge summaries (0.24 compared with that when using incident reports (1.00 and image order entries (0.91. Since the results suggested that incident reports and image order entries were possible data sources for prompt detection of serious falls, we focused on a comparison of falls found by incident reports and image order entries. Injury caused by falls found by image order entries was significantly more severe than falls detected by

  5. Automatic solar feature detection using image processing and pattern recognition techniques

    Science.gov (United States)

    Qu, Ming

    The objective of the research in this dissertation is to develop a software system to automatically detect and characterize solar flares, filaments and Corona Mass Ejections (CMEs), the core of so-called solar activity. These tools will assist us to predict space weather caused by violent solar activity. Image processing and pattern recognition techniques are applied to this system. For automatic flare detection, the advanced pattern recognition techniques such as Multi-Layer Perceptron (MLP), Radial Basis Function (RBF), and Support Vector Machine (SVM) are used. By tracking the entire process of flares, the motion properties of two-ribbon flares are derived automatically. In the applications of the solar filament detection, the Stabilized Inverse Diffusion Equation (SIDE) is used to enhance and sharpen filaments; a new method for automatic threshold selection is proposed to extract filaments from background; an SVM classifier with nine input features is used to differentiate between sunspots and filaments. Once a filament is identified, morphological thinning, pruning, and adaptive edge linking methods are applied to determine filament properties. Furthermore, a filament matching method is proposed to detect filament disappearance. The automatic detection and characterization of flares and filaments have been successfully applied on Halpha full-disk images that are continuously obtained at Big Bear Solar Observatory (BBSO). For automatically detecting and classifying CMEs, the image enhancement, segmentation, and pattern recognition techniques are applied to Large Angle Spectrometric Coronagraph (LASCO) C2 and C3 images. The processed LASCO and BBSO images are saved to file archive, and the physical properties of detected solar features such as intensity and speed are recorded in our database. Researchers are able to access the solar feature database and analyze the solar data efficiently and effectively. The detection and characterization system greatly improves

  6. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    process. Residuals are ascribed to locations in the empty background, as well as to data points of the point pattern. We obtain variance formulae, and study standardised residuals. There is also an analogy between our spatial residuals and the usual residuals for (non-spatial) generalised linear models...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  7. The analysis of thermally stimulated processes

    CERN Document Server

    Chen, R; Pamplin, Brian

    1981-01-01

    Thermally stimulated processes include a number of phenomena - either physical or chemical in nature - in which a certain property of a substance is measured during controlled heating from a 'low' temperature. Workers and graduate students in a wide spectrum of fields require an introduction to methods of extracting information from such measurements. This book gives an interdisciplinary approach to various methods which may be applied to analytical chemistry including radiation dosimetry and determination of archaeological and geological ages. In addition, recent advances are included, such

  8. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    Science.gov (United States)

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved

  9. Detecting deception in children: A meta-analysis.

    Science.gov (United States)

    Gongola, Jennifer; Scurich, Nicholas; Quas, Jodi A

    2017-02-01

    Although research reveals that children as young as 3 can use deception and will take steps to obscure truth, research concerning how well others detect children's deceptive efforts remains unclear. Yet adults regularly assess whether children are telling the truth in a variety of contexts, including at school, in the home, and in legal settings, particularly in investigations of maltreatment. We conducted a meta-analysis to synthesize extant research concerning adults' ability to detect deceptive statements produced by children. We included 45 experiments involving 7,893 adult judges and 1,858 children. Overall, adults could accurately discriminate truths/lies at an average rate of 54%, which is slightly but significantly above chance levels. The average rate at which true statements were correctly classified as honest was higher (63.8%), whereas the rate at which lies were classified as dishonest was not different from chance (47.5%). A small positive correlation emerged between judgment confidence and judgment accuracy. Professionals (e.g., social workers, police officers, teachers) slightly outperformed laypersons (e.g., college undergraduates). Finally, exploratory analyses revealed that the child's age did not significantly affect the rate at which adults could discriminate truths/lies from chance. Future research aimed toward improving lie detection accuracy might focus more on individual differences in children's lie-telling abilities in order to uncover any reliable indicators of deception. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. Bi-Level Semantic Representation Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Chang, Xiaojun; Ma, Zhigang; Yang, Yi; Zeng, Zhiqiang; Hauptmann, Alexander G

    2017-05-01

    Multimedia event detection has been one of the major endeavors in video event analysis. A variety of approaches have been proposed recently to tackle this problem. Among others, using semantic representation has been accredited for its promising performance and desirable ability for human-understandable reasoning. To generate semantic representation, we usually utilize several external image/video archives and apply the concept detectors trained on them to the event videos. Due to the intrinsic difference of these archives, the resulted representation is presumable to have different predicting capabilities for a certain event. Notwithstanding, not much work is available for assessing the efficacy of semantic representation from the source-level. On the other hand, it is plausible to perceive that some concepts are noisy for detecting a specific event. Motivated by these two shortcomings, we propose a bi-level semantic representation analyzing method. Regarding source-level, our method learns weights of semantic representation attained from different multimedia archives. Meanwhile, it restrains the negative influence of noisy or irrelevant concepts in the overall concept-level. In addition, we particularly focus on efficient multimedia event detection with few positive examples, which is highly appreciated in the real-world scenario. We perform extensive experiments on the challenging TRECVID MED 2013 and 2014 datasets with encouraging results that validate the efficacy of our proposed approach.

  11. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  12. Automated rice leaf disease detection using color image analysis

    Science.gov (United States)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  13. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  14. Weighted symbolic analysis of human behavior for event detection

    Science.gov (United States)

    Rosani, A.; Boato, G.; De Natale, F. G. B.

    2013-03-01

    Automatic video analysis and understanding has become a high interest research topic, with applications to video browsing, content-based video indexing, and visual surveillance. However, the automation of this process is still a challenging task, due to clutters produced by low-level processing operations. This common problem can be solved by embedding signi cant contextual information into the data, as well as using simple syntactic approaches to perform the matching between actual sequences and models. In this context we propose a novel framework that employs a symbolic representation of complex activities through sequences of atomic actions based on a weighted Context-Free Grammar.

  15. Analysis of DIRAC's behavior using model checking with process algebra

    Science.gov (United States)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-12-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  16. Analysis of DIRAC's behavior using model checking with process algebra

    International Nuclear Information System (INIS)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof; Diaz, Ricardo Graciani

    2012-01-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  17. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.

  18. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  19. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  20. Combined optimization of image-gathering and image-processing systems for scene feature detection

    Science.gov (United States)

    Halyo, Nesim; Arduini, Robert F.; Samms, Richard W.

    1987-01-01

    The relationship between the image gathering and image processing systems for minimum mean squared error estimation of scene characteristics is investigated. A stochastic optimization problem is formulated where the objective is to determine a spatial characteristic of the scene rather than a feature of the already blurred, sampled and noisy image data. An analytical solution for the optimal characteristic image processor is developed. The Wiener filter for the sampled image case is obtained as a special case, where the desired characteristic is scene restoration. Optimal edge detection is investigated using the Laplacian operator x G as the desired characteristic, where G is a two dimensional Gaussian distribution function. It is shown that the optimal edge detector compensates for the blurring introduced by the image gathering optics, and notably, that it is not circularly symmetric. The lack of circular symmetry is largely due to the geometric effects of the sampling lattice used in image acquisition. The optimal image gathering optical transfer function is also investigated and the results of a sensitivity analysis are shown.

  1. An experimental analysis of memory processing.

    Science.gov (United States)

    Wright, Anthony A

    2007-11-01

    Rhesus monkeys were trained and tested in visual and auditory list-memory tasks with sequences of four travel pictures or four natural/environmental sounds followed by single test items. Acquisitions of the visual list-memory task are presented. Visual recency (last item) memory diminished with retention delay, and primacy (first item) memory strengthened. Capuchin monkeys, pigeons, and humans showed similar visual-memory changes. Rhesus learned an auditory memory task and showed octave generalization for some lists of notes--tonal, but not atonal, musical passages. In contrast with visual list memory, auditory primacy memory diminished with delay and auditory recency memory strengthened. Manipulations of interitem intervals, list length, and item presentation frequency revealed proactive and retroactive inhibition among items of individual auditory lists. Repeating visual items from prior lists produced interference (on nonmatching tests) revealing how far back memory extended. The possibility of using the interference function to separate familiarity vs. recollective memory processing is discussed.

  2. Energy analysis in sterilization process of food

    International Nuclear Information System (INIS)

    Lee, Dong Sun; Pyun, Yu Ryang

    1986-01-01

    A procedure was developed for predicting energy consumption of batch type thermal processing of food. From mass and energy balance equations various energy usages or losses were estimated for steam sterilization of model food system in No.301-7 can (Φ74.1 x 113.0mm) at three different temperatures. Selected models were 5 % bentonite solution for conductive food and tap water for convective food. Total steam or energy consumption was higher at 110 deg C than at two other higher temperatures (121 deg C and 130 deg C). High energy consumption at low sterilization temperature was mainly due to high bleeding steam energy and convective and radiative heat losses. Thermal energy efficiency was also disscussed. (Author)

  3. ANALYSIS ON TECHNOLOGICAL PROCESSES CLEANING OIL PIPELINES

    Directory of Open Access Journals (Sweden)

    Mariana PǍTRAŞCU

    2015-05-01

    Full Text Available In this paper the researches are presented concerning the technological processes of oil pipelines.We know several technologies and materials used for cleaning the sludge deposits, iron and manganese oxides, dross, stone, etc.de on the inner walls of drinking water pipes or industries.For the oil industry, methods of removal of waste materials and waste pipes and liquid and gas transport networks are operations known long, tedious and expensive. The main methods and associated problems can be summarized as follows: 1 Blowing with compressed air.2 manual or mechanical brushing, sanding with water or dry.3 Wash with water jet of high pressure, solvent or chemical solution to remove the stone and hard deposits.4 The combined methods of cleaning machines that use water jets, cutters, chains, rotary heads cutters, etc.

  4. Detection of minor amounts of irradiated constituents in a complex, processed food matrix

    International Nuclear Information System (INIS)

    Marchioni, E.

    1999-01-01

    Food irradiation is a process applied by the food processing industry for spoiling bacteria in ingredients added to non-irradiated food. The paper describes a method for detection of even very small amounts of irradiated ingredients in food, such as irradiated spices (down to 0.1% m/m), separator chicken meat (2% m/m), or chopped salmon (14% m/m). (orig./CB) [de

  5. Detection and analysis of diamond fingerprinting feature and its application

    International Nuclear Information System (INIS)

    Li Xin; Huang Guoliang; Li Qiang; Chen Shengyi

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the 'Diamond ID' and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  6. Network structure detection and analysis of Shanghai stock market

    Directory of Open Access Journals (Sweden)

    Sen Wu

    2015-04-01

    Full Text Available Purpose: In order to investigate community structure of the component stocks of SSE (Shanghai Stock Exchange 180-index, a stock correlation network is built to find the intra-community and inter-community relationship. Design/methodology/approach: The stock correlation network is built taking the vertices as stocks and edges as correlation coefficients of logarithm returns of stock price. It is built as undirected weighted at first. GN algorithm is selected to detect community structure after transferring the network into un-weighted with different thresholds. Findings: The result of the network community structure analysis shows that the stock market has obvious industrial characteristics. Most of the stocks in the same industry or in the same supply chain are assigned to the same community. The correlation of the internal stock prices’ fluctuation is closer than in different communities. The result of community structure detection also reflects correlations among different industries. Originality/value: Based on the analysis of the community structure in Shanghai stock market, the result reflects some industrial characteristics, which has reference value to relationship among industries or sub-sectors of listed companies.

  7. [Detecting cardiac arrhythmias based on phase space analysis].

    Science.gov (United States)

    Sun, Rongrong; Wang, Yuanyuan; Yang, Su; Fang, Zuxiang

    2008-08-01

    It is important for cardiac therapy devices such as the automated external defibrillator to discriminate different cardiac disorders based on Electrocardiogram analysis. A phase space analysis based algorithm is proposed to detect cardiac arrhythmias effectively. Firstly, the phase space of the signal is reconstructed. Then from the viewpoint of geometry and information theory, the distribution entropy of the point density in the two-dimensional reconstructed phase space is calculated as the features in the further classification. Finally the nearest-neighbour method based on Mahalanobis distance is used to classify the sinus rhythm (SR), supraventricular tachyarrhythmia (SVTA), atrial flutter (AFL) and atrial fibrillation (AF). To evaluate the sensitivity, specificity and accuracy of this proposed method in the cardiac arrhythmias classification, the MIT-BIH arrhythmias database and the canine endocardial database are studied respectively. Experiment results demonstrate that the proposed method can detect SR, SVTA, AFL and AF signals rapidly and accurately with the simple computation. It promises to find application in automated devices for cardiac arrhythmias therapy.

  8. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  9. Overlapping communities detection based on spectral analysis of line graphs

    Science.gov (United States)

    Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan

    2018-05-01

    Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.

  10. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liu

    2011-09-01

    Full Text Available Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  11. Automatic defect detection for TFT-LCD array process using quasiconformal kernel support vector data description.

    Science.gov (United States)

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  12. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Directory of Open Access Journals (Sweden)

    Zhi-Wei Liu

    2015-04-01

    Full Text Available Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L−1 (S/N ≥ 3. A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L−1-40μg L−1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p < 0.001. The sensor showed no response for abrin, BSA, and could overcome the influence of complex environmental disruptors, indicating high specificity and good selectivity. Recovery and reproducibility in the result of simulated samples (simulated water, soil, and flour sample determination met the analysis requirements, which was 90.5∼95.5% and 7.85%∼9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R was greater than or equal to 0.9456 (p < 0.001. Response voltage (ΔUe and response time (t0 obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  13. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  14. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography

    International Nuclear Information System (INIS)

    Longuetaud, F.

    2005-10-01

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  15. An image-processing method to detect sub-optical features based on understanding noise in intensity measurements.

    Science.gov (United States)

    Bhatia, Tripta

    2018-02-01

    Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.

  16. Efficient detection, analysis and classification of lightning radiation fields

    Science.gov (United States)

    Harger, R. O.; Tretter, S. A.

    1976-01-01

    Modeling the large scale lightning flash structure is considered. Large scale flash data has been measured from strip charts of storms of August 5, August 26, and September 12, 1975. The data is being processed by a computer program called SASEV to estimate the large scale flash statistics. The program, experimental results, and conclusions for the large scale flash structure are described. The progress made in examining the internal flash structure consists mainly of developing the software required to process the NASA digital tape data. A FORTRAN program has been written for the statistical analysis of series of events. The statistics computed and tests performed are found to be particularly useful in the analysis of lightning data.

  17. Meteo and Hydrodynamic Measurements to Detect Physical Processes in Confined Shallow Seas

    Directory of Open Access Journals (Sweden)

    Francesca De Serio

    2018-01-01

    Full Text Available Coastal sites with typical lagoon features are extremely vulnerable, often suffering from scarce circulation. Especially in the case of shallow basins subjected to strong anthropization and urban discharges, it is fundamental to monitor their hydrodynamics and water quality. The proper detection of events by high performance sensors and appropriate analysis of sensor signals has proved to be a necessary tool for local authorities and stakeholders, leading to early warning and preventive measures against environmental degradation and related hazards. At the same time, assessed datasets are not only essential to deepen the knowledge of the physical processes in the target basin, but are also necessary to calibrate and validate modelling systems providing forecasts. The present paper aims to show how long-term and continuous recordings of meteorological and hydrodynamic data, collected in a semi-enclosed sea, can be managed to rapidly provide fundamental insights on its hydrodynamic structure. The acquired signals have been analyzed in time domain, processed and finally, correlated. The adopted method is simple, feasible and easily replicable. Even if the results are site-dependent, the procedure is generic, and depends on having good quality available data. To show how this might be employed, a case study is examined. In fact, it has been applied to a coastal system, located in Southern Italy, where two monitoring stations are placed in two interconnected basins. The inferred results show that the system is not wind dominated, and that the annual trends in the wind regime, wave spreading and current circulation are not independent, but rather reiterate. These deductions are of great interest as a predictive perspective and for numerical modelling.

  18. The Rondonia Lightning Detection Network: Network Description, Science Objectives, Data Processing Archival/Methodology, and Results

    Science.gov (United States)

    Blakeslee, R. J.; Bailey, J. C.; Pinto, O.; Athayde, A.; Renno, N.; Weidman, C. D.

    2003-01-01

    A four station Advanced Lightning Direction Finder (ALDF) network was established in the state of Rondonia in western Brazil in 1999 through a collaboration of U.S. and Brazilian participants from NASA, INPE, INMET, and various universities. The network utilizes ALDF IMPACT (Improved Accuracy from Combined Technology) sensors to provide cloud-to-ground lightning observations (i.e., stroke/flash locations, signal amplitude, and polarity) using both time-of- arrival and magnetic direction finding techniques. The observations are collected, processed and archived at a central site in Brasilia and at the NASA/Marshall Space Flight Center in Huntsville, Alabama. Initial, non-quality assured quick-look results are made available in near real-time over the Internet. The network, which is still operational, was deployed to provide ground truth data for the Lightning Imaging Sensor (LIS) on the Tropical Rainfall Measuring Mission (TRMM) satellite that was launched in November 1997. The measurements are also being used to investigate the relationship between the electrical, microphysical and kinematic properties of tropical convection. In addition, the long-time series observations produced by this network will help establish a regional lightning climatological database, supplementing other databases in Brazil that already exist or may soon be implemented. Analytic inversion algorithms developed at the NASA/Marshall Space Flight Center have been applied to the Rondonian ALDF lightning observations to obtain site error corrections and improved location retrievals. The data will also be corrected for the network detection efficiency. The processing methodology and the results from the analysis of four years of network operations will be presented.

  19. Meteo and Hydrodynamic Measurements to Detect Physical Processes in Confined Shallow Seas.

    Science.gov (United States)

    De Serio, Francesca; Mossa, Michele

    2018-01-18

    Coastal sites with typical lagoon features are extremely vulnerable, often suffering from scarce circulation. Especially in the case of shallow basins subjected to strong anthropization and urban discharges, it is fundamental to monitor their hydrodynamics and water quality. The proper detection of events by high performance sensors and appropriate analysis of sensor signals has proved to be a necessary tool for local authorities and stakeholders, leading to early warning and preventive measures against environmental degradation and related hazards. At the same time, assessed datasets are not only essential to deepen the knowledge of the physical processes in the target basin, but are also necessary to calibrate and validate modelling systems providing forecasts. The present paper aims to show how long-term and continuous recordings of meteorological and hydrodynamic data, collected in a semi-enclosed sea, can be managed to rapidly provide fundamental insights on its hydrodynamic structure. The acquired signals have been analyzed in time domain, processed and finally, correlated. The adopted method is simple, feasible and easily replicable. Even if the results are site-dependent, the procedure is generic, and depends on having good quality available data. To show how this might be employed, a case study is examined. In fact, it has been applied to a coastal system, located in Southern Italy, where two monitoring stations are placed in two interconnected basins. The inferred results show that the system is not wind dominated, and that the annual trends in the wind regime, wave spreading and current circulation are not independent, but rather reiterate. These deductions are of great interest as a predictive perspective and for numerical modelling.

  20. Detection and processing of phase modulated optical signals at 40 Gbit/s and beyond

    DEFF Research Database (Denmark)

    Geng, Yan

    This thesis addresses demodulation in direct detection systems and signal processing of high speed phase modulated signals in future all-optical wavelength division multiplexing (WDM) communication systems where differential phase shift keying (DPSK) or differential quadrature phase shift keying...... detection and all-optical signal processing -including optical labeling, wavelength conversion and signal regeneration- that already have been studied intensively for signals using conventional on-off keying (OOK) format, can also be successfully implemented for high-speed phase modulated signals...... (DQPSK) are used to transport information. All-optical network functionalities -such as optical labeling, wavelength conversion and signal regeneration- are experimentally investigated. Direct detection of phase modulated signals requires phase-to-intensity modulation conversion in a demodulator...