WorldWideScience

Sample records for analysis detection processing

  1. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  2. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available A Similarity Ratio Analysis (SRA method is proposed for early-stage Fault Detection (FD in plasma etching processes using real-time Optical Emission Spectrometer (OES data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A, takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  3. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  4. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  5. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    Science.gov (United States)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  6. Improvement on Exoplanet Detection Methods and Analysis via Gaussian Process Fitting Techniques

    Science.gov (United States)

    Van Ross, Bryce; Teske, Johanna

    2018-01-01

    Planetary signals in radial velocity (RV) data are often accompanied by signals coming solely from stellar photo- or chromospheric variation. Such variation can reduce the precision of planet detection and mass measurements, and cause misidentification of planetary signals. Recently, several authors have demonstrated the utility of Gaussian Process (GP) regression for disentangling planetary signals in RV observations (Aigrain et al. 2012; Angus et al. 2017; Czekala et al. 2017; Faria et al. 2016; Gregory 2015; Haywood et al. 2014; Rajpaul et al. 2015; Foreman-Mackey et al. 2017). GP models the covariance of multivariate data to make predictions about likely underlying trends in the data, which can be applied to regions where there are no existing observations. The potency of GP has been used to infer stellar rotation periods; to model and disentangle time series spectra; and to determine physical aspects, populations, and detection of exoplanets, among other astrophysical applications. Here, we implement similar analysis techniques to times series of the Ca-2 H and K activity indicator measured simultaneously with RVs in a small sample of stars from the large Keck/HIRES RV planet search program. Our goal is to characterize the pattern(s) of non-planetary variation to be able to know what is/ is not a planetary signal. We investigated ten different GP kernels and their respective hyperparameters to determine the optimal combination (e.g., the lowest Bayesian Information Criterion value) in each stellar data set. To assess the hyperparameters’ error, we sampled their posterior distributions using Markov chain Monte Carlo (MCMC) analysis on the optimized kernels. Our results demonstrate how GP analysis of stellar activity indicators alone can contribute to exoplanet detection in RV data, and highlight the challenges in applying GP analysis to relatively small, irregularly sampled time series.

  7. Formation of hydrocarbons in irradiated Brazilian beans: gas chromatographic analysis to detect radiation processing

    International Nuclear Information System (INIS)

    Villavicencio, A.L.C.H.; Mancini-Filho, J.; Hartmann, M.; Ammon, J.; Delincee, H.

    1997-01-01

    Radiation processing of beans, which are a major source of dietary protein in Brazil, is a valuable alternative to chemical fumigation to combat postharvest losses due to insect infestation. To ensure free consumer choice, irradiated food will be labeled as such, and to enforce labeling, analytical methods to detect the irradiation treatment in the food product itself are desirable. In two varieties of Brazilian beans, Carioca and Macacar beans, the radiolytic formation of hydrocarbons formed after alpha and beta cleavage, with regard to the carbonyl group in triglycerides, have been studied. Using gas chromatographic analysis of these radiolytic hydrocarbons, different yields per precursor fatty acid are observed for the two types of beans. However, the typical degradation pattern allows the identification of the irradiation treatment in both bean varieties, even after 6 months of storage

  8. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    Science.gov (United States)

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Eye movement analysis and cognitive processing: detecting indicators of conversion to Alzheimer’s disease

    Science.gov (United States)

    Pereira, Marta LG Freitas; Camargo, Marina von Zuben A; Aprahamian, Ivan; Forlenza, Orestes V

    2014-01-01

    A great amount of research has been developed around the early cognitive impairments that best predict the onset of Alzheimer’s disease (AD). Given that mild cognitive impairment (MCI) is no longer considered to be an intermediate state between normal aging and AD, new paths have been traced to acquire further knowledge about this condition and its subtypes, and to determine which of them have a higher risk of conversion to AD. It is now known that other deficits besides episodic and semantic memory impairments may be present in the early stages of AD, such as visuospatial and executive function deficits. Furthermore, recent investigations have proven that the hippocampus and the medial temporal lobe structures are not only involved in memory functioning, but also in visual processes. These early changes in memory, visual, and executive processes may also be detected with the study of eye movement patterns in pathological conditions like MCI and AD. In the present review, we attempt to explore the existing literature concerning these patterns of oculomotor changes and how these changes are related to the early signs of AD. In particular, we argue that deficits in visual short-term memory, specifically in iconic memory, attention processes, and inhibitory control, may be found through the analysis of eye movement patterns, and we discuss how they might help to predict the progression from MCI to AD. We add that the study of eye movement patterns in these conditions, in combination with neuroimaging techniques and appropriate neuropsychological tasks based on rigorous concepts derived from cognitive psychology, may highlight the early presence of cognitive impairments in the course of the disease. PMID:25031536

  10. Eye movement analysis and cognitive processing: detecting indicators of conversion to Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Pereira ML

    2014-07-01

    Full Text Available Marta LG Freitas Pereira, Marina von Zuben A Camargo, Ivan Aprahamian, Orestes V ForlenzaLaboratory of Neuroscience (LIM-27, Department and Institute of Psychiatry, Faculty of Medicine, University of São Paulo, São Paulo, SP, BrazilAbstract: A great amount of research has been developed around the early cognitive ­impairments that best predict the onset of Alzheimer’s disease (AD. Given that mild cognitive impairment (MCI is no longer considered to be an intermediate state between normal aging and AD, new paths have been traced to acquire further knowledge about this condition and its subtypes, and to determine which of them have a higher risk of conversion to AD. It is now known that other deficits besides episodic and semantic memory impairments may be present in the early stages of AD, such as visuospatial and executive function deficits. Furthermore, recent investigations have proven that the hippocampus and the medial temporal lobe structures are not only involved in memory functioning, but also in visual processes. These early changes in memory, visual, and executive processes may also be detected with the study of eye movement patterns in pathological conditions like MCI and AD. In the present review, we attempt to explore the existing literature concerning these patterns of oculomotor changes and how these changes are related to the early signs of AD. In particular, we argue that deficits in visual short-term memory, specifically in iconic memory, attention processes, and inhibitory control, may be found through the analysis of eye movement patterns, and we discuss how they might help to predict the progression from MCI to AD. We add that the study of eye movement patterns in these conditions, in combination with neuroimaging techniques and appropriate neuropsychological tasks based on rigorous concepts derived from cognitive psychology, may highlight the early presence of cognitive impairments in the course of the disease

  11. Evaluation of signal processing for boiling noise detection. Further analysis of BOR-60 reactor noise data

    International Nuclear Information System (INIS)

    Ledwidge, T.J.; Black, J.L.

    1989-01-01

    The present paper deals with investigations of acoustic signals from a boiling experiment performed on the BOR 60 reactor in the USSR. Signals have been analysed in frequency as well as in time domain. Signal characteristics successfully used to detect the boiling process have been found in time domain. A proposal for in-service boiling monitoring by acoustic means is described. (author). 3 refs, 16 figs

  12. Fault detection in nonlinear chemical processes based on kernel entropy component analysis and angular structure

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Qingchao; Yan, Xuefeng; Lv, Zhaomin; Guo, Meijin [East China University of Science and Technology, Shanghai (China)

    2013-06-15

    Considering that kernel entropy component analysis (KECA) is a promising new method of nonlinear data transformation and dimensionality reduction, a KECA based method is proposed for nonlinear chemical process monitoring. In this method, an angle-based statistic is designed because KECA reveals structure related to the Renyi entropy of input space data set, and the transformed data sets are produced with a distinct angle-based structure. Based on the angle difference between normal status and current sample data, the current status can be monitored effectively. And, the confidence limit of the angle-based statistics is determined by kernel density estimation based on sample data of the normal status. The effectiveness of the proposed method is demonstrated by case studies on both a numerical process and a simulated continuous stirred tank reactor (CSTR) process. The KECA based method can be an effective method for nonlinear chemical process monitoring.

  13. Fault detection in nonlinear chemical processes based on kernel entropy component analysis and angular structure

    International Nuclear Information System (INIS)

    Jiang, Qingchao; Yan, Xuefeng; Lv, Zhaomin; Guo, Meijin

    2013-01-01

    Considering that kernel entropy component analysis (KECA) is a promising new method of nonlinear data transformation and dimensionality reduction, a KECA based method is proposed for nonlinear chemical process monitoring. In this method, an angle-based statistic is designed because KECA reveals structure related to the Renyi entropy of input space data set, and the transformed data sets are produced with a distinct angle-based structure. Based on the angle difference between normal status and current sample data, the current status can be monitored effectively. And, the confidence limit of the angle-based statistics is determined by kernel density estimation based on sample data of the normal status. The effectiveness of the proposed method is demonstrated by case studies on both a numerical process and a simulated continuous stirred tank reactor (CSTR) process. The KECA based method can be an effective method for nonlinear chemical process monitoring

  14. Eyewitness decisions in simultaneous and sequential lineups: a dual-process signal detection theory analysis.

    Science.gov (United States)

    Meissner, Christian A; Tredoux, Colin G; Parker, Janat F; MacLin, Otto H

    2005-07-01

    Many eyewitness researchers have argued for the application of a sequential alternative to the traditional simultaneous lineup, given its role in decreasing false identifications of innocent suspects (sequential superiority effect). However, Ebbesen and Flowe (2002) have recently noted that sequential lineups may merely bring about a shift in response criterion, having no effect on discrimination accuracy. We explored this claim, using a method that allows signal detection theory measures to be collected from eyewitnesses. In three experiments, lineup type was factorially combined with conditions expected to influence response criterion and/or discrimination accuracy. Results were consistent with signal detection theory predictions, including that of a conservative criterion shift with the sequential presentation of lineups. In a fourth experiment, we explored the phenomenological basis for the criterion shift, using the remember-know-guess procedure. In accord with previous research, the criterion shift in sequential lineups was associated with a reduction in familiarity-based responding. It is proposed that the relative similarity between lineup members may create a context in which fluency-based processing is facilitated to a greater extent when lineup members are presented simultaneously.

  15. Distributed dendritic processing facilitates object detection: a computational analysis on the visual system of the fly.

    Science.gov (United States)

    Hennig, Patrick; Möller, Ralf; Egelhaaf, Martin

    2008-08-28

    Detecting objects is an important task when moving through a natural environment. Flies, for example, may land on salient objects or may avoid collisions with them. The neuronal ensemble of Figure Detection cells (FD-cells) in the visual system of the fly is likely to be involved in controlling these behaviours, as these cells are more sensitive to objects than to extended background structures. Until now the computations in the presynaptic neuronal network of FD-cells and, in particular, the functional significance of the experimentally established distributed dendritic processing of excitatory and inhibitory inputs is not understood. We use model simulations to analyse the neuronal computations responsible for the preference of FD-cells for small objects. We employed a new modelling approach which allowed us to account for the spatial spread of electrical signals in the dendrites while avoiding detailed compartmental modelling. The models are based on available physiological and anatomical data. Three models were tested each implementing an inhibitory neural circuit, but differing by the spatial arrangement of the inhibitory interaction. Parameter optimisation with an evolutionary algorithm revealed that only distributed dendritic processing satisfies the constraints arising from electrophysiological experiments. In contrast to a direct dendro-dendritic inhibition of the FD-cell (Direct Distributed Inhibition model), an inhibition of its presynaptic retinotopic elements (Indirect Distributed Inhibition model) requires smaller changes in input resistance in the inhibited neurons during visual stimulation. Distributed dendritic inhibition of retinotopic elements as implemented in our Indirect Distributed Inhibition model is the most plausible wiring scheme for the neuronal circuit of FD-cells. This microcircuit is computationally similar to lateral inhibition between the retinotopic elements. Hence, distributed inhibition might be an alternative explanation of

  16. Detecting spatial patterns of rivermouth processes using a geostatistical framework for near-real-time analysis

    Science.gov (United States)

    Xu, Wenzhao; Collingsworth, Paris D.; Bailey, Barbara; Carlson Mazur, Martha L.; Schaeffer, Jeff; Minsker, Barbara

    2017-01-01

    This paper proposes a geospatial analysis framework and software to interpret water-quality sampling data from towed undulating vehicles in near-real time. The framework includes data quality assurance and quality control processes, automated kriging interpolation along undulating paths, and local hotspot and cluster analyses. These methods are implemented in an interactive Web application developed using the Shiny package in the R programming environment to support near-real time analysis along with 2- and 3-D visualizations. The approach is demonstrated using historical sampling data from an undulating vehicle deployed at three rivermouth sites in Lake Michigan during 2011. The normalized root-mean-square error (NRMSE) of the interpolation averages approximately 10% in 3-fold cross validation. The results show that the framework can be used to track river plume dynamics and provide insights on mixing, which could be related to wind and seiche events.

  17. The selective processing of emotional visual stimuli while detecting auditory targets: an ERP analysis.

    Science.gov (United States)

    Schupp, Harald T; Stockburger, Jessica; Bublatzky, Florian; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O

    2008-09-16

    Event-related potential studies revealed an early posterior negativity (EPN) for emotional compared to neutral pictures. Exploring the emotion-attention relationship, a previous study observed that a primary visual discrimination task interfered with the emotional modulation of the EPN component. To specify the locus of interference, the present study assessed the fate of selective visual emotion processing while attention is directed towards the auditory modality. While simply viewing a rapid and continuous stream of pleasant, neutral, and unpleasant pictures in one experimental condition, processing demands of a concurrent auditory target discrimination task were systematically varied in three further experimental conditions. Participants successfully performed the auditory task as revealed by behavioral performance and selected event-related potential components. Replicating previous results, emotional pictures were associated with a larger posterior negativity compared to neutral pictures. Of main interest, increasing demands of the auditory task did not modulate the selective processing of emotional visual stimuli. With regard to the locus of interference, selective emotion processing as indexed by the EPN does not seem to reflect shared processing resources of visual and auditory modality.

  18. The selective processing of emotional visual stimuli while detecting auditory targets : An ERP analysis

    OpenAIRE

    Schupp, Harald Thomas; Stockburger, Jessica; Bublatzky, Florian; Junghöfer, Markus; Weike, Almut I.; Hamm, Alfons O.

    2008-01-01

    Event-related potential studies revealed an early posterior negativity (EPN) for emotional compared to neutral pictures. Exploring the emotion-attention relationship, a previous study observed that a primary visual discrimination task interfered with the emotional modulation of the EPN component. To specify the locus of interference, the present study assessed the fate of selective visual emotion processing while attention is directed towards the auditory modality. While simply viewing a rapi...

  19. The rSPA Processes of River Water-quality Analysis System for Critical Contaminate Detection, Classification Multiple-water-quality-parameter Values and Real-time Notification

    OpenAIRE

    Chalisa VEESOMMAI; Yasushi KIYOKI

    2016-01-01

    The water quality analysis is one of the most important aspects of designing environmental systems. It is necessary to realize detection and classification processes and systems for water quality analysis. The important direction is to lead to uncomplicated understanding for public utilization. This paper presents the river Sensing Processing Actuation processes (rSPA) for determination and classification of multiple-water- parameters in Chaophraya river. According to rSPA processes of multip...

  20. Malware detection and analysis

    Science.gov (United States)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan; Sanders, Benjamin; Erickson, Jeremy Lee; Fritz, David Jakob

    2016-03-22

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable to the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.

  1. Detecting determinism from point processes.

    Science.gov (United States)

    Andrzejak, Ralph G; Mormann, Florian; Kreuz, Thomas

    2014-12-01

    The detection of a nonrandom structure from experimental data can be crucial for the classification, understanding, and interpretation of the generating process. We here introduce a rank-based nonlinear predictability score to detect determinism from point process data. Thanks to its modular nature, this approach can be adapted to whatever signature in the data one considers indicative of deterministic structure. After validating our approach using point process signals from deterministic and stochastic model dynamics, we show an application to neuronal spike trains recorded in the brain of an epilepsy patient. While we illustrate our approach in the context of temporal point processes, it can be readily applied to spatial point processes as well.

  2. Activation analysis. Detection limits

    International Nuclear Information System (INIS)

    Revel, G.

    1999-01-01

    Numerical data and limits of detection related to the four irradiation modes, often used in activation analysis (reactor neutrons, 14 MeV neutrons, photon gamma and charged particles) are presented here. The technical presentation of the activation analysis is detailed in the paper P 2565 of Techniques de l'Ingenieur. (A.L.B.)

  3. Fault Detection for Industrial Processes

    Directory of Open Access Journals (Sweden)

    Yingwei Zhang

    2012-01-01

    Full Text Available A new fault-relevant KPCA algorithm is proposed. Then the fault detection approach is proposed based on the fault-relevant KPCA algorithm. The proposed method further decomposes both the KPCA principal space and residual space into two subspaces. Compared with traditional statistical techniques, the fault subspace is separated based on the fault-relevant influence. This method can find fault-relevant principal directions and principal components of systematic subspace and residual subspace for process monitoring. The proposed monitoring approach is applied to Tennessee Eastman process and penicillin fermentation process. The simulation results show the effectiveness of the proposed method.

  4. IWGFR benchmark test on signal processing for boiling noise detection, stage 2: Analysis of data from BOR-60

    International Nuclear Information System (INIS)

    Rowley, R.; Waites, C.; Macleod, I.D.

    1989-01-01

    Data from boiling experiments in the BOR 60 reactor in USSR has been supplied by IAEA to enable analysis techniques to be compared. The signals have been analysed at RNL using two basic techniques, High Frequency RMS analysis and Pulse Counting analysis and two more sophisticated methods, Pattern Recognition and Pulse Timing Analysis. All methods indicated boiling successfully, pulse counting proved more sensitive than RMS for the detection of the onset of boiling. Pattern Recognition shows promise of a very reliable detector provided the background can be defined. Data from an Ionisation chamber was also supplied and there was good correlation between the neutronic and acoustic signals. (author). 25 figs, 4 tabs

  5. Theoretical and experimental analysis of electroweak corrections to the inclusive jet process. Development of extreme topologies detection methods

    International Nuclear Information System (INIS)

    Meric, Nicolas

    2013-01-01

    We have studied the behaviour of the inclusive jet, W+jets and Z+jets processes from the phenomenological and experimental point of view in the ATLAS experiment at LHC in order to understand how important is the impact of Sudakov logarithms on electroweak corrections and in the associated production of weak vector boson and jets at LHC. We have computed the amplitude of the real electroweak corrections to the inclusive jet process due to the real emission of weak vector bosons from jets. We have done this computation with the MCFM and NLOjet++ generators at 7 TeV, 8 TeV and 14 TeV. This study shows that, for the inclusive jet process, the partial cancellation of the virtual weak corrections (due to weak bosons in loops) by the real electroweak corrections occurs. This effect shows that Bloch-Nordsieck violation is reduced for this process. We have then participated to the measure of the differential cross-section for these different processes in the ATLAS experiment at 7 TeV. In particular we have been involved into technical aspects of the measurement such as the study of the QCD background to the W+jets process in the muon channel. We have then combined the different measurements in this channel to compare their behaviour. This tends to show that several effects are giving to the electroweak corrections their relative importance as we see an increase of the relative contribution of weak bosons with jets processes to the inclusive jet process with the transverse momentum of jets, if we explicitly ask for the presence of electroweak bosons in the final state. This study is currently only a preliminary study and aims at showing that this study can be useful to investigate the underlying structure of these processes. Finally we have studied the noises affecting the ATLAS calorimeter. This has allowed for the development of a new way to detect problematic events using well known theorems from statistics. This new method is able to detect bursts of noise and

  6. Crack detection using image processing

    International Nuclear Information System (INIS)

    Moustafa, M.A.A

    2010-01-01

    This thesis contains five main subjects in eight chapters and two appendices. The first subject discus Wiener filter for filtering images. In the second subject, we examine using different methods, as Steepest Descent Algorithm (SDA) and the Wavelet Transformation, to detect and filling the cracks, and it's applications in different areas as Nano technology and Bio-technology. In third subject, we attempt to find 3-D images from 1-D or 2-D images using texture mapping with Open Gl under Visual C ++ language programming. The fourth subject consists of the process of using the image warping methods for finding the depth of 2-D images using affine transformation, bilinear transformation, projective mapping, Mosaic warping and similarity transformation. More details about this subject will be discussed below. The fifth subject, the Bezier curves and surface, will be discussed in details. The methods for creating Bezier curves and surface with unknown distribution, using only control points. At the end of our discussion we will obtain the solid form, using the so called NURBS (Non-Uniform Rational B-Spline); which depends on: the degree of freedom, control points, knots, and an evaluation rule; and is defined as a mathematical representation of 3-D geometry that can accurately describe any shape from a simple 2-D line, circle, arc, or curve to the most complex 3-D organic free-form surface or (solid) which depends on finding the Bezier curve and creating family of curves (surface), then filling in between to obtain the solid form. Another representation for this subject is concerned with building 3D geometric models from physical objects using image-based techniques. The advantage of image techniques is that they require no expensive equipment; we use NURBS, subdivision surface and mesh for finding the depth of any image with one still view or 2D image. The quality of filtering depends on the way the data is incorporated into the model. The data should be treated with

  7. Detecting periodicities with Gaussian processes

    Directory of Open Access Journals (Sweden)

    Nicolas Durrande

    2016-04-01

    Full Text Available We consider the problem of detecting and quantifying the periodic component of a function given noise-corrupted observations of a limited number of input/output tuples. Our approach is based on Gaussian process regression, which provides a flexible non-parametric framework for modelling periodic data. We introduce a novel decomposition of the covariance function as the sum of periodic and aperiodic kernels. This decomposition allows for the creation of sub-models which capture the periodic nature of the signal and its complement. To quantify the periodicity of the signal, we derive a periodicity ratio which reflects the uncertainty in the fitted sub-models. Although the method can be applied to many kernels, we give a special emphasis to the Matérn family, from the expression of the reproducing kernel Hilbert space inner product to the implementation of the associated periodic kernels in a Gaussian process toolkit. The proposed method is illustrated by considering the detection of periodically expressed genes in the arabidopsis genome.

  8. Metabonomics for detection of nuclear materials processing.

    Energy Technology Data Exchange (ETDEWEB)

    Alam, Todd Michael; Luxon, Bruce A. (University Texas Medical Branch); Neerathilingam, Muniasamy (University Texas Medical Branch); Ansari, S. (University Texas Medical Branch); Volk, David (University Texas Medical Branch); Sarkar, S. (University Texas Medical Branch); Alam, Mary Kathleen

    2010-08-01

    Tracking nuclear materials production and processing, particularly covert operations, is a key national security concern, given that nuclear materials processing can be a signature of nuclear weapons activities by US adversaries. Covert trafficking can also result in homeland security threats, most notably allowing terrorists to assemble devices such as dirty bombs. Existing methods depend on isotope analysis and do not necessarily detect chronic low-level exposure. In this project, indigenous organisms such as plants, small mammals, and bacteria are utilized as living sensors for the presence of chemicals used in nuclear materials processing. Such 'metabolic fingerprinting' (or 'metabonomics') employs nuclear magnetic resonance (NMR) spectroscopy to assess alterations in organismal metabolism provoked by the environmental presence of nuclear materials processing, for example the tributyl phosphate employed in the processing of spent reactor fuel rods to extract and purify uranium and plutonium for weaponization.

  9. Metabonomics for detection of nuclear materials processing

    International Nuclear Information System (INIS)

    Alam, Todd Michael; Luxon, Bruce A.; Neerathilingam, Muniasamy; Ansari, S.; Volk, David; Sarkar, S.; Alam, Mary Kathleen

    2010-01-01

    Tracking nuclear materials production and processing, particularly covert operations, is a key national security concern, given that nuclear materials processing can be a signature of nuclear weapons activities by US adversaries. Covert trafficking can also result in homeland security threats, most notably allowing terrorists to assemble devices such as dirty bombs. Existing methods depend on isotope analysis and do not necessarily detect chronic low-level exposure. In this project, indigenous organisms such as plants, small mammals, and bacteria are utilized as living sensors for the presence of chemicals used in nuclear materials processing. Such 'metabolic fingerprinting' (or 'metabonomics') employs nuclear magnetic resonance (NMR) spectroscopy to assess alterations in organismal metabolism provoked by the environmental presence of nuclear materials processing, for example the tributyl phosphate employed in the processing of spent reactor fuel rods to extract and purify uranium and plutonium for weaponization.

  10. BEAP profiles as rapid test system for status analysis and early detection of process incidents in biogas plants.

    Science.gov (United States)

    Refai, Sarah; Berger, Stefanie; Wassmann, Kati; Hecht, Melanie; Dickhaus, Thomas; Deppenmeier, Uwe

    2017-03-01

    A method was developed to quantify the performance of microorganisms involved in different digestion levels in biogas plants. The test system was based on the addition of butyrate (BCON), ethanol (ECON), acetate (ACON) or propionate (PCON) to biogas sludge samples and the subsequent analysis of CH 4 formation in comparison to control samples. The combination of the four values was referred to as BEAP profile. Determination of BEAP profiles enabled rapid testing of a biogas plant's metabolic state within 24 h and an accurate mapping of all degradation levels in a lab-scale experimental setup. Furthermore, it was possible to distinguish between specific BEAP profiles for standard biogas plants and for biogas reactors with process incidents (beginning of NH 4 + -N inhibition, start of acidification, insufficient hydrolysis and potential mycotoxin effects). Finally, BEAP profiles also functioned as a warning system for the early prediction of critical NH 4 + -N concentrations leading to a drop of CH 4 formation.

  11. Process Dissociation and Mixture Signal Detection Theory

    Science.gov (United States)

    DeCarlo, Lawrence T.

    2008-01-01

    The process dissociation procedure was developed in an attempt to separate different processes involved in memory tasks. The procedure naturally lends itself to a formulation within a class of mixture signal detection models. The dual process model is shown to be a special case. The mixture signal detection model is applied to data from a widely…

  12. Tornado detection data reduction and analysis

    Science.gov (United States)

    Davisson, L. D.

    1977-01-01

    Data processing and analysis was provided in support of tornado detection by analysis of radio frequency interference in various frequency bands. Sea state determination data from short pulse radar measurements were also processed and analyzed. A backscatter simulation was implemented to predict radar performance as a function of wind velocity. Computer programs were developed for the various data processing and analysis goals of the effort.

  13. Post-factum detection of radiation treatment in processed food by analysis of radiation-induced hydrocarbons. Pt. 1. Applying the method L 06.00-37 defined in Para. 35 LMBG (German Act on Food Irradiation) to processed food

    International Nuclear Information System (INIS)

    Hartmann, M.; Ammon, J.; Berg, H.

    1995-01-01

    The German official method L 06.00-37 (Para. 35 German Act on Food Irradiation) is used for the identification of irradiated fat-containing food by GC-analysis of radiation-induced hydrocarbons. Simple modifications in sample preparation allow a distinctive improvement in detection possibilities and detection limits as well. The applicability of the modified method for the detection of irradiated ingredients in model-like processed food is shown. An identification of only 3% (irradiated fat to total fat ratio) irradiated ingredient (1,5 kGy) in processed food was possible. Additionally, the kind of irradiated ingredient could be identified by the pattern of radiation induced hydrocarbons. Their concentrations are corresponding with the fatty acid composition of the irradiated compound. (orig.) [de

  14. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  15. Evaluate the application of modal test and analysis processes to structural fault detection in MSFC-STS project elements

    Science.gov (United States)

    Springer, William T.

    1988-01-01

    The Space Transportation System (STS) is a very complex and expensive flight system which is intended to carry payloads into low Earth orbit and return. A catastrophic failure of the STS (such as experienced in the 51-L incident) results in the loss of both human life as well as very expensive hardware. One impact of this incident was to reaffirm the need to do everything possible to insure the integrity and reliability of the STS is sufficient to produce a safe flight. One means of achieving this goal is to expand the number of inspection technologies available for use on the STS. The purpose was to begin to evaluate the possible use of assessing the structural integrity of STS components for which Marshall Space Flight Center (MSFC) has responsibility. This entailed reviewing the available literature and determining a low-level experimental program which could be performed by MSFC and would help establish the feasibility of using this technology for structural fault detection.

  16. Process energy analysis

    International Nuclear Information System (INIS)

    Kaiser, V.

    1993-01-01

    In Chapter 2 process energy cost analysis for chemical processing is treated in a general way, independent of the specific form of energy and power production. Especially, energy data collection and data treatment, energy accounting (metering, balance setting), specific energy input, and utility energy costs and prices are discussed. (R.P.) 14 refs., 4 figs., 16 tabs

  17. BURAR: Detection and signal processing capabilities

    International Nuclear Information System (INIS)

    Ghica, Daniela; Radulian, Mircea; Popa, Mihaela

    2004-01-01

    Since July 2002, a new seismic monitoring station, the Bucovina Seismic Array (BURAR), has been installed in the northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics (NIEP), Romania. The array consists of 10 seismic sensors (9 short-period and one broad band) located in boreholes and distributed in a 5 x 5 km area. At present, the seismic data are continuously recorded by BURAR and transmitted in real-time to the Romanian National Data Centre (ROM N DC), at Bucharest and to the National Data Center of USA, in Florida. The statistical analysis for the seismic information gathered at ROM N DC by the BURAR in the August 2002 - December 2003 time interval points out a much better efficiency of the BURAR system in detecting teleseismic events and local events occurred in the N-NE part of Romanian territory, in comparison with the actual Romanian Telemetered Network. Furthermore, the BURAR monitoring system has proven to be an important source of reliable data for NIEP efforts in elaborating of the seismic bulletins. Signal processing capability of the system provides useful information in order to improve the location of the local seismic events, using the array beamforming facility. This method increases significantly the signal-to-noise ratio of the seismic signal by summing up the coherent signals from the array components. In this way, eventual source nucleation phases can be detected. At the same time, using the slowness and backazimuth estimations by f-k analysis, locations for the seismic events can be performed based only on the information recorded by the BURAR array, acting like a single seismic station recording system. Additionally, f-k analysis techniques are useful in the local site effects estimation and interpretation of the local geological structure. (authors)

  18. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    Science.gov (United States)

    Qin, Kunming; Liu, Qidi; Cai, Hao; Cao, Gang; Lu, Tulin; Shen, Baojia; Shu, Yachun; Cai, Baochang

    2014-01-01

    Background: In traditional Chinese medicine (TCM), raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae), is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA) by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry. Results: The results indicated that there was less arctiin in stir-fried materials than in raw materials. however, there were higher levels of arctigenin in stir-fried materials than in raw materials. Conclusion: We suggest that arctiin reduced significantly following the thermal conversion of arctiin to arctigenin. In conclusion, this finding may shed some light on understanding the differences in the therapeutic values of raw versus PFA in TCM. PMID:25422559

  19. BURAR: Detection and signal processing capabilities

    International Nuclear Information System (INIS)

    Ghica, Daniela; Radulian, Mircea; Popa, Mihaela

    2004-01-01

    Since July 2002, a new seismic monitoring station, the Bucovina Seismic Array (BURAR), has been installed in the northern part of Romania, in a joint effort of the Air Force Technical Applications Center, USA, and the National Institute for Earth Physics (NIEP), Romania. The array consists of 10 seismic sensors (9 short-period and one broad band) located in boreholes and distributed in a 5 x 5 km 2 area. At present, the seismic data are continuously recorded by BURAR and transmitted in real-time to the Romanian National Data Centre (ROM N DC), in Bucharest and to the National Data Center of USA, in Florida. The statistical analysis for the seismic information gathered at ROM N DC by the BURAR in the August 2002 - December 2003 time interval points out a much better efficiency of the BURAR system in detecting teleseismic events and local events occurred in the N-NE part of Romanian territory, in comparison with the actual Romanian Telemetered Network. Furthermore, the BURAR monitoring system has proven to be an important source of reliable data for NIEP efforts in issuing the seismic bulletins. Signal processing capability of the system provides useful information in order to improve the location of the local seismic events, using the array beamforming procedure. This method increases significantly the signal-to-noise ratio by summing up the coherent signals from the array components. In this way, possible source nucleation phases can be detected. At the same time, using the slowness and back azimuth estimations by f-k analysis, locations for the seismic events can be established based only on the information recorded by the BURAR array, acting like a single seismic station recording system. (authors)

  20. Matched Filter Processing for Asteroid Detection

    Science.gov (United States)

    Gural, Peter S.; Larsen, Jeffrey A.; Gleason, Arianna E.

    2005-10-01

    Matched filter (MF) processing has been shown to provide significant performance gains when processing stellar imagery used for asteroid detection, recovery, and tracking. This includes extending detection ranges to fainter magnitudes at the noise limit of the imagery and operating in dense cluttered star fields as encountered at low Galactic latitudes. The MF software has been shown to detect 40% more asteroids in high-quality Spacewatch imagery relative to the currently implemented approaches, which are based on moving target indicator (MTI) algorithms. In addition, MF detections were made in dense star fields and in situations in which the asteroid was collocated with a star in an image frame, cases in which the MTI algorithms failed. Thus, using legacy sensors and optics, improved detection sensitivity is achievable by simply upgrading the image-processing stream. This in turn permits surveys of the near-Earth asteroid (NEA) population farther from opposition, for smaller sizes, and in directions previously inaccessible to current NEA search programs. A software package has been developed and made available on the NASA data services Web site that can be used for asteroid detection and recovery operations utilizing the enhanced performance capabilities of MF processing.

  1. Impact of e-alert for detection of acute kidney injury on processes of care and outcomes: protocol for a systematic review and meta-analysis.

    Science.gov (United States)

    Lachance, Philippe; Villeneuve, Pierre-Marc; Wilson, Francis P; Selby, Nicholas M; Featherstone, Robin; Rewa, Oleksa; Bagshaw, Sean M

    2016-05-05

    Acute kidney injury (AKI) is a common complication in hospitalised patients. It imposes significant risk for major morbidity and mortality. Moreover, patients suffering an episode of AKI consume considerable health resources. Recently, a number of studies have evaluated the implementation of automated electronic alerts (e-alerts) configured from electronic medical records (EMR) and clinical information systems (CIS) to warn healthcare providers of early or impending AKI in hospitalised patients. The impact of e-alerts on care processes, patient outcomes and health resource use, however, remains uncertain. We will perform a systematic review to describe and appraise e-alerts for AKI, and evaluate their impact on processes of care, clinical outcomes and health services use. In consultation with a research librarian, a search strategy will be developed and electronic databases (ie, MEDLINE, EMBASE, CINAHL, Cochrane Library and Inspec via Engineering Village) searched. Selected grey literature sources will also be searched. Search themes will focus on e-alerts and AKI. Citation screening, selection, quality assessment and data abstraction will be performed in duplicate. The primary analysis will be narrative; however, where feasible, pooled analysis will be performed. Each e-alert will be described according to trigger, type of alert, target recipient and degree of intrusiveness. Pooled effect estimates will be described, where applicable. Our systematic review will synthesise the literature on the value of e-alerts to detect AKI, and their impact on processes, patient-centred outcomes and resource use, and also identify key knowledge gaps and barriers to implementation. This is a fundamental step in a broader research programme aimed to understand the ideal structure of e-alerts, target population and methods for implementation, to derive benefit. Research ethics approval is not required for this review. CRD42016033033. Published by the BMJ Publishing Group Limited

  2. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  3. Entanglement of identical particles and the detection process

    DEFF Research Database (Denmark)

    Tichy, Malte C.; de Melo, Fernando; Kus, Marek

    2013-01-01

    We introduce detector-level entanglement, a unified entanglement concept for identical particles that takes into account the possible deletion of many-particle which-way information through the detection process. The concept implies a measure for the effective indistinguishability of the particles...... statistical behavior depends on their initial entanglement. Our results show that entanglement cannot be attributed to a state of identical particles alone, but that the detection process has to be incorporated in the analysis....

  4. Process monitoring using optical ultrasonic wave detection

    International Nuclear Information System (INIS)

    Telschow, K.L.; Walter, J.B.; Garcia, G.V.; Kunerth, D.C.

    1989-01-01

    Optical ultrasonic wave detection techniques are being developed for process monitoring. An important limitation on optical techniques is that the material surface, in materials processing applications, is usually not a specular reflector and in many cases is totally diffusely reflecting. This severely degrades the light collected by the detection optics, greatly reducing the intensity and randomly scattering the phase of the reflected light. A confocal Fabry-Perot interferometer, which is sensitive to the Doppler frequency shift resulting from the surface motion and not to the phase of the collected light, is well suited to detecting ultrasonic waves in diffusely reflecting materials. This paper describes the application of this detector to the real-time monitoring of the sintering of ceramic materials. 8 refs., 5 figs

  5. Signal processing for boiling noise detection

    International Nuclear Information System (INIS)

    Ledwidge, T.J.; Black, J.L.

    1989-01-01

    The present paper deals with investigations of acoustic signals from a boiling experiment performed on the KNS I loop at KfK Karlsruhe. Signals have been analysed in frequency as well as in time domain. Signal characteristics successfully used to detect the boiling process have been found in time domain. (author). 6 refs, figs

  6. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  7. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  8. Linear discriminant analysis for welding fault detection

    International Nuclear Information System (INIS)

    Li, X.; Simpson, S.W.

    2010-01-01

    This work presents a new method for real time welding fault detection in industry based on Linear Discriminant Analysis (LDA). A set of parameters was calculated from one second blocks of electrical data recorded during welding and based on control data from reference welds under good conditions, as well as faulty welds. Optimised linear combinations of the parameters were determined with LDA and tested with independent data. Short arc welds in overlap joints were studied with various power sources, shielding gases, wire diameters, and process geometries. Out-of-position faults were investigated. Application of LDA fault detection to a broad range of welding procedures was investigated using a similarity measure based on Principal Component Analysis. The measure determines which reference data are most similar to a given industrial procedure and the appropriate LDA weights are then employed. Overall, results show that Linear Discriminant Analysis gives an effective and consistent performance in real-time welding fault detection.

  9. PCB Fault Detection Using Image Processing

    Science.gov (United States)

    Nayak, Jithendra P. R.; Anitha, K.; Parameshachari, B. D., Dr.; Banu, Reshma, Dr.; Rashmi, P.

    2017-08-01

    The importance of the Printed Circuit Board inspection process has been magnified by requirements of the modern manufacturing environment where delivery of 100% defect free PCBs is the expectation. To meet such expectations, identifying various defects and their types becomes the first step. In this PCB inspection system the inspection algorithm mainly focuses on the defect detection using the natural images. Many practical issues like tilt of the images, bad light conditions, height at which images are taken etc. are to be considered to ensure good quality of the image which can then be used for defect detection. Printed circuit board (PCB) fabrication is a multidisciplinary process, and etching is the most critical part in the PCB manufacturing process. The main objective of Etching process is to remove the exposed unwanted copper other than the required circuit pattern. In order to minimize scrap caused by the wrongly etched PCB panel, inspection has to be done in early stage. However, all of the inspections are done after the etching process where any defective PCB found is no longer useful and is simply thrown away. Since etching process costs 0% of the entire PCB fabrication, it is uneconomical to simply discard the defective PCBs. In this paper a method to identify the defects in natural PCB images and associated practical issues are addressed using Software tools and some of the major types of single layer PCB defects are Pattern Cut, Pin hole, Pattern Short, Nick etc., Therefore the defects should be identified before the etching process so that the PCB would be reprocessed. In the present approach expected to improve the efficiency of the system in detecting the defects even in low quality images

  10. Signal analysis for failure detection

    International Nuclear Information System (INIS)

    Parpaglione, M.C.; Perez, L.V.; Rubio, D.A.; Czibener, D.; D'Attellis, C.E.; Brudny, P.I.; Ruzzante, J.E.

    1994-01-01

    Several methods for analysis of acoustic emission signals are presented. They are mainly oriented to detection of changes in noisy signals and characterization of higher amplitude discrete pulses or bursts. The aim was to relate changes and events with failure, crack or wear in materials, being the final goal to obtain automatic means of detecting such changes and/or events. Performance evaluation was made using both simulated and laboratory test signals. The methods being presented are the following: 1. Application of the Hopfield Neural Network (NN) model for classifying faults in pipes and detecting wear of a bearing. 2. Application of the Kohonnen and Back Propagation Neural Network model for the same problem. 3. Application of Kalman filtering to determine time occurrence of bursts. 4. Application of a bank of Kalman filters (KF) for failure detection in pipes. 5. Study of amplitude distribution of signals for detecting changes in their shape. 6. Application of the entropy distance to measure differences between signals. (author). 10 refs, 11 figs

  11. Detection of radiation processing in onions

    International Nuclear Information System (INIS)

    Duchacek, V.

    1985-01-01

    Two breeds of onions were used for irradiation. Both breeds were divided into two parts - the first was irradiated with a dose of 80 Gy and the second served as a control. The two parts were stored under the same conditions. Conductometry, liquid chromatography and spectrophotometry were used for detecting the radiation processing of the onions. Only from the spectrophotometric determination of 2-desoxysaccharides it was possible to safely distinguish irradiated onions from non-irradiated controls throughout storage time. (E.S.)

  12. Subband Energy Detection in Passive Array Processing

    National Research Council Canada - National Science Library

    Bono, Michael

    2000-01-01

    ...), which includes both Subband Peak Energy Detection (SPED) and Subband Extrema Energy Detection (SEED). It will be shown that SED has several performance advantages over Conventional Energy Detection...

  13. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  14. Detection of microparticles in dynamic processes

    International Nuclear Information System (INIS)

    Ten, K A; Pruuel, E R; Kashkarov, A O; Rubtsov, I A; Shechtman, L I; Zhulanov, V V; Tolochko, B P; Rykovanov, G N; Muzyrya, A K; Smirnov, E B; Stolbikov, M Yu; Prosvirnin, K M

    2016-01-01

    When a metal plate is subjected to a strong shock impact, its free surface emits a flow of particles of different sizes (shock-wave “dusting”). Traditionally, the process of dusting is investigated by the methods of pulsed x-ray or piezoelectric sensor or via an optical technique. The particle size ranges from a few microns to hundreds of microns. The flow is assumed to include also finer particles, which cannot be detected with the existing methods yet. On the accelerator complex VEPP-3-VEPP-4 at the BINP there are two experiment stations for research on fast processes, including explosion ones. The stations enable measurement of both passed radiation (absorption) and small-angle x-ray scattering on synchrotron radiation (SR). Radiation is detected with a precision high-speed detector DIMEX. The detector has an internal memory of 32 frames, which enables recording of the dynamics of the process (shooting of movies) with intervals of 250 ns to 2 μ s. Flows of nano- and microparticles from free surfaces of various materials (copper and tin) have been examined. Microparticle flows were emitted from grooves of 50-200 μ s in size and joints (gaps) between metal parts. With the soft x-ray spectrum of SR one can explore the dynamics of a single microjet of micron size. The dynamics of density distribution along micro jets were determined. Under a shock wave (∼ 60 GPa) acting on tin disks, flows of microparticles from a smooth surface were recorded. (paper)

  15. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  16. Early skin tumor detection from microscopic images through image processing

    International Nuclear Information System (INIS)

    Siddiqi, A.A.; Narejo, G.B.; Khan, A.M.

    2017-01-01

    The research is done to provide appropriate detection technique for skin tumor detection. The work is done by using the image processing toolbox of MATLAB. Skin tumors are unwanted skin growth with different causes and varying extent of malignant cells. It is a syndrome in which skin cells mislay the ability to divide and grow normally. Early detection of tumor is the most important factor affecting the endurance of a patient. Studying the pattern of the skin cells is the fundamental problem in medical image analysis. The study of skin tumor has been of great interest to the researchers. DIP (Digital Image Processing) allows the use of much more complex algorithms for image processing, and hence, can offer both more sophisticated performance at simple task, and the implementation of methods which would be impossibly by analog means. It allows much wider range of algorithms to be applied to the input data and can avoid problems such as build up of noise and signal distortion during processing. The study shows that few works has been done on cellular scale for the images of skin. This research allows few checks for the early detection of skin tumor using microscopic images after testing and observing various algorithms. After analytical evaluation the result has been observed that the proposed checks are time efficient techniques and appropriate for the tumor detection. The algorithm applied provides promising results in lesser time with accuracy. The GUI (Graphical User Interface) that is generated for the algorithm makes the system user friendly. (author)

  17. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  18. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  19. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    Science.gov (United States)

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  20. Trace detection of organic compounds in complex sample matrixes by single photon ionization ion trap mass spectrometry: real-time detection of security-relevant compounds and online analysis of the coffee-roasting process.

    Science.gov (United States)

    Schramm, Elisabeth; Kürten, Andreas; Hölzer, Jasper; Mitschke, Stefan; Mühlberger, Fabian; Sklorz, Martin; Wieser, Jochen; Ulrich, Andreas; Pütz, Michael; Schulte-Ladbeck, Rasmus; Schultze, Rainer; Curtius, Joachim; Borrmann, Stephan; Zimmermann, Ralf

    2009-06-01

    An in-house-built ion trap mass spectrometer combined with a soft ionization source has been set up and tested. As ionization source, an electron beam pumped vacuum UV (VUV) excimer lamp (EBEL) was used for single-photon ionization. It was shown that soft ionization allows the reduction of fragmentation of the target analytes and the suppression of most matrix components. Therefore, the combination of photon ionization with the tandem mass spectrometry (MS/MS) capability of an ion trap yields a powerful tool for molecular ion peak detection and identification of organic trace compounds in complex matrixes. This setup was successfully tested for two different applications. The first one is the detection of security-relevant substances like explosives, narcotics, and chemical warfare agents. One test substance from each of these groups was chosen and detected successfully with single photon ionization ion trap mass spectrometry (SPI-ITMS) MS/MS measurements. Additionally, first tests were performed, demonstrating that this method is not influenced by matrix compounds. The second field of application is the detection of process gases. Here, exhaust gas from coffee roasting was analyzed in real time, and some of its compounds were identified using MS/MS studies.

  1. Automated Windowing Processing for Pupil Detection

    National Research Council Canada - National Science Library

    Ebisawa, Y

    2001-01-01

    .... The pupil center in the video image is a focal point to determine the eye gaze. Recently, to improve the disadvantages of traditional pupil detection methods, a pupil detection technique using two light sources (LEDs...

  2. Fake currency detection using image processing

    Science.gov (United States)

    Agasti, Tushar; Burand, Gajanan; Wade, Pratik; Chitra, P.

    2017-11-01

    The advancement of color printing technology has increased the rate of fake currency note printing and duplicating the notes on a very large scale. Few years back, the printing could be done in a print house, but now anyone can print a currency note with maximum accuracy using a simple laser printer. As a result the issue of fake notes instead of the genuine ones has been increased very largely. India has been unfortunately cursed with the problems like corruption and black money. And counterfeit of currency notes is also a big problem to it. This leads to design of a system that detects the fake currency note in a less time and in a more efficient manner. The proposed system gives an approach to verify the Indian currency notes. Verification of currency note is done by the concepts of image processing. This article describes extraction of various features of Indian currency notes. MATLAB software is used to extract the features of the note. The proposed system has got advantages like simplicity and high performance speed. The result will predict whether the currency note is fake or not.

  3. Social Media Sentiment Analysis and Topic Detection for Singapore English

    Science.gov (United States)

    2013-09-01

    study of NLP techniques,” La Revista de Procesamiento de Lenguaje Natural , vol. 50, pp. 45–52, 2013. [5] F. Batista, and R. Ribeiro, “Sentiment...have been made possible via social-media applications. Sentiment analysis and topic detection are two growing areas in Natural Language Processing...social-media applications. Sentiment analysis and topic detection are two growing areas in Natural Language Processing, and there are increasing

  4. Analysis of processing contaminants in edible oils. Part 1. Liquid chromatography-tandem mass spectrometry method for the direct detection of 3-monochloropropanediol monoesters and glycidyl esters.

    Science.gov (United States)

    MacMahon, Shaun; Mazzola, Eugene; Begley, Timothy H; Diachenko, Gregory W

    2013-05-22

    A new analytical method has been developed and validated for the detection of glycidyl esters (GEs) and 3-monochloropropanediol (3-MCPD) monoesters in edible oils. The target compounds represent two classes of potentially carcinogenic chemical contaminants formed during the processing of edible oils. Target analytes are separated from edible oil matrices using a two-step solid-phase extraction (SPE) procedure. The extracts are then analyzed using liquid chromatography-tandem mass spectrometry (LC-MS/MS) with electrospray ionization (ESI). Chromatographic conditions that separate sn-1 and sn-2 monoesters of 3-MCPD have been developed for the first time. The method has been validated for GEs, sn-1 3-MCPD monoesters of lauric, myristic, linolenic, linoleic, oleic, and stearic acids, and sn-2 3-MCPD monoesters of oleic and palmitic acids in coconut, olive, and palm oils using an external calibration curve. The range of average recoveries and relative standard deviations (RSDs) across the three oil matrices at three spiking concentrations are 84-115% (3-16% RSD) for the GEs, 95-113% (1-10% RSD) for the sn-1 3-MCPD monoesters, and 76.8-103% (5.1-11.2% RSD) for the sn-2 3-MCPD monoesters, with limits of quantitation at or below 30 ng/g for the GEs, 60 ng/g for sn-1 3-MCPD monoesters, and 180 ng/g for sn-2 3-MCPD monoesters.

  5. Analysis of processing contaminants in edible oils. Part 2. Liquid chromatography-tandem mass spectrometry method for the direct detection of 3-monochloropropanediol and 2-monochloropropanediol diesters.

    Science.gov (United States)

    MacMahon, Shaun; Begley, Timothy H; Diachenko, Gregory W

    2013-05-22

    A method was developed and validated for the detection of fatty acid diesters of 2-monochloropropanediol (2-MCPD) and 3-monochloropropanediol (3-MCPD) in edible oils. These analytes are potentially carcinogenic chemical contaminants formed during edible oil processing. After separation from oil matrices using a two-step solid-phase extraction (SPE) procedure, the target compounds are quantitated using liquid chromatography-tandem mass spectrometry (LC-MS/MS) with electrospray ionization (ESI). The first chromatographic conditions have been developed that separate intact diesters of 2-MCPD and 3-MCPD, allowing for their individual quantitation. The method has been validated for 28 3-MCPD diesters of lauric, myristic, palmitic, linolenic, linoleic, oleic, and stearic acids in coconut, olive, and palm oils, as well as 3 2-MCPD diesters, using an external calibration curve. The range of average recoveries and relative standard deviations (RSDs) across the three oil matrices at three spiking concentrations are 88-118% (2-16% RSD) with maximum limits of quantitation of 30 ng/g (ppb).

  6. Chemical detection, identification, and analysis system

    International Nuclear Information System (INIS)

    Morel, R.S.; Gonzales, D.; Mniszewski, S.

    1990-01-01

    The chemical detection, identification, and analysis system (CDIAS) has three major goals. The first is to display safety information regarding chemical environment before personnel entry. The second is to archive personnel exposure to the environment. Third, the system assists users in identifying the stage of a chemical process in progress and suggests safety precautions associated with that process. In addition to these major goals, the system must be sufficiently compact to provide transportability, and it must be extremely simple to use in order to keep user interaction at a minimum. The system created to meet these goals includes several pieces of hardware and the integration of four software packages. The hardware consists of a low-oxygen, carbon monoxide, explosives, and hydrogen sulfide detector; an ion mobility spectrometer for airborne vapor detection; and a COMPAQ 386/20 portable computer. The software modules are a graphics kernel, an expert system shell, a data-base management system, and an interface management system. A supervisory module developed using the interface management system coordinates the interaction of the other software components. The system determines the safety of the environment using conventional data acquisition and analysis techniques. The low-oxygen, carbon monoxide, hydrogen sulfide, explosives, and vapor detectors are monitored for hazardous levels, and warnings are issued accordingly

  7. Computerization of the safeguards analysis decision process

    International Nuclear Information System (INIS)

    Ehinger, M.H.

    1990-01-01

    This paper reports that safeguards regulations are evolving to meet new demands for timeliness and sensitivity in detecting the loss or unauthorized use of sensitive nuclear materials. The opportunities to meet new rules, particularly in bulk processing plants, involve developing techniques which use modern, computerized process control and information systems. Using these computerized systems in the safeguards analysis involves all the challenges of the man-machine interface experienced in the typical process control application and adds new dimensions to accuracy requirements, data analysis, and alarm resolution in the regulatory environment

  8. Calculation of the detection limits for radionuclides identified in gamma-ray spectra based on post-processing peak analysis results.

    Science.gov (United States)

    Korun, M; Vodenik, B; Zorko, B

    2018-03-01

    A new method for calculating the detection limits of gamma-ray spectrometry measurements is presented. The method is applicable for gamma-ray emitters, irrespective of the influences of the peaked background, the origin of the background and the overlap with other peaks. It offers the opportunity for multi-gamma-ray emitters to calculate the common detection limit, corresponding to more peaks. The detection limit is calculated by approximating the dependence of the uncertainty in the indication on its value with a second-order polynomial. In this approach the relation between the input quantities and the detection limit are described by an explicit expression and can be easy investigated. The detection limit is calculated from the data usually provided by the reports of peak-analyzing programs: the peak areas and their uncertainties. As a result, the need to use individual channel contents for calculating the detection limit is bypassed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Automatic Gap Detection in Friction Stir Welding Processes (Preprint)

    National Research Council Canada - National Science Library

    Yang, Yu; Kalya, Prabhanjana; Landers, Robert G; Krishnamurthy, K

    2006-01-01

    .... This paper develops a monitoring algorithm to detect gaps in Friction Stir Welding (FSW) processes. Experimental studies are conducted to determine how the process parameters and the gap width affect the welding process...

  10. Traffic sign detection and analysis

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Trivedi, Mohan M.; Moeslund, Thomas B.

    2012-01-01

    Traffic sign recognition (TSR) is a research field that has seen much activity in the recent decade. This paper introduces the problem and presents 4 recent papers on traffic sign detection and 4 recent papers on traffic sign classification. It attempts to extract recent trends in the field...

  11. Evaluation of signal processing for boiling noise detection

    International Nuclear Information System (INIS)

    Black, J.L.; Ledwidge, T.J.

    1989-01-01

    As part of the co-ordinated research programme on the detection of sodium boiling some further analysis has been performed on the data from the test loop in Karlsruhe and some preliminary analysis of the data from the BOR 60 experiment. The work on the Karlsruhe data is concerned with the search for a reliable method by which the quality of signal processing strategies may be compared. The results show that the three novel methods previously reported are all markedly superior to the mean square method which is used as a benchmark. The three novel methods are nth order differentiation in the frequency domain, the mean square prediction based on nth order conditional expectation and the nth order probability density function. A preliminary analysis on the data from the BOR 60 reactor shows that 4th order differentiation is adequate for the detection of signals derived from a pressure transducer and that the map of spurious trip probability (S) and the probability of missing an event (M) is consistent with the theoretical model proposed herein, and the suggested procedures for evaluating the quality of detection strategies. (author). 15 figs, 1 tab

  12. Surface Distresses Detection of Pavement Based on Digital Image Processing

    OpenAIRE

    Ouyang , Aiguo; Luo , Chagen; Zhou , Chao

    2010-01-01

    International audience; Pavement crack is the main form of early diseases of pavement. The use of digital photography to record pavement images and subsequent crack detection and classification has undergone continuous improvements over the past decade. Digital image processing has been applied to detect the pavement crack for its advantages of large amount of information and automatic detection. The applications of digital image processing in pavement crack detection, distresses classificati...

  13. Detection of irradiated spices by thermoluminescence analysis

    International Nuclear Information System (INIS)

    Hammerton, K.M.; Banos, C.

    1996-01-01

    Spices are used extensively in prepared foods. The high levels of contamination of many spices with microorganisms poses a problem for the food industry. Irradiation treatment is the most effective means of reducing the microbial load to safe levels. Although the process is currently subject to a moratorium in Australia, it is used in several countries for the decontamination of spices. Methods for detecting irradiation treatment of spices are necessary to enforce compliance with labelling requirements or with a prohibition on the sale of irradiated foods. Thermoluminescence (TL) analysis of spice samples has been shown to be an applicable method for the detection of all irradiated spices. It was established that the TL response originates from the adhering mineral dust in the sample. Definitive identification of many irradiated spices requires the separation of a mineral extract from the organic fraction of the spice sample. This separation can be achieved by using density centrifugation with a heavy liquid, sodium polytungstate. Clear discrimination between untreated and irradiated spice samples has been obtained by re-irradiation of the mineral extract after the first TL analysis with an absorbed dose of about 1 kGy (normalisation). The ratio of the first to second TL response was about one for irradiated samples and well below one for untreated samples. These methods have been investigated with a range of spices to establish the most suitable method for routine control purposes. (author)

  14. Detection and defect correction of operating process

    International Nuclear Information System (INIS)

    Vasendina, Elena; Plotnikova, Inna; Levitskaya, Anastasiya; Kvesko, Svetlana

    2016-01-01

    The article is devoted to the current problem of enterprise competitiveness rise in hard and competitive terms of business environment. The importance of modern equipment for detection of defects and their correction is explained. Production of chipboard is used as an object of research. Short description and main results of estimation efficiency of innovative solutions of enterprises are considered. (paper)

  15. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi; Sun, Ying; Khadraoui, Sofiane

    2016-01-01

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been

  16. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  17. Big Data Analysis of Manufacturing Processes

    International Nuclear Information System (INIS)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-01-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results. (paper)

  18. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  19. Crack Length Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    1990-01-01

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better then that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  20. Crack Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal, Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better than that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  1. Employing image processing techniques for cancer detection using microarray images.

    Science.gov (United States)

    Dehghan Khalilabad, Nastaran; Hassanpour, Hamid

    2017-02-01

    Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  3. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    maintains the flexibility of deciding sooner than the fixed sample size procedure at the price of some lower power [13, 514]. The sequential probability... markets , detection of signals with unknown arrival time in seismology, navigation, radar and sonar signal processing, speech segmentation, and the... skimming cruise missile can yield a significant increase in the probability of raid annihilation. Furthermore, usually detection systems are

  4. Advanced Signal Processing for Thermal Flaw Detection; TOPICAL

    International Nuclear Information System (INIS)

    VALLEY, MICHAEL T.; HANSCHE, BRUCE D.; PAEZ, THOMAS L.; URBINA, ANGEL; ASHBAUGH, DENNIS M.

    2001-01-01

    Dynamic thermography is a promising technology for inspecting metallic and composite structures used in high-consequence industries. However, the reliability and inspection sensitivity of this technology has historically been limited by the need for extensive operator experience and the use of human judgment and visual acuity to detect flaws in the large volume of infrared image data collected. To overcome these limitations new automated data analysis algorithms and software is needed. The primary objectives of this research effort were to develop a data processing methodology that is tied to the underlying physics, which reduces or removes the data interpretation requirements, and which eliminates the need to look at significant numbers of data frames to determine if a flaw is present. Considering the strengths and weakness of previous research efforts, this research elected to couple both the temporal and spatial attributes of the surface temperature. Of the possible algorithms investigated, the best performing was a radiance weighted root mean square Laplacian metric that included a multiplicative surface effect correction factor and a novel spatio-temporal parametric model for data smoothing. This metric demonstrated the potential for detecting flaws smaller than 0.075 inch in inspection areas on the order of one square foot. Included in this report is the development of a thermal imaging model, a weighted least squares thermal data smoothing algorithm, simulation and experimental flaw detection results, and an overview of the ATAC (Automated Thermal Analysis Code) software that was developed to analyze thermal inspection data

  5. Detecting jaundice by using digital image processing

    Science.gov (United States)

    Castro-Ramos, J.; Toxqui-Quitl, C.; Villa Manriquez, F.; Orozco-Guillen, E.; Padilla-Vivanco, A.; Sánchez-Escobar, JJ.

    2014-03-01

    When strong Jaundice is presented, babies or adults should be subject to clinical exam like "serum bilirubin" which can cause traumas in patients. Often jaundice is presented in liver disease such as hepatitis or liver cancer. In order to avoid additional traumas we propose to detect jaundice (icterus) in newborns or adults by using a not pain method. By acquiring digital images in color, in palm, soles and forehead, we analyze RGB attributes and diffuse reflectance spectra as the parameter to characterize patients with either jaundice or not, and we correlate that parameters with the level of bilirubin. By applying support vector machine we distinguish between healthy and sick patients.

  6. Processing of Graphene combining Optical Detection and Scanning Probe Lithography

    Directory of Open Access Journals (Sweden)

    Zimmermann Sören

    2015-01-01

    Full Text Available This paper presents an experimental setup tailored for robotic processing of graphene with in-situ vision based control. A robust graphene detection approach is presented applying multiple image processing operations of the visual feedback provided by a high-resolution light microscope. Detected graphene flakes can be modified using a scanning probe based lithographical process that is directly linked to the in-situ optical images. The results of this process are discussed with respect to further application scenarios.

  7. Signal Detection Theory-Based Information Processing for the Detection of Breast Cancer at Microwave Frequencies

    National Research Council Canada - National Science Library

    Nolte, Loren

    2002-01-01

    The hypothesis is that one can use signal detection theory to improve the performance in detecting tumors in the breast by using this theory to develop task-oriented information processing techniques...

  8. Recent developments in analytical detection methods for radiation processed foods

    International Nuclear Information System (INIS)

    Wu Jilan

    1993-01-01

    A short summary of the programmes of 'ADMIT' (FAO/IAEA) and the developments in analytical detection methods for radiation processed foods has been given. It is suggested that for promoting the commercialization of radiation processed foods and controlling its quality, one must pay more attention to the study of analytical detection methods of irradiated food

  9. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  10. Analysis of Variance in Statistical Image Processing

    Science.gov (United States)

    Kurz, Ludwik; Hafed Benteftifa, M.

    1997-04-01

    A key problem in practical image processing is the detection of specific features in a noisy image. Analysis of variance (ANOVA) techniques can be very effective in such situations, and this book gives a detailed account of the use of ANOVA in statistical image processing. The book begins by describing the statistical representation of images in the various ANOVA models. The authors present a number of computationally efficient algorithms and techniques to deal with such problems as line, edge, and object detection, as well as image restoration and enhancement. By describing the basic principles of these techniques, and showing their use in specific situations, the book will facilitate the design of new algorithms for particular applications. It will be of great interest to graduate students and engineers in the field of image processing and pattern recognition.

  11. Radar fall detection using principal component analysis

    Science.gov (United States)

    Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem

    2016-05-01

    Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.

  12. Detecting causality in policy diffusion processes

    OpenAIRE

    Grabow, Carsten; Macinko, James; Silver, Diana; Porfiri, Maurizio

    2016-01-01

    A universal question in network science entails learning about the topology of interaction from collective dynamics. Here, we address this question by examining diffusion of laws across US states. We propose two complementary techniques to unravel determinants of this diffusion process: information-theoretic union transfer entropy and event synchronization. In order to systematically investigate their performance on law activity data, we establish a new stochastic model to generate synthetic ...

  13. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  14. Numerical analysis of Eucalyptus grandis × E. urophylla heat-treatment: A dynamically detecting method of mass loss during the process

    Science.gov (United States)

    Zhao, Zijian; Ma, Qing; Mu, Jun; Yi, Songlin; He, Zhengbin

    Eucalyptus particles, lamellas and boards were applied to explore a simply-implemented method with neglected heat and mass transfer to inspect the mass loss during the heat-treatment course. The results revealed that the mass loss of a certain period was theoretically the definite integration of loss rate to time in this period, and a monitoring model for mass loss speed was developed with the particles and validated with the lamellas and boards. The loss rate was correlated to the temperature and temperature-evolving speed in the model which was composed of three functions during different temperature-evolving period. The sample mass loss was calculated in the MATLAB for the lamellas and boards and the model was validated and adjusted based on the difference between the computed results and the practically measured loss values. The error ranges of the new models were -16.30% to 18.35% for wood lamellas and -9.86% to 6.80% for wood boards. This method made it possible to acquire the instantaneous loss value through continuously detecting the wood temperature evolution. This idea could provide a reference for the Eucalyptus heat-treatment to detect the treating course and control the final material characteristics.

  15. Crack Detection with Lamb Wave Wavenumber Analysis

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  16. Detecting causality in policy diffusion processes

    Science.gov (United States)

    Grabow, Carsten; Macinko, James; Silver, Diana; Porfiri, Maurizio

    2016-08-01

    A universal question in network science entails learning about the topology of interaction from collective dynamics. Here, we address this question by examining diffusion of laws across US states. We propose two complementary techniques to unravel determinants of this diffusion process: information-theoretic union transfer entropy and event synchronization. In order to systematically investigate their performance on law activity data, we establish a new stochastic model to generate synthetic law activity data based on plausible networks of interactions. Through extensive parametric studies, we demonstrate the ability of these methods to reconstruct networks, varying in size, link density, and degree heterogeneity. Our results suggest that union transfer entropy should be preferred for slowly varying processes, which may be associated with policies attending to specific local problems that occur only rarely or with policies facing high levels of opposition. In contrast, event synchronization is effective for faster enactment rates, which may be related to policies involving Federal mandates or incentives. This study puts forward a data-driven toolbox to explain the determinants of legal activity applicable to political science, across dynamical systems, information theory, and complex networks.

  17. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  18. Detection of cracks on concrete surfaces by hyperspectral image processing

    Science.gov (United States)

    Santos, Bruno O.; Valença, Jonatas; Júlio, Eduardo

    2017-06-01

    All large infrastructures worldwide must have a suitable monitoring and maintenance plan, aiming to evaluate their behaviour and predict timely interventions. In the particular case of concrete infrastructures, the detection and characterization of crack patterns is a major indicator of their structural response. In this scope, methods based on image processing have been applied and presented. Usually, methods focus on image binarization followed by applications of mathematical morphology to identify cracks on concrete surface. In most cases, publications are focused on restricted areas of concrete surfaces and in a single crack. On-site, the methods and algorithms have to deal with several factors that interfere with the results, namely dirt and biological colonization. Thus, the automation of a procedure for on-site characterization of crack patterns is of great interest. This advance may result in an effective tool to support maintenance strategies and interventions planning. This paper presents a research based on the analysis and processing of hyper-spectral images for detection and classification of cracks on concrete structures. The objective of the study is to evaluate the applicability of several wavelengths of the electromagnetic spectrum for classification of cracks in concrete surfaces. An image survey considering highly discretized wavelengths between 425 nm and 950 nm was performed on concrete specimens, with bandwidths of 25 nm. The concrete specimens were produced with a crack pattern induced by applying a load with displacement control. The tests were conducted to simulate usual on-site drawbacks. In this context, the surface of the specimen was subjected to biological colonization (leaves and moss). To evaluate the results and enhance crack patterns a clustering method, namely k-means algorithm, is being applied. The research conducted allows to define the suitability of using clustering k-means algorithm combined with hyper-spectral images highly

  19. Information theoretic analysis of edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2010-08-01

    Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.

  20. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  1. Signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    1989-05-01

    At the Specialists' Meeting on Sodium Boiling Detection organized by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency at Chester in the United Kingdom in 1981 various methods of detecting sodium boiling were reported. But, it was not possible to make a comparative assessment of these methods because the signal condition in each experiment was different from others. That is why participants of this meeting recommended that a benchmark test should be carried out in order to evaluate and compare signal processing methods for boiling detection. Organization of the Co-ordinated Research Programme (CRP) on signal processing techniques for sodium boiling noise detection was also recommended at the 16th meeting of the IWGFR. The CRP on Signal Processing Techniques for Sodium Boiling Noise Detection was set up in 1984. Eight laboratories from six countries have agreed to participate in this CRP. The overall objective of the programme was the development of reliable on-line signal processing techniques which could be used for the detection of sodium boiling in an LMFBR core. During the first stage of the programme a number of existing processing techniques used by different countries have been compared and evaluated. In the course of further work, an algorithm for implementation of this sodium boiling detection system in the nuclear reactor will be developed. It was also considered that the acoustic signal processing techniques developed for boiling detection could well make a useful contribution to other acoustic applications in the reactor. This publication consists of two parts. Part I is the final report of the co-ordinated research programme on signal processing techniques for sodium boiling noise detection. Part II contains two introductory papers and 20 papers presented at four research co-ordination meetings since 1985. A separate abstract was prepared for each of these 22 papers. Refs, figs and tabs

  2. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David; Gereige, Issam; Gourgon, Cé cile

    2013-01-01

    patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications

  3. Fault Management: Degradation Signature Detection, Modeling, and Processing, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  4. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P.

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  5. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  6. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    Hubert, Ph.; Mays, C.

    1998-01-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  7. Detection of LiveLock in BPMN Using Process Expression

    Science.gov (United States)

    Tantitharanukul, Nasi; Jumpamule, Watcharee

    Although the Business Process Modeling Notation (BPMN) is a popular tool for modeling business process in conceptual level, the result diagram may contain structural problem. One of the structural problems is livelock. In this problem, one token proceeds to end event, while other token is still in process with no progression. In this paper, we introduce an expression liked method to detect livelock in the BPMN diagram. Our approach utilizes the power of the declarative ability of expression to determine all of the possible process chains, and indicate whether there are livelock or not. As a result, we have shown that our method can detect livelock, if any.

  8. Use of Sparse Principal Component Analysis (SPCA) for Fault Detection

    DEFF Research Database (Denmark)

    Gajjar, Shriram; Kulahci, Murat; Palazoglu, Ahmet

    2016-01-01

    Principal component analysis (PCA) has been widely used for data dimension reduction and process fault detection. However, interpreting the principal components and the outcomes of PCA-based monitoring techniques is a challenging task since each principal component is a linear combination of the ...

  9. Plagiarism Detection for Indonesian Language using Winnowing with Parallel Processing

    Science.gov (United States)

    Arifin, Y.; Isa, S. M.; Wulandhari, L. A.; Abdurachman, E.

    2018-03-01

    The plagiarism has many forms, not only copy paste but include changing passive become active voice, or paraphrasing without appropriate acknowledgment. It happens on all language include Indonesian Language. There are many previous research that related with plagiarism detection in Indonesian Language with different method. But there are still some part that still has opportunity to improve. This research proposed the solution that can improve the plagiarism detection technique that can detect not only copy paste form but more advance than that. The proposed solution is using Winnowing with some addition process in pre-processing stage. With stemming processing in Indonesian Language and generate fingerprint in parallel processing that can saving time processing and produce the plagiarism result on the suspected document.

  10. Sequential Detection of Fission Processes for Harbor Defense

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  11. A dual-process account of auditory change detection.

    Science.gov (United States)

    McAnally, Ken I; Martin, Russell L; Eramudugolla, Ranmalee; Stuart, Geoffrey W; Irvine, Dexter R F; Mattingley, Jason B

    2010-08-01

    Listeners can be "deaf" to a substantial change in a scene comprising multiple auditory objects unless their attention has been directed to the changed object. It is unclear whether auditory change detection relies on identification of the objects in pre- and post-change scenes. We compared the rates at which listeners correctly identify changed objects with those predicted by change-detection models based on signal detection theory (SDT) and high-threshold theory (HTT). Detected changes were not identified as accurately as predicted by models based on either theory, suggesting that some changes are detected by a process that does not support change identification. Undetected changes were identified as accurately as predicted by the HTT model but much less accurately than predicted by the SDT models. The process underlying change detection was investigated further by determining receiver-operating characteristics (ROCs). ROCs did not conform to those predicted by either a SDT or a HTT model but were well modeled by a dual-process that incorporated HTT and SDT components. The dual-process model also accurately predicted the rates at which detected and undetected changes were correctly identified.

  12. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  13. Application of image processing technology in yarn hairiness detection

    Directory of Open Access Journals (Sweden)

    Guohong ZHANG

    2016-02-01

    Full Text Available Digital image processing technology is one of the new methods for yarn detection, which can realize the digital characterization and objective evaluation of yarn appearance. This paper overviews the current status of development and application of digital image processing technology used for yarn hairiness evaluation, and analyzes and compares the traditional detection methods and this new developed method. Compared with the traditional methods, the image processing technology based method is more objective, fast and accurate, which is the vital development trend of the yarn appearance evaluation.

  14. Optimizing Urine Processing Protocols for Protein and Metabolite Detection.

    Science.gov (United States)

    Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K

    In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.

  15. Analysis and detection of climate change

    International Nuclear Information System (INIS)

    Thejll, P.; Stendel, M.

    2001-01-01

    The authors first discuss the concepts 'climate' and 'climate change detection', outlining the difficulties of the latter in terms of the properties of the former. In more detail they then discuss the analysis and detection, carried out at the Danish Climate Centre, of anthropogenic climate change and the nonanthropogenic changes regarding anthropogenic climate change the emphasis is on the improvement of global and regional climate models, and the reconstruction of past climates regarding non-anthropogenic changes the authors describe two case studies of potential solar influence on climate. (LN)

  16. Simulation of land mine detection processes using nuclear techniques

    International Nuclear Information System (INIS)

    Aziz, M.

    2005-01-01

    A computer models were designed to study the processes of land mine detection using nuclear technique. Parameters that affect the detection were analyzed . Mines of different masses at different depths in the soil are considered using two types of sources , 252 C f and 14 MeV neutron source. The capability to differentiate between mines and other objects such as concrete , iron , wood , Aluminum ,water and polyethylene were analyzed and studied

  17. Fault detection of Tennessee Eastman process based on topological features and SVM

    Science.gov (United States)

    Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen

    2018-03-01

    Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.

  18. Process for detecting leak faults using a helium mass spectrometer

    International Nuclear Information System (INIS)

    Divet, Claude; Morin, Claude.

    1977-01-01

    The description is given of a process for detecting very small leak faults putting into communication the outer and inner sides of the wall of a containment, one of these wall sides being in contact with gaseous helium under a pressure of around one torr, the other side being one of the limits of a space pumped down to a residual gas pressure under 10 -3 torr. This space is in communication with the measuring cell of a helium mass spectrometer. This process may be applied to the detection of faults in metal claddings of the fuel rods used in nuclear reactors [fr

  19. Cascaded image analysis for dynamic crack detection in material testing

    Science.gov (United States)

    Hampel, U.; Maas, H.-G.

    Concrete probes in civil engineering material testing often show fissures or hairline-cracks. These cracks develop dynamically. Starting at a width of a few microns, they usually cannot be detected visually or in an image of a camera imaging the whole probe. Conventional image analysis techniques will detect fissures only if they show a width in the order of one pixel. To be able to detect and measure fissures with a width of a fraction of a pixel at an early stage of their development, a cascaded image analysis approach has been developed, implemented and tested. The basic idea of the approach is to detect discontinuities in dense surface deformation vector fields. These deformation vector fields between consecutive stereo image pairs, which are generated by cross correlation or least squares matching, show a precision in the order of 1/50 pixel. Hairline-cracks can be detected and measured by applying edge detection techniques such as a Sobel operator to the results of the image matching process. Cracks will show up as linear discontinuities in the deformation vector field and can be vectorized by edge chaining. In practical tests of the method, cracks with a width of 1/20 pixel could be detected, and their width could be determined at a precision of 1/50 pixel.

  20. An effort allocation model considering different budgetary constraint on fault detection process and fault correction process

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2016-01-01

    Full Text Available Fault detection process (FDP and Fault correction process (FCP are important phases of software development life cycle (SDLC. It is essential for software to undergo a testing phase, during which faults are detected and corrected. The main goal of this article is to allocate the testing resources in an optimal manner to minimize the cost during testing phase using FDP and FCP under dynamic environment. In this paper, we first assume there is a time lag between fault detection and fault correction. Thus, removal of a fault is performed after a fault is detected. In addition, detection process and correction process are taken to be independent simultaneous activities with different budgetary constraints. A structured optimal policy based on optimal control theory is proposed for software managers to optimize the allocation of the limited resources with the reliability criteria. Furthermore, release policy for the proposed model is also discussed. Numerical example is given in support of the theoretical results.

  1. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  2. PRECLOSURE CRITICALITY ANALYSIS PROCESS REPORT

    International Nuclear Information System (INIS)

    Danise, A.E.

    2004-01-01

    This report describes a process for performing preclosure criticality analyses for a repository at Yucca Mountain, Nevada. These analyses will be performed from the time of receipt of fissile material until permanent closure of the repository (preclosure period). The process describes how criticality safety analyses will be performed for various configurations of waste in or out of waste packages that could occur during preclosure as a result of normal operations or event sequences. The criticality safety analysis considers those event sequences resulting in unanticipated moderation, loss of neutron absorber, geometric changes, or administrative errors in waste form placement (loading) of the waste package. The report proposes a criticality analyses process for preclosure to allow a consistent transition from preclosure to postclosure, thereby possibly reducing potential cost increases and delays in licensing of Yucca Mountain. The proposed approach provides the advantage of using a parallel regulatory framework for evaluation of preclosure and postclosure performance and is consistent with the U.S. Nuclear Regulatory Commission's approach of supporting risk-informed, performance-based regulation for fuel cycle facilities, ''Yucca Mountain Review Plan, Final Report'', and 10 CFR Part 63. The criticality-related criteria for ensuring subcriticality are also described as well as which guidance documents will be utilized. Preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the U.S. Nuclear Regulatory Commission; therefore, the design approach for preclosure criticality safety will be dictated by existing regulatory requirements while using a risk-informed approach with burnup credit for in-package operations

  3. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Möller, A.; Ruhlmann-Kleider, V.; Neveu, J.; Palanque-Delabrouille, N. [Irfu, SPP, CEA Saclay, F-91191 Gif sur Yvette cedex (France); Lanusse, F.; Starck, J.-L., E-mail: anais.moller@cea.fr, E-mail: vanina.ruhlmann-kleider@cea.fr, E-mail: francois.lanusse@cea.fr, E-mail: jeremy.neveu@cea.fr, E-mail: nathalie.palanque-delabrouille@cea.fr, E-mail: jstarck@cea.fr [Laboratoire AIM, UMR CEA-CNRS-Paris 7, Irfu, SAp, CEA Saclay, F-91191 Gif sur Yvette cedex (France)

    2015-04-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data deferred photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000 detections dominated by signals of bright objects that were not perfectly subtracted. Allowing these artifacts to be detected leads not only to a waste of resources but also to possible signal coordinate contamination. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of our subtraction stacks, SN-like events are rather circular objects while most spurious detections exhibit different shapes. A two-step procedure was necessary to have a proper evaluation of the noise in the subtracted image stacks and thus a reliable signal extraction. We also set up a new detection strategy to obtain coordinates with good resolution for the extracted signal. SNIa Monte-Carlo (MC) generated images were used to study detection efficiency and coordinate resolution. When tested on SNLS 3-year data this procedure decreases the number of detections by a factor of two, while losing only 10% of SN-like events, almost all faint ones. MC results show that SNIa detection efficiency is equivalent to that of the original method for bright events, while the coordinate resolution is improved.

  4. Detection of genetically modified organisms in foreign-made processed foods containing corn and potato.

    Science.gov (United States)

    Monma, Kimio; Araki, Rie; Sagi, Naoki; Satoh, Masaki; Ichikawa, Hisatsugu; Satoh, Kazue; Tobe, Takashi; Kamata, Kunihiro; Hino, Akihiro; Saito, Kazuo

    2005-06-01

    Investigations of the validity of labeling regarding genetically modified (GM) products were conducted using polymerase chain reaction (PCR) methods for foreign-made processed foods made from corn and potato purchased in the Tokyo area and in the USA. Several kinds of GM crops were detected in 12 of 32 samples of processed corn samples. More than two GM events for which safety reviews have been completed in Japan were simultaneously detected in 10 samples. GM events MON810 and Bt11 were most frequently detected in the samples by qualitative PCR methods. MON810 was detected in 11 of the 12 samples, and Bt11 was detected in 6 of the 12 samples. In addition, Roundup Ready soy was detected in one of the 12 samples. On the other hand, CBH351, for which the safety assessment was withdrawn in Japan, was not detected in any of the 12 samples. A trial quantitative analysis was performed on six of the GM maize qualitatively positive samples. The estimated amounts of GM maize in these samples ranged from 0.2 to 2.8%, except for one sample, which contained 24.1%. For this sample, the total amount found by event-specific quantitative analysis was 23.8%. Additionally, Roundup Ready soy was detected in one sample of 21 potato-processed foods, although GM potatoes were not detected in any sample.

  5. Digital Image Processing Technique for Breast Cancer Detection

    Science.gov (United States)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  6. Pipeline Processing with an Iterative, Context-Based Detection Model

    Science.gov (United States)

    2016-01-22

    wave precursor artifacts. Distortion definitely is reduced with the addition of more channels to the processed data stream (comparing trace 3 to...limitations of fully automatic hypothesis evaluation with a test case of two events in Central Asia – a deep Hindu Kush earthquake and a shallow earthquake in...AFRL-RV-PS- AFRL-RV-PS- TR-2016-0080 TR-2016-0080 PIPELINE PROCESSING WITH AN ITERATIVE, CONTEXT-BASED DETECTION MODEL T. Kværna, et al

  7. Counterfeit Electronics Detection Using Image Processing and Machine Learning

    Science.gov (United States)

    Asadizanjani, Navid; Tehranipoor, Mark; Forte, Domenic

    2017-01-01

    Counterfeiting is an increasing concern for businesses and governments as greater numbers of counterfeit integrated circuits (IC) infiltrate the global market. There is an ongoing effort in experimental and national labs inside the United States to detect and prevent such counterfeits in the most efficient time period. However, there is still a missing piece to automatically detect and properly keep record of detected counterfeit ICs. Here, we introduce a web application database that allows users to share previous examples of counterfeits through an online database and to obtain statistics regarding the prevalence of known defects. We also investigate automated techniques based on image processing and machine learning to detect different physical defects and to determine whether or not an IC is counterfeit.

  8. Counterfeit Electronics Detection Using Image Processing and Machine Learning

    International Nuclear Information System (INIS)

    Asadizanjani, Navid; Tehranipoor, Mark; Forte, Domenic

    2017-01-01

    Counterfeiting is an increasing concern for businesses and governments as greater numbers of counterfeit integrated circuits (IC) infiltrate the global market. There is an ongoing effort in experimental and national labs inside the United States to detect and prevent such counterfeits in the most efficient time period. However, there is still a missing piece to automatically detect and properly keep record of detected counterfeit ICs. Here, we introduce a web application database that allows users to share previous examples of counterfeits through an online database and to obtain statistics regarding the prevalence of known defects. We also investigate automated techniques based on image processing and machine learning to detect different physical defects and to determine whether or not an IC is counterfeit. (paper)

  9. PROBABILISTIC APPROACH TO OBJECT DETECTION AND RECOGNITION FOR VIDEOSTREAM PROCESSING

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-07-01

    Full Text Available Purpose: The represented research results are aimed to improve theoretical basics of computer vision and artificial intelligence of dynamical system. Proposed approach of object detection and recognition is based on probabilistic fundamentals to ensure the required level of correct object recognition. Methods: Presented approach is grounded at probabilistic methods, statistical methods of probability density estimation and computer-based simulation at verification stage of development. Results: Proposed approach for object detection and recognition for video stream data processing has shown several advantages in comparison with existing methods due to its simple realization and small time of data processing. Presented results of experimental verification look plausible for object detection and recognition in video stream. Discussion: The approach can be implemented in dynamical system within changeable environment such as remotely piloted aircraft systems and can be a part of artificial intelligence in navigation and control systems.

  10. System for detecting and processing abnormality in electromagnetic shielding

    International Nuclear Information System (INIS)

    Takahashi, T.; Nakamura, M.; Yabana, Y.; Ishikawa, T.; Nagata, K.

    1991-01-01

    The present invention relates to a system for detecting and processing an abnormality in electromagnetic shielding of an intelligent building which is constructed using an electromagnetic shielding material for the skeleton and openings such as windows and doorways so that the whole of the building is formed into an electromagnetic shielding structure. (author). 4 figs

  11. Processing bronchial sonograms to detect respiratory cycle fragments

    International Nuclear Information System (INIS)

    Bureev, A Sh; Zhdanov, D S; Zemlyakov, I Yu; Svetlik, M V

    2014-01-01

    This article describes the authors' results of work on the development of a method for the automated assessment of the state of the human bronchopulmonary system based on acoustic data. In particular, the article covers the method of detecting breath sounds on bronchial sonograms obtained during the auscultation process

  12. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    Directory of Open Access Journals (Sweden)

    Peter J. Best

    2009-03-01

    Full Text Available Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1 threat monitoring-surveillance of security audit logs for ‘red flags’, (2 automated extraction and analysis of data from audit trails, and (3 using forensic investigation techniques to determine whether a fraud has actually occurred. We demonstrate how mySAP, an enterprise system, can be used for audit trail analysis in detecting financial frauds; afterwards we use a case study of a suspected fraud to illustrate how to implement the methodology.

  13. Trends in business process analysis: from verification to process mining

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Cardoso, J.; Cordeiro, J.; Filipe, J.

    2007-01-01

    Business process analysis ranges from model verification at design-time to the monitoring of processes at runtime. Much progress has been achieved in process verification. Today we are able to verify the entire reference model of SAP without any problems. Moreover, more and more processes leave

  14. Image recognition on raw and processed potato detection: a review

    Science.gov (United States)

    Qi, Yan-nan; Lü, Cheng-xu; Zhang, Jun-ning; Li, Ya-shuo; Zeng, Zhen; Mao, Wen-hua; Jiang, Han-lu; Yang, Bing-nan

    2018-02-01

    Objective: Chinese potato staple food strategy clearly pointed out the need to improve potato processing, while the bottleneck of this strategy is technology and equipment of selection of appropriate raw and processed potato. The purpose of this paper is to summarize the advanced raw and processed potato detection methods. Method: According to consult research literatures in the field of image recognition based potato quality detection, including the shape, weight, mechanical damage, germination, greening, black heart, scab potato etc., the development and direction of this field were summarized in this paper. Result: In order to obtain whole potato surface information, the hardware was built by the synchronous of image sensor and conveyor belt to achieve multi-angle images of a single potato. Researches on image recognition of potato shape are popular and mature, including qualitative discrimination on abnormal and sound potato, and even round and oval potato, with the recognition accuracy of more than 83%. Weight is an important indicator for potato grading, and the image classification accuracy presents more than 93%. The image recognition of potato mechanical damage focuses on qualitative identification, with the main affecting factors of damage shape and damage time. The image recognition of potato germination usually uses potato surface image and edge germination point. Both of the qualitative and quantitative detection of green potato have been researched, currently scab and blackheart image recognition need to be operated using the stable detection environment or specific device. The image recognition of processed potato mainly focuses on potato chips, slices and fries, etc. Conclusion: image recognition as a food rapid detection tool have been widely researched on the area of raw and processed potato quality analyses, its technique and equipment have the potential for commercialization in short term, to meet to the strategy demand of development potato as

  15. Process and device for detecting tumours of the eyes

    International Nuclear Information System (INIS)

    Safi, Nour; Thoreson, Elisabeth.

    1975-01-01

    This invention refers to a process and system for detecting tumours of the eye likely to take up radioelements. To this end, the invention proposes a detection process whereby a molecule labelled by a radioelement emitting Beta radiations having an energy spectrum extending beyond the Cerenkov threshold in the vitreous humor of the eye is introduced into the circulatory system of the patient under examination and whereby the Cerenkov emission is measured through the lens and pupil. A β-emitter radioelement and notably 32 P which is taken up energetically by high metabolic activity tissue can be employed in particular. The invention also proposes a system to use the process described above, comprising a dioptric system for transmitting the light produced in the vitreous humor by Cerenkov effect to a light detector of a type enabling the luminous flux it receives to be integrated [fr

  16. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  17. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  18. Analysis of Android Device-Based Solutions for Fall Detection

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  19. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  20. EXOPLANETARY DETECTION BY MULTIFRACTAL SPECTRAL ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, Sahil; Wettlaufer, John S. [Program in Applied Mathematics, Yale University, New Haven, CT (United States); Sordo, Fabio Del [Department of Astronomy, Yale University, New Haven, CT (United States)

    2017-01-01

    Owing to technological advances, the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies timescales that characterize planetary orbital motion around the host star and those that arise from stellar features such as spots. Without fitting stellar models to spectral data, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the timescales obtained to primary and secondary eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via Doppler shift detection. Finally, we have analyzed synthetic spectra obtained using the SOAP 2.0 tool, which simulates a stellar spectrum and the influence of the presence of a planet or a spot on that spectrum over one orbital period. We have demonstrated that, so long as the signal-to-noise-ratio ≥ 75, our approach reconstructs the planetary orbital period, as well as the rotation period of a spot on the stellar surface.

  1. Detection and analysis of CRISPRs of Shigella.

    Science.gov (United States)

    Guo, Xiangjiao; Wang, Yingfang; Duan, Guangcai; Xue, Zerun; Wang, Linlin; Wang, Pengfei; Qiu, Shaofu; Xi, Yuanlin; Yang, Haiyan

    2015-01-01

    The recently discovered CRISPRs (Clustered regularly interspaced short palindromic repeats) and Cas (CRISPR-associated) proteins are a novel genetic barrier that limits horizontal gene transfer in prokaryotes and the CRISPR loci provide a historical view of the exposure of prokaryotes to a variety of foreign genetic elements. The aim of study was to investigate the occurrence and distribution of the CRISPRs in Shigella. A collection of 61 strains of Shigella were screened for the existence of CRISPRs. Three CRISPR loci were identified among 61 shigella strains. CRISPR1/cas loci are detected in 49 strains of shigella. Yet, IS elements were detected in cas gene in some strains. In the remaining 12 Shigella flexneri strains, the CRISPR1/cas locus is deleted and only a cas3' pseudo gene and a repeat sequence are present. The presence of CRISPR2 is frequently accompanied by the emergence of CRISPR1. CRISPR3 loci were present in almost all strains (52/61). The length of CRISPR arrays varied from 1 to 9 spacers. Sequence analysis of the CRISPR arrays revealed that few spacers had matches in the GenBank databases. However, one spacer in CRISPR3 loci matches the cognate cas3 genes and no cas gene was present around CRISPR3 region. Analysis of CRISPR sequences show that CRISPR have little change which makes CRISPR poor genotyping markers. The present study is the first attempt to determine and analyze CRISPRs of shigella isolated from clinical patients.

  2. Detection, information fusion, and temporal processing for intelligence in recognition

    Energy Technology Data Exchange (ETDEWEB)

    Casasent, D. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  3. IMAGE PROCESSING FOR DETECTION OF ORAL WHITE SPONGE NEVUS LESIONS

    Directory of Open Access Journals (Sweden)

    Rajdeep Mitra

    2016-12-01

    Full Text Available White Sponge Nevus is a rear hereditary disease in human causes incurable white lesions in oral mucosa. Appropriate history, clinical examination along with biopsy and cytological studies are helpful for diagnosis of this disorder. Identification can also be made in alternative way by applying image processing technique using Watershed segmentation with MATLAB software. The applied techniques are effective and reliable for early accurate detection of the disease as alternative of expertise clinical and time taking laboratory investigations.

  4. Information theoretic analysis of canny edge detection in visual communication

    Science.gov (United States)

    Jiang, Bo; Rahman, Zia-ur

    2011-06-01

    In general edge detection evaluation, the edge detectors are examined, analyzed, and compared either visually or with a metric for specific an application. This analysis is usually independent of the characteristics of the image-gathering, transmission and display processes that do impact the quality of the acquired image and thus, the resulting edge image. We propose a new information theoretic analysis of edge detection that unites the different components of the visual communication channel and assesses edge detection algorithms in an integrated manner based on Shannon's information theory. The edge detection algorithm here is considered to achieve high performance only if the information rate from the scene to the edge approaches the maximum possible. Thus, by setting initial conditions of the visual communication system as constant, different edge detection algorithms could be evaluated. This analysis is normally limited to linear shift-invariant filters so in order to examine the Canny edge operator in our proposed system, we need to estimate its "power spectral density" (PSD). Since the Canny operator is non-linear and shift variant, we perform the estimation for a set of different system environment conditions using simulations. In our paper we will first introduce the PSD of the Canny operator for a range of system parameters. Then, using the estimated PSD, we will assess the Canny operator using information theoretic analysis. The information-theoretic metric is also used to compare the performance of the Canny operator with other edge-detection operators. This also provides a simple tool for selecting appropriate edgedetection algorithms based on system parameters, and for adjusting their parameters to maximize information throughput.

  5. Detection of outliers by neural network on the gas centrifuge experimental data of isotopic separation process

    International Nuclear Information System (INIS)

    Andrade, Monica de Carvalho Vasconcelos

    2004-01-01

    This work presents and discusses the neural network technique aiming at the detection of outliers on a set of gas centrifuge isotope separation experimental data. In order to evaluate the application of this new technique, the result obtained of the detection is compared to the result of the statistical analysis combined with the cluster analysis. This method for the detection of outliers presents a considerable potential in the field of data analysis and it is at the same time easier and faster to use and requests very less knowledge of the physics involved in the process. This work established a procedure for detecting experiments which are suspect to contain gross errors inside a data set where the usual techniques for identification of these errors cannot be applied or its use/demands an excessively long work. (author)

  6. Terrain Mapping and Obstacle Detection Using Gaussian Processes

    DEFF Research Database (Denmark)

    Kjærgaard, Morten; Massaro, Alessandro Salvatore; Bayramoglu, Enis

    2011-01-01

    In this paper we consider a probabilistic method for extracting terrain maps from a scene and use the information to detect potential navigation obstacles within it. The method uses Gaussian process regression (GPR) to predict an estimate function and its relative uncertainty. To test the new...... show that the estimated maps follow the terrain shape, while protrusions are identified and may be isolated as potential obstacles. Representing the data with a covariance function allows a dramatic reduction of the amount of data to process, while maintaining the statistical properties of the measured...... and interpolated features....

  7. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  8. Analysis of multiparty mediation processes

    NARCIS (Netherlands)

    Vuković, Siniša

    2013-01-01

    Crucial challenges for multiparty mediation processes include the achievement of adequate cooperation among the mediators and consequent coordination of their activities in the mediation process. Existing literature goes only as far as to make it clear that successful mediation requires necessary

  9. Experimental analysis of armouring process

    Science.gov (United States)

    Lamberti, Alberto; Paris, Ennio

    Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.

  10. Processing Satellite Imagery To Detect Waste Tire Piles

    Science.gov (United States)

    Skiles, Joseph; Schmidt, Cynthia; Wuinlan, Becky; Huybrechts, Catherine

    2007-01-01

    A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors and managers time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate. The task of identifying tire piles in satellite imagery is uniquely challenging because of their low reflectance levels: Tires tend to be spectrally confused with shadows and deep water, both of which reflect little light to satellite-borne imaging systems. In this methodology, the challenge is met, in part, by use of software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model included incorporation of lessons learned in previous research on the detection and mapping of tire piles by use of manual/ visual and/or computational analysis of aerial and satellite imagery. The TIRe model is a computational model for identifying tire piles and discriminating between tire piles and other objects. The input to the TIRe model is the georeferenced but otherwise raw satellite spectral images of a geographic region to be surveyed

  11. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    Bruschi, R.; Di Porto, P.; Pallottelli, R.

    1985-01-01

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  12. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W.B.; Johnston, P.N.; Walker, S.R.; Bubb, I.F. [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J.F. [New South Wales Univ., Kensington, NSW (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  13. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W B; Johnston, P N; Walker, S R; Bubb, I F [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J F [New South Wales Univ., Kensington, NSW (Australia); Cohen, D D; Dytlewski, N [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  14. Hypernasal Speech Detection by Acoustic Analysis of Unvoiced Plosive Consonants

    Directory of Open Access Journals (Sweden)

    Alexander Sepúlveda-Sepúlveda

    2009-12-01

    Full Text Available People with a defective velopharyngeal mechanism speak with abnormal nasal resonance (hypernasal speech. Voice analysis methods for hypernasality detection commonly use vowels and nasalized vowels. However to obtain a more general assessment of this abnormality it is necessary to analyze stops and fricatives. This study describes a method with high generalization capability for hypernasality detection analyzing unvoiced Spanish stop consonants. The importance of phoneme-by-phoneme analysis is shown, in contrast with whole word parametrization which includes irrelevant segments from the classification point of view. Parameters that correlate the imprints of Velopharyngeal Incompetence (VPI over voiceless stop consonants were used in the feature estimation stage. Classification was carried out using a Support Vector Machine (SVM, including the Rademacher complexity model with the aim of increasing the generalization capability. Performances of 95.2% and 92.7% were obtained in the processing and verification stages for a repeated cross-validation classifier evaluation.

  15. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira

    2013-01-01

    which rely on notions of distances or densities, this approach introduces a novel concept based on local quadratic entropy for evaluating the similarity of a data object with its neighbors. This information theoretic quantity is used to regularize the closeness amongst data instances and subsequently......Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature...... benefits the process of mapping data into a usually lower dimensional space. Outliers are then identified by spectral analysis of the eigenspace spanned by the set of leading eigenvectors derived from the mapping procedure. The proposed technique is purely data-driven and imposes no assumptions regarding...

  16. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  17. Formal Analysis of Design Process Dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  18. Improved Anomaly Detection using Integrated Supervised and Unsupervised Processing

    Science.gov (United States)

    Hunt, B.; Sheppard, D. G.; Wetterer, C. J.

    There are two broad technologies of signal processing applicable to space object feature identification using nonresolved imagery: supervised processing analyzes a large set of data for common characteristics that can be then used to identify, transform, and extract information from new data taken of the same given class (e.g. support vector machine); unsupervised processing utilizes detailed physics-based models that generate comparison data that can then be used to estimate parameters presumed to be governed by the same models (e.g. estimation filters). Both processes have been used in non-resolved space object identification and yield similar results yet arrived at using vastly different processes. The goal of integrating the results of the two is to seek to achieve an even greater performance by building on the process diversity. Specifically, both supervised processing and unsupervised processing will jointly operate on the analysis of brightness (radiometric flux intensity) measurements reflected by space objects and observed by a ground station to determine whether a particular day conforms to a nominal operating mode (as determined from a training set) or exhibits anomalous behavior where a particular parameter (e.g. attitude, solar panel articulation angle) has changed in some way. It is demonstrated in a variety of different scenarios that the integrated process achieves a greater performance than each of the separate processes alone.

  19. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  20. Detection and Processing Techniques of FECG Signal for Fetal Monitoring

    Directory of Open Access Journals (Sweden)

    Hasan MA

    2009-03-01

    Full Text Available Abstract Fetal electrocardiogram (FECG signal contains potentially precise information that could assist clinicians in making more appropriate and timely decisions during labor. The ultimate reason for the interest in FECG signal analysis is in clinical diagnosis and biomedical applications. The extraction and detection of the FECG signal from composite abdominal signals with powerful and advance methodologies are becoming very important requirements in fetal monitoring. The purpose of this review paper is to illustrate the various methodologies and developed algorithms on FECG signal detection and analysis to provide efficient and effective ways of understanding the FECG signal and its nature for fetal monitoring. A comparative study has been carried out to show the performance and accuracy of various methods of FECG signal analysis for fetal monitoring. Finally, this paper further focused some of the hardware implementations using electrical signals for monitoring the fetal heart rate. This paper opens up a passage for researchers, physicians, and end users to advocate an excellent understanding of FECG signal and its analysis procedures for fetal heart rate monitoring system.

  1. Algorithms Development in Detection of the Gelatinization Process during Enzymatic ‘Dodol’ Processing

    Directory of Open Access Journals (Sweden)

    Azman Hamzah

    2013-09-01

    Full Text Available Computer vision systems have found wide application in foods processing industry to perform quality evaluation. The systems enable to replace human inspectors for the evaluation of a variety of quality attributes. This paper describes the implementation of the Fast Fourier Transform and Kalman filtering algorithms to detect the glutinous rice flour slurry (GRFS gelatinization in an enzymatic „dodol. processing. The onset of the GRFS gelatinization is critical in determining the quality of an enzymatic „dodol.. Combinations of these two algorithms were able to detect the gelatinization of the GRFS. The result shows that the gelatinization of the GRFS was at the time range of 11.75 minutes to 14.75 minutes for 24 batches of processing. This paper will highlight the capability of computer vision using our proposed algorithms in monitoring and controlling of an enzymatic „dodol. processing via image processing technology.

  2. Algorithms Development in Detection of the Gelatinization Process during Enzymatic ‘Dodol’ Processing

    Directory of Open Access Journals (Sweden)

    Azman Hamzah

    2007-11-01

    Full Text Available Computer vision systems have found wide application in foods processing industry to perform the quality evaluation. The systems enable to replace human inspectors for the evaluation of a variety of quality attributes. This paper describes the implementation of the Fast Fourier Transform and Kalman filtering algorithms to detect the glutinous rice flour slurry (GRFS gelatinization in an enzymatic ‘dodol’ processing. The onset of the GRFS gelatinization is critical in determining the quality of an enzymatic ‘dodol’. Combinations of these two algorithms were able to detect the gelatinization of the GRFS. The result shows that the gelatinization of the GRFS was at the time range of 11.75 minutes to 15.33 minutes for 20 batches of processing. This paper will highlight the capability of computer vision using our proposed algorithms in monitoring and controlling of an enzymatic ‘dodol’ processing via image processing technology.

  3. Does facial processing prioritize change detection?: change blindness illustrates costs and benefits of holistic processing.

    Science.gov (United States)

    Wilford, Miko M; Wells, Gary L

    2010-11-01

    There is broad consensus among researchers both that faces are processed more holistically than other objects and that this type of processing is beneficial. We predicted that holistic processing of faces also involves a cost, namely, a diminished ability to localize change. This study (N = 150) utilized a modified change-blindness paradigm in which some trials involved a change in one feature of an image (nose, chin, mouth, hair, or eyes for faces; chimney, porch, window, roof, or door for houses), whereas other trials involved no change. People were better able to detect the occurrence of a change for faces than for houses, but were better able to localize which feature had changed for houses than for faces. Half the trials used inverted images, a manipulation that disrupts holistic processing. With inverted images, the critical interaction between image type (faces vs. houses) and task (change detection vs. change localization) disappeared. The results suggest that holistic processing reduces change-localization abilities.

  4. Discontinuity Detection in the Shield Metal Arc Welding Process.

    Science.gov (United States)

    Cocota, José Alberto Naves; Garcia, Gabriel Carvalho; da Costa, Adilson Rodrigues; de Lima, Milton Sérgio Fernandes; Rocha, Filipe Augusto Santos; Freitas, Gustavo Medeiros

    2017-05-10

    This work proposes a new methodology for the detection of discontinuities in the weld bead applied in Shielded Metal Arc Welding (SMAW) processes. The detection system is based on two sensors-a microphone and piezoelectric-that acquire acoustic emissions generated during the welding. The feature vectors extracted from the sensor dataset are used to construct classifier models. The approaches based on Artificial Neural Network (ANN) and Support Vector Machine (SVM) classifiers are able to identify with a high accuracy the three proposed weld bead classes: desirable weld bead, shrinkage cavity and burn through discontinuities. Experimental results illustrate the system's high accuracy, greater than 90% for each class. A novel Hierarchical Support Vector Machine (HSVM) structure is proposed to make feasible the use of this system in industrial environments. This approach presented 96.6% overall accuracy. Given the simplicity of the equipment involved, this system can be applied in the metal transformation industries.

  5. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    Science.gov (United States)

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  6. Entropy Measures for Stochastic Processes with Applications in Functional Anomaly Detection

    Directory of Open Access Journals (Sweden)

    Gabriel Martos

    2018-01-01

    Full Text Available We propose a definition of entropy for stochastic processes. We provide a reproducing kernel Hilbert space model to estimate entropy from a random sample of realizations of a stochastic process, namely functional data, and introduce two approaches to estimate minimum entropy sets. These sets are relevant to detect anomalous or outlier functional data. A numerical experiment illustrates the performance of the proposed method; in addition, we conduct an analysis of mortality rate curves as an interesting application in a real-data context to explore functional anomaly detection.

  7. Vision based error detection for 3D printing processes

    Directory of Open Access Journals (Sweden)

    Baumann Felix

    2016-01-01

    Full Text Available 3D printers became more popular in the last decade, partly because of the expiration of key patents and the supply of affordable machines. The origin is located in rapid prototyping. With Additive Manufacturing (AM it is possible to create physical objects from 3D model data by layer wise addition of material. Besides professional use for prototyping and low volume manufacturing they are becoming widespread amongst end users starting with the so called Maker Movement. The most prevalent type of consumer grade 3D printers is Fused Deposition Modelling (FDM, also Fused Filament Fabrication FFF. This work focuses on FDM machinery because of their widespread occurrence and large number of open problems like precision and failure. These 3D printers can fail to print objects at a statistical rate depending on the manufacturer and model of the printer. Failures can occur due to misalignment of the print-bed, the print-head, slippage of the motors, warping of the printed material, lack of adhesion or other reasons. The goal of this research is to provide an environment in which these failures can be detected automatically. Direct supervision is inhibited by the recommended placement of FDM printers in separate rooms away from the user due to ventilation issues. The inability to oversee the printing process leads to late or omitted detection of failures. Rejects effect material waste and wasted time thus lowering the utilization of printing resources. Our approach consists of a camera based error detection mechanism that provides a web based interface for remote supervision and early failure detection. Early failure detection can lead to reduced time spent on broken prints, less material wasted and in some cases salvaged objects.

  8. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  9. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  10. Automatic Detection and Resolution of Lexical Ambiguity in Process Models

    NARCIS (Netherlands)

    Pittke, F.; Leopold, H.; Mendling, J.

    2015-01-01

    System-related engineering tasks are often conducted using process models. In this context, it is essential that these models do not contain structural or terminological inconsistencies. To this end, several automatic analysis techniques have been proposed to support quality assurance. While formal

  11. Process and apparatus for detecting presence of plant substances

    International Nuclear Information System (INIS)

    Kirby, J.A.

    1991-01-01

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level

  12. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  13. On damage detection in wind turbine gearboxes using outlier analysis

    Science.gov (United States)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  14. Process Integration Analysis of an Industrial Hydrogen Production Process

    OpenAIRE

    Stolten, Detlef; Grube, Thomas; Tock, Laurence; Maréchal, François; Metzger, Christian; Arpentinier, Philippe

    2010-01-01

    The energy efficiency of an industrial hydrogen production process using steam methane reforming (SMR) combined with the water gas shift reaction (WGS) is analyzed using process integration techniques based on heat cascade calculation and pinch analysis with the aim of identifying potential measures to enhance the process performance. The challenge is to satisfy the high temperature heat demand of the SMR reaction by minimizing the consumption of natural gas to feed the combustion and to expl...

  15. Automatic processing of isotopic dilution curves obtained by precordial detection

    International Nuclear Information System (INIS)

    Verite, J.C.

    1973-01-01

    Dilution curves pose two distinct problems: that of their acquisition and that of their processing. A study devoted to the latter aspect only was presented. It was necessary to satisfy two important conditions: the treatment procedure, although applied to a single category of curves (isotopic dilution curves obtained by precordial detection), had to be as general as possible; to allow dissemination of the method the equipment used had to be relatively modest and inexpensive. A simple method, considering the curve processing as a process identification, was developed and should enable the mean heart cavity volume and certain pulmonary circulation parameters to be determined. Considerable difficulties were encountered, limiting the value of the results obtained though not condemning the method itself. The curve processing question raised the problem of their acquisition, i.e. the number of these curves and their meaning. A list of the difficulties encountered is followed by a set of possible solutions, a solution being understood to mean a curve processing combination where the overlapping between the two aspects of the problem is accounted for [fr

  16. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  17. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  18. Computer-assisted image processing to detect spores from the fungus Pandora neoaphidis.

    Science.gov (United States)

    Korsnes, Reinert; Westrum, Karin; Fløistad, Erling; Klingen, Ingeborg

    2016-01-01

    This contribution demonstrates an example of experimental automatic image analysis to detect spores prepared on microscope slides derived from trapping. The application is to monitor aerial spore counts of the entomopathogenic fungus Pandora neoaphidis which may serve as a biological control agent for aphids. Automatic detection of such spores can therefore play a role in plant protection. The present approach for such detection is a modification of traditional manual microscopy of prepared slides, where autonomous image recording precedes computerised image analysis. The purpose of the present image analysis is to support human visual inspection of imagery data - not to replace it. The workflow has three components:•Preparation of slides for microscopy.•Image recording.•Computerised image processing where the initial part is, as usual, segmentation depending on the actual data product. Then comes identification of blobs, calculation of principal axes of blobs, symmetry operations and projection on a three parameter egg shape space.

  19. Effects of image processing on the detective quantum efficiency

    Science.gov (United States)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-04-01

    Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

  20. Root cause analysis with enriched process logs

    NARCIS (Netherlands)

    Suriadi, S.; Ouyang, C.; Aalst, van der W.M.P.; Hofstede, ter A.H.M.; La Rosa, M.; Soffer, P.

    2013-01-01

    n the field of process mining, the use of event logs for the purpose of root cause analysis is increasingly studied. In such an analysis, the availability of attributes/features that may explain the root cause of some phenomena is crucial. Currently, the process of obtaining these attributes from

  1. Streak detection and analysis pipeline for optical images

    Science.gov (United States)

    Virtanen, J.; Granvik, M.; Torppa, J.; Muinonen, K.; Poikonen, J.; Lehti, J.; Säntti, T.; Komulainen, T.; Flohrer, T.

    2014-07-01

    We describe a novel data processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data to support the development and validation of population models, and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. We focus on the low signal-to-noise (SNR) detection of objects with high angular velocities, resulting in long and faint object trails, or streaks, in the optical images. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and, particularly for satellites, within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a 'track-before-detect' problem, resulting in streaks of arbitrary lengths. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, algorithms are not readily available yet. In the ESA-funded StreakDet (Streak detection and astrometric reduction) project, we develop and evaluate an automated processing pipeline applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The algorithmic

  2. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2005-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  3. Computerized Analysis and Detection of Missed Cancer in Screening Mammogram

    National Research Council Canada - National Science Library

    Li, Lihua

    2004-01-01

    This project is to explore an innovative CAD strategy for improving early detection of breast cancer in screening mammograms by focusing on computerized analysis and detection of cancers missed by radiologists...

  4. Trojan detection model based on network behavior analysis

    International Nuclear Information System (INIS)

    Liu Junrong; Liu Baoxu; Wang Wenjin

    2012-01-01

    Based on the analysis of existing Trojan detection technology, this paper presents a Trojan detection model based on network behavior analysis. First of all, we abstract description of the Trojan network behavior, then according to certain rules to establish the characteristic behavior library, and then use the support vector machine algorithm to determine whether a Trojan invasion. Finally, through the intrusion detection experiments, shows that this model can effectively detect Trojans. (authors)

  5. Detectability of Granger causality for subsampled continuous-time neurophysiological processes.

    Science.gov (United States)

    Barnett, Lionel; Seth, Anil K

    2017-01-01

    Granger causality is well established within the neurosciences for inference of directed functional connectivity from neurophysiological data. These data usually consist of time series which subsample a continuous-time biophysiological process. While it is well known that subsampling can lead to imputation of spurious causal connections where none exist, less is known about the effects of subsampling on the ability to reliably detect causal connections which do exist. We present a theoretical analysis of the effects of subsampling on Granger-causal inference. Neurophysiological processes typically feature signal propagation delays on multiple time scales; accordingly, we base our analysis on a distributed-lag, continuous-time stochastic model, and consider Granger causality in continuous time at finite prediction horizons. Via exact analytical solutions, we identify relationships among sampling frequency, underlying causal time scales and detectability of causalities. We reveal complex interactions between the time scale(s) of neural signal propagation and sampling frequency. We demonstrate that detectability decays exponentially as the sample time interval increases beyond causal delay times, identify detectability "black spots" and "sweet spots", and show that downsampling may potentially improve detectability. We also demonstrate that the invariance of Granger causality under causal, invertible filtering fails at finite prediction horizons, with particular implications for inference of Granger causality from fMRI data. Our analysis emphasises that sampling rates for causal analysis of neurophysiological time series should be informed by domain-specific time scales, and that state-space modelling should be preferred to purely autoregressive modelling. On the basis of a very general model that captures the structure of neurophysiological processes, we are able to help identify confounds, and offer practical insights, for successful detection of causal connectivity

  6. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2018-03-01

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  7. A core ontology for business process analysis

    NARCIS (Netherlands)

    Pedrinaci, C.; Domingue, J.; Alves De Medeiros, A.K.; Bechhofer, S.; Hauswirth, M.; Hoffmann, J.; Koubarakis, M.

    2008-01-01

    Business Process Management (BPM) aims at supporting the whole life-cycle necessary to deploy and maintain business processes in organisations. An important step of the BPM life-cycle is the analysis of the processes deployed in companies. However, the degree of automation currently achieved cannot

  8. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  9. Detection and Analysis of Threats to the Energy Sector: DATES

    Energy Technology Data Exchange (ETDEWEB)

    Alfonso Valdes

    2010-03-31

    This report summarizes Detection and Analysis of Threats to the Energy Sector (DATES), a project sponsored by the United States Department of Energy and performed by a team led by SRI International, with collaboration from Sandia National Laboratories, ArcSight, Inc., and Invensys Process Systems. DATES sought to advance the state of the practice in intrusion detection and situational awareness with respect to cyber attacks in energy systems. This was achieved through adaptation of detection algorithms for process systems as well as development of novel anomaly detection techniques suited for such systems into a detection suite. These detection components, together with third-party commercial security systems, were interfaced with the commercial Security Information Event Management (SIEM) solution from ArcSight. The efficacy of the integrated solution was demonstrated on two testbeds, one based on a Distributed Control System (DCS) from Invensys, and the other based on the Virtual Control System Environment (VCSE) from Sandia. These achievements advance the DOE Cybersecurity Roadmap [DOE2006] goals in the area of security monitoring. The project ran from October 2007 until March 2010, with the final six months focused on experimentation. In the validation phase, team members from SRI and Sandia coupled the two test environments and carried out a number of distributed and cross-site attacks against various points in one or both testbeds. Alert messages from the distributed, heterogeneous detection components were correlated using the ArcSight SIEM platform, providing within-site and cross-site views of the attacks. In particular, the team demonstrated detection and visualization of network zone traversal and denial-of-service attacks. These capabilities were presented to the DistribuTech Conference and Exhibition in March 2010. The project was hampered by interruption of funding due to continuing resolution issues and agreement on cost share for four months in 2008

  10. Effects of image processing on the detective quantum efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na [Yonsei University, Wonju (Korea, Republic of)

    2010-02-15

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  11. Effects of image processing on the detective quantum efficiency

    International Nuclear Information System (INIS)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-01-01

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  12. Detecting change in dynamic process systems with immunocomputing

    Energy Technology Data Exchange (ETDEWEB)

    Yang, X.; Aldrich, C.; Maree, C. [University of Stellenbosch, Stellenbosch (South Africa). Dept. of Process Engineering

    2007-02-15

    The natural immune system is an adaptive distributed pattern recognition system with several functional components designed for recognition, memory acquisition, diversity and self-regulation. In artificial immune systems, some of these characteristics are exploited in order to design computational systems capable of detecting novel patterns or the anomalous behaviour of a system in some sense. Despite their obvious promise in the application of fault diagnostic systems in process engineering, their potential remains largely unexplored in this regard. In this paper, the application of real-valued negative selection algorithms to simulated and real-world systems is considered. These algorithms deal with the self-nonself discrimination problem in immunocomputing, where normal process behaviour is coded as the self and any deviations from normal behaviour is encoded as nonself. The case studies have indicated that immunocomputing based on negative selection can provide competitive options for fault diagnosis in nonlinear process systems, but further work is required on large systems characterized by many variables.

  13. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    Science.gov (United States)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  14. Detecting geomorphic processes and change with high resolution topographic data

    Science.gov (United States)

    Mudd, Simon; Hurst, Martin; Grieve, Stuart; Clubb, Fiona; Milodowski, David; Attal, Mikael

    2016-04-01

    The first global topographic dataset was released in 1996, with 1 km grid spacing. It is astonishing that in only 20 years we now have access to tens of thousands of square kilometres of LiDAR data at point densities greater than 5 points per square meter. This data represents a treasure trove of information that our geomorphic predecessors could only dream of. But what are we to do with this data? Here we explore the potential of high resolution topographic data to dig deeper into geomorphic processes across a wider range of landscapes and using much larger spatial coverage than previously possible. We show how this data can be used to constrain sediment flux relationships using relief and hillslope length, and how this data can be used to detect landscape transience. We show how the nonlinear sediment flux law, proposed for upland, soil mantled landscapes by Roering et al. (1999) is consistent with a number of topographic tests. This flux law allows us to predict how landscapes will respond to tectonic forcing, and we show how these predictions can be used to detect erosion rate perturbations across a range of tectonic settings.

  15. Infective endocarditis detection through SPECT/CT images digital processing

    Science.gov (United States)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  16. Low-Altitude and Slow-Speed Small Target Detection Based on Spectrum Zoom Processing

    Directory of Open Access Journals (Sweden)

    Xuwang Zhang

    2018-01-01

    Full Text Available This paper proposes a spectrum zoom processing based target detection algorithm for detecting the weak echo of low-altitude and slow-speed small (LSS targets in heavy ground clutter environments, which can be used to retrofit the existing radar systems. With the existing range-Doppler frequency images, the proposed method firstly concatenates the data from the same Doppler frequency slot of different images and then applies the spectrum zoom processing. After performing the clutter suppression, the target detection can be finally implemented. Through the theoretical analysis and real data verification, it is shown that the proposed algorithm can obtain a preferable spectrum zoom result and improve the signal-to-clutter ratio (SCR with a very low computational load.

  17. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  18. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-05-01

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Process mining and security: detecting anomalous process executions and checking process conformance

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Alves De Medeiros, A.K.

    2005-01-01

    One approach to secure systems is through the analysis of audit trails. An audit trail is a record of all events that take place in a system and across a network, i.e., it provides a trace of user/system actionssothatsecurityeventscanberelatedtotheactionsofaspecific individual or system component.

  20. Moments analysis of concurrent Poisson processes

    International Nuclear Information System (INIS)

    McBeth, G.W.; Cross, P.

    1975-01-01

    A moments analysis of concurrent Poisson processes has been carried out. Equations are given which relate combinations of distribution moments to sums of products involving the number of counts associated with the processes and the mean rate of the processes. Elimination of background is discussed and equations suitable for processing random radiation, parent-daughter pairs in the presence of background, and triple and double correlations in the presence of background are given. The theory of identification of the four principle radioactive series by moments analysis is discussed. (Auth.)

  1. Application of flaw detection methods for detection of fatigue processes in low-alloyed steel

    Directory of Open Access Journals (Sweden)

    Zbigniew H. śUREK

    2007-01-01

    Full Text Available The paper presents the investigations conducted in the Fraunhofer Institute (IZFP Saarbrücken by use of a BEMI microscope (BEMI= Barkhausenrausch- und Wirbelstrom-Mikroskopie or Barkhausen Noise and Eddy Current Microscopy. The ability to detect cyclic and contact fatigue load influences has been investigated. The measurement amplitudes obtained with Barkhausen Noise and Eddy Current probes havebeen analysed. Correlation of measurement results and material’s condition has been observed in case of the eddy current mode method for frequencies above 2 MHz (for contact-loaded material samples. Detection of material’s fatigue process (at 80 % fatiguelife in the sample subjected to series of high-cyclic loads has been proven to be practically impossible. Application of flaw detection methods in material fatigue tests requires modification of test methods and use of investigation methods relevant to physical parameters of the investigated material. The magnetic leakage field method, which has been abandoned by many researchers, may be of significant use in the material fatigue assessment and may provide new research prospects.

  2. Neutron activation analysis detection limits using 252Cf sources

    International Nuclear Information System (INIS)

    DiPrete, D.P.; Sigg, R.A.

    2000-01-01

    The Savannah River Technology Center (SRTC) developed a neutron activation analysis (NAA) facility several decades ago using low-flux 252 Cf neutron sources. Through this time, the facility has addressed areas of applied interest in managing the Savannah River Site (SRS). Some applications are unique because of the site's operating history and its chemical-processing facilities. Because sensitivity needs for many applications are not severe, they can be accomplished using an ∼6-mg 252 Cf NAA facility. The SRTC 252 Cf facility continues to support applied research programs at SRTC as well as other SRS programs for environmental and waste management customers. Samples analyzed by NAA include organic compounds, metal alloys, sediments, site process solutions, and many other materials. Numerous radiochemical analyses also rely on the facility for production of short-lived tracers, yielding by activation of carriers and small-scale isotope production for separation methods testing. These applications are more fully reviewed in Ref. 1. Although the flux [approximately2 x 10 7 n/cm 2 ·s] is low relative to reactor facilities, more than 40 elements can be detected at low and sub-part-per-million levels. Detection limits provided by the facility are adequate for many analytical projects. Other multielement analysis methods, particularly inductively coupled plasma atomic emission and inductively coupled plasma mass spectrometry, can now provide sensitivities on dissolved samples that are often better than those available by NAA using low-flux isotopic sources. Because NAA allows analysis of bulk samples, (a) it is a more cost-effective choice when its sensitivity is adequate than methods that require digestion and (b) it eliminates uncertainties that can be introduced by digestion processes

  3. Probing the lifetimes of auditory novelty detection processes.

    Science.gov (United States)

    Pegado, Felipe; Bekinschtein, Tristan; Chausson, Nicolas; Dehaene, Stanislas; Cohen, Laurent; Naccache, Lionel

    2010-08-01

    Auditory novelty detection can be fractionated into multiple cognitive processes associated with their respective neurophysiological signatures. In the present study we used high-density scalp event-related potentials (ERPs) during an active version of the auditory oddball paradigm to explore the lifetimes of these processes by varying the stimulus onset asynchrony (SOA). We observed that early MMN (90-160 ms) decreased when the SOA increased, confirming the evanescence of this echoic memory system. Subsequent neural events including late MMN (160-220 ms) and P3a/P3b components of the P3 complex (240-500 ms) did not decay with SOA, but showed a systematic delay effect supporting a two-stage model of accumulation of evidence. On the basis of these observations, we propose a distinction within the MMN complex of two distinct events: (1) an early, pre-attentive and fast-decaying MMN associated with generators located within superior temporal gyri (STG) and frontal cortex, and (2) a late MMN more resistant to SOA, corresponding to the activation of a distributed cortical network including fronto-parietal regions. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  4. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  5. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  6. Detection and localization of leak of pipelines of RBMK reactor. Methods of processing of acoustic noise

    International Nuclear Information System (INIS)

    Tcherkaschov, Y.M.; Strelkov, B.P.; Chimanski, S.B.; Lebedev, V.I.; Belyanin, L.A.

    1997-01-01

    For realization of leak detection of input pipelines and output pipelines of RBMK reactor the method, based on detection and control of acoustic leak signals, was designed. In this report the review of methods of processing and analysis of acoustic noise is submitted. These methods were included in the software of the leak detection system and are used for the decision of the following problems: leak detection by method of sound pressure level in conditions of powerful background noise and strong attenuation of a signal; detection of a small leak in early stage by high-sensitivity correlation method; determination of a point of a sound source in conditions of strong reflection of a signal by a correlation method and sound pressure method; evaluation of leak size by the analysis of a sound level and point of a sound source. The work of considered techniques is illustrated on an example of test results of a fragment of the leak detection system. This test was executed on a Leningrad NPP, operated at power levels of 460, 700, 890 and 1000 MWe. 16 figs

  7. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  8. Detection and reduction of tungsten contamination in ion implantation processes

    International Nuclear Information System (INIS)

    Polignano, M.L.; Galbiati, A.; Grasso, S.; Mica, I.; Barbarossa, F.; Magni, D.

    2016-01-01

    In this paper, we review the results of some studies addressing the problem of tungsten contamination in implantation processes. For some tests, the implanter was contaminated by implantation of wafers with an exposed tungsten layer, resulting in critical contamination conditions. First, DLTS (deep level transient spectroscopy) measurements were calibrated to measure tungsten contamination in ion-implanted samples. DLTS measurements of tungsten-implanted samples showed that the tungsten concentration increases linearly with the dose up to a rather low dose (5 x 10 10 cm -2 ). Tungsten deactivation was observed when the dose was further increased. Under these conditions, ToF-SIMS revealed tungsten at the wafer surface, showing that deactivation was due to surface segregation. DLTS calibration could therefore be obtained in the linear dose regime only. This calibration was used to evaluate the tungsten contamination in arsenic implantations. Ordinary operating conditions and critical contamination conditions of the equipment were compared. A moderate tungsten contamination was observed in samples implanted under ordinary operating conditions. This contamination was easily suppressed by a thin screen oxide. On the contrary, implantations in critical conditions of the equipment resulted in a relevant tungsten contamination, which could be reduced but not suppressed even by a relatively thick screen oxide (up to 150 Aa). A decontamination process consisting of high dose implantations of dummy wafers was tested for its efficiency to remove tungsten and titanium contamination. This process was found to be much more effective for titanium than for tungsten. Finally, DLTS proved to be much more sensitive that TXRF (total reflection X-ray fluorescence) in detecting tungsten contamination. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. Detection and reduction of tungsten contamination in ion implantation processes

    Energy Technology Data Exchange (ETDEWEB)

    Polignano, M.L.; Galbiati, A.; Grasso, S.; Mica, I.; Barbarossa, F.; Magni, D. [STMicroelectronics, Agrate Brianza (Italy)

    2016-12-15

    In this paper, we review the results of some studies addressing the problem of tungsten contamination in implantation processes. For some tests, the implanter was contaminated by implantation of wafers with an exposed tungsten layer, resulting in critical contamination conditions. First, DLTS (deep level transient spectroscopy) measurements were calibrated to measure tungsten contamination in ion-implanted samples. DLTS measurements of tungsten-implanted samples showed that the tungsten concentration increases linearly with the dose up to a rather low dose (5 x 10{sup 10} cm{sup -2}). Tungsten deactivation was observed when the dose was further increased. Under these conditions, ToF-SIMS revealed tungsten at the wafer surface, showing that deactivation was due to surface segregation. DLTS calibration could therefore be obtained in the linear dose regime only. This calibration was used to evaluate the tungsten contamination in arsenic implantations. Ordinary operating conditions and critical contamination conditions of the equipment were compared. A moderate tungsten contamination was observed in samples implanted under ordinary operating conditions. This contamination was easily suppressed by a thin screen oxide. On the contrary, implantations in critical conditions of the equipment resulted in a relevant tungsten contamination, which could be reduced but not suppressed even by a relatively thick screen oxide (up to 150 Aa). A decontamination process consisting of high dose implantations of dummy wafers was tested for its efficiency to remove tungsten and titanium contamination. This process was found to be much more effective for titanium than for tungsten. Finally, DLTS proved to be much more sensitive that TXRF (total reflection X-ray fluorescence) in detecting tungsten contamination. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  10. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  11. Theoretical analysis of radiographic images by nonstationary Poisson processes

    International Nuclear Information System (INIS)

    Tanaka, Kazuo; Uchida, Suguru; Yamada, Isao.

    1980-01-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process. (author)

  12. In-process fault detection for textile fabric production: onloom imaging

    Science.gov (United States)

    Neumann, Florian; Holtermann, Timm; Schneider, Dorian; Kulczycki, Ashley; Gries, Thomas; Aach, Til

    2011-05-01

    Constant and traceable high fabric quality is of high importance both for technical and for high-quality conventional fabrics. Usually, quality inspection is carried out by trained personal, whose detection rate and maximum period of concentration are limited. Low resolution automated fabric inspection machines using texture analysis were developed. Since 2003, systems for the in-process inspection on weaving machines ("onloom") are commercially available. With these defects can be detected, but not measured quantitative precisely. Most systems are also prone to inevitable machine vibrations. Feedback loops for fault prevention are not established. Technology has evolved since 2003: Camera and computer prices dropped, resolutions were enhanced, recording speeds increased. These are the preconditions for real-time processing of high-resolution images. So far, these new technological achievements are not used in textile fabric production. For efficient use, a measurement system must be integrated into the weaving process; new algorithms for defect detection and measurement must be developed. The goal of the joint project is the development of a modern machine vision system for nondestructive onloom fabric inspection. The system consists of a vibration-resistant machine integration, a high-resolution machine vision system, and new, reliable, and robust algorithms with quality database for defect documentation. The system is meant to detect, measure, and classify at least 80 % of economically relevant defects. Concepts for feedback loops into the weaving process will be pointed out.

  13. Summary of process research analysis efforts

    Science.gov (United States)

    Burger, D. R.

    1985-01-01

    A summary of solar-cell process research analysis efforts was presented. Process design and cell design are interactive efforts where technology from integrated circuit processes and other processes are blended. The primary factors that control cell efficiency are: (1) the bulk parameters of the available sheet material, (2) the retention and enhancement of these bulk parameters, and (3) the cell design and the cost to produce versus the finished cells performance. The process sequences need to be tailored to be compatible with the sheet form, the cell shape form, and the processing equipment. New process options that require further evaluation and utilization are lasers, robotics, thermal pulse techniques, and new materials. There are numerous process control techniques that can be adapted and used that will improve product uniformity and reduced costs. Two factors that can lead to longer life modules are the use of solar cell diffusion barriers and improved encapsulation.

  14. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  15. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  16. Image Processing Methods Usable for Object Detection on the Chessboard

    Directory of Open Access Journals (Sweden)

    Beran Ladislav

    2016-01-01

    Full Text Available Image segmentation and object detection is challenging problem in many research. Although many algorithms for image segmentation have been invented, there is no simple algorithm for image segmentation and object detection. Our research is based on combination of several methods for object detection. The first method suitable for image segmentation and object detection is colour detection. This method is very simply, but there is problem with different colours. For this method it is necessary to have precisely determined colour of segmented object before all calculations. In many cases it is necessary to determine this colour manually. Alternative simply method is method based on background removal. This method is based on difference between reference image and detected image. In this paper several methods suitable for object detection are described. Thisresearch is focused on coloured object detection on chessboard. The results from this research with fusion of neural networks for user-computer game checkers will be applied.

  17. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  18. Detection of non-stationary leak signals at NPP primary circuit by cross-correlation analysis

    International Nuclear Information System (INIS)

    Shimanskij, S.B.

    2007-01-01

    A leak-detection system employing high-temperature microphones has been developed for the RBMK and ATR (Japan) reactors. Further improvement of the system focused on using cross-correlation analysis of the spectral components of the signal to detect a small leak at an early stage of development. Since envelope processes are less affected by distortions than are wave processes, they give a higher-degree of correlation and can be used to detect leaks with lower signal-noise ratios. Many simulation tests performed at nuclear power plants have shown that the proposed methods can be used to detect and find the location of a small leak [ru

  19. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  20. Probing Interfacial Processes on Graphene Surface by Mass Detection

    Science.gov (United States)

    Kakenov, Nurbek; Kocabas, Coskun

    2013-03-01

    In this work we studied the mass density of graphene, probed interfacial processes on graphene surface and examined the formation of graphene oxide by mass detection. The graphene layers were synthesized by chemical vapor deposition method on copper foils and transfer-printed on a quartz crystal microbalance (QCM). The mass density of single layer graphene was measured by investigating the mechanical resonance of the QCM. Moreover, we extended the developed technique to probe the binding dynamics of proteins on the surface of graphene, were able to obtain nonspecific binding constant of BSA protein of graphene surface in aqueous solution. The time trace of resonance signal showed that the BSA molecules rapidly saturated by filling the available binding sites on graphene surface. Furthermore, we monitored oxidation of graphene surface under oxygen plasma by tracing the changes of interfacial mass of the graphene controlled by the shifts in Raman spectra. Three regimes were observed the formation of graphene oxide which increases the interfacial mass, the release of carbon dioxide and the removal of small graphene/graphene oxide flakes. Scientific and Technological Research Council of Turkey (TUBITAK) grant no. 110T304, 109T209, Marie Curie International Reintegration Grant (IRG) grant no 256458, Turkish Academy of Science (TUBA-Gebip).

  1. Processability analysis of candidate waste forms

    International Nuclear Information System (INIS)

    Gould, T.H. Jr.; Dunson, J.B. Jr.; Eisenberg, A.M.; Haight, H.G. Jr.; Mello, V.E.; Schuyler, R.L. III.

    1982-01-01

    A quantitative merit evaluation, or processability analysis, was performed to assess the relative difficulty of remote processing of Savannah River Plant high-level wastes for seven alternative waste form candidates. The reference borosilicate glass process was rated as the simplest, followed by FUETAP concrete, glass marbles in a lead matrix, high-silica glass, crystalline ceramics (SYNROC-D and tailored ceramics), and coated ceramic particles. Cost estimates for the borosilicate glass, high-silica glass, and ceramic waste form processing facilities are also reported

  2. Non-Harmonic Fourier Analysis for bladed wheels damage detection

    Science.gov (United States)

    Neri, P.; Peeters, B.

    2015-11-01

    The interaction between bladed wheels and the fluid distributed by the stator vanes results in cyclic loading of the rotating components. Compressors and turbines wheels are subject to vibration and fatigue issues, especially when resonance conditions are excited. Even if resonance conditions can be often predicted and avoided, high cycle fatigue failures can occur, causing safety issues and economic loss. Rigorous maintenance programs are then needed, forcing the system to expensive shut-down. Blade crack detection methods are beneficial for condition-based maintenance. While contact measurement systems are not always usable in exercise conditions (e.g. high temperature), non-contact methods can be more suitable. One (or more) stator-fixed sensor can measure all the blades as they pass by, in order to detect the damaged ones. The main drawback in this situation is the short acquisition time available for each blade, which is shortened by the high rotational speed of the components. A traditional Discrete Fourier Transform (DFT) analysis would result in a poor frequency resolution. A Non-Harmonic Fourier Analysis (NHFA) can be executed with an arbitrary frequency resolution instead, allowing to obtain frequency information even with short-time data samples. This paper shows an analytical investigation of the NHFA method. A data processing algorithm is then proposed to obtain frequency shift information from short time samples. The performances of this algorithm are then studied by experimental and numerical tests.

  3. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  4. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  5. PROCESSING BIG REMOTE SENSING DATA FOR FAST FLOOD DETECTION IN A DISTRIBUTED COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    A. Olasz

    2017-07-01

    Full Text Available The Earth observation (EO missions of the space agencies and space industry (ESA, NASA, national and commercial companies are evolving as never before. These missions aim to develop and launch next-generation series of satellites and sensors and often provide huge amounts of data, even free of charge, to enable novel monitoring services. The wide geospatial sector is targeted to handle new challenges to store, process and visualize these geospatial data, reaching the level of Big Data by their volume, variety, velocity, along with the need of multi-source spatio-temporal geospatial data processing. Handling and analysis of remote sensing data has always been a cumbersome task due to the ever-increasing size and frequency of collected information. This paper presents the achievements of the IQmulus EU FP7 research and development project with respect to processing and analysis of geospatial big data in the context of flood and waterlogging detection.

  6. Detection and monitoring of neurotransmitters--a spectroscopic analysis.

    Science.gov (United States)

    Manciu, Felicia S; Lee, Kendall H; Durrer, William G; Bennet, Kevin E

    2013-01-01

    We demonstrate that confocal Raman mapping spectroscopy provides rapid, detailed, and accurate neurotransmitter analysis, enabling millisecond time resolution monitoring of biochemical dynamics. As a prototypical demonstration of the power of the method, we present real-time in vitro serotonin, adenosine, and dopamine detection, and dopamine diffusion in an inhomogeneous organic gel, which was used as a substitute for neurologic tissue.  Dopamine, adenosine, and serotonin were used to prepare neurotransmitter solutions in distilled water. The solutions were applied to the surfaces of glass slides, where they interdiffused. Raman mapping was achieved by detecting nonoverlapping spectral signatures characteristic of the neurotransmitters with an alpha 300 WITec confocal Raman system, using 532 nm neodymium-doped yttrium aluminum garnet laser excitation. Every local Raman spectrum was recorded in milliseconds and complete Raman mapping in a few seconds.  Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific microscale image regions. Such information is particularly important for complex, heterogeneous samples, where changes in composition can influence neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.  Accurate nondestructive characterization for real-time detection of neurotransmitters in inhomogeneous environments without the requirement of sample labeling is a key issue in neuroscience. Our work demonstrates the capabilities of Raman spectroscopy in biological applications, possibly providing a new tool for elucidating the mechanism and kinetics of deep brain stimulation. © 2012 International Neuromodulation Society.

  7. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  8. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  9. Website Detection Using Remote Traffic Analysis

    OpenAIRE

    Gong, Xun; Kiyavash, Negar; Schear, Nabíl; Borisov, Nikita

    2011-01-01

    Recent work in traffic analysis has shown that traffic patterns leaked through side channels can be used to recover important semantic information. For instance, attackers can find out which website, or which page on a website, a user is accessing simply by monitoring the packet size distribution. We show that traffic analysis is even a greater threat to privacy than previously thought by introducing a new attack that can be carried out remotely. In particular, we show that, to perform traffi...

  10. GOTRES: an expert system for fault detection and analysis

    International Nuclear Information System (INIS)

    Chung, D.T.; Modarres, M.

    1989-01-01

    This paper describes a deep-knowledge expert system shell for diagnosing faults in process operations. The expert program shell is called GOTRES (GOal TRee Expert System) and uses a goal tree-success tree deep-knowledge structure to model its knowledge-base. To demonstrate GOTRES, we have built an on-line fault diagnosis expert system for an experimental nuclear reactor facility using this shell. The expert system is capable of diagnosing fault conditions using system goal tree as well as utilizing accumulated operating knowledge to predict plant causal and temporal behaviours. The GOTRES shell has also been used for root-cause detection and analysis in a nuclear plant. (author)

  11. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  12. Application and Analysis of Wavelet Transform in Image Edge Detection

    Institute of Scientific and Technical Information of China (English)

    Jianfang gao[1

    2016-01-01

    For the image processing technology, technicians have been looking for a convenient and simple detection method for a long time, especially for the innovation research on image edge detection technology. Because there are a lot of original information at the edge during image processing, thus, we can get the real image data in terms of the data acquisition. The usage of edge is often in the case of some irregular geometric objects, and we determine the contour of the image by combining with signal transmitted data. At the present stage, there are different algorithms in image edge detection, however, different types of algorithms have divergent disadvantages so It is diffi cult to detect the image changes in a reasonable range. We try to use wavelet transformation in image edge detection, making full use of the wave with the high resolution characteristics, and combining multiple images, in order to improve the accuracy of image edge detection.

  13. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  14. Book: Marine Bioacoustic Signal Processing and Analysis

    Science.gov (United States)

    2011-09-30

    physicists , and mathematicians . However, more and more biologists and psychologists are starting to use advanced signal processing techniques and...Book: Marine Bioacoustic Signal Processing and Analysis 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT ...chapters than it should be, since the project must be finished by Dec. 31. I have started setting aside 2 hours of uninterrupted per workday to work

  15. A cost analysis: processing maple syrup products

    Science.gov (United States)

    Neil K. Huyler; Lawrence D. Garrett

    1979-01-01

    A cost analysis of processing maple sap to syrup for three fuel types, oil-, wood-, and LP gas-fired evaporators, indicates that: (1) fuel, capital, and labor are the major cost components of processing sap to syrup; (2) wood-fired evaporators show a slight cost advantage over oil- and LP gas-fired evaporators; however, as the cost of wood approaches $50 per cord, wood...

  16. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Swami Prasad, P.; Dutta, B.K.; Kushwaha, H.S.

    1996-01-01

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  17. Trace analysis for 300 MM wafers and processes with TXRF

    International Nuclear Information System (INIS)

    Nutsch, A.; Erdmann, V.; Zielonka, G.; Pfitzner, L.; Ryssel, H.

    2000-01-01

    Efficient fabrication of semiconductor devices is combined with an increasing size of silicon wafers. The contamination level of processes, media, and equipment has to decrease continuously. A new test laboratory for 300 mm was installed in view of the above mentioned aspects. Aside of numerous processing tools this platform consist electrical test methods, particle detection, vapor phase decomposition (VPD) preparation, and TXRF. The equipment is installed in a cleanroom. It is common to perform process or equipment control, development, evaluation and qualification with monitor wafers. The evaluation and the qualification of 300 mm equipment require direct TXRF on 300 mm wafers. A new TXRF setup was installed due to the wafer size of 300 mm. The 300 mm TXRF is equipped with tungsten and molybdenum anode. This combination allows a sensitive detection of elements with fluorescence energy below 10 keV for tungsten excitation. The molybdenum excitation enables the detection of a wide variety of elements. The detection sensitivity for the tungsten anode excited samples is ten times higher than for molybdenum anode measured samples. The system is calibrated with 1 ng Ni. This calibration shows a stability within 5 % when monitored to control system stability. Decreasing the amount of Ni linear results in a linear decrease of the measured Ni signal. This result is verified for a range of elements by multielement samples. New designs demand new processes and materials, e.g. ferroelectric layers and copper. The trace analysis of many of these materials is supported by the higher excitation energy of the molybdenum anode. Reclaim and recycling of 300 mm wafers demand for an accurate contamination control of the processes to avoid cross contamination. Polishing or etching result in modified surfaces. TXRF as a non-destructive test method allows the simultaneously detection of a variety of elements on differing surfaces in view of contamination control and process

  18. Processing of the quench detection signals in W7-X

    International Nuclear Information System (INIS)

    Birus, Dietrich; Schneider, Matthias; Rummel, Thomas; Fricke, Marko; Petry, Klaus; Ebersoldt, Andreas

    2009-01-01

    The Wendelstein 7-X (W7-X) project uses superconductive coils for generation of the magnetic field to keep the plasma. One of the important safety systems is the protection against quench events. The quench detection system of W7-X protects the superconducting coils, the superconducting bus bar sections and the high temperature superconductor of the current leads against the damage because of a quench and against the high stress by a fast discharge of the magnet system. Therefore, the present design of the quench detection system (QDS) uses a two-stage safety concept for discharging the magnetic system. This paper describes the present design of the system assembly from the quench detection unit (QDU) for the detection of the quench to the quench detection interface (QDI) to implement the two-stage safety concept.

  19. Development of Quantum Devices and Algorithms for Radiation Detection and Radiation Signal Processing

    International Nuclear Information System (INIS)

    El Tokhy, M.E.S.M.E.S.

    2012-01-01

    The main functions of spectroscopy system are signal detection, filtering and amplification, pileup detection and recovery, dead time correction, amplitude analysis and energy spectrum analysis. Safeguards isotopic measurements require the best spectrometer systems with excellent resolution, stability, efficiency and throughput. However, the resolution and throughput, which depend mainly on the detector, amplifier and the analog-to-digital converter (ADC), can still be improved. These modules have been in continuous development and improvement. For this reason we are interested with both the development of quantum detectors and efficient algorithms of the digital processing measurement. Therefore, the main objective of this thesis is concentrated on both 1. Study quantum dot (QD) devices behaviors under gamma radiation 2. Development of efficient algorithms for handling problems of gamma-ray spectroscopy For gamma radiation detection, a detailed study of nanotechnology QD sources and infrared photodetectors (QDIP) for gamma radiation detection is introduced. There are two different types of quantum scintillator detectors, which dominate the area of ionizing radiation measurements. These detectors are QD scintillator detectors and QDIP scintillator detectors. By comparison with traditional systems, quantum systems have less mass, require less volume, and consume less power. These factors are increasing the need for efficient detector for gamma-ray applications such as gamma-ray spectroscopy. Consequently, the nanocomposite materials based on semiconductor quantum dots has potential for radiation detection via scintillation was demonstrated in the literature. Therefore, this thesis presents a theoretical analysis for the characteristics of QD sources and infrared photodetectors (QDIPs). A model of QD sources under incident gamma radiation detection is developed. A novel methodology is introduced to characterize the effect of gamma radiation on QD devices. The rate

  20. Process based analysis of manually controlled drilling processes for bone

    Science.gov (United States)

    Teicher, Uwe; Achour, Anas Ben; Nestler, Andreas; Brosius, Alexander; Lauer, Günter

    2018-05-01

    The machining operation drilling is part of the standard repertoire for medical applications. This machining cycle, which is usually a multi-stage process, generates the geometric element for the subsequent integration of implants, which are screwed into the bone in subsequent processes. In addition to the form, shape and position of the generated drill hole, it is also necessary to use a technology that ensures an operation with minimal damage. A surface damaged by excessive mechanical and thermal energy input shows a deterioration in the healing capacity of implants and represents a structure with complications for inflammatory reactions. The resulting loads are influenced by the material properties of the bone, the used technology and the tool properties. An important aspect of the process analysis is the fact that machining of bone is in most of the cases a manual process that depends mainly on the skills of the operator. This includes, among other things, the machining time for the production of a drill hole, since manual drilling is a force-controlled process. Experimental work was carried out on the bone of a porcine mandible in order to investigate the interrelation of the applied load during drilling. It can be shown that the load application can be subdivided according to the working feed direction. The entire drilling process thus consists of several time domains, which can be divided into the geometry-generating feed motion and a retraction movement of the tool. It has been shown that the removal of the tool from the drill hole has a significant influence on the mechanical load input. This fact is proven in detail by a new evaluation methodology. The causes of this characteristic can also be identified, as well as possible ways of reducing the load input.

  1. Non destructive defect detection by spectral density analysis.

    Science.gov (United States)

    Krejcar, Ondrej; Frischer, Robert

    2011-01-01

    The potential nondestructive diagnostics of solid objects is discussed in this article. The whole process is accomplished by consecutive steps involving software analysis of the vibration power spectrum (eventually acoustic emissions) created during the normal operation of the diagnosed device or under unexpected situations. Another option is to create an artificial pulse, which can help us to determine the actual state of the diagnosed device. The main idea of this method is based on the analysis of the current power spectrum density of the received signal and its postprocessing in the Matlab environment with a following sample comparison in the Statistica software environment. The last step, which is comparison of samples, is the most important, because it is possible to determine the status of the examined object at a given time. Nowadays samples are compared only visually, but this method can't produce good results. Further the presented filter can choose relevant data from a huge group of data, which originate from applying FFT (Fast Fourier Transform). On the other hand, using this approach they can be subjected to analysis with the assistance of a neural network. If correct and high-quality starting data are provided to the initial network, we are able to analyze other samples and state in which condition a certain object is. The success rate of this approximation, based on our testing of the solution, is now 85.7%. With further improvement of the filter, it could be even greater. Finally it is possible to detect defective conditions or upcoming limiting states of examined objects/materials by using only one device which contains HW and SW parts. This kind of detection can provide significant financial savings in certain cases (such as continuous casting of iron where it could save hundreds of thousands of USD).

  2. Detection and Analysis of Circular RNAs by RT-PCR.

    Science.gov (United States)

    Panda, Amaresh C; Gorospe, Myriam

    2018-03-20

    Gene expression in eukaryotic cells is tightly regulated at the transcriptional and posttranscriptional levels. Posttranscriptional processes, including pre-mRNA splicing, mRNA export, mRNA turnover, and mRNA translation, are controlled by RNA-binding proteins (RBPs) and noncoding (nc)RNAs. The vast family of ncRNAs comprises diverse regulatory RNAs, such as microRNAs and long noncoding (lnc)RNAs, but also the poorly explored class of circular (circ)RNAs. Although first discovered more than three decades ago by electron microscopy, only the advent of high-throughput RNA-sequencing (RNA-seq) and the development of innovative bioinformatic pipelines have begun to allow the systematic identification of circRNAs (Szabo and Salzman, 2016; Panda et al ., 2017b; Panda et al ., 2017c). However, the validation of true circRNAs identified by RNA sequencing requires other molecular biology techniques including reverse transcription (RT) followed by conventional or quantitative (q) polymerase chain reaction (PCR), and Northern blot analysis (Jeck and Sharpless, 2014). RT-qPCR analysis of circular RNAs using divergent primers has been widely used for the detection, validation, and sometimes quantification of circRNAs (Abdelmohsen et al ., 2015 and 2017; Panda et al ., 2017b). As detailed here, divergent primers designed to span the circRNA backsplice junction sequence can specifically amplify the circRNAs and not the counterpart linear RNA. In sum, RT-PCR analysis using divergent primers allows direct detection and quantification of circRNAs.

  3. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  4. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  5. Analysis of Exhaled Breath for Disease Detection

    Science.gov (United States)

    Amann, Anton; Miekisch, Wolfram; Schubert, Jochen; Buszewski, Bogusław; Ligor, Tomasz; Jezierski, Tadeusz; Pleil, Joachim; Risby, Terence

    2014-06-01

    Breath analysis is a young field of research with great clinical potential. As a result of this interest, researchers have developed new analytical techniques that permit real-time analysis of exhaled breath with breath-to-breath resolution in addition to the conventional central laboratory methods using gas chromatography-mass spectrometry. Breath tests are based on endogenously produced volatiles, metabolites of ingested precursors, metabolites produced by bacteria in the gut or the airways, or volatiles appearing after environmental exposure. The composition of exhaled breath may contain valuable information for patients presenting with asthma, renal and liver diseases, lung cancer, chronic obstructive pulmonary disease, inflammatory lung disease, or metabolic disorders. In addition, oxidative stress status may be monitored via volatile products of lipid peroxidation. Measurement of enzyme activity provides phenotypic information important in personalized medicine, whereas breath measurements provide insight into perturbations of the human exposome and can be interpreted as preclinical signals of adverse outcome pathways.

  6. Kernel principal component analysis for change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Morton, J.C.

    2008-01-01

    region acquired at two different time points. If change over time does not dominate the scene, the projection of the original two bands onto the second eigenvector will show change over time. In this paper a kernel version of PCA is used to carry out the analysis. Unlike ordinary PCA, kernel PCA...... with a Gaussian kernel successfully finds the change observations in a case where nonlinearities are introduced artificially....

  7. A statistical analysis on the leak detection performance of ...

    Indian Academy of Sciences (India)

    Chinedu Duru

    2017-11-09

    Nov 9, 2017 ... of underground and overground pipelines with wireless sensor networks through the .... detection performance analysis of pipeline leakage. This study and ..... case and apply to all materials transported through the pipeline.

  8. Attack Pattern Analysis Framework for a Multiagent Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Krzysztof Juszczyszyn

    2008-08-01

    Full Text Available The paper proposes the use of attack pattern ontology and formal framework for network traffic anomalies detection within a distributed multi-agent Intrusion Detection System architecture. Our framework assumes ontology-based attack definition and distributed processing scheme with exchange of communicates between agents. The role of traffic anomalies detection was presented then it has been discussed how some specific values characterizing network communication can be used to detect network anomalies caused by security incidents (worm attack, virus spreading. Finally, it has been defined how to use the proposed techniques in distributed IDS using attack pattern ontology.

  9. Automated processing integrated with a microflow cytometer for pathogen detection in clinical matrices.

    Science.gov (United States)

    Golden, J P; Verbarg, J; Howell, P B; Shriver-Lake, L C; Ligler, F S

    2013-02-15

    A spinning magnetic trap (MagTrap) for automated sample processing was integrated with a microflow cytometer capable of simultaneously detecting multiple targets to provide an automated sample-to-answer diagnosis in 40 min. After target capture on fluorescently coded magnetic microspheres, the magnetic trap automatically concentrated the fluorescently coded microspheres, separated the captured target from the sample matrix, and exposed the bound target sequentially to biotinylated tracer molecules and streptavidin-labeled phycoerythrin. The concentrated microspheres were then hydrodynamically focused in a microflow cytometer capable of 4-color analysis (two wavelengths for microsphere identification, one for light scatter to discriminate single microspheres and one for phycoerythrin bound to the target). A three-fold decrease in sample preparation time and an improved detection limit, independent of target preconcentration, was demonstrated for detection of Escherichia coli 0157:H7 using the MagTrap as compared to manual processing. Simultaneous analysis of positive and negative controls, along with the assay reagents specific for the target, was used to obtain dose-response curves, demonstrating the potential for quantification of pathogen load in buffer and serum. Published by Elsevier B.V.

  10. An image processing pipeline to detect and segment nuclei in muscle fiber microscopic images.

    Science.gov (United States)

    Guo, Yanen; Xu, Xiaoyin; Wang, Yuanyuan; Wang, Yaming; Xia, Shunren; Yang, Zhong

    2014-08-01

    Muscle fiber images play an important role in the medical diagnosis and treatment of many muscular diseases. The number of nuclei in skeletal muscle fiber images is a key bio-marker of the diagnosis of muscular dystrophy. In nuclei segmentation one primary challenge is to correctly separate the clustered nuclei. In this article, we developed an image processing pipeline to automatically detect, segment, and analyze nuclei in microscopic image of muscle fibers. The pipeline consists of image pre-processing, identification of isolated nuclei, identification and segmentation of clustered nuclei, and quantitative analysis. Nuclei are initially extracted from background by using local Otsu's threshold. Based on analysis of morphological features of the isolated nuclei, including their areas, compactness, and major axis lengths, a Bayesian network is trained and applied to identify isolated nuclei from clustered nuclei and artifacts in all the images. Then a two-step refined watershed algorithm is applied to segment clustered nuclei. After segmentation, the nuclei can be quantified for statistical analysis. Comparing the segmented results with those of manual analysis and an existing technique, we find that our proposed image processing pipeline achieves good performance with high accuracy and precision. The presented image processing pipeline can therefore help biologists increase their throughput and objectivity in analyzing large numbers of nuclei in muscle fiber images. © 2014 Wiley Periodicals, Inc.

  11. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    Lu Zhigang; Wu Huan; Liu Baoxu

    2007-01-01

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  12. Detection of a novel, integrative aging process suggests complex physiological integration.

    Science.gov (United States)

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  13. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  14. Real time loss detection for SNM in process

    International Nuclear Information System (INIS)

    Candy, J.V.; Dunn, D.R.; Gavel, D.T.

    1980-01-01

    This paper discusses the basis of a design for real time special nuclear material (SNM) loss detectors. The design utilizes process measurements and signal processing techniques to produce a timely estimate of material loss. A state estimator is employed as the primary signal processing algorithm. Material loss is indicated by changes in the states or process innovations (residuals). The design philosophy is discussed in the context of these changes

  15. Automatic Detection of Optic Disc in Retinal Image by Using Keypoint Detection, Texture Analysis, and Visual Dictionary Techniques

    Directory of Open Access Journals (Sweden)

    Kemal Akyol

    2016-01-01

    Full Text Available With the advances in the computer field, methods and techniques in automatic image processing and analysis provide the opportunity to detect automatically the change and degeneration in retinal images. Localization of the optic disc is extremely important for determining the hard exudate lesions or neovascularization, which is the later phase of diabetic retinopathy, in computer aided eye disease diagnosis systems. Whereas optic disc detection is fairly an easy process in normal retinal images, detecting this region in the retinal image which is diabetic retinopathy disease may be difficult. Sometimes information related to optic disc and hard exudate information may be the same in terms of machine learning. We presented a novel approach for efficient and accurate localization of optic disc in retinal images having noise and other lesions. This approach is comprised of five main steps which are image processing, keypoint extraction, texture analysis, visual dictionary, and classifier techniques. We tested our proposed technique on 3 public datasets and obtained quantitative results. Experimental results show that an average optic disc detection accuracy of 94.38%, 95.00%, and 90.00% is achieved, respectively, on the following public datasets: DIARETDB1, DRIVE, and ROC.

  16. Detecting change in processes using comparative trace clustering

    NARCIS (Netherlands)

    Hompes, B.F.A.; Buijs, J.C.A.M.; van der Aalst, W.M.P.; Dixit, P.M.; Buurman, J.

    2015-01-01

    Real-life business processes are complex and show a high degree of variability. Additionally, due to changing conditions and circumstances, these processes continuously evolve over time. For example, in the healthcare domain, advances in medicine trigger changes in diagnoses and treatment processes.

  17. Optimizing detection and analysis of slow waves in sleep EEG.

    Science.gov (United States)

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    Gomes, Luciene Betzler C.; Santos, Isaac Luquetti dos; Fonseca, Antonio Carlos C. da; Pellini, Marcos Pinto; Rebelo, Ana Maria

    2005-01-01

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  19. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters.

  20. Data analysis of inertial sensor for train positioning detection system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seong Jin; Park, Sung Soo; Lee, Jae Ho; Kang, Dong Hoon [Korea Railroad Research Institute, Uiwang (Korea, Republic of)

    2015-02-15

    Train positioning detection information is fundamental for high-speed railroad inspection, making it possible to simultaneously determine the status and evaluate the integrity of railroad equipment. This paper presents the results of measurements and an analysis of an inertial measurement unit (IMU) used as a positioning detection sensors. Acceleration and angular rate measurements from the IMU were analyzed in the amplitude and frequency domains, with a discussion on vibration and train motions. Using these results and GPS information, the positioning detection of a Korean tilting train express was performed from Naju station to Illo station on the Honam-line. The results of a synchronized analysis of sensor measurements and train motion can help in the design of a train location detection system and improve the positioning detection performance.

  1. Detecting Anthropogenic Disturbance on Weathering and Erosion Processes

    Science.gov (United States)

    Vanacker, V.; Schoonejans, J.; Bellin, N.; Ameijeiras-Mariño, Y.; Opfergelt, S.; Christl, M.

    2014-12-01

    Anthropogenic disturbance of natural vegetation can profoundly alter the physical, chemical and biological processes within soils. Rapid removal of topsoil during intense farming can result in an imbalance between soil production through chemical weathering and physical erosion, with direct implications on local biogeochemical cycling. However, the feedback mechanisms between soil erosion, chemical weathering and biogeochemical cycling in response to anthropogenic forcing are not yet fully understood. In this paper, we analyze dynamic soil properties for a rapidly changing anthropogenic landscape in the Spanish Betic Cordillera; and focus on the coupling between physical erosion, soil production and soil chemical weathering. Modern erosion rates were quantified through analysis of sediment deposition volumes behind check dams, and represent catchment-average erosion rates over the last 10 to 50 years. Soil production rates are derived from in-situ produced 10Be nuclide concentrations, and represent long-term flux rates. In each catchment, soil chemical weathering intensities were calculated for two soil-regolith profiles. Although Southeast Spain is commonly reported as the European region that is most affected by land degradation, modern erosion rates are low (140 t ha-1 yr-1). About 50 % of the catchments are losing soils at a rate of less than 60 t km-2 yr-1. Our data show that modern erosion rates are roughly of the same magnitude as the long-term or cosmogenically-derived erosion rates in the Betic Cordillera. Soils developed on weathered metamorphic rocks have no well-developed profile characteristics, and are generally thin and stony. Nevertheless, soil chemical weathering intensities are high; and question the occurrence of past soil truncation.

  2. Across frequency processes involved in auditory detection of coloration

    DEFF Research Database (Denmark)

    Buchholz, Jörg; Kerketsos, P

    2008-01-01

    filterbank was designed to approximate auditory filter-shapes measured by Oxenham and Shera [JARO, 2003, 541-554], derived from forward masking data. The results of the present study demonstrate that a “purely” spectrum-based model approach can successfully describe auditory coloration detection even at high......When an early wall reflection is added to a direct sound, a spectral modulation is introduced to the signal's power spectrum. This spectral modulation typically produces an auditory sensation of coloration or pitch. Throughout this study, auditory spectral-integration effects involved in coloration...... detection are investigated. Coloration detection thresholds were therefore measured as a function of reflection delay and stimulus bandwidth. In order to investigate the involved auditory mechanisms, an auditory model was employed that was conceptually similar to the peripheral weighting model [Yost, JASA...

  3. Fault detection and isolation in processes involving induction machines

    Energy Technology Data Exchange (ETDEWEB)

    Zell, K; Medvedev, A [Control Engineering Group, Luleaa University of Technology, Luleaa (Sweden)

    1998-12-31

    A model-based technique for fault detection and isolation in electro-mechanical systems comprising induction machines is introduced. Two coupled state observers, one for the induction machine and another for the mechanical load, are used to detect and recognize fault-specific behaviors (fault signatures) from the real-time measurements of the rotor angular velocity and terminal voltages and currents. Practical applicability of the method is verified in full-scale experiments with a conveyor belt drive at SSAB, Luleaa Works. (orig.) 3 refs.

  4. Fault detection and isolation in processes involving induction machines

    Energy Technology Data Exchange (ETDEWEB)

    Zell, K.; Medvedev, A. [Control Engineering Group, Luleaa University of Technology, Luleaa (Sweden)

    1997-12-31

    A model-based technique for fault detection and isolation in electro-mechanical systems comprising induction machines is introduced. Two coupled state observers, one for the induction machine and another for the mechanical load, are used to detect and recognize fault-specific behaviors (fault signatures) from the real-time measurements of the rotor angular velocity and terminal voltages and currents. Practical applicability of the method is verified in full-scale experiments with a conveyor belt drive at SSAB, Luleaa Works. (orig.) 3 refs.

  5. Integrating human factors into process hazard analysis

    International Nuclear Information System (INIS)

    Kariuki, S.G.; Loewe, K.

    2007-01-01

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors

  6. Detection limit for rate fluctuations in inhomogeneous Poisson processes

    Science.gov (United States)

    Shintani, Toshiaki; Shinomoto, Shigeru

    2012-04-01

    Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.

  7. Detection limit for rate fluctuations in inhomogeneous Poisson processes.

    Science.gov (United States)

    Shintani, Toshiaki; Shinomoto, Shigeru

    2012-04-01

    Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.

  8. Protecting Student Intellectual Property in Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Barber, Craig

    2012-01-01

    The rapid development of the Internet along with increasing computer literacy has made it easy and tempting for digital natives to copy-paste someone's work. Plagiarism is now a burning issue in education, industry and even in the research community. In this study, the authors concentrate on plagiarism detection with particular focus on the…

  9. Fault detection in processes represented by PLS models using an EWMA control scheme

    KAUST Repository

    Harrou, Fouzi

    2016-10-20

    Fault detection is important for effective and safe process operation. Partial least squares (PLS) has been used successfully in fault detection for multivariate processes with highly correlated variables. However, the conventional PLS-based detection metrics, such as the Hotelling\\'s T and the Q statistics are not well suited to detect small faults because they only use information about the process in the most recent observation. Exponentially weighed moving average (EWMA), however, has been shown to be more sensitive to small shifts in the mean of process variables. In this paper, a PLS-based EWMA fault detection method is proposed for monitoring processes represented by PLS models. The performance of the proposed method is compared with that of the traditional PLS-based fault detection method through a simulated example involving various fault scenarios that could be encountered in real processes. The simulation results clearly show the effectiveness of the proposed method over the conventional PLS method.

  10. Control system for technological processes in tritium processing plants with process analysis

    International Nuclear Information System (INIS)

    Retevoi, Carmen Maria; Stefan, Iuliana; Balteanu, Ovidiu; Stefan, Liviu; Bucur, Ciprian

    2005-01-01

    Integration of a large variety of installations and equipment into a unitary system for controlling the technological process in tritium processing nuclear facilities appears to be a rather complex approach particularly when experimental or new technologies are developed. Ensuring a high degree of versatility allowing easy modifications in configurations and process parameters is a major requirement imposed on experimental installations. The large amount of data which must be processed, stored and easily accessed for subsequent analyses imposes development of a large information network based on a highly integrated system containing the acquisition, control and technological process analysis data as well as data base system. On such a basis integrated systems of computation and control able to conduct the technological process could be developed as well protection systems for cases of failures or break down. The integrated system responds to the control and security requirements in case of emergency and of the technological processes specific to the industry that processes radioactive or toxic substances with severe consequences in case of technological failure as in the case of tritium processing nuclear plant. In order to lower the risk technological failure of these processes an integrated software, data base and process analysis system are developed, which, based on identification algorithm of the important parameters for protection and security systems, will display the process evolution trend. The system was checked on a existing plant that includes a removal tritium unit, finally used in a nuclear power plant, by simulating the failure events as well as the process. The system will also include a complete data base monitoring all the parameters and a process analysis software for the main modules of the tritium processing plant, namely, isotope separation, catalytic purification and cryogenic distillation

  11. FIND: difFerential chromatin INteractions Detection using a spatial Poisson process.

    Science.gov (United States)

    Djekidel, Mohamed Nadhir; Chen, Yang; Zhang, Michael Q

    2018-02-12

    Polymer-based simulations and experimental studies indicate the existence of a spatial dependency between the adjacent DNA fibers involved in the formation of chromatin loops. However, the existing strategies for detecting differential chromatin interactions assume that the interacting segments are spatially independent from the other segments nearby. To resolve this issue, we developed a new computational method, FIND, which considers the local spatial dependency between interacting loci. FIND uses a spatial Poisson process to detect differential chromatin interactions that show a significant difference in their interaction frequency and the interaction frequency of their neighbors. Simulation and biological data analysis show that FIND outperforms the widely used count-based methods and has a better signal-to-noise ratio. © 2018 Djekidel et al.; Published by Cold Spring Harbor Laboratory Press.

  12. Exergy analysis of the LFC process

    International Nuclear Information System (INIS)

    Li, Qingsong; Lin, Yuankui

    2016-01-01

    Highlights: • Mengdong lignite was upgraded by liquids from coal (LFC) process at a laboratory-scale. • True boiling point distillation of tar was performed. • Basing on experimental data, the LFC process was simulated in Aspen Plus. • Amounts of exergy destruction and efficiencies of blocks were calculated. • Potential measures for improving the LFC process are suggested. - Abstract: Liquid from coal (LFC) is a pyrolysis technology for upgrading lignite. LFC is close to viability as a large-scale commercial technology and is strongly promoted by the Chinese government. This paper presents an exergy analysis of the LFC process producing semicoke and tar, simulated in Aspen Plus. The simulation included the drying unit, pyrolysis unit, tar recovery unit and combustion unit. To obtain the data required for the simulation, Mengdong lignite was upgraded using a laboratory-scale experimental facility based on LFC technology. True boiling point distillation of tar was performed. Based on thermodynamic data obtained from the simulation, chemical exergy and physical exergy were determined for process streams and exergy destruction was calculated. The exergy budget of the LFC process is presented as a Grassmann flow diagram. The overall exergy efficiency was 76.81%, with the combustion unit causing the highest exergy destruction. The study found that overall exergy efficiency can be increased by reducing moisture in lignite and making full use of physical exergy of pyrolysates. A feasible method for making full use of physical exergy of semicoke was suggested.

  13. Digital Printing Quality Detection and Analysis Technology Based on CCD

    Science.gov (United States)

    He, Ming; Zheng, Liping

    2017-12-01

    With the help of CCD digital printing quality detection and analysis technology, it can carry out rapid evaluation and objective detection of printing quality, and can play a certain control effect on printing quality. It can be said CDD digital printing quality testing and analysis of the rational application of technology, its digital printing and printing materials for a variety of printing equipments to improve the quality of a very positive role. In this paper, we do an in-depth study and discussion based on the CCD digital print quality testing and analysis technology.

  14. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  15. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  16. Signal processing for solar array monitoring, fault detection, and optimization

    CERN Document Server

    Braun, Henry; Spanias, Andreas

    2012-01-01

    Although the solar energy industry has experienced rapid growth recently, high-level management of photovoltaic (PV) arrays has remained an open problem. As sensing and monitoring technology continues to improve, there is an opportunity to deploy sensors in PV arrays in order to improve their management. In this book, we examine the potential role of sensing and monitoring technology in a PV context, focusing on the areas of fault detection, topology optimization, and performance evaluation/data visualization. First, several types of commonly occurring PV array faults are considered and detection algorithms are described. Next, the potential for dynamic optimization of an array's topology is discussed, with a focus on mitigation of fault conditions and optimization of power output under non-fault conditions. Finally, monitoring system design considerations such as type and accuracy of measurements, sampling rate, and communication protocols are considered. It is our hope that the benefits of monitoring presen...

  17. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    Energy Technology Data Exchange (ETDEWEB)

    Matzel, Eric M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-31

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.

  18. Exceptional maintenance and anticipation. Hazard detection and processing methodology

    International Nuclear Information System (INIS)

    Nicot, P.; Mesnage, J.

    1994-01-01

    In order to minimize the consequences of important problems on nuclear reactor equipment, the basic principle to be applied is anticipation, which concerns the development of methods and tools and the related preventive actions. The example of the reactor vessel cover leakage problem in 1991, and its resolution, has led to a control, detection and replacement-to-survey strategy. Thus, a preventive industrial awareness strategy has been implemented for predictable and unpredictable problems

  19. Signal processing techniques for damage detection with piezoelectric wafer active sensors and embedded ultrasonic structural radar

    Science.gov (United States)

    Yu, Lingyu; Bao, Jingjing; Giurgiutiu, Victor

    2004-07-01

    Embedded ultrasonic structural radar (EUSR) algorithm is developed for using piezoelectric wafer active sensor (PWAS) array to detect defects within a large area of a thin-plate specimen. Signal processing techniques are used to extract the time of flight of the wave packages, and thereby to determine the location of the defects with the EUSR algorithm. In our research, the transient tone-burst wave propagation signals are generated and collected by the embedded PWAS. Then, with signal processing, the frequency contents of the signals and the time of flight of individual frequencies are determined. This paper starts with an introduction of embedded ultrasonic structural radar algorithm. Then we will describe the signal processing methods used to extract the time of flight of the wave packages. The signal processing methods being used include the wavelet denoising, the cross correlation, and Hilbert transform. Though hardware device can provide averaging function to eliminate the noise coming from the signal collection process, wavelet denoising is included to ensure better signal quality for the application in real severe environment. For better recognition of time of flight, cross correlation method is used. Hilbert transform is applied to the signals after cross correlation in order to extract the envelope of the signals. Signal processing and EUSR are both implemented by developing a graphical user-friendly interface program in LabView. We conclude with a description of our vision for applying EUSR signal analysis to structural health monitoring and embedded nondestructive evaluation. To this end, we envisage an automatic damage detection application utilizing embedded PWAS, EUSR, and advanced signal processing.

  20. An overview of the legislation and light microscopy for detection of processed animal proteins in feeds.

    Science.gov (United States)

    Liu, Xian; Han, Lujia; Veys, Pascal; Baeten, Vincent; Jiang, Xunpeng; Dardenne, Pierre

    2011-08-01

    From the first cases of bovine spongiform encephalopathy (BSE) among cattle in the United Kingdom in 1986, the route of infection of BSE is generally believed by means of feeds containing low level of processed animal proteins (PAPs). Therefore, many feed bans and alternative and complementary techniques were resulted for the BSE safeguards in the world. Now the feed bans are expected to develop into a "species to species" ban, which requires the corresponding species-specific identification methods. Currently, banned PAPs can be detected by various methods as light microscopy, polymerase chain reaction, enzyme-linked immunosorbent assay, near infrared spectroscopy, and near infrared microscopy. Light microscopy as described in the recent Commission Regulation EC/152/2009 is the only official method for the detection and characterization of PAPs in feed in the European Union. It is able to detect the presence of constituents of animal origin in feed at the level of 1 g/kg with hardly any false negative. Nevertheless, light microscopy has the limitation of lack of species specificity. This article presents a review of legislations on the use of PAPs in feedstuff, the detection details of animal proteins by light microscopy, and also presents and discusses the analysis procedure and expected development of the technique. Copyright © 2010 Wiley-Liss, Inc.

  1. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  2. THE ANALYSIS OF DETECTIVE GENRE IN MEDIA STUDIES IN THE STUDENT AUDIENCE

    Directory of Open Access Journals (Sweden)

    Alexander Fedorov

    2011-11-01

    Full Text Available Development of skills for the critical analysis of media texts - an important task of media education. However, media literacy practice shows that students have the problems with the discussion / analysis of entertainment genres in the early stages of media studies, for example, the difficulties in the process of understanding and interpreting the author's conception, plot and genre features. This article substantiates the methodological approaches to the analysis skills of detective/thriller genre in media studies in the student audience.

  3. Detecting fire in video stream using statistical analysis

    Directory of Open Access Journals (Sweden)

    Koplík Karel

    2017-01-01

    Full Text Available The real time fire detection in video stream is one of the most interesting problems in computer vision. In fact, in most cases it would be nice to have fire detection algorithm implemented in usual industrial cameras and/or to have possibility to replace standard industrial cameras with one implementing the fire detection algorithm. In this paper, we present new algorithm for detecting fire in video. The algorithm is based on tracking suspicious regions in time with statistical analysis of their trajectory. False alarms are minimized by combining multiple detection criteria: pixel brightness, trajectories of suspicious regions for evaluating characteristic fire flickering and persistence of alarm state in sequence of frames. The resulting implementation is fast and therefore can run on wide range of affordable hardware.

  4. Steam leak detection method in pipeline using histogram analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Se Oh; Jeon, Hyeong Seop; Son, Ki Sung; Chae, Gyung Sun [Saean Engineering Corp, Seoul (Korea, Republic of); Park, Jong Won [Dept. of Information Communications Engineering, Chungnam NationalUnversity, Daejeon (Korea, Republic of)

    2015-10-15

    Leak detection in a pipeline usually involves acoustic emission sensors such as contact type sensors. These contact type sensors pose difficulties for installation and cannot operate in areas having high temperature and radiation. Therefore, recently, many researchers have studied the leak detection phenomenon by using a camera. Leak detection by using a camera has the advantages of long distance monitoring and wide area surveillance. However, the conventional leak detection method by using difference images often mistakes the vibration of a structure for a leak. In this paper, we propose a method for steam leakage detection by using the moving average of difference images and histogram analysis. The proposed method can separate the leakage and the vibration of a structure. The working performance of the proposed method is verified by comparing with experimental results.

  5. Identifying Time Measurement Tampering in the Traversal Time and Hop Count Analysis (TTHCA Wormhole Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Jonny Karlsson

    2013-05-01

    Full Text Available Traversal time and hop count analysis (TTHCA is a recent wormhole detection algorithm for mobile ad hoc networks (MANET which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  6. Human detection and motion analysis at security points

    Science.gov (United States)

    Ozer, I. Burak; Lv, Tiehan; Wolf, Wayne H.

    2003-08-01

    This paper presents a real-time video surveillance system for the recognition of specific human activities. Specifically, the proposed automatic motion analysis is used as an on-line alarm system to detect abnormal situations in a campus environment. A smart multi-camera system developed at Princeton University is extended for use in smart environments in which the camera detects the presence of multiple persons as well as their gestures and their interaction in real-time.

  7. Detection of chromosome aberrations in tumors lineage after irradiation process

    International Nuclear Information System (INIS)

    Silva, Luciana Maria Silva; Campos, Tarcisio

    2002-01-01

    When radioresistant cancerous cells are irradiated at level of few Gys, the interactions may not generate visible observations in the morphology of the cells or effects so intense such as death after few hours. The changes that will be observed depend on the combination of many factors that define the probability of cell surviving in response to the physical dose applied. Genetic factors may affect the cell response such as the cell sensitivity to irradiation, cancerous cell is studied when irradiated with Co-60 gamma rays. Besides the evaluation of the radiosensitivity of this cells when exposed to gamma irradiation, possible chromosomic aberrations and apoptosis were detected. (author)

  8. Detection of Epileptic Seizures with Multi-modal Signal Processing

    DEFF Research Database (Denmark)

    Conradsen, Isa

    convulsive seizures tested. Another study was performed, involving quantitative parameters in the time and frequency domain. The study showed, that there are several differences between tonic seizures and the tonic phase of GTC seizures and furthermore revealed differences of the epileptic (tonic and tonic...... phase of GTC) and simulated seizures. This was valuable information concerning a seizure detection algorithm, and the findings from this research provided evidence for a change in the definition of these seizures by the International League Against Epilepsy (ILAE). Our final study presents a novel...

  9. Improvement in Limit of Detection of Enzymatic Biogas Sensor Utilizing Chromatography Paper for Breath Analysis.

    Science.gov (United States)

    Motooka, Masanobu; Uno, Shigeyasu

    2018-02-02

    Breath analysis is considered to be an effective method for point-of-care diagnosis due to its noninvasiveness, quickness and simplicity. Gas sensors for breath analysis require detection of low-concentration substances. In this paper, we propose that reduction of the background current improves the limit of detection of enzymatic biogas sensors utilizing chromatography paper. After clarifying the cause of the background current, we reduced the background current by improving the fabrication process of the sensors utilizing paper. Finally, we evaluated the limit of detection of the sensor with the sample vapor of ethanol gas. The experiment showed about a 50% reduction of the limit of detection compared to previously-reported sensor. This result presents the possibility of the sensor being applied in diagnosis, such as for diabetes, by further lowering the limit of detection.

  10. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar....... The emphasis of this paper is on the signal-to-noise ratio of the detection and its relation to the sensitivity. Two absorbance cells with an optical path length of 100 μm and 1000 μm were characterized and compared in terms of sensitivity, limit of detection and effective path length for measurements...

  11. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  12. URBAN DETECTION, DELIMITATION AND MORPHOLOGY: COMPARATIVE ANALYSIS OF SELECTIVE "MEGACITIES"

    Directory of Open Access Journals (Sweden)

    B. Alhaddad

    2012-08-01

    Full Text Available Over the last 50 years, the world has faced an impressive growth of urban population. The walled city, close to the outside, an "island"for economic activities and population density within the rural land, has led to the spread of urban life and urban networks in almost all the territory. There was, as said Margalef (1999, "a topological inversion of the landscape". The "urban" has gone from being an island in the ocean of rural land vastness, to represent the totally of the space in which are inserted natural and rural "systems". New phenomena such as the fall of the fordist model of production, the spread of urbanization known as urban sprawl, and the change of scale of the metropolis, covering increasingly large regions, called "megalopolis" (Gottmann, 1961, have characterized the century. However there are no rigorous databases capable of measuring and evaluating the phenomenon of megacities and in general the process of urbanization in the contemporary world. The aim of this paper is to detect, identify and analyze the morphology of the megacities through remote sensing instruments as well as various indicators of landscape. To understand the structure of these heterogeneous landscapes called megacities, land consumption and spatial complexity needs to be quantified accurately. Remote sensing might be helpful in evaluating how the different land covers shape urban megaregions. The morphological landscape analysis allows establishing the analogies and the differences between patterns of cities and studying the symmetry, growth direction, linearity, complexity and compactness of the urban form. The main objective of this paper is to develop a new methodology to detect urbanized land of some megacities around the world (Tokyo, Mexico, Chicago, New York, London, Moscow, Sao Paulo and Shanghai using Landsat 7 images.

  13. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P.

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  14. Pipeline leak detection and location by on-line-correlation with a process computer

    International Nuclear Information System (INIS)

    Siebert, H.; Isermann, R.

    1977-01-01

    A method for leak detection using a correlation technique in pipelines is described. For leak detection and also for leak localisation and estimation of the leak flow recursive estimation algorithms are used. The efficiency of the methods is demonstrated with a process computer and a pipeline model operating on-line. It is shown that very small leaks can be detected. (orig.) [de

  15. Detection of wood failure by image processing method: influence of algorithm, adhesive and wood species

    Science.gov (United States)

    Lanying Lin; Sheng He; Feng Fu; Xiping Wang

    2015-01-01

    Wood failure percentage (WFP) is an important index for evaluating the bond strength of plywood. Currently, the method used for detecting WFP is visual inspection, which lacks efficiency. In order to improve it, image processing methods are applied to wood failure detection. The present study used thresholding and K-means clustering algorithms in wood failure detection...

  16. Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers

    Science.gov (United States)

    Pierścińska, D.

    2018-01-01

    This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.

  17. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  18. People detection in nuclear plants by video processing for safety purpose

    International Nuclear Information System (INIS)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A.; Seixas, Jose M.; Silva, Eduardo Antonio B.; Cota, Raphael E.; Ramos, Bruno L.

    2011-01-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  19. People detection in nuclear plants by video processing for safety purpose

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN), Rio de Janeiro, RJ (Brazil); Seixas, Jose M.; Silva, Eduardo Antonio B., E-mail: seixas@lps.ufrj.b, E-mail: eduardo@lps.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Eletrica; Cota, Raphael E.; Ramos, Bruno L., E-mail: brunolange@poli.ufrj.b [Universidade Federal do Rio de Janeiro (EP/UFRJ), RJ (Brazil). Dept. de Engenharia Eletronica e de Computacao

    2011-07-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  20. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    International Nuclear Information System (INIS)

    SUN, Y.

    2004-01-01

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P and CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P and CE (BSC 2004 [DIRS 169860

  1. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  2. TB case detection in Tajikistan – analysis of existing obstacles

    Directory of Open Access Journals (Sweden)

    Alexei Korobitsyn

    2013-10-01

    Full Text Available Background: Tajikistan National TB Control ProgramObjective: (1 To identify the main obstacles to increasing TB Detection in Tajikistan. (2 To identify interventions that improve TB detection.Methods: Review of the available original research data, health normative base, health systems performance and national economic data, following WHO framework for detection of TB cases, which is based on three scenarios of why incident cases of TB may not be notified.Results: Data analysis revealed that some aspects of TB case detection are more problematic than others and that there are gaps in the knowledge of specific obstacles to TB case detection. The phenomenon of “initial default” in Tajikistan has been documented; however, it needs to be studied further. The laboratory services detect infectious TB cases effectively; however, referrals of appropriate suspects for TB diagnosis may lag behind. The knowledge about TB in the general population has improved. Yet, the problem of TB related stigma persists, thus being an obstacle for effective TB detection. High economic cost of health services driven by under-the-table payments was identified as another barrier for access to health services.Conclusion: Health system strengthening should become a primary intervention to improve case detection in Tajikistan. More research on reasons contributing to the failure to register TB cases, as well as factors underlying stigma is needed.

  3. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...... shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy...

  4. Warpage analysis in injection moulding process

    Science.gov (United States)

    Hidayah, M. H. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study was concentrated on the effects of process parameters in plastic injection moulding process towards warpage problem by using Autodesk Moldflow Insight (AMI) software for the simulation. In this study, plastic dispenser of dental floss has been analysed with thermoplastic material of Polypropylene (PP) used as the moulded material and details properties of 80 Tonne Nessei NEX 1000 injection moulding machine also has been used in this study. The variable parameters of the process are packing pressure, packing time, melt temperature and cooling time. Minimization of warpage obtained from the optimization and analysis data from the Design Expert software. Integration of Response Surface Methodology (RSM), Center Composite Design (CCD) with polynomial models that has been obtained from Design of Experiment (DOE) is the method used in this study. The results show that packing pressure is the main factor that will contribute to the formation of warpage in x-axis and y-axis. While in z-axis, the main factor is melt temperature and packing time is the less significant among the four parameters in x, y and z-axes. From optimal processing parameter, the value of warpage in x, y and z-axis have been optimised by 21.60%, 26.45% and 24.53%, respectively.

  5. Detection and quantification of flow consistency in business process models.

    Science.gov (United States)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara

    2018-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.

  6. Detection of optimum maturity of maize using image processing and ...

    African Journals Online (AJOL)

    A CCD camera for image acquisition of the different green colorations of the maize leaves at maturity was used. Different color features were extracted from the image processing system (MATLAB) and used as inputs to the artificial neural network that classify different levels of maturity. Keywords: Maize, Maturity, CCD ...

  7. Auditory Processing Speed and Signal Detection in Schizophrenia

    Science.gov (United States)

    Korboot, P. J.; Damiani, N.

    1976-01-01

    Two differing explanations of schizophrenic processing deficit were examined: Chapman and McGhie's and Yates'. Thirty-two schizophrenics, classified on the acute-chronic and paranoid-nonparanoid dimensions, and eight neurotics were tested on two dichotic listening tasks. (Editor)

  8. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  9. Microstructuring of piezoresistive cantilevers for gas detection and analysis

    International Nuclear Information System (INIS)

    Sarov, Y.; Sarova, V.; Bitterlich, Ch.; Richter, O.; Guliyev, E.; Zoellner, J.-P.; Rangelow, I. W.; Andok, R.; Bencurova, A.

    2011-01-01

    In this work we report on a design and fabrication of cantilevers for gas detection and analysis. The cantilevers have expanded area of interaction with the gas, while the signal transduction is realized by an integrated piezoresistive deflection sensor, placed at the narrowed cantilever base with highest stress along the cantilever. Moreover, the cantilevers have integrated bimorph micro-actuator detection in a static and dynamic mode. The cantilevers are feasible as pressure, temperature and flow sensors and under chemical functionalization - for gas recognition, tracing and composition analysis. (authors)

  10. Management of vacuum leak-detection processes, calibration, and standards

    International Nuclear Information System (INIS)

    Wilson, N.G.

    1985-01-01

    Vacuum leak detection requires integrated management action to ensure the successful production of apparatus having required leak tightness. Implementation of properly planned, scheduled, and engineered procedures and test arrangements are an absolute necessity to prevent unexpected, impractical, technically inadequate, or unnecessarily costly incidents in leak-testing operations. The use of standard procedures, leak standards appropriate to the task, and accurate calibration systems or devices is necessary to validate the integrity of any leak-test procedure. In this paper, the need for implementing these practices is discussed using case histories of typical examples of large complex vacuum systems. Aggressive management practices are of primary importance throughout a project's life cycle to ensure the lowest cost; this includes successful leak testing of components. It should be noted that the opinions and conclusions expressed in this paper are those of the author and are not those of the Los Alamos National Laboratory or the Department of Energy

  11. Application of 241Am EDXRF in detecting and controlling of rare earth separation process by solvent extraction

    International Nuclear Information System (INIS)

    Yan Chunhua; Jia Jiangtao; Liao Chunsheng; Wang Mingwen; Li Biaoguo; Xu Guangxian

    1996-01-01

    The article investigated a fast EDXRF analysis method by radioisotope excited ( 241 Am) employing a high-purity germanium detector in rare earth separation process by solvent extraction. Applying the method, hydrochloride aqueous samples of SeEuGd/Tb/Dy separation processes were off-line analyzed. Comparative results measured by ICP were also given out. The results show that the method can be used for a wide rare earth concentration range with low error. Being fast, effective, precise and non-destructive, it can be used for on-line analysis to detect and control rare earth separation process by solvent extraction

  12. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  13. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    Science.gov (United States)

    2018-01-01

    ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform by Kwok F Tom Sensors and Electron...1 October 2016–30 September 2017 4. TITLE AND SUBTITLE An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a

  14. Accelerating Malware Detection via a Graphics Processing Unit

    Science.gov (United States)

    2010-09-01

    Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...NAs02]. These alerts can be costly in terms of time and resources for individuals and organizations to investigate each misidentified file [YWL07] [Vak10

  15. Sociolinguistically Informed Natural Language Processing: Automating Irony Detection

    Science.gov (United States)

    2017-10-23

    interaction feature using the entire training dataset, and repeated this process 100 times to account for variation due to the SGD procedure. Table 6...Levy and Goldberg , 2014). We parsed the ukWaC corpus (Baroni et al., 2009) using the Stanford Dependency Parser v3.5.2 with Stanford Dependencies...bitrary and variable sizes. We pre-trained our own syntactic embeddings fol- lowing (Levy and Goldberg , 2014). We parsed the ukWaC corpus (Baroni et

  16. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  17. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  18. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Zeng Jun; Chu Chengsheng; Ding Ge; Xiang Qingpei; Hao Fanhua; Luo Xiaobing

    2013-01-01

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr 3 (Ce) fast scintillator detector and 252 Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  19. Multisensor Network System for Wildfire Detection Using Infrared Image Processing

    Directory of Open Access Journals (Sweden)

    I. Bosch

    2013-01-01

    Full Text Available This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire, for different probability of false alarm (PFA. The necessity of including decision fusion is thereby demonstrated.

  20. A practical approach to tramway track condition monitoring: vertical track defects detection and identification using time-frequency processing technique

    Directory of Open Access Journals (Sweden)

    Bocz Péter

    2018-03-01

    Full Text Available This paper presents an automatic method for detecting vertical track irregularities on tramway operation using acceleration measurements on trams. For monitoring of tramway tracks, an unconventional measurement setup is developed, which records the data of 3-axes wireless accelerometers mounted on wheel discs. Accelerations are processed to obtain the vertical track irregularities to determine whether the track needs to be repaired. The automatic detection algorithm is based on time–frequency distribution analysis and determines the defect locations. Admissible limits (thresholds are given for detecting moderate and severe defects using statistical analysis. The method was validated on frequented tram lines in Budapest and accurately detected severe defects with a hit rate of 100%, with no false alarms. The methodology is also sensitive to moderate and small rail surface defects at the low operational speed.

  1. Detection of charged particles through a photodiode: design and analysis

    International Nuclear Information System (INIS)

    Angoli, A.; Quirino, L.L.; Hernandez, V.M.; Lopez del R, H.; Mireles, F.; Davila, J.I.; Rios, C.; Pinedo, J.L.

    2006-01-01

    This project develops and construct an charge particle detector mean a pin photodiode array, design and analysis using a silicon pin Fotodiodo that generally is used to detect visible light, its good efficiency, size compact and reduced cost specifically allows to its use in the radiation monitoring and alpha particle detection. Here, so much, appears the design of the system of detection like its characterization for alpha particles where one is reported as alpha energy resolution and detection efficiency. The equipment used in the development of work consists of alpha particle a triple source composed of Am-241, Pu-239 and Cm-244 with 5,55 KBq as total activity, Maestro 32 software made by ORTEC, a multi-channel card Triumph from ORTEC and one low activity electroplated uranium sample. (Author)

  2. Live face detection based on the analysis of Fourier spectra

    Science.gov (United States)

    Li, Jiangwei; Wang, Yunhong; Tan, Tieniu; Jain, Anil K.

    2004-08-01

    Biometrics is a rapidly developing technology that is to identify a person based on his or her physiological or behavioral characteristics. To ensure the correction of authentication, the biometric system must be able to detect and reject the use of a copy of a biometric instead of the live biometric. This function is usually termed "liveness detection". This paper describes a new method for live face detection. Using structure and movement information of live face, an effective live face detection algorithm is presented. Compared to existing approaches, which concentrate on the measurement of 3D depth information, this method is based on the analysis of Fourier spectra of a single face image or face image sequences. Experimental results show that the proposed method has an encouraging performance.

  3. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  4. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  5. Low-power signal processing devices for portable ECG detection.

    Science.gov (United States)

    Lee, Shuenn-Yuh; Cheng, Chih-Jen; Wang, Cheng-Pin; Kao, Wei-Chun

    2008-01-01

    An analog front end for diagnosing and monitoring the behavior of the heart is presented. This sensing front end has two low-power processing devices, including a 5(th)-order Butterworth operational transconductance-C (OTA-C) filter and an 8-bit successive approximation analog-to-digital converter (SAADC). The components fabricated in a 0.18-microm CMOS technology feature with power consumptions of 453 nW (filter) and 940 nW (ADC) at a supply voltage of 1 V, respectively. The system specifications in terms of output noise and linearity associated with the two integrated circuits are described in this paper.

  6. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  7. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T R.H.; Whitlow, H J [Lund Univ. (Sweden); Bubb, I F; Short, R; Johnston, P N [Royal Melbourne Inst. of Tech., VIC (Australia)

    1997-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  8. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T.R.H.; Whitlow, H.J. [Lund Univ. (Sweden); Bubb, I.F.; Short, R.; Johnston, P.N. [Royal Melbourne Inst. of Tech., VIC (Australia)

    1996-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  9. Detection of land mines using fast and thermal neutron analysis

    International Nuclear Information System (INIS)

    Bach, P.

    1998-01-01

    The detection of land mines is made possible by using nuclear sensor based on neutron interrogation. Neutron interrogation allows to detect the sensitive elements (C, H, O, N) of the explosives in land mines or in unexploded shells: the evaluation of characteristic ratio N/O and C/O in a volume element gives a signature of high explosives. Fast neutron interrogation has been qualified in our laboratories as a powerful close distance method for identifying the presence of a mine or explosive. This method could be implemented together with a multisensor detection system - for instance IR or microwave - to reduce the false alarm rate by addressing the suspected area. Principle of operation is based on the measurement of gamma rays induced by neutron interaction with irradiated nuclei from the soil and from a possible mine. Specific energy of these gamma rays allows to recognise the elements at the origin of neutron interaction. Several detection methods can be used, depending on nuclei to be identified. Analysis of physical data, computations by simulation codes, and experimentations performed in our laboratory have shown the interest of Fast Neutron Analysis (FNA) combined with Thermal Neutron Analysis (TNA) techniques, especially for detection of nitrogen 14 N, carbon 12 C and oxygen 16 O. The FNA technique can be implemented using a 14 MeV sealed neutron tube, and a set of detectors. The mines detection has been demonstrated from our investigations, using a low power neutron generator working in the 10 8 n/s range, which is reasonable when considering safety rules. A fieldable demonstrator would be made with a detection head including tube and detectors, and with remote electronics, power supplies and computer installed in a vehicle. (author)

  10. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  11. Detecting fast, online reasoning processes in clinical decision making.

    Science.gov (United States)

    Flores, Amanda; Cobos, Pedro L; López, Francisco J; Godoy, Antonio

    2014-06-01

    In an experiment that used the inconsistency paradigm, experienced clinical psychologists and psychology students performed a reading task using clinical reports and a diagnostic judgment task. The clinical reports provided information about the symptoms of hypothetical clients who had been previously diagnosed with a specific mental disorder. Reading times of inconsistent target sentences were slower than those of control sentences, demonstrating an inconsistency effect. The results also showed that experienced clinicians gave different weights to different symptoms according to their relevance when fluently reading the clinical reports provided, despite the fact that all the symptoms were of equal diagnostic value according to the Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; American Psychiatric Association, 2000). The diagnostic judgment task yielded a similar pattern of results. In contrast to previous findings, the results of the reading task may be taken as direct evidence of the intervention of reasoning processes that occur very early, rapidly, and online. We suggest that these processes are based on the representation of mental disorders and that these representations are particularly suited to fast retrieval from memory and to making inferences. They may also be related to the clinicians' causal reasoning. The implications of these results for clinician training are also discussed.

  12. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  13. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    Poline, J.B.; Mazoyer, B.M.

    1994-01-01

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  14. Gravitational wave detection and data analysis for pulsar timing arrays

    NARCIS (Netherlands)

    Haasteren, Rutger van

    2011-01-01

    Long-term precise timing of Galactic millisecond pulsars holds great promise for measuring long-period (months-to-years) astrophysical gravitational waves. In this work we develop a Bayesian data analysis method for projects called pulsar timing arrays; projects aimed to detect these gravitational

  15. Detecting bots using multi-level traffic analysis

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2016-01-01

    introduces a novel multi-level botnet detection approach that performs network traffic analysis of three protocols widely considered as the main carriers of botnet Command and Control (C&C) and attack traffic, i.e. TCP, UDP and DNS. The proposed method relies on supervised machine learning for identifying...

  16. Meter Detection in Symbolic Music Using Inner Metric Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Volk, A.

    2016-01-01

    In this paper we present PRIMA: a new model tailored to symbolic music that detects the meter and the first downbeat position of a piece. Given onset data, the metrical structure of a piece is interpreted using the Inner Metric Analysis (IMA) model. IMA identifies the strong and weak metrical

  17. The Detection and Analysis of Chromosome Fragile Sites

    DEFF Research Database (Denmark)

    Bjerregaard, Victoria A; Özer, Özgün; Hickson, Ian D

    2018-01-01

    A fragile site is a chromosomal locus that is prone to form a gap or constriction visible within a condensed metaphase chromosome, particularly following exposure of cells to DNA replication stress. Based on their frequency, fragile sites are classified as either common (CFSs; present in all...... for detection and analysis of chromosome fragile sites....

  18. Fault detection of feed water treatment process using PCA-WD with parameter optimization.

    Science.gov (United States)

    Zhang, Shirong; Tang, Qian; Lin, Yu; Tang, Yuling

    2017-05-01

    Feed water treatment process (FWTP) is an essential part of utility boilers; and fault detection is expected for its reliability improvement. Classical principal component analysis (PCA) has been applied to FWTPs in our previous work; however, the noises of T 2 and SPE statistics result in false detections and missed detections. In this paper, Wavelet denoise (WD) is combined with PCA to form a new algorithm, (PCA-WD), where WD is intentionally employed to deal with the noises. The parameter selection of PCA-WD is further formulated as an optimization problem; and PSO is employed for optimization solution. A FWTP, sustaining two 1000MW generation units in a coal-fired power plant, is taken as a study case. Its operation data is collected for following verification study. The results show that the optimized WD is effective to restrain the noises of T 2 and SPE statistics, so as to improve the performance of PCA-WD algorithm. And, the parameter optimization enables PCA-WD to get its optimal parameters in an automatic way rather than on individual experience. The optimized PCA-WD is further compared with classical PCA and sliding window PCA (SWPCA), in terms of four cases as bias fault, drift fault, broken line fault and normal condition, respectively. The advantages of the optimized PCA-WD, against classical PCA and SWPCA, is finally convinced with the results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Processing of Instantaneous Angular Speed Signal for Detection of a Diesel Engine Failure

    Directory of Open Access Journals (Sweden)

    Adam Charchalis

    2013-01-01

    Full Text Available Continuous monitoring of diesel engine performance under its operating is critical for the prediction of malfunction development and subsequently functional failure detection. Analysis of instantaneous angular speed (IAS of the crankshaft is considered as one of the nonintrusive and effective methods of the detection of combustion quality deterioration. In this paper results of experimental verification of fuel system's malfunction detecting, using optical encoder for IAS recording are presented. The implemented method relies on the comparison of measurement results, recorded under healthy and faulty conditions of the engine. Elaborated dynamic model of angular speed variations enables us to build templates of engine behavior. Recorded during experiment, values of cylinder pressure were taken for the approximation of pressure basic waveform. The main task of data processing is smoothing the raw angular speed signal. The noise is due to sensor mount vibrations, signal emitter machining, engine body vibrations, and crankshaft torsional vibrations. Smoothing of the measurement data was carried out by the implementation of the Savitzky-Golay filter. Measured signal after smoothing was compared with the model of IAS run.

  20. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection.

  1. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  2. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  3. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  4. Rapid Detection of Biological and Chemical Threat Agents Using Physical Chemistry, Active Detection, and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung; Dong, Li; Fu, Rong; Liotta, Lance; Narayanan, Aarthi; Petricoin, Emanuel; Ross, Mark; Russo, Paul; Zhou, Weidong; Luchini, Alessandra; Manes, Nathan; Chertow, Jessica; Han, Suhua; Kidd, Jessica; Senina, Svetlana; Groves, Stephanie

    2007-01-01

    Basic technologies have been successfully developed within this project: rapid collection of aerosols and a rapid ultra-sensitive immunoassay technique. Water-soluble, humidity-resistant polyacrylamide nano-filters were shown to (1) capture aerosol particles as small as 20 nm, (2) work in humid air and (3) completely liberate their captured particles in an aqueous solution compatible with the immunoassay technique. The immunoassay technology developed within this project combines electrophoretic capture with magnetic bead detection. It allows detection of as few as 150-600 analyte molecules or viruses in only three minutes, something no other known method can duplicate. The technology can be used in a variety of applications where speed of analysis and/or extremely low detection limits are of great importance: in rapid analysis of donor blood for hepatitis, HIV and other blood-borne infections in emergency blood transfusions, in trace analysis of pollutants, or in search of biomarkers in biological fluids. Combined in a single device, the water-soluble filter and ultra-sensitive immunoassay technique may solve the problem of early warning type detection of aerosolized pathogens. These two technologies are protected with five patent applications and are ready for commercialization.

  5. Citation-based plagiarism detection detecting disguised and cross-language plagiarism using citation pattern analysis

    CERN Document Server

    Gipp, Bela

    2014-01-01

    Plagiarism is a problem with far-reaching consequences for the sciences. However, even today's best software-based systems can only reliably identify copy & paste plagiarism. Disguised plagiarism forms, including paraphrased text, cross-language plagiarism, as well as structural and idea plagiarism often remain undetected. This weakness of current systems results in a large percentage of scientific plagiarism going undetected. Bela Gipp provides an overview of the state-of-the art in plagiarism detection and an analysis of why these approaches fail to detect disguised plagiarism forms. The aut

  6. Information Processing Features Can Detect Behavioral Regimes of Dynamical Systems

    Directory of Open Access Journals (Sweden)

    Rick Quax

    2018-01-01

    Full Text Available In dynamical systems, local interactions between dynamical units generate correlations which are stored and transmitted throughout the system, generating the macroscopic behavior. However a framework to quantify exactly how these correlations are stored, transmitted, and combined at the microscopic scale is missing. Here we propose to characterize the notion of “information processing” based on all possible Shannon mutual information quantities between a future state and all possible sets of initial states. We apply it to the 256 elementary cellular automata (ECA, which are the simplest possible dynamical systems exhibiting behaviors ranging from simple to complex. Our main finding is that only a few information features are needed for full predictability of the systemic behavior and that the “information synergy” feature is always most predictive. Finally we apply the idea to foreign exchange (FX and interest-rate swap (IRS time-series data. We find an effective “slowing down” leading indicator in all three markets for the 2008 financial crisis when applied to the information features, as opposed to using the data itself directly. Our work suggests that the proposed characterization of the local information processing of units may be a promising direction for predicting emergent systemic behaviors.

  7. Traffic analysis and control using image processing

    Science.gov (United States)

    Senthilkumar, K.; Ellappan, Vijayan; Arun, A. R.

    2017-11-01

    This paper shows the work on traffic analysis and control till date. It shows an approach to regulate traffic the use of image processing and MATLAB systems. This concept uses computational images that are to be compared with original images of the street taken in order to determine the traffic level percentage and set the timing for the traffic signal accordingly which are used to reduce the traffic stoppage on traffic lights. They concept proposes to solve real life scenarios in the streets, thus enriching the traffic lights by adding image receivers like HD cameras and image processors. The input is then imported into MATLAB to be used. as a method for calculating the traffic on roads. Their results would be computed in order to adjust the traffic light timings on a particular street, and also with respect to other similar proposals but with the added value of solving a real, big instance.

  8. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  9. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  10. Multiple scattering problems in heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Johnston, P.N.; El Bouanani, M.; Stannard, W.B.; Bubb, I.F.; Cohen, D.D.; Dytlewski, N.; Siegele, R.

    1998-01-01

    A number of groups use Heavy Ion Elastic Recoil Detection Analysis (HIERDA) to study materials science problems. Nevertheless, there is no standard methodology for the analysis of HIERDA spectra. To overcome this deficiency we have been establishing codes for 2-dimensional data analysis. A major problem involves the effects of multiple and plural scattering which are very significant, even for quite thin (∼100 nm) layers of the very heavy elements. To examine the effects of multiple scattering we have made comparisons between the small-angle model of Sigmund et al. and TRIM calculations. (authors)

  11. Auditory detection of an increment in the rate of a random process

    International Nuclear Information System (INIS)

    Brown, W.S.; Emmerich, D.S.

    1994-01-01

    Recent experiments have presented listeners with complex tonal stimuli consisting of components with values (i.e., intensities or frequencies) randomly sampled from probability distributions [e.g., R. A. Lutfi, J. Acoust. Soc. Am. 86, 934--944 (1989)]. In the present experiment, brief tones were presented at intervals corresponding to the intensity of a random process. Specifically, the intervals between tones were randomly selected from exponential probability functions. Listeners were asked to decide whether tones presented during a defined observation interval represented a ''noise'' process alone or the ''noise'' with a ''signal'' process added to it. The number of tones occurring in any observation interval is a Poisson variable; receiver operating characteristics (ROCs) arising from Poisson processes have been considered by Egan [Signal Detection Theory and ROC Analysis (Academic, New York, 1975)]. Several sets of noise and signal intensities and observation interval durations were selected which were expected to yield equivalent performance. Rating ROCs were generated based on subjects' responses in a single-interval, yes--no task. The performance levels achieved by listeners and the effects of intensity and duration are compared to those predicted for an ideal observer

  12. Confirmation of identity and detection limit in neutron activation analysis

    International Nuclear Information System (INIS)

    Yustina Tri Handayani; Slamet Wiyuniati; Tulisna

    2010-01-01

    Neutron Activation Analysis (NAA) based on neutron capture by nuclides. Of the various possibilities of radionuclides that occur, radionuclides and gamma radiation which provides the identity of the element were analyzed and the best sensitivity should be determined. Confirmation for elements in sediment samples was done theoretically and experimentally. The result of confirmation shows that Al, V, Cr K, Na, Ca and Zn were analyzed based on radionuclides of Al-28, V-52, Cr-51 , K-42, Na-24, Ca-48, Zn-65. Elements of Mg, Mn, Fe, Co were analyzed based on radionuclides of Mg-27, Mn-56, Fe-59, Co-60 through peak which the highest value of combined probability of radiation emission and efficiency. Cu can be analyzed through Cu-64 or Cu-66, but the second is more sensitive. Detection limit is determined at a certain measurement conditions carried out by a laboratory. Detection limit in the NAA is determined based on the Compton continue area by Curie method. The detection limit of Al, V, Ca, Mg, Mn, As, K, Na, Mg, Ce, Co, Cr, Fe, La, Sc, and Zn in sediment samples are 240, 27, 4750, 2600, 21, 3.3 , 75, 1.4, 1.8, 0.5, 2.7, 29, 1, 0.05, and 37 ppm. Analysis of Cu in sediments which concentrations of 98.6 ppm, Cu-66 is not detected. Tests using pure standard solutions of Cu obtained detection limit of 0.12 µg, or 7.9 ppm in samples of 15 mg. In general, the detection limit obtained was higher than the detection limit of the reference, it was caused by the differences in the sample matrix and analytical conditions. (author)

  13. Image processing and analysis using neural networks for optometry area

    Science.gov (United States)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  14. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.

  15. Fast EEG spike detection via eigenvalue analysis and clustering of spatial amplitude distribution

    Science.gov (United States)

    Fukami, Tadanori; Shimada, Takamasa; Ishikawa, Bunnoshin

    2018-06-01

    Objective. In the current study, we tested a proposed method for fast spike detection in electroencephalography (EEG). Approach. We performed eigenvalue analysis in two-dimensional space spanned by gradients calculated from two neighboring samples to detect high-amplitude negative peaks. We extracted the spike candidates by imposing restrictions on parameters regarding spike shape and eigenvalues reflecting detection characteristics of individual medical doctors. We subsequently performed clustering, classifying detected peaks by considering the amplitude distribution at 19 scalp electrodes. Clusters with a small number of candidates were excluded. We then defined a score for eliminating spike candidates for which the pattern of detected electrodes differed from the overall pattern in a cluster. Spikes were detected by setting the score threshold. Main results. Based on visual inspection by a psychiatrist experienced in EEG, we evaluated the proposed method using two statistical measures of precision and recall with respect to detection performance. We found that precision and recall exhibited a trade-off relationship. The average recall value was 0.708 in eight subjects with the score threshold that maximized the F-measure, with 58.6  ±  36.2 spikes per subject. Under this condition, the average precision was 0.390, corresponding to a false positive rate 2.09 times higher than the true positive rate. Analysis of the required processing time revealed that, using a general-purpose computer, our method could be used to perform spike detection in 12.1% of the recording time. The process of narrowing down spike candidates based on shape occupied most of the processing time. Significance. Although the average recall value was comparable with that of other studies, the proposed method significantly shortened the processing time.

  16. Multi-Modal Detection and Mapping of Static and Dynamic Obstacles in Agriculture for Process Evaluation

    Directory of Open Access Journals (Sweden)

    Timo Korthals

    2018-03-01

    Full Text Available Today, agricultural vehicles are available that can automatically perform tasks such as weed detection and spraying, mowing, and sowing while being steered automatically. However, for such systems to be fully autonomous and self-driven, not only their specific agricultural tasks must be automated. An accurate and robust perception system automatically detecting and avoiding all obstacles must also be realized to ensure safety of humans, animals, and other surroundings. In this paper, we present a multi-modal obstacle and environment detection and recognition approach for process evaluation in agricultural fields. The proposed pipeline detects and maps static and dynamic obstacles globally, while providing process-relevant information along the traversed trajectory. Detection algorithms are introduced for a variety of sensor technologies, including range sensors (lidar and radar and cameras (stereo and thermal. Detection information is mapped globally into semantical occupancy grid maps and fused across all sensors with late fusion, resulting in accurate traversability assessment and semantical mapping of process-relevant categories (e.g., crop, ground, and obstacles. Finally, a decoding step uses a Hidden Markov model to extract relevant process-specific parameters along the trajectory of the vehicle, thus informing a potential control system of unexpected structures in the planned path. The method is evaluated on a public dataset for multi-modal obstacle detection in agricultural fields. Results show that a combination of multiple sensor modalities increases detection performance and that different fusion strategies must be applied between algorithms detecting similar and dissimilar classes.

  17. Early Detection of Diabetic Retinopathy in Fluorescent Angiography Retinal Images Using Image Processing Methods

    Directory of Open Access Journals (Sweden)

    Meysam Tavakoli

    2010-12-01

    Full Text Available Introduction: Diabetic retinopathy (DR is the single largest cause of sight loss and blindness in the working age population of Western countries; it is the most common cause of blindness in adults between 20 and 60 years of age. Early diagnosis of DR is critical for preventing vision loss so early detection of microaneurysms (MAs as the first signs of DR is important. This paper addresses the automatic detection of MAs in fluorescein angiography fundus images, which plays a key role in computer assisted diagnosis of DR, a serious and frequent eye disease. Material and Methods: The algorithm can be divided into three main steps. The first step or pre-processing was for background normalization and contrast enhancement of the image. The second step aimed at detecting landmarks, i.e., all patterns possibly corresponding to vessels and the optic nerve head, which was achieved using a local radon transform. Then, MAs were extracted, which were used in the final step to automatically classify candidates into real MA and other objects. A database of 120 fluorescein angiography fundus images was used to train and test the algorithm. The algorithm was compared to manually obtained gradings of those images. Results: Sensitivity of diagnosis for DR was 94%, with specificity of 75%, and sensitivity of precise microaneurysm localization was 92%, at an average number of 8 false positives per image. Discussion and Conclusion: Sensitivity and specificity of this algorithm make it one of the best methods in this field. Using local radon transform in this algorithm eliminates the noise sensitivity for microaneurysm detection in retinal image analysis.

  18. Early detection of foot ulcers through asymmetry analysis

    Science.gov (United States)

    Kaabouch, Naima; Chen, Yi; Hu, Wen-Chen; Anderson, Julie; Ames, Forrest; Paulson, Rolf

    2009-02-01

    Foot ulcers affect millions of Americans annually. Areas that are likely to ulcerate have been associated with increased local skin temperatures due to inflammation and enzymatic autolysis of tissue. Conventional methods to assess skin, including inspection and palpation, may be valuable approaches, but usually they do not detect changes in skin integrity until an ulcer has already developed. Conversely, infrared imaging is a technology able to assess the integrity of the skin and its many layers, thus having the potential to index the cascade of physiological events in the prevention, assessment, and management of foot ulcers. In this paper, we propose a technique, asymmetry analysis, to automatically analyze the infrared images in order to detect inflammation. Preliminary results show that the proposed technique can be reliable and efficient to detect inflammation and, hence, predict potential ulceration.

  19. Analysis of accelerants and fire debris using aroma detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Barshick, S.A.

    1997-01-17

    The purpose of this work was to investigate the utility of electronic aroma detection technologies for the detection and identification of accelerant residues in suspected arson debris. Through the analysis of known accelerant residues, a trained neural network was developed for classifying suspected arson samples. Three unknown fire debris samples were classified using this neural network. The item corresponding to diesel fuel was correctly identified every time. For the other two items, wide variations in sample concentration and excessive water content, producing high sample humidities, were shown to influence the sensor response. Sorbent sampling prior to aroma detection was demonstrated to reduce these problems and to allow proper neural network classification of the remaining items corresponding to kerosene and gasoline.

  20. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  1. Image processing based detection of lung cancer on CT scan images

    Science.gov (United States)

    Abdillah, Bariqi; Bustamam, Alhadi; Sarwinda, Devvi

    2017-10-01

    In this paper, we implement and analyze the image processing method for detection of lung cancer. Image processing techniques are widely used in several medical problems for picture enhancement in the detection phase to support the early medical treatment. In this research we proposed a detection method of lung cancer based on image segmentation. Image segmentation is one of intermediate level in image processing. Marker control watershed and region growing approach are used to segment of CT scan image. Detection phases are followed by image enhancement using Gabor filter, image segmentation, and features extraction. From the experimental results, we found the effectiveness of our approach. The results show that the best approach for main features detection is watershed with masking method which has high accuracy and robust.

  2. Individual differences in event-based prospective memory: Evidence for multiple processes supporting cue detection.

    Science.gov (United States)

    Brewer, Gene A; Knight, Justin B; Marsh, Richard L; Unsworth, Nash

    2010-04-01

    The multiprocess view proposes that different processes can be used to detect event-based prospective memory cues, depending in part on the specificity of the cue. According to this theory, attentional processes are not necessary to detect focal cues, whereas detection of nonfocal cues requires some form of controlled attention. This notion was tested using a design in which we compared performance on a focal and on a nonfocal prospective memory task by participants with high or low working memory capacity. An interaction was found, such that participants with high and low working memory performed equally well on the focal task, whereas the participants with high working memory performed significantly better on the nonfocal task than did their counterparts with low working memory. Thus, controlled attention was only necessary for detecting event-based prospective memory cues in the nonfocal task. These results have implications for theories of prospective memory, the processes necessary for cue detection, and the successful fulfillment of intentions.

  3. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......, that the same principles that apply to a binary non-reactive compound system are valid also for a binary-element or a multi-element system. Therefore, it is advantageous to employ the element based method for multicomponent reaction-separation systems. It is shown that the same design-control principles...

  4. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  5. Dual-Process Theory and Signal-Detection Theory of Recognition Memory

    Science.gov (United States)

    Wixted, John T.

    2007-01-01

    Two influential models of recognition memory, the unequal-variance signal-detection model and a dual-process threshold/detection model, accurately describe the receiver operating characteristic, but only the latter model can provide estimates of recollection and familiarity. Such estimates often accord with those provided by the remember-know…

  6. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING and EVALUATION MEHTODS and REQUIREMENTS

    International Nuclear Information System (INIS)

    SCHOFIELD JS

    2007-01-01

    This document has two purposes: (sm b ullet) Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. (sm b ullet) Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals

  7. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian

  8. Application of factor analysis to the explosive detection

    International Nuclear Information System (INIS)

    Park, Yong Joon; Song, Byung Chul; Im, Hee Jung; Kim, Won Ho; Cho, Jung Hwan

    2005-01-01

    The detection of explosive devices hidden in airline baggage is significant problem, particularly in view of the development of modern plastic explosives which can formed into various innocent-appearing shapes and which are sufficiently powerful that small quantities can destroy an aircraft in flight. Besides, the biggest difficulty occurs from long detection time required for the explosive detection system based on thermal neutron interrogation, which involves exposing baggage to slow neutrons having energy in the order of 0.025 eV. The elemental compositions of explosives can be determined by the Neutron Induced Prompt gamma Spectroscopy (NIPS) which has been installed in Korea Atomic Energy Research Institute as a tool for the detection of explosives in passenger baggage. In this work, the factor analysis has been applied to the NIPS system to increase the signal-to-noise ratio of the prompt gamma spectrum for the detection of explosive hidden in a passenger's baggage, especially for the noisy prompt gamma spectrum obtained with short measurement time

  9. Entrepreneurship Learning Process by using SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Jajat Sudrajat

    2016-03-01

    Full Text Available The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students and faculty. Participants of this study were Binus student of various majors who were taking courses EN001 and EN002. This study used research and development that examining the theoretical learning components of entrepreneurship education (teaching and learning dimension, where there were six dimensions of the survey which was a fundamental element in determining the framework of entrepreneurship education. Research finds that a strategy based on a matrix of factors is at least eight strategies for improving the learning process of entrepreneurship. From eight strategies are one of them strategies to increase collaboration BEC with family support. This strategy is supported by the survey results to the three majors who are following the EN001 and EN002, where more than 85% of the students are willing to do an aptitude test to determine the advantages and disadvantages of self-development and more of 54% of the students are not willing to accept the wishes of their parents because they do not correspond to his ideals. Based on the above results, it is suggested for further research, namely developing entrepreneurship research by analyzing other dimensions.

  10. Data Set for the manuscript entitled, "Sample Processing Approach for Detection of Ricin in Surface Samples."

    Data.gov (United States)

    U.S. Environmental Protection Agency — Figure. This dataset is associated with the following publication: Shah, S., S. Kane, A.M. Erler, and T. Alfaro. Sample Processing Approach for Detection of Ricin in...

  11. Detecting Phonemes and Letters in Text: Interactions Between Different Types and Levels of Processes

    National Research Council Canada - National Science Library

    Schnelder, Vivian

    1998-01-01

    .... In addition, visual word unitization processes were implicated. Experiments 3 and 4 provided support for the hypothesis that the Gestalt goodness of pattern affected detection errors when subjects searched for letters...

  12. Leak detection in pipelines through spectral analysis of pressure signals

    Directory of Open Access Journals (Sweden)

    Souza A.L.

    2000-01-01

    Full Text Available The development and test of a technique for leak detection in pipelines is presented. The technique is based on the spectral analysis of pressure signals measured in pipeline sections where the formation of stationary waves is favoured, allowing leakage detection during the start/stop of pumps. Experimental tests were performed in a 1250 m long pipeline for various operational conditions of the pipeline (liquid flow rate and leakage configuration. Pressure transients were obtained by four transducers connected to a PC computer. The obtained results show that the spectral analysis of pressure transients, together with the knowledge of reflection points provide a simple and efficient way of identifying leaks during the start/stop of pumps in pipelines.

  13. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  14. Image Segmentation and Processing for Efficient Parking Space Analysis

    OpenAIRE

    Tutika, Chetan Sai; Vallapaneni, Charan; R, Karthik; KP, Bharath; Muthu, N Ruban Rajesh Kumar

    2018-01-01

    In this paper, we develop a method to detect vacant parking spaces in an environment with unclear segments and contours with the help of MATLAB image processing capabilities. Due to the anomalies present in the parking spaces, such as uneven illumination, distorted slot lines and overlapping of cars. The present-day conventional algorithms have difficulties processing the image for accurate results. The algorithm proposed uses a combination of image pre-processing and false contour detection ...

  15. Criteria for assessing the quality of signal processing techniques for acoustic leak detection

    International Nuclear Information System (INIS)

    Prabhakar, R.; Singh, O.P.

    1990-01-01

    In this paper the criteria used in the first IAEA coordinated research programme to assess the quality of signal processing techniques for sodium boiling noise detection are highlighted. Signal processing techniques, using new features sensitive to boiling and a new approach for achieving higher reliability of detection, which were developed at Indira Gandhi Centre for Atomic Research are also presented. 10 refs, 3 figs, 2 tabs

  16. Establishment of analysis method for methane detection by gas chromatography

    Science.gov (United States)

    Liu, Xinyuan; Yang, Jie; Ye, Tianyi; Han, Zeyu

    2018-02-01

    The study focused on the establishment of analysis method for methane determination by gas chromatography. Methane was detected by hydrogen flame ionization detector, and the quantitative relationship was determined by working curve of y=2041.2x+2187 with correlation coefficient of 0.9979. The relative standard deviation of 2.60-6.33% and the recovery rate of 96.36%∼105.89% were obtained during the parallel determination of standard gas. This method was not quite suitable for biogas content analysis because methane content in biogas would be over the measurement range in this method.

  17. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2015-02-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  18. Natural and professional help: a process analysis.

    Science.gov (United States)

    Tracey, T J; Toro, P A

    1989-08-01

    Differences in the helping interactions formed by mental health professionals, divorce lawyers, and mutual help group leaders were examined. Fourteen members of each of these three helper groups (N = 42) met independently with a coached client presenting marital difficulties. Using ratings of ability to ameliorate the personal and emotional problems presented, the 42 helpers were divided (using a median split) into successful and less successful outcome groups. The responses of each of the pairs were coded using the Hill Counselor Verbal Response Category System. The sequence of client-helper responses were examined using log-linear analysis as they varied by type of helper and outcome. Results indicated that successful helpers (regardless of type of helper) tended to use directives (e.g., guidance and approval-reassurance) differently from less successful helpers. Successful helpers used directives following client emotional expression and not following factual description. In addition, clear differences in helper responses by helper type and outcome were found. Each helper type had unique patterns of responses that differentiated successful from less successful outcomes. Client responses were found to vary across helper type even when given the same helper preceding response. Results are discussed with respect to the unique goals of each helping relationship and the different shaping process involved in each.

  19. Nonlinear damage detection in composite structures using bispectral analysis

    Science.gov (United States)

    Ciampa, Francesco; Pickering, Simon; Scarselli, Gennaro; Meo, Michele

    2014-03-01

    Literature offers a quantitative number of diagnostic methods that can continuously provide detailed information of the material defects and damages in aerospace and civil engineering applications. Indeed, low velocity impact damages can considerably degrade the integrity of structural components and, if not detected, they can result in catastrophic failure conditions. This paper presents a nonlinear Structural Health Monitoring (SHM) method, based on ultrasonic guided waves (GW), for the detection of the nonlinear signature in a damaged composite structure. The proposed technique, based on a bispectral analysis of ultrasonic input waveforms, allows for the evaluation of the nonlinear response due to the presence of cracks and delaminations. Indeed, such a methodology was used to characterize the nonlinear behaviour of the structure, by exploiting the frequency mixing of the original waveform acquired from a sparse array of sensors. The robustness of bispectral analysis was experimentally demonstrated on a damaged carbon fibre reinforce plastic (CFRP) composite panel, and the nonlinear source was retrieved with a high level of accuracy. Unlike other linear and nonlinear ultrasonic methods for damage detection, this methodology does not require any baseline with the undamaged structure for the evaluation of the nonlinear source, nor a priori knowledge of the mechanical properties of the specimen. Moreover, bispectral analysis can be considered as a nonlinear elastic wave spectroscopy (NEWS) technique for materials showing either classical or non-classical nonlinear behaviour.

  20. A piezoresistive cantilever for lateral force detection fabricated by a monolithic post-CMOS process

    International Nuclear Information System (INIS)

    Ji Xu; Li Zhihong; Li Juan; Wang Yangyuan; Xi Jianzhong

    2008-01-01

    This paper presents a post-CMOS process to monolithically integrate a piezoresistive cantilever for lateral force detection and signal processing circuitry. The fabrication process includes a standard CMOS process and one more lithography step to micromachine the cantilever structure in the post-CMOS process. The piezoresistors are doped in the CMOS process but defined in the post-CMOS micromachining process without any extra process required. A partially split cantilever configuration is developed for the lateral force detection. The piezoresistors are self-aligned to the split cantilever, and therefore the width of the beam is only limited by lithography. Consequently, this kind of cantilever potentially has a high resolution. The preliminary experimental results show expected performances of the fabricated piezoresistors and electronic circuits

  1. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  2. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    International Nuclear Information System (INIS)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao; Carlson, Reed B.; Yoo, Tae-Sic

    2017-01-01

    Highlights: • Process monitoring can strengthen nuclear safeguards and material accountancy. • Assessment is conducted at a system-centric level to improve safeguards effectiveness. • Anomaly detection is improved by integrating process and operation relationships. • Decision making is benefited from using sensor and event sequence information. • Formal framework enables optimization of sensor and data processing resources. - Abstract: In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a system-centric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologies within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.

  3. Development of vibrational analysis for detection of antisymmetric shells

    International Nuclear Information System (INIS)

    Esmailzadeh Khadem, S.; Mahmoodi, M.; Rezaee, M.

    2002-01-01

    In this paper, vibrational behavior of bodies of revolution with different types of structural faults is studied. Since vibrational characteristics of structures are natural properties of system, the existence of any structural faults causes measurable changes in these properties. Here, this matter is demonstrated. In other words, vibrational behavior of a body of revolution with no structural faults is analyzed by two methods of I) numerical analysis using super sap software, II) Experimental model analysis, and natural frequencies and mode shapes are obtained. Then, different types of cracks are introduced in the structure, and analysis is repeated and the results are compared. Based on this study, one may perform crack detection by measuring the natural frequencies and mode shapes of the samples and comparing with reference information obtained from the vibration analysis of the original structure with no fault

  4. Limits of detection in instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Guinn, V.P.

    1990-01-01

    Lower limits of detection (LLODs), frequently referred to simply as limits of detection and abbreviated as LODs, often appear in the literature of analytical chemistry - for numerous different methods of elemental and/or molecular analysis. In this chapter, one particular method of quantitative elemental analysis, that of instrumental neutron activation analysis (INAA), is the subject discussed, with reference to LODs. Particularly in the literature of neutron activation analysis (NAA), many tables of 'interference-free' NAA LODs are available. Not all of these are of much use, because (1) for many the definition used for LOD is not clear, or reasonable, (2) for many, the analysis conditions used are not clearly specified, and (3) for many, the analysis conditions used are specified, but not very practicable for most laboratories. For NAA work, such tables of interference-free LODs are, in any case, only applicable to samples in which, at the time of counting, only one radionuclide is present to any significant extent in the activated sample. It is important to note that tables of INAA LODs, per se, do not exist - since the LOD for a given element, under stated analysis conditions, can vary by orders of magnitude, depending on the elemental composition of the matrix in which it is present. For any given element, its INAA LOD will always be as large as, and usually much larger than, its tabulated 'interference-free' NAA LOD - how much larger depending upon the elemental composition of the matrix in which it is present. As discussed in this chapter, however, an INAA computer program exists that can calculate realistic INAA LODs for any elements of interest, in any kind of specified sample matrix, under any given set of analysis conditions

  5. Ultrasonic sensor system to detect solids in a milk pasteurization process

    Science.gov (United States)

    Barroeta Z., Carlos; Sanchez M., Fernando L.; Fernando R., G. Moreno; Montes P., Laura

    2002-11-01

    In the food industry, many products require a specific process. In the milk industry, the raw milk passes through several process stages before reaching the end user in a very qualitative and healthy way. One of the problems of the milk is that it can contain solids in suspension, result of contamination of the milk, or inherent to the pasteurization process itself. In order to control these solids, a solid detection system is being developed, which will detect the solids by the reflection and refraction of ultrasonic waves. The sensor must be set in the upper part of the milk containers, and with a grid array to allow the control system to prevent these solids from entering into the pipes of the processing plant. The sensing system may activate an acoustic alarm to indicate that a solid has been detected, and a visual one to indicate the affected part of the process. (To be presented in Spanish.)

  6. The application of image processing to the detection of corrosion by radiography

    International Nuclear Information System (INIS)

    Packer, M.E.

    1979-02-01

    The computer processing of digitised radiographs has been investigated with a view to improving x-radiography as a method for detecting corrosion. Linearisation of the image-density distribution in a radiograph has been used to enhance information which can be attributed to corrosion, making the detection of corrosion by radiography both easier and more reliable. However, conclusive evidence has yet to be obtained that image processing can result in the detection of corrosion which was not already faintly apparent on an unprocessed radiograph. A potential method has also been discovered for analysing the history of a corrosion site

  7. Fault Detection and Diagnosis System in Process industry Based on Big Data and WeChat

    Directory of Open Access Journals (Sweden)

    Sun Zengqiang

    2017-01-01

    Full Text Available The fault detection and diagnosis information in process industry can be received, anytime and anywhere, based on bigdata and WeChat with mobile phone, which got rid of constraints that can only check Distributed Control System (DCS in the central control room or look over in office. Then, fault detection, diagnosis information sharing can be provided, and what’s more, fault detection alarm range, code and inform time can be personalized. The pressure of managers who worked on process industry can be release with the mobile information system.

  8. Statistical methods for anomaly detection in the complex process; Methodes statistiques de detection d'anomalies de fonctionnement dans les processus complexes

    Energy Technology Data Exchange (ETDEWEB)

    Al Mouhamed, Mayez

    1977-09-15

    In a number of complex physical systems the accessible signals are often characterized by random fluctuations about a mean value. The fluctuations (signature) often transmit information about the state of the system that the mean value cannot predict. This study is undertaken to elaborate statistical methods of anomaly detection on the basis of signature analysis of the noise inherent in the process. The algorithm presented first learns the characteristics of normal operation of a complex process. Then it detects small deviations from the normal behavior. The algorithm can be implemented in a medium-sized computer for on line application. (author) [French] Dans de nombreux systemes physiques complexes les grandeurs accessibles a l'homme sont souvent caracterisees par des fluctuations aleatoires autour d'une valeur moyenne. Les fluctuations (signatures) transmettent souvent des informations sur l'etat du systeme que la valeur moyenne ne peut predire. Cette etude est entreprise pour elaborer des methodes statistiques de detection d'anomalies de fonctionnement sur la base de l'analyse des signatures contenues dans les signaux de bruit provenant du processus. L'algorithme presente est capable de: 1/ Apprendre les caracteristiques des operations normales dans un processus complexe. 2/ Detecter des petites deviations par rapport a la conduite normale du processus. L'algorithme peut etre implante sur un calculateur de taille moyenne pour les applications en ligne. (auteur)

  9. Incrementally Detecting Change Types of Spatial Area Object: A Hierarchical Matching Method Considering Change Process

    Directory of Open Access Journals (Sweden)

    Yanhui Wang

    2018-01-01

    Full Text Available Detecting and extracting the change types of spatial area objects can track area objects’ spatiotemporal change pattern and provide the change backtracking mechanism for incrementally updating spatial datasets. To respond to the problems of high complexity of detection methods, high redundancy rate of detection factors, and the low automation degree during incrementally update process, we take into account the change process of area objects in an integrated way and propose a hierarchical matching method to detect the nine types of changes of area objects, while minimizing the complexity of the algorithm and the redundancy rate of detection factors. We illustrate in details the identification, extraction, and database entry of change types, and how we achieve a close connection and organic coupling of incremental information extraction and object type-of-change detection so as to characterize the whole change process. The experimental results show that this method can successfully detect incremental information about area objects in practical applications, with the overall accuracy reaching above 90%, which is much higher than the existing weighted matching method, making it quite feasible and applicable. It helps establish the corresponding relation between new-version and old-version objects, and facilitate the linked update processing and quality control of spatial data.

  10. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  11. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  12. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Semi-Disk Structure

    Science.gov (United States)

    2018-01-01

    kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science dealing with the analysis or...diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm purposes in industrial plants are the...SUPPLEMENTARY NOTES 14. ABSTRACT This report is the result of applying morphological image and statistical processing techniques to the energy

  13. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  14. Sensor Failure Detection of FASSIP System using Principal Component Analysis

    Science.gov (United States)

    Sudarno; Juarsa, Mulya; Santosa, Kussigit; Deswandri; Sunaryo, Geni Rina

    2018-02-01

    In the nuclear reactor accident of Fukushima Daiichi in Japan, the damages of core and pressure vessel were caused by the failure of its active cooling system (diesel generator was inundated by tsunami). Thus researches on passive cooling system for Nuclear Power Plant are performed to improve the safety aspects of nuclear reactors. The FASSIP system (Passive System Simulation Facility) is an installation used to study the characteristics of passive cooling systems at nuclear power plants. The accuracy of sensor measurement of FASSIP system is essential, because as the basis for determining the characteristics of a passive cooling system. In this research, a sensor failure detection method for FASSIP system is developed, so the indication of sensor failures can be detected early. The method used is Principal Component Analysis (PCA) to reduce the dimension of the sensor, with the Squarred Prediction Error (SPE) and statistic Hotteling criteria for detecting sensor failure indication. The results shows that PCA method is capable to detect the occurrence of a failure at any sensor.

  15. Analysis of the theoretical bias in dark matter direct detection

    International Nuclear Information System (INIS)

    Catena, Riccardo

    2014-01-01

    Fitting the model ''A'' to dark matter direct detection data, when the model that underlies the data is ''B'', introduces a theoretical bias in the fit. We perform a quantitative study of the theoretical bias in dark matter direct detection, with a focus on assumptions regarding the dark matter interactions, and velocity distribution. We address this problem within the effective theory of isoscalar dark matter-nucleon interactions mediated by a heavy spin-1 or spin-0 particle. We analyze 24 benchmark points in the parameter space of the theory, using frequentist and Bayesian statistical methods. First, we simulate the data of future direct detection experiments assuming a momentum/velocity dependent dark matter-nucleon interaction, and an anisotropic dark matter velocity distribution. Then, we fit a constant scattering cross section, and an isotropic Maxwell-Boltzmann velocity distribution to the simulated data, thereby introducing a bias in the analysis. The best fit values of the dark matter particle mass differ from their benchmark values up to 2 standard deviations. The best fit values of the dark matter-nucleon coupling constant differ from their benchmark values up to several standard deviations. We conclude that common assumptions in dark matter direct detection are a source of potentially significant bias

  16. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  17. Detection of irradiated chicken by 2-alkylcyclobutanone analysis

    International Nuclear Information System (INIS)

    Tanabe, Hiroko; Goto, Michiko; Miyahara, Makoto

    2001-01-01

    Chicken meat irradiated at 0.5 kGy or higher doses were identified by GC/MS method analyzing 2-dodecylcyclobutanone (2-DCB) and 2-tetradecylcyclobutanone (2-TCB), which are formed from palmitic acid and stearic acid respectively, and isolated using extraction procedures of soxhlet-florisil chromatography. Many fat-containing foods have oleic acid in abundance as parent fatty acid, and chicken meat contains palmitoleic acid to the amount as much as stearic acid. In this study, we detected 2-tetradec-5'-enylcyclobutanone (2-TeCB) and 2-dodec-5'-enylcyclobutanone (2-DeCB) in chicken meat, which are formed from oleic acid and palmitoleic acid by irradiation respectively, using GC/MS method. Sensitivity in detection of both 2-TeCB and 2-DeCB were lower than that of 2-DCB. However, at least 0.57 μg/g/fat of 2-TeCB was detected in chicken meat irradiated at 0.5 kGy, so 2-TeCB seems to be a useful marker for the identification of irradiated foods containing fat. On the contrary, 2-DeCB was not detected clearly at low doses. This suggests that 2-DeCB may be a useful marker for irradiated fat in the food having enough amount of palmitoleic acid needed to analysis. In addition, 2-tetradecadienylcyclobutanone, which is formed from linoleic acid was also found in chicken meat. (author)

  18. Detection of hidden explosives by fast neutron activation analysis

    International Nuclear Information System (INIS)

    Li Xinnian; Guo Junpeng; Luo Wenyun; Wang Chuanshan; Fang Xiaoming; Yu Tailiu

    2008-01-01

    The paper describes the method and principle for detection of hidden explosive by fast neutron activation analysis (FNAA). The method of detection of explosives by FNAA has the specific properties of simple determination equipments, high reliability, and low detecting cost, and would be beneficial to the applicability and popularization in the field of protecting and securing nation. The contents of nitrogen and oxygen in four explosives, more then ten common materials and TNT samples covered with soil, were measured by FNAA. 14 MeV fast neutrons were generated from (d, t) reaction with a 400 kV Cockcroft Walton type accelerator. The two-dimension distributions for nitro- gen and oxygen counting rates per unit mass of determined matters were obtained, and the characteristic area of explosives and non-explosives can be defined. By computer aided pattern recognition, the samples were identified with low false alarm or omission rates. The Monte-Carlo simulation indicates that there is no any radiation at 15 m apart from neutron source and is safe for irradiation after 1 h. It is suggested that FNAA may be potential in remote controlling for detection hidden explosive system with multi-probe large array. (authors)

  19. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    Directory of Open Access Journals (Sweden)

    Göran Pulkkis

    2011-11-01

    Full Text Available As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA, which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  20. LANDSAT-8 OPERATIONAL LAND IMAGER CHANGE DETECTION ANALYSIS

    Directory of Open Access Journals (Sweden)

    W. Pervez

    2017-05-01

    Full Text Available This paper investigated the potential utility of Landsat-8 Operational Land Imager (OLI for change detection analysis and mapping application because of its superior technical design to previous Landsat series. The OLI SVM classified data was successfully classified with regard to all six test classes (i.e., bare land, built-up land, mixed trees, bushes, dam water and channel water. OLI support vector machine (SVM classified data for the four seasons (i.e., spring, autumn, winter, and summer was used to change detection results of six cases: (1 winter to spring which resulted reduction in dam water mapping and increases of bushes; (2 winter to summer which resulted reduction in dam water mapping and increase of vegetation; (3 winter to autumn which resulted increase in dam water mapping; (4 spring to summer which resulted reduction of vegetation and shallow water; (5 spring to autumn which resulted decrease of vegetation; and (6 summer to autumn which resulted increase of bushes and vegetation . OLI SVM classified data resulted higher overall accuracy and kappa coefficient and thus found suitable for change detection analysis.

  1. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  2. Low-level processing for real-time image analysis

    Science.gov (United States)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  3. In-Process Detection of Weld Defects Using Laser-Based Ultrasound

    International Nuclear Information System (INIS)

    Bacher, G.D.; Kercel, S.W.; Kisner, R.A.; Klein, M.B.; Pouet, B.

    1999-01-01

    Laser-based ultrasonic (LBU) measurement shows great promise for on-line monitoring of weld quality in tailor-welded blanks. Tailor-welded blanks are steel blanks made from plates of differing thickness and/or properties butt-welded together; they are used in automobile manufacturing to produce body, frame, and closure panels. LBU uses a pulsed laser to generate the ultrasound and a continuous wave (CW) laser interferometer to detect the ultrasound at the point of interrogation to perform ultrasonic inspection. LBU enables in-process measurements since there is no sensor contact or near-contact with the workpiece. The authors are using laser-generated plate (Lamb) waves to propagate from one plate into the weld nugget as a means of detecting defects. This paper reports the results of the investigation of a number of inspection architectures based on processing of signals from selected plate waves, which are either reflected from or transmitted through the weld zone. Bayesian parameter estimation and wavelet analysis (both continuous and discrete) have shown that the LBU time-series signal is readily separable into components that provide distinguishing features which describe weld quality. The authors anticipate that, in an on-line industrial application, these measurements can be implemented just downstream from the weld cell. Then the weld quality data can be fed back to control critical weld parameters or alert the operator of a problem requiring maintenance. Internal weld defects and deviations from the desired surface profile can then be corrected before defective parts are produced

  4. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  5. Error detection in GPS observations by means of Multi-process models

    DEFF Research Database (Denmark)

    Thomsen, Henrik F.

    2001-01-01

    The main purpose of this article is to present the idea of using Multi-process models as a method of detecting errors in GPS observations. The theory behind Multi-process models, and double differenced phase observations in GPS is presented shortly. It is shown how to model cycle slips in the Mul...

  6. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of

  7. Process variant comparison: using event logs to detect differences in behavior and business rules

    NARCIS (Netherlands)

    Bolt, A.; de Leoni, M.; van der Aalst, W.M.P.

    2018-01-01

    This paper addresses the problem of comparing different variants of the same process. We aim to detect relevant differences between processes based on what was recorded in event logs. We use transition systems to model behavior and to highlight differences. Transition systems are annotated with

  8. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  9. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    OpenAIRE

    Constanta RADULESCU; Liviu Marius CÎRŢÎNĂ; Constantin MILITARU

    2011-01-01

    This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and ...

  10. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  11. Amorphous silicon batch process cost analysis

    International Nuclear Information System (INIS)

    Whisnant, R.A.; Sherring, C.

    1993-08-01

    This report describes the development of baseline manufacturing cost data to assist PVMaT monitoring teams in assessing current and future subcontracts, which an emphasis on commercialization and production. A process for the manufacture of a single-junction, large-area, a Si module was modeled using an existing Research Triangle Institute (RTI) computer model. The model estimates a required, or breakeven, price for the module based on its production process and the financial structure of the company operating the process. Sufficient detail on cost drivers is presented so the relationship of the process features and business characteristics can be related to the estimated required price

  12. Analysis of Public Datasets for Wearable Fall Detection Systems.

    Science.gov (United States)

    Casilari, Eduardo; Santoyo-Ramón, José-Antonio; Cano-García, José-Manuel

    2017-06-27

    Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs) have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs). In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.). Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  13. Analysis of Public Datasets for Wearable Fall Detection Systems

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2017-06-01

    Full Text Available Due to the boom of wireless handheld devices such as smartwatches and smartphones, wearable Fall Detection Systems (FDSs have become a major focus of attention among the research community during the last years. The effectiveness of a wearable FDS must be contrasted against a wide variety of measurements obtained from inertial sensors during the occurrence of falls and Activities of Daily Living (ADLs. In this regard, the access to public databases constitutes the basis for an open and systematic assessment of fall detection techniques. This paper reviews and appraises twelve existing available data repositories containing measurements of ADLs and emulated falls envisaged for the evaluation of fall detection algorithms in wearable FDSs. The analysis of the found datasets is performed in a comprehensive way, taking into account the multiple factors involved in the definition of the testbeds deployed for the generation of the mobility samples. The study of the traces brings to light the lack of a common experimental benchmarking procedure and, consequently, the large heterogeneity of the datasets from a number of perspectives (length and number of samples, typology of the emulated falls and ADLs, characteristics of the test subjects, features and positions of the sensors, etc.. Concerning this, the statistical analysis of the samples reveals the impact of the sensor range on the reliability of the traces. In addition, the study evidences the importance of the selection of the ADLs and the need of categorizing the ADLs depending on the intensity of the movements in order to evaluate the capability of a certain detection algorithm to discriminate falls from ADLs.

  14. Robust detection of discordant sites in regional frequency analysis

    NARCIS (Netherlands)

    Neykov, N.M.; Neytchev, P.N.; Van Gelder, P.H.A.J.M.; Todorov, V.K.

    2007-01-01

    The discordancy measure in terms of the sample L?moment ratios (L?CV, L?skewness, L?kurtosis) of the at?site data is widely recommended in the screening process of atypical sites in the regional frequency analysis (RFA). The sample mean and the covariance matrix of the L?moments ratios, on which the

  15. Two-stage process analysis using the process-based performance measurement framework and business process simulation

    NARCIS (Netherlands)

    Han, K.H.; Kang, J.G.; Song, M.S.

    2009-01-01

    Many enterprises have recently been pursuing process innovation or improvement to attain their performance goals. To align a business process with enterprise performances, this study proposes a two-stage process analysis for process (re)design that combines the process-based performance measurement

  16. Detection of microbial biofilms on food processing surfaces: hyperspectral fluorescence imaging study

    Science.gov (United States)

    Jun, Won; Kim, Moon S.; Chao, Kaunglin; Lefcourt, Alan M.; Roberts, Michael S.; McNaughton, James L.

    2009-05-01

    We used a portable hyperspectral fluorescence imaging system to evaluate biofilm formations on four types of food processing surface materials including stainless steel, polypropylene used for cutting boards, and household counter top materials such as formica and granite. The objective of this investigation was to determine a minimal number of spectral bands suitable to differentiate microbial biofilm formation from the four background materials typically used during food processing. Ultimately, the resultant spectral information will be used in development of handheld portable imaging devices that can be used as visual aid tools for sanitation and safety inspection (microbial contamination) of the food processing surfaces. Pathogenic E. coli O157:H7 and Salmonella cells were grown in low strength M9 minimal medium on various surfaces at 22 +/- 2 °C for 2 days for biofilm formation. Biofilm autofluorescence under UV excitation (320 to 400 nm) obtained by hyperspectral fluorescence imaging system showed broad emissions in the blue-green regions of the spectrum with emission maxima at approximately 480 nm for both E. coli O157:H7 and Salmonella biofilms. Fluorescence images at 480 nm revealed that for background materials with near-uniform fluorescence responses such as stainless steel and formica cutting board, regardless of the background intensity, biofilm formation can be distinguished. This suggested that a broad spectral band in the blue-green regions can be used for handheld imaging devices for sanitation inspection of stainless, cutting board, and formica surfaces. The non-uniform fluorescence responses of granite make distinctions between biofilm and background difficult. To further investigate potential detection of the biofilm formations on granite surfaces with multispectral approaches, principal component analysis (PCA) was performed using the hyperspectral fluorescence image data. The resultant PCA score images revealed distinct contrast between

  17. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  18. Shielding analysis of the advanced voloxidation process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Park, J. J.; Lee, J. W.; Shin, J. M.; Park, G. I.; Song, K. C

    2008-09-15

    This report deals describes how much a shielding benefit can be obtained by the Advanced Voloxidation process. The calculation was performed with the MCNPX code and a simple problem was modeled with a spent fuel source which was surrounded by a concrete wall. The source terms were estimated with the ORIGEN-ARP code and the gamma spectrum and the neutron spectrum were also obtained. The thickness of the concrete wall was estimated before and after the voloxidation process. From the results, the gamma spectrum after the voloxidation process was estimated as a 67% reduction compared with that of before the voloxidation process due to the removal of several gamma emission elements such as cesium and rubidium. The MCNPX calculations provided that the thickness of the general concrete wall could be reduced by 12% after the voloxidation process. And the heavy concrete wall provided a 28% reduction in the shielding of the source term after the voloxidation process. This can be explained in that there lots of gamma emission isotopes still exist after the advanced voloxidation process such as Pu-241, Y-90, and Sr-90 which are independent of the voloxidation process.

  19. Probabilistic analysis of a thermosetting pultrusion process

    NARCIS (Netherlands)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper H.

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion

  20. Probabilistic analysis of a thermosetting pultrusion process

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusio...

  1. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    M.N.M. van Lieshout (Marie-Colette); V. Capasso

    2009-01-01

    htmlabstractWe give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects

  2. Sequential spatial processes for image analysis

    NARCIS (Netherlands)

    Lieshout, van M.N.M.; Capasso, V.

    2009-01-01

    We give a brief introduction to sequential spatial processes. We discuss their definition, formulate a Markov property, and indicate why such processes are natural tools in tackling high level vision problems. We focus on the problem of tracking a variable number of moving objects through a video

  3. On statistical analysis of compound point process

    Czech Academy of Sciences Publication Activity Database

    Volf, Petr

    2006-01-01

    Roč. 35, 2-3 (2006), s. 389-396 ISSN 1026-597X R&D Projects: GA ČR(CZ) GA402/04/1294 Institutional research plan: CEZ:AV0Z10750506 Keywords : counting process * compound process * hazard function * Cox -model Subject RIV: BB - Applied Statistics, Operational Research

  4. Construction Analysis during the Design Process

    NARCIS (Netherlands)

    Vries, de B.; Harink, J.M.J.; Martens, B.; Brown, A.

    2005-01-01

    4D CAD systems are used by contractors for visually checking the construction process. To enable simulation of the construction process, the construction planner links building components from a CAD model with the activities from a project planning. In this paper we describe a method to generate a

  5. Detecting depression stigma on social media: A linguistic analysis.

    Science.gov (United States)

    Li, Ang; Jiao, Dongdong; Zhu, Tingshao

    2018-05-01

    Efficient detection of depression stigma in mass media is important for designing effective stigma reduction strategies. Using linguistic analysis methods, this paper aims to build computational models for detecting stigma expressions in Chinese social media posts (Sina Weibo). A total of 15,879 Weibo posts with keywords were collected and analyzed. First, a content analysis was conducted on all 15,879 posts to determine whether each of them reflected depression stigma or not. Second, using four algorithms (Simple Logistic Regression, Multilayer Perceptron Neural Networks, Support Vector Machine, and Random Forest), two groups of classification models were built based on selected linguistic features; one for differentiating between posts with and without depression stigma, and one for differentiating among posts with three specific types of depression stigma. First, 967 of 15,879 posts (6.09%) indicated depression stigma. 39.30%, 15.82%, and 14.99% of them endorsed the stigmatizing view that "People with depression are unpredictable", "Depression is a sign of personal weakness", and "Depression is not a real medical illness", respectively. Second, the highest F-Measure value for differentiating between stigma and non-stigma reached 75.2%. The highest F-Measure value for differentiating among three specific types of stigma reached 86.2%. Due to the limited and imbalanced dataset of Chinese Weibo posts, the findings of this study might have limited generalizability. This paper confirms that incorporating linguistic analysis methods into online detection of stigma can be beneficial to improve the performance of stigma reduction programs. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Raza, Safdar; Mokhlis, Hazlie; Arof, Hamzah; Laghari, J.A.; Wang, Li

    2015-01-01

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  7. An automatized frequency analysis for vine plot detection and delineation in remote sensing

    OpenAIRE

    Delenne , Carole; Rabatel , G.; Deshayes , M.

    2008-01-01

    The availability of an automatic tool for vine plot detection, delineation, and characterization would be very useful for management purposes. An automatic and recursive process using frequency analysis (with Fourier transform and Gabor filters) has been developed to meet this need. This results in the determination of vine plot boundary and accurate estimation of interrow width and row orientation. To foster large-scale applications, tests and validation have been carried out on standard ver...

  8. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Science.gov (United States)

    Lee, Jack; Zee, Benny Chung Ying; Li, Qing

    2013-01-01

    Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA), high order spectrum analysis (HOS), fractal analysis (FA), and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC) are obtained. They are 96.3%, 99.1% and 98.5% (99.3%), respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  9. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Jack Lee

    Full Text Available Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA, high order spectrum analysis (HOS, fractal analysis (FA, and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC are obtained. They are 96.3%, 99.1% and 98.5% (99.3%, respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  10. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  11. Detection of Moving Targets Based on Doppler Spectrum Analysis Technique for Passive Coherent Radar

    Directory of Open Access Journals (Sweden)

    Zhao Yao-dong

    2013-06-01

    Full Text Available A novel method of moving targets detection taking Doppler spectrum analysis technique for Passive Coherent Radar (PCR is provided. After dividing the receiving signals into segments as pulse series, it utilizes the technique of pulse compress and Doppler processing to detect and locate the targets. Based on the algorithm for Pulse-Doppler (PD radar, the equipollence between continuous and pulsed wave in match filtering is proved and details of this method are introduced. To compare it with the traditional method of Cross-Ambiguity Function (CAF calculation, the relationship and mathematical modes of them are analyzed, with some suggestions on parameters choosing. With little influence to the gain of targets, the method can greatly promote the processing efficiency. The validity of the proposed method is demonstrated by offline processing real collected data sets and simulation results.

  12. A program for activation analysis data processing

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Loska, L.; Taczanowski, S.

    1978-01-01

    An ALGOL program for activation analysis data handling is presented. The program may be used either for single channel spectrometry data or for multichannel spectrometry. The calculation of instrumental error and of analysis standard deviation is carried out. The outliers are tested, and the regression line diagram with the related observations are plotted by the program. (author)

  13. Detection and Monitoring of Neurotransmitters - a Spectroscopic Analysis

    Science.gov (United States)

    Manciu, Felicia; Lee, Kendall; Durrer, William; Bennet, Kevin

    2012-10-01

    In this work we demonstrate the capability of confocal Raman mapping spectroscopy for simultaneously and locally detecting important compounds in neuroscience such as dopamine, serotonin, and adenosine. The Raman results show shifting of the characteristic vibrations of the compounds, observations consistent with previous spectroscopic studies. Although some vibrations are common in these neurotransmitters, Raman mapping was achieved by detecting non-overlapping characteristic spectral signatures of the compounds, as follows: for dopamine the vibration attributed to C-O stretching, for serotonin the indole ring stretching vibration, and for adenosine the adenine ring vibrations. Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific micro-scale image regions. Such information is particularly important for complex, heterogeneous samples, where modification of the chemical or physical composition can influence the neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.

  14. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    Directory of Open Access Journals (Sweden)

    Constanta RADULESCU

    2011-07-01

    Full Text Available This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and operations of the dimensions andtolerances tree scheme will make for a machined piece is both indoor and outdoor.

  15. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    Peng, R.; Li, Y.F.; Zhang, W.J.; Hu, Q.P.

    2014-01-01

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  16. Image Post-Processing and Analysis. Chapter 17

    Energy Technology Data Exchange (ETDEWEB)

    Yushkevich, P. A. [University of Pennsylvania, Philadelphia (United States)

    2014-09-15

    For decades, scientists have used computers to enhance and analyse medical images. At first, they developed simple computer algorithms to enhance the appearance of interesting features in images, helping humans read and interpret them better. Later, they created more advanced algorithms, where the computer would not only enhance images but also participate in facilitating understanding of their content. Segmentation algorithms were developed to detect and extract specific anatomical objects in images, such as malignant lesions in mammograms. Registration algorithms were developed to align images of different modalities and to find corresponding anatomical locations in images from different subjects. These algorithms have made computer aided detection and diagnosis, computer guided surgery and other highly complex medical technologies possible. Nowadays, the field of image processing and analysis is a complex branch of science that lies at the intersection of applied mathematics, computer science, physics, statistics and biomedical sciences. This chapter will give a general overview of the most common problems in this field and the algorithms that address them.

  17. Theoretical detection limit of PIXE analysis using 20 MeV proton beams

    Science.gov (United States)

    Ishii, Keizo; Hitomi, Keitaro

    2018-02-01

    Particle-induced X-ray emission (PIXE) analysis is usually performed using proton beams with energies in the range 2∼3 MeV because at these energies, the detection limit is low. The detection limit of PIXE analysis depends on the X-ray production cross-section, the continuous background of the PIXE spectrum and the experimental parameters such as the beam currents and the solid angle and detector efficiency of X-ray detector. Though the continuous background increases as the projectile energy increases, the cross-section of the X-ray increases as well. Therefore, the detection limit of high energy proton PIXE is not expected to increase significantly. We calculated the cross sections of continuous X-rays produced in several bremsstrahlung processes and estimated the detection limit of a 20 MeV proton PIXE analysis by modelling the Compton tail of the γ-rays produced in the nuclear reactions, and the escape effect on the secondary electron bremsstrahlung. We found that the Compton tail does not affect the detection limit when a thin X-ray detector is used, but the secondary electron bremsstrahlung escape effect does have an impact. We also confirmed that the detection limit of the PIXE analysis, when used with 4 μm polyethylene backing film and an integrated beam current of 1 μC, is 0.4∼2.0 ppm for proton energies in the range 10∼30 MeV and elements with Z = 16-90. This result demonstrates the usefulness of several 10 MeV cyclotrons for performing PIXE analysis. Cyclotrons with these properties are currently installed in positron emission tomography (PET) centers.

  18. STABILITY ANALYSIS OF RADIAL TURNING PROCESS FOR SUPERALLOYS

    Directory of Open Access Journals (Sweden)

    Alberto JIMÉNEZ

    2017-07-01

    Full Text Available Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC environments, are set in four different states depending on materials grain size and Hard-ness (LGA, LGS, SGA and SGS. Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.

  19. Improvement of retinal blood vessel detection using morphological component analysis.

    Science.gov (United States)

    Imani, Elaheh; Javidi, Malihe; Pourreza, Hamid-Reza

    2015-03-01

    Detection and quantitative measurement of variations in the retinal blood vessels can help diagnose several diseases including diabetic retinopathy. Intrinsic characteristics of abnormal retinal images make blood vessel detection difficult. The major problem with traditional vessel segmentation algorithms is producing false positive vessels in the presence of diabetic retinopathy lesions. To overcome this problem, a novel scheme for extracting retinal blood vessels based on morphological component analysis (MCA) algorithm is presented in this paper. MCA was developed based on sparse representation of signals. This algorithm assumes that each signal is a linear combination of several morphologically distinct components. In the proposed method, the MCA algorithm with appropriate transforms is adopted to separate vessels and lesions from each other. Afterwards, the Morlet Wavelet Transform is applied to enhance the retinal vessels. The final vessel map is obtained by adaptive thresholding. The performance of the proposed method is measured on the publicly available DRIVE and STARE datasets and compared with several state-of-the-art methods. An accuracy of 0.9523 and 0.9590 has been respectively achieved on the DRIVE and STARE datasets, which are not only greater than most methods, but are also superior to the second human observer's performance. The results show that the proposed method can achieve improved detection in abnormal retinal images and decrease false positive vessels in pathological regions compared to other methods. Also, the robustness of the method in the presence of noise is shown via experimental result. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Energetic Analysis of Poultry Processing Operations

    OpenAIRE

    Simeon Olatayo JEKAYINFA

    2007-01-01

    Energy audit of three poultry processing plants was conducted in southwestern Nigeria. The plants were grouped into three different categories based on their production capacities. The survey involved all the five easily defined unit operations utilized by the poultry processing industry and the experimental design allowed the energy consumed in each unit operation to be measured. The results of the audit revealed that scalding & defeathering is the most energy intensive unit operation in al...

  1. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  2. Electromagnetic heating processes: analysis and simulations

    OpenAIRE

    Calay, Rajnish Kaur

    1994-01-01

    Electromagnetic heating (EMH) processes are being increasingly used in the industrial and domestic sectors, yet they receive relatively little attention in the thermal engineering domain. Time-temperature characteristics in EMH are qualitatively different from those in conventional heating techniques due to the additional parameters (viz dielectric properties of the material, size and shape of the product and process frequency). From a unified theory perspective, a multi-...

  3. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  4. Modeling fraud detection and the incorporation of forensic specialists in the audit process

    DEFF Research Database (Denmark)

    Sakalauskaite, Dominyka

    Financial statement audits are still comparatively poor in fraud detection. Forensic specialists can play a significant role in increasing audit quality. In this paper, based on prior academic research, I develop a model of fraud detection and the incorporation of forensic specialists in the audit...... process. The intention of the model is to identify the reasons why the audit is weak in fraud detection and to provide the analytical framework to assess whether the incorporation of forensic specialists can help to improve it. The results show that such specialists can potentially improve the fraud...... detection in the audit, but might also cause some negative implications. Overall, even though fraud detection is one of the main topics in research there are very few studies done on the subject of how auditors co-operate with forensic specialists. Thus, the paper concludes with suggestions for further...

  5. Automated rice leaf disease detection using color image analysis

    Science.gov (United States)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  6. Characterizing Dynamic Walking Patterns and Detecting Falls with Wearable Sensors Using Gaussian Process Methods

    Directory of Open Access Journals (Sweden)

    Taehwan Kim

    2017-05-01

    Full Text Available By incorporating a growing number of sensors and adopting machine learning technologies, wearable devices have recently become a prominent health care application domain. Among the related research topics in this field, one of the most important issues is detecting falls while walking. Since such falls may lead to serious injuries, automatically and promptly detecting them during daily use of smartphones and/or smart watches is a particular need. In this paper, we investigate the use of Gaussian process (GP methods for characterizing dynamic walking patterns and detecting falls while walking with built-in wearable sensors in smartphones and/or smartwatches. For the task of characterizing dynamic walking patterns in a low-dimensional latent feature space, we propose a novel approach called auto-encoded Gaussian process dynamical model, in which we combine a GP-based state space modeling method with a nonlinear dimensionality reduction method in a unique manner. The Gaussian process methods are fit for this task because one of the most import strengths of the Gaussian process methods is its capability of handling uncertainty in the model parameters. Also for detecting falls while walking, we propose to recycle the latent samples generated in training the auto-encoded Gaussian process dynamical model for GP-based novelty detection, which can lead to an efficient and seamless solution to the detection task. Experimental results show that the combined use of these GP-based methods can yield promising results for characterizing dynamic walking patterns and detecting falls while walking with the wearable sensors.

  7. Power spectrum weighted edge analysis for straight edge detection in images

    Science.gov (United States)

    Karvir, Hrishikesh V.; Skipper, Julie A.

    2007-04-01

    Most man-made objects provide characteristic straight line edges and, therefore, edge extraction is a commonly used target detection tool. However, noisy images often yield broken edges that lead to missed detections, and extraneous edges that may contribute to false target detections. We present a sliding-block approach for target detection using weighted power spectral analysis. In general, straight line edges appearing at a given frequency are represented as a peak in the Fourier domain at a radius corresponding to that frequency, and a direction corresponding to the orientation of the edges in the spatial domain. Knowing the edge width and spacing between the edges, a band-pass filter is designed to extract the Fourier peaks corresponding to the target edges and suppress image noise. These peaks are then detected by amplitude thresholding. The frequency band width and the subsequent spatial filter mask size are variable parameters to facilitate detection of target objects of different sizes under known imaging geometries. Many military objects, such as trucks, tanks and missile launchers, produce definite signatures with parallel lines and the algorithm proves to be ideal for detecting such objects. Moreover, shadow-casting objects generally provide sharp edges and are readily detected. The block operation procedure offers advantages of significant reduction in noise influence, improved edge detection, faster processing speed and versatility to detect diverse objects of different sizes in the image. With Scud missile launcher replicas as target objects, the method has been successfully tested on terrain board test images under different backgrounds, illumination and imaging geometries with cameras of differing spatial resolution and bit-depth.

  8. DETECTION OF THE SECOND r-PROCESS PEAK ELEMENT TELLURIUM IN METAL-POOR STARS ,

    International Nuclear Information System (INIS)

    Roederer, Ian U.; Lawler, James E.; Cowan, John J.; Beers, Timothy C.; Frebel, Anna; Ivans, Inese I.; Schatz, Hendrik; Sobeck, Jennifer S.; Sneden, Christopher

    2012-01-01

    Using near-ultraviolet spectra obtained with the Space Telescope Imaging Spectrograph on board the Hubble Space Telescope, we detect neutral tellurium in three metal-poor stars enriched by products of r-process nucleosynthesis, BD +17 3248, HD 108317, and HD 128279. Tellurium (Te, Z = 52) is found at the second r-process peak (A ≈ 130) associated with the N = 82 neutron shell closure, and it has not been detected previously in Galactic halo stars. The derived tellurium abundances match the scaled solar system r-process distribution within the uncertainties, confirming the predicted second peak r-process residuals. These results suggest that tellurium is predominantly produced in the main component of the r-process, along with the rare earth elements.

  9. Smartphone Cortex Controlled Real-Time Image Processing and Reprocessing for Concentration Independent LED Induced Fluorescence Detection in Capillary Electrophoresis.

    Science.gov (United States)

    Szarka, Mate; Guttman, Andras

    2017-10-17

    We present the application of a smartphone anatomy based technology in the field of liquid phase bioseparations, particularly in capillary electrophoresis. A simple capillary electrophoresis system was built with LED induced fluorescence detection and a credit card sized minicomputer to prove the concept of real time fluorescent imaging (zone adjustable time-lapse fluorescence image processor) and separation controller. The system was evaluated by analyzing under- and overloaded aminopyrenetrisulfonate (APTS)-labeled oligosaccharide samples. The open source software based image processing tool allowed undistorted signal modulation (reprocessing) if the signal was inappropriate for the actual detection system settings (too low or too high). The novel smart detection tool for fluorescently labeled biomolecules greatly expands dynamic range and enables retrospective correction for injections with unsuitable signal levels without the necessity to repeat the analysis.

  10. Enhancement of crack detection in stud bolts of nuclear reactor by ultrasonic signal processing technique

    International Nuclear Information System (INIS)

    Lee, J.H.; Oh, W.D.; Choi, S.W.; Park, M.H.

    2004-01-01

    'Full-text:' The stud bolts is one of the most critical parts for safety of reactor vessels in the nuclear power plants. However, in the application of ultrasonic technique for crack detection in stud bolt, some difficulties encountered are classification of crack signal from the signals reflected from threads part in stud bolt. In this study, shadow effect technique combined with new signal processing method is Investigated to enhance the detectability of small crack initiated from root of thread in stud bolt. The key idea of signal processing is based on the fact that the shape of waveforms from the threads is uniform since the shape of the threads in a bolt is same. If some cracks exist in the thread, the flaw signals are different to the reference signals. It is demonstrated that the small flaws are efficiently detected by novel ultrasonic technique combined with this new signal processing concept. (author)

  11. [The process of detection and treatment of cases of tuberculosis in a prison].

    Science.gov (United States)

    Valença, Mariana Soares; Cezar-Vaz, Marta Regina; Brum, Clarice Brinck; Silva, Pedro Eduardo Almeida da

    2016-06-01

    This study seeks to analyze the process of detection and treatment of cases of tuberculosis (TB) in a prison in the south of Brazil. An active and passive search for TB was conducted to estimate the scale of TB in a prison with 764 inmates. In conjunction with the detection strategies and clinical follow-up of the 41 TB cases, participant observation and records in field diaries were performed, making it possible to analyze the scope and limitations of detection and treatment of cases of TB in prison. The development of search strategies is discussed along with the use of questionnaires to detect symptomatic cases, as well as the inadequacy of the clinical follow-up of TB cases, involvement of different workers and coordination between prison and health services. There is clear potential for the control of TB using an active search to induce the passive detection and screening for symptoms that - even skewed by the perceptions of inmates regarding symptoms of TB - enabled an increase in detection. The functional dynamics of prison life hamper the inclusion of health routines and can restrict actions to control TB and other diseases. In the process of control of TB in prisons, the feasibility of effective detection methods is as important as planning based on disease conditions, network services and workers involved.

  12. Overlapping communities detection based on spectral analysis of line graphs

    Science.gov (United States)

    Gui, Chun; Zhang, Ruisheng; Hu, Rongjing; Huang, Guoming; Wei, Jiaxuan

    2018-05-01

    Community in networks are often overlapping where one vertex belongs to several clusters. Meanwhile, many networks show hierarchical structure such that community is recursively grouped into hierarchical organization. In order to obtain overlapping communities from a global hierarchy of vertices, a new algorithm (named SAoLG) is proposed to build the hierarchical organization along with detecting the overlap of community structure. SAoLG applies the spectral analysis into line graphs to unify the overlap and hierarchical structure of the communities. In order to avoid the limitation of absolute distance such as Euclidean distance, SAoLG employs Angular distance to compute the similarity between vertices. Furthermore, we make a micro-improvement partition density to evaluate the quality of community structure and use it to obtain the more reasonable and sensible community numbers. The proposed SAoLG algorithm achieves a balance between overlap and hierarchy by applying spectral analysis to edge community detection. The experimental results on one standard network and six real-world networks show that the SAoLG algorithm achieves higher modularity and reasonable community number values than those generated by Ahn's algorithm, the classical CPM and GN ones.

  13. Network structure detection and analysis of Shanghai stock market

    Directory of Open Access Journals (Sweden)

    Sen Wu

    2015-04-01

    Full Text Available Purpose: In order to investigate community structure of the component stocks of SSE (Shanghai Stock Exchange 180-index, a stock correlation network is built to find the intra-community and inter-community relationship. Design/methodology/approach: The stock correlation network is built taking the vertices as stocks and edges as correlation coefficients of logarithm returns of stock price. It is built as undirected weighted at first. GN algorithm is selected to detect community structure after transferring the network into un-weighted with different thresholds. Findings: The result of the network community structure analysis shows that the stock market has obvious industrial characteristics. Most of the stocks in the same industry or in the same supply chain are assigned to the same community. The correlation of the internal stock prices’ fluctuation is closer than in different communities. The result of community structure detection also reflects correlations among different industries. Originality/value: Based on the analysis of the community structure in Shanghai stock market, the result reflects some industrial characteristics, which has reference value to relationship among industries or sub-sectors of listed companies.

  14. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  15. Detection of Adult Green Sturgeon Using Environmental DNA Analysis.

    Directory of Open Access Journals (Sweden)

    Paul S Bergman

    Full Text Available Environmental DNA (eDNA is an emerging sampling method that has been used successfully for detection of rare aquatic species. The Identification of sampling tools that are less stressful for target organisms has become increasingly important for rare and endangered species. A decline in abundance of the Southern Distinct Population Segment (DPS of North American Green Sturgeon located in California's Central Valley has led to its listing as Threatened under the Federal Endangered Species Act in 2006. While visual surveys of spawning Green Sturgeon in the Central Valley are effective at monitoring fish densities in concentrated pool habitats, results do not scale well to the watershed level, providing limited spatial and temporal context. Unlike most traditional survey methods, environmental DNA analysis provides a relatively quick, inexpensive tool that could efficiently monitor the presence and distribution of aquatic species. We positively identified Green Sturgeon DNA at two locations of known presence in the Sacramento River, proving that eDNA can be effective for monitoring the presence of adult sturgeon. While further study is needed to understand uncertainties of the sampling method, our study represents the first documented detection of Green Sturgeon eDNA, indicating that eDNA analysis could provide a new tool for monitoring Green Sturgeon distribution in the Central Valley, complimenting traditional on-going survey methods.

  16. Frontiers in In-Situ Cosmic Dust Detection and Analysis

    International Nuclear Information System (INIS)

    Sternovsky, Zoltan; Auer, Siegfried; Drake, Keith; Gruen, Eberhard; Horanyi, Mihaly; Le, Huy; Xie Jianfeng; Srama, Ralf

    2011-01-01

    In-situ cosmic dust instruments and measurements played a critical role in the emergence of the field of dusty plasmas. The major breakthroughs included the discovery of β-meteoroids, interstellar dust particles within the solar system, Jovian stream particles, and the detection and analysis of Enceladus's plumes. The science goals of cosmic dust research require the measurements of the charge, the spatial, size and velocity distributions, and the chemical and isotopic compositions of individual dust particles. In-situ dust instrument technology has improved significantly in the last decade. Modern dust instruments with high sensitivity can detect submicron-sized particles even at low impact velocities. Innovative ion optics methods deliver high mass resolution, m/dm>100, for chemical and isotopic analysis. The accurate trajectory measurement of cosmic dust is made possible even for submicron-sized grains using the Dust Trajectory Sensor (DTS). This article is a brief review of the current capabilities of modern dust instruments, future challenges and opportunities in cosmic dust research.

  17. Detection and analysis of diamond fingerprinting feature and its application

    Energy Technology Data Exchange (ETDEWEB)

    Li Xin; Huang Guoliang; Li Qiang; Chen Shengyi, E-mail: tshgl@tsinghua.edu.cn [Department of Biomedical Engineering, the School of Medicine, Tsinghua University, Beijing, 100084 (China)

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the 'Diamond ID' and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  18. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Directory of Open Access Journals (Sweden)

    Zhi-Wei Liu

    2015-04-01

    Full Text Available Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L−1 (S/N ≥ 3. A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L−1-40μg L−1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p < 0.001. The sensor showed no response for abrin, BSA, and could overcome the influence of complex environmental disruptors, indicating high specificity and good selectivity. Recovery and reproducibility in the result of simulated samples (simulated water, soil, and flour sample determination met the analysis requirements, which was 90.5∼95.5% and 7.85%∼9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R was greater than or equal to 0.9456 (p < 0.001. Response voltage (ΔUe and response time (t0 obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  19. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  20. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  1. Speech endpoint detection with non-language speech sounds for generic speech processing applications

    Science.gov (United States)

    McClain, Matthew; Romanowski, Brian

    2009-05-01

    Non-language speech sounds (NLSS) are sounds produced by humans that do not carry linguistic information. Examples of these sounds are coughs, clicks, breaths, and filled pauses such as "uh" and "um" in English. NLSS are prominent in conversational speech, but can be a significant source of errors in speech processing applications. Traditionally, these sounds are ignored by speech endpoint detection algorithms, where speech regions are identified in the audio signal prior to processing. The ability to filter NLSS as a pre-processing step can significantly enhance the performance of many speech processing applications, such as speaker identification, language identification, and automatic speech recognition. In order to be used in all such applications, NLSS detection must be performed without the use of language models that provide knowledge of the phonology and lexical structure of speech. This is especially relevant to situations where the languages used in the audio are not known apriori. We present the results of preliminary experiments using data from American and British English speakers, in which segments of audio are classified as language speech sounds (LSS) or NLSS using a set of acoustic features designed for language-agnostic NLSS detection and a hidden-Markov model (HMM) to model speech generation. The results of these experiments indicate that the features and model used are capable of detection certain types of NLSS, such as breaths and clicks, while detection of other types of NLSS such as filled pauses will require future research.

  2. A CCTV system with SMS alert (CMDSA): An implementation of pixel processing algorithm for motion detection

    Science.gov (United States)

    Rahman, Nurul Hidayah Ab; Abdullah, Nurul Azma; Hamid, Isredza Rahmi A.; Wen, Chuah Chai; Jelani, Mohamad Shafiqur Rahman Mohd

    2017-10-01

    Closed-Circuit TV (CCTV) system is one of the technologies in surveillance field to solve the problem of detection and monitoring by providing extra features such as email alert or motion detection. However, detecting and alerting the admin on CCTV system may complicate due to the complexity to integrate the main program with an external Application Programming Interface (API). In this study, pixel processing algorithm is applied due to its efficiency and SMS alert is added as an alternative solution for users who opted out email alert system or have no Internet connection. A CCTV system with SMS alert (CMDSA) was developed using evolutionary prototyping methodology. The system interface was implemented using Microsoft Visual Studio while the backend components, which are database and coding, were implemented on SQLite database and C# programming language, respectively. The main modules of CMDSA are motion detection, capturing and saving video, image processing and Short Message Service (SMS) alert functions. Subsequently, the system is able to reduce the processing time making the detection process become faster, reduce the space and memory used to run the program and alerting the system admin instantly.

  3. Energetic Analysis of Poultry Processing Operations

    Directory of Open Access Journals (Sweden)

    Simeon Olatayo JEKAYINFA

    2007-01-01

    Full Text Available Energy audit of three poultry processing plants was conducted in southwestern Nigeria. The plants were grouped into three different categories based on their production capacities. The survey involved all the five easily defined unit operations utilized by the poultry processing industry and the experimental design allowed the energy consumed in each unit operation to be measured. The results of the audit revealed that scalding & defeathering is the most energy intensive unit operation in all the three plant categories, averagely accounting for about 44% of the total energy consumption in the processing plants. Other processing operations consuming energy in the following order are eviscerating (17.5%, slaughtering (17%, washing & chilling (16% and packing (6%. The results of the study clearly indicated that the least mechanized of the plants consumed the highest energy (50.36 MJ followed by the semi-mechanized plant (28.04 MJ and the most mechanized plant (17.83 MJ. The energy audits have provided baseline information needed for carrying out budgeting, forecasting energy requirements and planning plant expansion in the poultry processing industries in the study area.

  4. Thermal analysis of a glass bending process

    International Nuclear Information System (INIS)

    Buonanno, G.; Dell'Isola, M.; Frattolillo, A.; Giovinco, G.

    2005-01-01

    The paper presents the thermal simulation of naturally ventilated ovens used in glass sheets hot forming for windscreen production. The determination of thermal and flow conditions in the oven and, consequently, the windshield temperature distribution is necessary both for the productive process optimisation and to assure beforehand, without any iterative tuning process, the required characteristics of the product considered. To this purpose, the authors carried out a 3D numerical simulation of the thermal interaction between the glass and the oven internal surfaces during the whole heating process inside the oven. In particular, a finite volumes method was used to take into account both the convective, conductive and radiative heat transfer in the oven. The numerical temperature distribution in the glass was validated through the comparison with the data obtained from an experimental apparatus designed and built for the purpose

  5. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  6. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography

    International Nuclear Information System (INIS)

    Longuetaud, F.

    2005-10-01

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  7. Detection of Prion Proteins and TSE Infectivity in the Rendering and Biodiesel Manufacture Processes

    Energy Technology Data Exchange (ETDEWEB)

    Brown, R.; Keller, B.; Oleschuk, R. [Queen' s University, Kingston, Ontario (Canada)

    2007-03-15

    This paper addresses emerging issues related to monitoring prion proteins and TSE infectivity in the products and waste streams of rendering and biodiesel manufacture processes. Monitoring is critical to addressing the knowledge gaps identified in 'Biodiesel from Specified Risk Material Tallow: An Appraisal of TSE Risks and their Reduction' (IEA's AMF Annex XXX, 2006) that prevent comprehensive risk assessment of TSE infectivity in products and waste. The most important challenge for monitoring TSE risk is the wide variety of sample types, which are generated at different points in the rendering/biodiesel production continuum. Conventional transmissible spongiform encephalopathy (TSE) assays were developed for specified risk material (SRM) and other biological tissues. These, however, are insufficient to address the diverse sample matrices produced in rendering and biodiesel manufacture. This paper examines the sample types expected in rendering and biodiesel manufacture and the implications of applying TSE assay methods to them. The authors then discuss a sample preparation filtration, which has not yet been applied to these sample types, but which has the potential to provide or significantly improve TSE monitoring. The main improvement will come from transfer of the prion proteins from the sample matrix to a matrix compatible with conventional and emerging bioassays. A second improvement will come from preconcentrating the prion proteins, which means transferring proteins from a larger sample volume into a smaller volume for analysis to provide greater detection sensitivity. This filtration method may also be useful for monitoring other samples, including wash waters and other waste streams, which may contain SRM, including those from abattoirs and on-farm operations. Finally, there is a discussion of emerging mass spectrometric methods, which Prusiner and others have shown to be suitable for detection and characterisation of prion proteins (Stahl

  8. Encapsulation Processing and Manufacturing Yield Analysis

    Science.gov (United States)

    Willis, P.

    1985-01-01

    Evaluation of the ethyl vinyl acetate (EVA) encapsulation system is presented. This work is part of the materials baseline needed to demonstrate a 30 year module lifetime capability. Process and compound variables are both being studied along with various module materials. Results have shown that EVA should be stored rolled up, and enclosed in a plastic bag to retard loss of peroxide curing agents. The TBEC curing agent has superior shelf life and processing than the earlier Lupersol-101 curing agent. Analytical methods were developed to test for peroxide content, and experimental methodologies were formalized.

  9. Streak detection and analysis pipeline for space-debris optical images

    Science.gov (United States)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  10. "Textural analysis of multiparametric MRI detects transition zone prostate cancer".

    Science.gov (United States)

    Sidhu, Harbir S; Benigno, Salvatore; Ganeshan, Balaji; Dikaios, Nikos; Johnston, Edward W; Allen, Clare; Kirkham, Alex; Groves, Ashley M; Ahmed, Hashim U; Emberton, Mark; Taylor, Stuart A; Halligan, Steve; Punwani, Shonit

    2017-06-01

    To evaluate multiparametric-MRI (mpMRI) derived histogram textural-analysis parameters for detection of transition zone (TZ) prostatic tumour. Sixty-seven consecutive men with suspected prostate cancer underwent 1.5T mpMRI prior to template-mapping-biopsy (TPM). Twenty-six men had 'significant' TZ tumour. Two radiologists in consensus matched TPM to the single axial slice best depicting tumour, or largest TZ diameter for those with benign histology, to define single-slice whole TZ-regions-of-interest (ROIs). Textural-parameter differences between single-slice whole TZ-ROI containing significant tumour versus benign/insignificant tumour were analysed using Mann Whitney U test. Diagnostic accuracy was assessed by receiver operating characteristic area under curve (ROC-AUC) analysis cross-validated with leave-one-out (LOO) analysis. ADC kurtosis was significantly lower (p Textural features of the whole prostate TZ can discriminate significant prostatic cancer through reduced kurtosis of the ADC-histogram where significant tumour is included in TZ-ROI and reduced T1 entropy independent of tumour inclusion. • MR textural features of prostate transition zone may discriminate significant prostatic cancer. • Transition zone (TZ) containing significant tumour demonstrates a less peaked ADC histogram. • TZ containing significant tumour reveals higher post-contrast T1-weighted homogeneity. • The utility of MR texture analysis in prostate cancer merits further investigation.

  11. Detecting inpatient falls by using natural language processing of electronic medical records

    Directory of Open Access Journals (Sweden)

    Toyabe Shin-ichi

    2012-12-01

    Full Text Available Abstract Background Incident reporting is the most common method for detecting adverse events in a hospital. However, under-reporting or non-reporting and delay in submission of reports are problems that prevent early detection of serious adverse events. The aim of this study was to determine whether it is possible to promptly detect serious injuries after inpatient falls by using a natural language processing method and to determine which data source is the most suitable for this purpose. Methods We tried to detect adverse events from narrative text data of electronic medical records by using a natural language processing method. We made syntactic category decision rules to detect inpatient falls from text data in electronic medical records. We compared how often the true fall events were recorded in various sources of data including progress notes, discharge summaries, image order entries and incident reports. We applied the rules to these data sources and compared F-measures to detect falls between these data sources with reference to the results of a manual chart review. The lag time between event occurrence and data submission and the degree of injury were compared. Results We made 170 syntactic rules to detect inpatient falls by using a natural language processing method. Information on true fall events was most frequently recorded in progress notes (100%, incident reports (65.0% and image order entries (12.5%. However, F-measure to detect falls using the rules was poor when using progress notes (0.12 and discharge summaries (0.24 compared with that when using incident reports (1.00 and image order entries (0.91. Since the results suggested that incident reports and image order entries were possible data sources for prompt detection of serious falls, we focused on a comparison of falls found by incident reports and image order entries. Injury caused by falls found by image order entries was significantly more severe than falls detected by

  12. Radionuclides for process analysis problems and examples

    International Nuclear Information System (INIS)

    Otto, R.; Koennecke, H.G.; Luther, D.; Hecht, P.

    1986-01-01

    Both practical problems of the application of the tracer techniques for residence time measurements and the advantages of the methods are discussed. In this paper selected examples for tracer experiments carried out in a drinking water generator, a caprolactam production plant and a cokery are given. In all cases the efficiency of the processes investigated could be improved. (author)

  13. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  14. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...

  15. Exergy analysis in industrial food processing

    NARCIS (Netherlands)

    Zisopoulos, F.K.

    2016-01-01

    The sustainable provision of food on a global scale in the near future is a very serious challenge. This thesis focuses on the assessment and design of sustainable industrial food production chains and processes by using the concept of exergy which is an objective metric based on the first and

  16. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    Science.gov (United States)

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved

  17. Entrepreneurship Learning Process by using SWOT Analysis

    OpenAIRE

    Jajat Sudrajat; Muhammad Ali Rahman; Antonius Sianturi; Vendy Vendy

    2016-01-01

    The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC) unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students...

  18. Laplace-Laplace analysis of the fractional Poisson process

    OpenAIRE

    Gorenflo, Rudolf; Mainardi, Francesco

    2013-01-01

    We generate the fractional Poisson process by subordinating the standard Poisson process to the inverse stable subordinator. Our analysis is based on application of the Laplace transform with respect to both arguments of the evolving probability densities.

  19. Linkage analysis: Inadequate for detecting susceptibility loci in complex disorders?

    Energy Technology Data Exchange (ETDEWEB)

    Field, L.L.; Nagatomi, J. [Univ. of Calgary, Alberta (Canada)

    1994-09-01

    Insulin-dependent diabetes mellitus (IDDM) may provide valuable clues about approaches to detecting susceptibility loci in other oligogenic disorders. Numerous studies have demonstrated significant association between IDDM and a VNTR in the 5{prime} flanking region of the insulin (INS) gene. Paradoxically, all attempts to demonstrate linkage of IDDM to this VNTR have failed. Lack of linkage has been attributed to insufficient marker locus information, genetic heterogeneity, or high frequency of the IDDM-predisposing allele in the general population. Tyrosine hydroxylase (TH) is located 2.7 kb from INS on the 5` side of the VNTR and shows linkage disequilibrium with INS region loci. We typed a highly polymorphic microsatellite within TH in 176 multiplex families, and performed parametric (lod score) linkage analysis using various intermediate reduced penetrance models for IDDM (including rare and common disease allele frequencies), as well as non-parametric (affected sib pair) linkage analysis. The scores significantly reject linkage for recombination values of .05 or less, excluding the entire 19 kb region containing TH, the 5{prime} VNTR, the INS gene, and IGF2 on the 3{prime} side of INS. Non-parametric linkage analysis also provided no significant evidence for linkage (mean TH allele sharing 52.5%, P=.12). These results have important implications for efforts to locate genes predisposing to complex disorders, strongly suggesting that regions which are significantly excluded by linkage methods may nevertheless contain predisposing genes readily detectable by association methods. We advocate that investigators routinely perform association analyses in addition to linkage analyses.

  20. Genome-Wide Detection and Analysis of Multifunctional Genes

    Science.gov (United States)

    Pritykin, Yuri; Ghersi, Dario; Singh, Mona

    2015-01-01

    Many genes can play a role in multiple biological processes or molecular functions. Identifying multifunctional genes at the genome-wide level and studying their properties can shed light upon the complexity of molecular events that underpin cellular functioning, thereby leading to a better understanding of the functional landscape of the cell. However, to date, genome-wide analysis of multifunctional genes (and the proteins they encode) has been limited. Here we introduce a computational approach that uses known functional annotations to extract genes playing a role in at least two distinct biological processes. We leverage functional genomics data sets for three organisms—H. sapiens, D. melanogaster, and S. cerevisiae—and show that, as compared to other annotated genes, genes involved in multiple biological processes possess distinct physicochemical properties, are more broadly expressed, tend to be more central in protein interaction networks, tend to be more evolutionarily conserved, and are more likely to be essential. We also find that multifunctional genes are significantly more likely to be involved in human disorders. These same features also hold when multifunctionality is defined with respect to molecular functions instead of biological processes. Our analysis uncovers key features about multifunctional genes, and is a step towards a better genome-wide understanding of gene multifunctionality. PMID:26436655

  1. Analysis of the Chirplet Transform-Based Algorithm for Radar Detection of Accelerated Targets

    Science.gov (United States)

    Galushko, V. G.; Vavriv, D. M.

    2017-06-01

    Purpose: Efficiency analysis of an optimal algorithm of chirp signal processing based on the chirplet transform as applied to detection of radar targets in uniformly accelerated motion. Design/methodology/approach: Standard methods of the optimal filtration theory are used to investigate the ambiguity function of chirp signals. Findings: An analytical expression has been derived for the ambiguity function of chirp signals that is analyzed with respect to detection of radar targets moving at a constant acceleration. Sidelobe level and characteristic width of the ambiguity function with respect to the coordinates frequency and rate of its change have been estimated. The gain in the signal-to-noise ratio has been assessed that is provided by the algorithm under consideration as compared with application of the standard Fourier transform to detection of chirp signals against a “white” noise background. It is shown that already with a comparatively small (processing channels (elementary filters with respect to the frequency change rate) the gain in the signal-tonoise ratio exceeds 10 dB. A block diagram of implementation of the algorithm under consideration is suggested on the basis of a multichannel weighted Fourier transform. Recommendations as for selection of the detection algorithm parameters have been developed. Conclusions: The obtained results testify to efficiency of application of the algorithm under consideration to detection of radar targets moving at a constant acceleration. Nevertheless, it seems expedient to perform computer simulations of its operability with account for the noise impact along with trial measurements in real conditions.

  2. A Comparative Analysis for Selection of Appropriate Mother Wavelet for Detection of Stationary Disturbances

    Science.gov (United States)

    Kamble, Saurabh Prakash; Thawkar, Shashank; Gaikwad, Vinayak G.; Kothari, D. P.

    2017-12-01

    Detection of disturbances is the first step of mitigation. Power electronics plays a crucial role in modern power system which makes system operation efficient but it also bring stationary disturbances in the power system and added impurities to the supply. It happens because of the non-linear loads used in modern day power system which inject disturbances like harmonic disturbances, flickers, sag etc. in power grid. These impurities can damage equipments so it is necessary to mitigate these impurities present in the supply very quickly. So, digital signal processing techniques are incorporated for detection purpose. Signal processing techniques like fast Fourier transform, short-time Fourier transform, Wavelet transform etc. are widely used for the detection of disturbances. Among all, wavelet transform is widely used because of its better detection capabilities. But, which mother wavelet has to use for detection is still a mystery. Depending upon the periodicity, the disturbances are classified as stationary and non-stationary disturbances. This paper presents the importance of selection of mother wavelet for analyzing stationary disturbances using discrete wavelet transform. Signals with stationary disturbances of various frequencies are generated using MATLAB. The analysis of these signals is done using various mother wavelets like Daubechies and bi-orthogonal wavelets and the measured root mean square value of stationary disturbance is obtained. The measured value obtained by discrete wavelet transform is compared with the exact RMS value of the frequency component and the percentage differences are presented which helps to select optimum mother wavelet.

  3. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  4. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  5. IMPRINT Analysis of an Unmanned Air System Geospatial Information Process

    National Research Council Canada - National Science Library

    Hunn, Bruce P; Schweitzer, Kristin M; Cahir, John A; Finch, Mary M

    2008-01-01

    ... intelligence, geospatial analysis cell. The Improved Performance Research Integration Tool (IMPRINT) modeling program was used to understand this process and to assess crew workload during several test scenarios...

  6. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.

  7. End point detection in ion milling processes by sputter-induced optical emission spectroscopy

    International Nuclear Information System (INIS)

    Lu, C.; Dorian, M.; Tabei, M.; Elsea, A.

    1984-01-01

    The characteristic optical emission from the sputtered material during ion milling processes can provide an unambiguous indication of the presence of the specific etched species. By monitoring the intensity of a representative emission line, the etching process can be precisely terminated at an interface. Enhancement of the etching end point is possible by using a dual-channel photodetection system operating in a ratio or difference mode. The installation of the optical detection system to an existing etching chamber has been greatly facilitated by the use of optical fibers. Using a commercial ion milling system, experimental data for a number of etching processes have been obtained. The result demonstrates that sputter-induced optical emission spectroscopy offers many advantages over other techniques in detecting the etching end point of ion milling processes

  8. Detection of masses in mammograms by analysis of gradient vector convergence using sector filter

    International Nuclear Information System (INIS)

    Fakhari, Y.; Karimian, A.; Mohammadbeigi, M.

    2012-01-01

    Although mammography is the main diagnostic method for breast cancer, but the interpretation of mammograms is a difficult task and depends on the experience and skill of the radiologists. Computer Aided Detection (CADe) systems have been proposed to help radiologist in interpretation of mammograms. In this paper a novel filter called Sector filter is proposed to detect masses. This filter works based on the analysis of convergence of gradient vectors toward the center of filter. Using this filter, rounded convex regions, which are more likely to be pertained to a mass, could be detected in a gray scale image. After applying this filter on the images with two scales and their linear combination suspicious points were selected by a specific process. After implementation of the proposed method, promising results were achieved. The performance of the proposed method in this research was competitive or in some cases even better than that of other suggested methods in the literature. (authors)

  9. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    Science.gov (United States)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  10. Automated processing of the single-lead electrocardiogram for the detection of obstructive sleep apnoea.

    Science.gov (United States)

    de Chazal, Philip; Heneghan, Conor; Sheridan, Elaine; Reilly, Richard; Nolan, Philip; O'Malley, Mark

    2003-06-01

    A method for the automatic processing of the electrocardiogram (ECG) for the detection of obstructive apnoea is presented. The method screens nighttime single-lead ECG recordings for the presence of major sleep apnoea and provides a minute-by-minute analysis of disordered breathing. A large independently validated database of 70 ECG recordings acquired from normal subjects and subjects with obstructive and mixed sleep apnoea, each of approximately eight hours in duration, was used throughout the study. Thirty-five of these recordings were used for training and 35 retained for independent testing. A wide variety of features based on heartbeat intervals and an ECG-derived respiratory signal were considered. Classifiers based on linear and quadratic discriminants were compared. Feature selection and regularization of classifier parameters were used to optimize classifier performance. Results show that the normal recordings could be separated from the apnoea recordings with a 100% success rate and a minute-by-minute classification accuracy of over 90% is achievable.

  11. Analysis of Americium in Transplutonium Process Solutions

    International Nuclear Information System (INIS)

    Ferguson, R.B.

    2001-01-01

    One of the more difficult analyses in the transplutonium field is the determination of americium at trace levels in a complex matrix such as a process dissolver solution. Because of these conditions a highly selective separation must precede the measurement of americium. The separation technique should be mechanically simple to permit remote operation with master-slave manipulators. For subsequent americium measurement by the mass spectroscopic isotopic-dilution technique, plutonium and curium interferences must also have been removed

  12. Analysis and improvement of last warehousing processes

    OpenAIRE

    Kumetytė, Indrė

    2017-01-01

    The efficiency and productivity are one of the most significant factors in every manufacturing company in order to maintain competitiveness and leadership in the market. To keep it, an enterprise has to pay a lot of attention and efforts to inside logistic and management of its inventory. The analyzed selected Lithuanian production company’s activity and production principles addresses to major nowadays problem – inventory management and time reduction for warehousing processes. Thus, in this...

  13. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  14. Analysis of the medication reconciliation process conducted at hospital admission

    Directory of Open Access Journals (Sweden)

    María Beatriz Contreras Rey

    2016-07-01

    Full Text Available Objective: To analyze the outcomes of a medication reconciliation process at admission in the hospital setting. To assess the role of the Pharmacist in detecting reconciliation errors and preventing any adverse events entailed. Method: A retrospective study was conducted to analyze the medication reconciliation activity during the previous six months. The study included those patients for whom an apparently not justified discrepancy was detected at admission, after comparing the hospital medication prescribed with the home treatment stated in their clinical hospital records. Those patients for whom the physician ordered the introduction of home medication without any specification were also considered. In order to conduct the reconciliation process, the Pharmacist prepared the best pharmacotherapeutical history possible, reviewing all available information about the medication the patient could be taking before admission, and completing the process with a clinical interview. The discrepancies requiring clarification were reported to the physician. It was considered that the reconciliation proposal had been accepted if the relevant modification was made in the next visit of the physician, or within 24-48 hours maximum; this case was then labeled as a reconciliation error. For the descriptive analysis, the Statistics® SPSS program, version 17.0, was used. Outcomes: 494 medications were reconciled in 220 patients, with a mean of 2.25 medications per patient. More than half of patients (59.5% had some discrepancy that required clarification; the most frequent was the omission of a medication that the patient was taking before admission (86.2%, followed by an unjustified modification in dosing or way of administration (5.9%. In total, 312 discrepancies required clarification; out of these, 93 (29.8% were accepted and considered as reconciliation errors, 126 (40% were not accepted, and in 93 cases (29,8% acceptance was not relevant due to a change in

  15. Biomedical journals lack a consistent method to detect outcome reporting bias: a cross-sectional analysis.

    Science.gov (United States)

    Huan, L N; Tejani, A M; Egan, G

    2014-10-01

    An increasing amount of recently published literature has implicated outcome reporting bias (ORB) as a major contributor to skewing data in both randomized controlled trials and systematic reviews; however, little is known about the current methods in place to detect ORB. This study aims to gain insight into the detection and management of ORB by biomedical journals. This was a cross-sectional analysis involving standardized questions via email or telephone with the top 30 biomedical journals (2012) ranked by impact factor. The Cochrane Database of Systematic Reviews was excluded leaving 29 journals in the sample. Of 29 journals, 24 (83%) responded to our initial inquiry of which 14 (58%) answered our questions and 10 (42%) declined participation. Five (36%) of the responding journals indicated they had a specific method to detect ORB, whereas 9 (64%) did not have a specific method in place. The prevalence of ORB in the review process seemed to differ with 4 (29%) journals indicating ORB was found commonly, whereas 7 (50%) indicated ORB was uncommon or never detected by their journal previously. The majority (n = 10/14, 72%) of journals were unwilling to report or make discrepancies found in manuscripts available to the public. Although the minority, there were some journals (n = 4/14, 29%) which described thorough methods to detect ORB. Many journals seemed to lack a method with which to detect ORB and its estimated prevalence was much lower than that reported in literature suggesting inadequate detection. There exists a potential for overestimation of treatment effects of interventions and unclear risks. Fortunately, there are journals within this sample which appear to utilize comprehensive methods for detection of ORB, but overall, the data suggest improvements at the biomedical journal level for detecting and minimizing the effect of this bias are needed. © 2014 John Wiley & Sons Ltd.

  16. Processes and instruments for detecting bubbles in a medium as a liquid metal

    International Nuclear Information System (INIS)

    1977-01-01

    This invention concerns processes and apparatuses for detecting bubbles in a medium containing them and, particularly although not exclusively, bubbles in a liquid metal used in the cooling system of a fast nuclear reactor. The process consists in seeing that a relative movement is produced between the bubbles and a receiving device, in emitting a collimated ultrasonic signal, beamed at the bubble, by means of a transmitter at a frequency equal to or greater than the resonance frequency of the bubble and in detecting a Doppler signal emitted by the bubble and received by the receiving device so as to detect the bubble. Preferably the diffusion due to the Doppler effect is such that a received diffused Doppler signal has a pulse shape having a peak amplitude proportional to the radius of the bubble and appears as a lateral asymmetrical band with respect to the ultrasonic signal. Preferably the diffusion due to the Doppler effect is brought about by the movement of the bubbles. According to another of its characteristics, the invention concerns an apparatus for detecting a bubble in a medium containing it where a relative movement is produced between the apparatus and the bubble. This apparatus includes a device for emitting an ultrasonic signal beamed at the bubble, a device for receiving an ultrasonic signal in return, a Doppler signal diffused by the bubble and a device for detecting the diffused Doppler signal received by the receiving device so as to detect the bubble [fr

  17. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  18. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liu

    2011-09-01

    Full Text Available Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  19. An image-processing method to detect sub-optical features based on understanding noise in intensity measurements.

    Science.gov (United States)

    Bhatia, Tripta

    2018-02-01

    Accurate quantitative analysis of image data requires that we distinguish between fluorescence intensity (true signal) and the noise inherent to its measurements to the extent possible. We image multilamellar membrane tubes and beads that grow from defects in the fluid lamellar phase of the lipid 1,2-dioleoyl-sn-glycero-3-phosphocholine dissolved in water and water-glycerol mixtures by using fluorescence confocal polarizing microscope. We quantify image noise and determine the noise statistics. Understanding the nature of image noise also helps in optimizing image processing to detect sub-optical features, which would otherwise remain hidden. We use an image-processing technique "optimum smoothening" to improve the signal-to-noise ratio of features of interest without smearing their structural details. A high SNR renders desired positional accuracy with which it is possible to resolve features of interest with width below optical resolution. Using optimum smoothening, the smallest and the largest core diameter detected is of width [Formula: see text] and [Formula: see text] nm, respectively, discussed in this paper. The image-processing and analysis techniques and the noise modeling discussed in this paper can be used for detailed morphological analysis of features down to sub-optical length scales that are obtained by any kind of fluorescence intensity imaging in the raster mode.

  20. Integrating Pavement Crack Detection and Analysis Using Autonomous Unmanned Aerial Vehicle Imagery

    Science.gov (United States)

    2015-03-27

    INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL VEHICLE...protection in the United States. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS UNMANNED AERIAL...APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED. AFIT-ENV-MS-15-M-195 INTEGRATING PAVEMENT CRACK DETECTION AND ANALYSIS USING AUTONOMOUS