WorldWideScience

Sample records for analysis detection processing

  1. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  2. Dynamics analysis of vibration process in Particle Impact Noise Detection

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui; ZHOU Chang-lei; WANG Shu-juan; ZHAI Guo-fu

    2007-01-01

    Particle Impact Noise Detection (PIND) test is a reliability screening technique for hermetic device that is prescribed by MIL-PRF-39016E. Some test conditions are specified, although MIL-PRF-39016E did not specify how to obtain these conditions. This paper establishes the dynamics model of vibration process based on first order mass-spring system. The corresponding Simulink model is also established to simulate vibration process in optional input excitations. The response equations are derived in sinusoidal excitations and the required electromagnetic force waves are computed in order to obtain a given vibration and shock accelerations. Last, some simulation results are given.

  3. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    Science.gov (United States)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  4. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  5. Microreactors with integrated UV/Vis spectroscopic detection for online process analysis under segmented flow.

    Science.gov (United States)

    Yue, Jun; Falke, Floris H; Schouten, Jaap C; Nijhuis, T Alexander

    2013-12-21

    Combining reaction and detection in multiphase microfluidic flow is becoming increasingly important for accelerating process development in microreactors. We report the coupling of UV/Vis spectroscopy with microreactors for online process analysis under segmented flow conditions. Two integration schemes are presented: one uses a cross-type flow-through cell subsequent to a capillary microreactor for detection in the transmission mode; the other uses embedded waveguides on a microfluidic chip for detection in the evanescent wave field. Model experiments reveal the capabilities of the integrated systems in real-time concentration measurements and segmented flow characterization. The application of such integration for process analysis during gold nanoparticle synthesis is demonstrated, showing its great potential in process monitoring in microreactors operated under segmented flow.

  6. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available A Similarity Ratio Analysis (SRA method is proposed for early-stage Fault Detection (FD in plasma etching processes using real-time Optical Emission Spectrometer (OES data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A, takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  7. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Science.gov (United States)

    Yang, Jie; McArdle, Conor; Daniels, Stephen

    2014-01-01

    A Similarity Ratio Analysis (SRA) method is proposed for early-stage Fault Detection (FD) in plasma etching processes using real-time Optical Emission Spectrometer (OES) data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR) statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A), takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  8. Analysis of active islanding detection methods for grid-connected microinverters for renewable energy processing

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, C.L. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain); Departamento de Ingenieria Electronica, Universidad Distrital Francisco Jose de Caldas, Carrera 7 N 40-53 Piso 5, Bogota (Colombia); Velasco, D.; Figueres, E.; Garcera, G. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain)

    2010-11-15

    This paper presents the analysis and comparison of the main active techniques for islanding detection used in grid-connected microinverters for power processing of renewable energy sources. These techniques can be classified into two classes: techniques introducing positive feedback in the control of the inverter and techniques based on harmonics injection. Accurate PSIM trademark simulations have been carried out in order to perform a comparative analysis of the techniques under study and to establish their advantages and disadvantages according to IEEE standards. (author)

  9. Technology Gap Analysis for the Detection of Process Signatures Using Less Than Remote Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hartman, John S.; Atkinson, David A.; Lind, Michael A.; Maughan, A. D.; Kelly, James F.

    2005-01-01

    Although remote sensing methods offer advantages for monitoring important illicit process activities, remote and stand-off technologies cannot successfully detect all important processes with the sensitivity and certainty that is desired. The main scope of the program is observables, with a primary focus on chemical signatures. A number of key process signatures elude remote or stand-off detection for a variety of reasons (e.g., heavy particulate emissions that do not propagate far enough for detection at stand-off distances, semi-volatile chemicals that do not tend to vaporize and remain in the environment near the source, etc.). Some of these compounds can provide persistent, process-specific information that is not available through remote techniques; however, the associated measurement technologies have their own set of advantages, disadvantages and technical challenges that may need to be overcome before additional signature data can be effectively and reliably exploited. The main objective of this report is to describe a process to identify high impact technology gaps for important less-than-remote detection applications. The subsequent analysis focuses on the technology development needed to enable exploitation of important process signatures. The evaluation process that was developed involves three interrelated and often conflicting requirements generation activities: • Identification of target signature chemicals with unique intelligence value and their associated attributes as mitigated by environmentally influenced fate and transport effects (i.e., what can you expect to actually find that has intelligence value, where do you need to look for it and what sensitivity and selectivity do you need to see it) • Identification of end-user deployment scenario possibilities and constraints with a focus on alternative detection requirements, timing issues, logistical consideration, and training requirements for a successful measurement • Identification of

  10. Analysis of Space Shuttle Ground Support System Fault Detection, Isolation, and Recovery Processes and Resources

    Science.gov (United States)

    Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.

    2009-01-01

    As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.

  11. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    Science.gov (United States)

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated.

  12. Automated detection method for architectural distortion areas on mammograms based on morphological processing and surface analysis

    Science.gov (United States)

    Ichikawa, Tetsuko; Matsubara, Tomoko; Hara, Takeshi; Fujita, Hiroshi; Endo, Tokiko; Iwase, Takuji

    2004-05-01

    As well as mass and microcalcification, architectural distortion is a very important finding for the early detection of breast cancer via mammograms, and such distortions can be classified into three typical types: spiculation, retraction, and distortion. The purpose of this work is to develop an automatic method for detecting areas of architectural distortion with spiculation. The suspect areas are detected by concentration indexes of line-structures extracted by using mean curvature. After that, discrimination analysis of nine features is employed for the classifications of true and false positives. The employed features are the size, the mean pixel value, the mean concentration index, the mean isotropic index, the contrast, and four other features based on the power spectrum. As a result of this work, the accuracy of the classification was 76% and the sensitivity was 80% with 0.9 false positives per image in our database in regard to spiculation. It was concluded that our method was effective in detectiong the area of architectural distortion; however, some architectural distortions were not detected accurately because of the size, the density, or the different appearance of the distorted areas.

  13. Experimental analysis of the auditory detection process on avian point counts

    Science.gov (United States)

    Simons, T.R.; Alldredge, M.W.; Pollock, K.H.; Wettroth, J.M.

    2007-01-01

    We have developed a system for simulating the conditions of avian surveys in which birds are identified by sound. The system uses a laptop computer to control a set of amplified MP3 players placed at known locations around a survey point. The system can realistically simulate a known population of songbirds under a range of factors that affect detection probabilities. The goals of our research are to describe the sources and range of variability affecting point-count estimates and to find applications of sampling theory and methodologies that produce practical improvements in the quality of bird-census data. Initial experiments in an open field showed that, on average, observers tend to undercount birds on unlimited-radius counts, though the proportion of birds counted by individual observers ranged from 81% to 132% of the actual total. In contrast to the unlimited-radius counts, when data were truncated at a 50-m radius around the point, observers overestimated the total population by 17% to 122%. Results also illustrate how detection distances decline and identification errors increase with increasing levels of ambient noise. Overall, the proportion of birds heard by observers decreased by 28 ?? 4.7% under breezy conditions, 41 ?? 5.2% with the presence of additional background birds, and 42 ?? 3.4% with the addition of 10 dB of white noise. These findings illustrate some of the inherent difficulties in interpreting avian abundance estimates based on auditory detections, and why estimates that do not account for variations in detection probability will not withstand critical scrutiny. ?? The American Ornithologists' Union, 2007.

  14. The selective processing of emotional visual stimuli while detecting auditory targets: an ERP analysis.

    Science.gov (United States)

    Schupp, Harald T; Stockburger, Jessica; Bublatzky, Florian; Junghöfer, Markus; Weike, Almut I; Hamm, Alfons O

    2008-09-16

    Event-related potential studies revealed an early posterior negativity (EPN) for emotional compared to neutral pictures. Exploring the emotion-attention relationship, a previous study observed that a primary visual discrimination task interfered with the emotional modulation of the EPN component. To specify the locus of interference, the present study assessed the fate of selective visual emotion processing while attention is directed towards the auditory modality. While simply viewing a rapid and continuous stream of pleasant, neutral, and unpleasant pictures in one experimental condition, processing demands of a concurrent auditory target discrimination task were systematically varied in three further experimental conditions. Participants successfully performed the auditory task as revealed by behavioral performance and selected event-related potential components. Replicating previous results, emotional pictures were associated with a larger posterior negativity compared to neutral pictures. Of main interest, increasing demands of the auditory task did not modulate the selective processing of emotional visual stimuli. With regard to the locus of interference, selective emotion processing as indexed by the EPN does not seem to reflect shared processing resources of visual and auditory modality.

  15. Remote sensing change detection and process analysis of long-term land use change and human impacts.

    Science.gov (United States)

    Zhou, Qiming; Li, Baolin; Chen, Yumin

    2011-11-01

    This study investigates environmental change over a 30-year period and attempts to gain a better understanding of human impacts on an arid environment and their consequences for regional development. Multitemporal remotely sensed imagery was acquired and integrated to establish the basis for change detection and process analysis. Land cover changes were investigated in two categories, namely categorical change using image classification and quantitative change using a vegetation index. The results show that human-induced land cover changes have been minor in this remote area. However, the pace of growth of human-induced change has been accelerating since the early 1990s. The analysis of the multi-temporal vegetation index also shows no overall trend of rangeland deterioration, although local change of vegetation cover caused by human activities was noticeable. The results suggest that the current trend of rapid growth may not be sustainable and that the implementation of effective counter-measures for environmentally sound development is a rather urgent matter.

  16. Multiresolution processing for source detection and localization

    Institute of Scientific and Technical Information of China (English)

    DING Feng; GONG Xianyi

    2003-01-01

    The conventional BF/MFP (beamforming/matched field processing) or BF/MFPbased on subspace are traditional array signal processing methods for source's detection andlocation, they are all belonging to singe-resolution processing. In fact, the array signal hasmultiresolution structure, which is worthy of exploiture and utilization to enhance the abilityof detection and location, especially to improve the robustness for BF/MFP. The time-spacemultiresolution modeling of multipath transmitted wave and the corresponding multiresolutionfocused processing are investigated, and it is shown from the analysis of actual sea-trial datathat the performance of MFP can be improved.

  17. Does the Butcher-on-the-Bus Phenomenon Require a Dual-Process Explanation? A Signal Detection Analysis.

    Science.gov (United States)

    Tunney, Richard J; Mullett, Timothy L; Moross, Claudia J; Gardner, Anna

    2012-01-01

    The butcher-on-the-bus is a rhetorical device or hypothetical phenomenon that is often used to illustrate how recognition decisions can be based on different memory processes (Mandler, 1980). The phenomenon describes a scenario in which a person is recognized but the recognition is accompanied by a sense of familiarity or knowing characterized by an absence of contextual details such as the person's identity. We report two recognition memory experiments that use signal detection analyses to determine whether this phenomenon is evidence for a recollection plus familiarity model of recognition or is better explained by a univariate signal detection model. We conclude that there is an interaction between confidence estimates and remember-know judgments which is not explained fully by either single-process signal detection or traditional dual-process models.

  18. Fingerprint detection and process prediction by multivariate analysis of fed-batch monoclonal antibody cell culture data.

    Science.gov (United States)

    Sokolov, Michael; Soos, Miroslav; Neunstoecklin, Benjamin; Morbidelli, Massimo; Butté, Alessandro; Leardi, Riccardo; Solacroup, Thomas; Stettler, Matthieu; Broly, Hervé

    2015-01-01

    This work presents a sequential data analysis path, which was successfully applied to identify important patterns (fingerprints) in mammalian cell culture process data regarding process variables, time evolution and process response. The data set incorporates 116 fed-batch cultivation experiments for the production of a Fc-Fusion protein. Having precharacterized the evolutions of the investigated variables and manipulated parameters with univariate analysis, principal component analysis (PCA) and partial least squares regression (PLSR) are used for further investigation. The first major objective is to capture and understand the interaction structure and dynamic behavior of the process variables and the titer (process response) using different models. The second major objective is to evaluate those models regarding their capability to characterize and predict the titer production. Moreover, the effects of data unfolding, imputation of missing data, phase separation, and variable transformation on the performance of the models are evaluated.

  19. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  20. Determination of diethanolamine or N-methyldiethanolamine in high ammonium concentration matrices by capillary electrophoresis with indirect UV detection: application to the analysis of refinery process waters

    Energy Technology Data Exchange (ETDEWEB)

    Bord, N.; Cretier, G.; Rocca, J.-L. [Universite Claude Bernard Lyon 1 (France). Laboratoire des Sciences Analytiques; Bailly, C. [Centre de Recherches de Gonfreville, Total France, Laboratoires Chromatographie Liquide et Microbiologie, Rogerville (France); Souchez, J.-P. [Centre de Recherches de Solaize, Total France, Chemin du Canal, BP 22, St-Symphorien d' Ozon (France)

    2004-09-01

    Alkanolamines such as diethanolamine (DEA) and N-methyldiethanolamine (MDEA) are used in desulfurization processes in crude oil refineries. These compounds may be found in process waters following an accidental contamination. The analysis of alkanolamines in refinery process waters is very difficult due to the high ammonium concentration of the samples. This paper describes a method for the determination of DEA in high ammonium concentration refinery process waters by using capillary electrophoresis (CE) with indirect UV detection. The same method can be used for the determination of MDEA. Best results were achieved with a background electrolyte (BGE) comprising 10 mM histidine adjusted to pH 5.0 with acetic acid. The development of this electrolyte and the analytical performances are discussed. The quantification was performed by using internal standardization, by which triethanolamine (TEA) was used as internal standard. A matrix effect due to the high ammonium content has been highlighted and standard addition was therefore used. The developed method was characterized in terms of repeatability of migration times and corrected peak areas, linearity, and accuracy. Limits of detection (LODs) and quantification (LOQs) obtained were 0.2 and 0.7 ppm, respectively. The CE method was applied to the determination of DEA or MDEA in refinery process waters spiked with known amounts of analytes and it gave excellent results, since uncertainties obtained were 8 and 5%, respectively. (orig.)

  1. Integrated Process Capability Analysis

    Institute of Scientific and Technical Information of China (English)

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S

    2002-01-01

    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  2. Ongoing Active Deformation Processes at Fernandina Volcano (Galapagos) Detected via Multi-Orbit COSMO-SkyMed SAR Data Analysis

    Science.gov (United States)

    Pepe, Susi; Castaldo, Raffaele; De Luca, Claudio; Casu, Francesco; Tizzani, Pietro; Sansosti, Eugenio

    2014-05-01

    Fernandina Volcano, Galápagos (Ecuador), has experienced several uplift and eruption episodes over the last twenty-two years. The ground deformation between 2002 and 2006 was interpreted as the effect of an inflation phenomenon of two separate magma reservoirs beneath the caldera. Moreover, the uplift deformation occurred during the 2005 eruption was concentrated near the circumferential eruptive fissures, while being superimposed on a broad subsidence centred on the caldera. The geodetic studies emphasized the presence of two sub volcanic lateral intrusions from the central storage system in December 2006 and August 2007. The latest eruption in 2009 was characterized by lava flows emitted from the SW radial fissures. We analyze the spatial and temporal ground deformation between March 2012 and July 2013, by using data acquired by COSMO-SkyMed X-band constellation along both ascending and descending orbits and by applying advanced InSAR techniques. In particular, we use the SBAS InSAR approach and combine ascending and descending time series to produce vertical and East-West components of the mean deformation velocity and deformation time series. Our analysis revealed a new uplift phenomenon due to the stress concentration inside the shallow magmatic system of the volcano. In particular, the vertical mean velocity map shows that the deformation pattern is concentrated inside caldera region and is characterized by strongly radial symmetry with a maximum displacement of about 20 cm in uplift; an axial symmetry is also observed in the EW horizontal mean velocity map, showing a maximum displacement of about +12 cm towards East for the SE flank, and -12 cm towards West for the NW flank of the volcano. Moreover, the deformation time series show a rather linear uplift trend from March to September 2012, interrupted by a low deformation rate interval lasting until January 2013. After this stage, the deformation shows again a linear behaviour with an increased uplift rate

  3. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  4. 激光加工中光化学作用的检测分析方法%The Detection Analysis Methods of Photochemical Effect in Laser Processing

    Institute of Scientific and Technical Information of China (English)

    宋悠全; 魏昕; 谢小柱; 刘俊宏

    2011-01-01

    Photothermal and photochemical effects exist in the process of laser processed materials. The detection and analysis methods for photochemical effect used in laser processing are reviewed. Photochemical results are mainly detected according to the changes such as material surface structure, composition, micromor-phology, physical-chemical characteristics after laser processing; while the detection of photochemical process adopts the spectroscopy and the mass spectrometer with a time-resolved scale to analyze the molecular fragments and the charged ions in mass distribution, energy distribution and angle distribution during laser processing. The photochemical mechanism is generally studied by analyzing the experimental detection results combining with the related theoretical knowledge of materials chemical structure.%在基于激光加工各种材料过程中存在的光热作用和光化学作用的基础上,综述了激光加工过程中光化学作用的检测分析方法.对光化学作用结果的检测主要通过对激光作用后材料表面的结构、成分、微观形貌、物理化学特性等的变化进行分析;而对光化学作用过程的检测则采用具有时间分辨尺度的质谱仪或光谱仪等对激光加工过程中产生的分子碎片和带电离子的质量分布、能量分布和角度分布进行分析;根据仪器检测的结果,结合材料的化学结构等理论知识综合分析是研究激光光化学作用机理的主要方法.

  5. Ultrasound perfusion signal processing for tumor detection

    Science.gov (United States)

    Kim, MinWoo; Abbey, Craig K.; Insana, Michael F.

    2016-04-01

    Enhanced blood perfusion in a tissue mass is an indication of neo-vascularity and a sign of a potential malignancy. Ultrasonic pulsed-Doppler imaging is a preferred modality for noninvasive monitoring of blood flow. However, the weak blood echoes and disorganized slow flow make it difficult to detect perfusion using standard methods without the expense and risk of contrast enhancement. Our research measures the efficiency of conventional power-Doppler (PD) methods at discriminating flow states by comparing measurement performance to that of an ideal discriminator. ROC analysis applied to the experimental results shows that power Doppler methods are just 30-50 % efficient at perfusion flows less than 1ml/min, suggesting an opportunity to improve perfusion assessment through signal processing. A new perfusion estimator is proposed by extending the statistical discriminator approach. We show that 2-D perfusion color imaging may be enhanced using this approach.

  6. UV image processing to detect diffuse clouds

    Science.gov (United States)

    Armengot, M.; Gómez de Castro, A. I.; López-Santiago, J.; Sánchez-Doreste, N.

    2015-05-01

    The presence of diffuse clouds along the Galaxy is under consideration as far as they are related to stellar formation and their physical properties are not well understood. The signal received from most of these structures in the UV images is minimal compared to the point sources. The presence of noise in these images makes hard the analysis because the Signal-to-Noise ratio is proportionally much higher in these areas. However, the digital processing of the images shows that it is possible to enhance and target these clouds. Typically, this kind of treatment is done on purpose for specific research areas and the Astrophysicist's work depends on the computer tools and its possibilities for enhancing a particular area based on a prior knowledge. Automating this step is the goal of our work to make easier the study of these structures in UV images. In particular we have used the GALEX survey images in the aim of learning to automatically detect such clouds and be able of unsupervised detection and graphic enhancement to log them. Our experiments show the existence of some evidences in the UV images that allow the systematic computing and open the chance to generalize the algorithm to find these structures in universe areas where they have not been recorded yet.

  7. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    Directory of Open Access Journals (Sweden)

    Kunming Qin

    2014-01-01

    Full Text Available Background: In traditional Chinese medicine (TCM, raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae, is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry. Results: The results indicated that there was less arctiin in stir-fried materials than in raw materials. however, there were higher levels of arctigenin in stir-fried materials than in raw materials. Conclusion: We suggest that arctiin reduced significantly following the thermal conversion of arctiin to arctigenin. In conclusion, this finding may shed some light on understanding the differences in the therapeutic values of raw versus PFA in TCM.

  8. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  9. The Data Analysis in Gravitational Wave Detection

    Science.gov (United States)

    Xiao-ge, Wang; Lebigot, Eric; Zhi-hui, Du; Jun-wei, Cao; Yun-yong, Wang; Fan, Zhang; Yong-zhi, Cai; Mu-zi, Li; Zong-hong, Zhu; Jin, Qian; Cong, Yin; Jian-bo, Wang; Wen, Zhao; Yang, Zhang; Blair, David; Li, Ju; Chun-nong, Zhao; Lin-qing, Wen

    2017-01-01

    Gravitational wave (GW) astronomy based on the GW detection is a rising interdisciplinary field, and a new window for humanity to observe the universe, followed after the traditional astronomy with the electromagnetic waves as the detection means, it has a quite important significance for studying the origin and evolution of the universe, and for extending the astronomical research field. The appearance of laser interferometer GW detector has opened a new era of GW detection, and the data processing and analysis of GWs have already been developed quickly around the world, to provide a sharp weapon for the GW astronomy. This paper introduces systematically the tool software that commonly used for the data analysis of GWs, and discusses in detail the basic methods used in the data analysis of GWs, such as the time-frequency analysis, composite analysis, pulsar timing analysis, matched filter, template, χ2 test, and Monte-Carlo simulation, etc.

  10. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  11. Fast Facial Detection by Depth Map Analysis

    Directory of Open Access Journals (Sweden)

    Ming-Yuan Shieh

    2013-01-01

    Full Text Available In order to obtain correct facial recognition results, one needs to adopt appropriate facial detection techniques. Moreover, the effects of facial detection are usually affected by the environmental conditions such as background, illumination, and complexity of objectives. In this paper, the proposed facial detection scheme, which is based on depth map analysis, aims to improve the effectiveness of facial detection and recognition under different environmental illumination conditions. The proposed procedures consist of scene depth determination, outline analysis, Haar-like classification, and related image processing operations. Since infrared light sources can be used to increase dark visibility, the active infrared visual images captured by a structured light sensory device such as Kinect will be less influenced by environmental lights. It benefits the accuracy of the facial detection. Therefore, the proposed system will detect the objective human and face firstly and obtain the relative position by structured light analysis. Next, the face can be determined by image processing operations. From the experimental results, it demonstrates that the proposed scheme not only improves facial detection under varying light conditions but also benefits facial recognition.

  12. Chemical sensing in process analysis.

    Science.gov (United States)

    Hirschfeld, T; Callis, J B; Kowalski, B R

    1984-10-19

    Improvements in process control, which determine production efficiency and product quality, are critically dependent upon on-line process analysis. The technology of the required instrumentation will be substantially expanded by advances in sensing devices. In the future, the hardware will consist of sensor arrays and miniaturized instruments fabricated by microlithography and silicon micromachining. Chemometrics will be extensively used in software to provide error detection, selfcalibration, and correction as well as multivariate data analysis for the determination of anticipated and unanticipated species. A number of examples of monolithically fabricated sensors now exist and more will be forthcoming as the new paradigms and new tools are widely adopted. A trend toward not only on-line but even in-product sensors is becoming discernible.

  13. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  14. Lung Cancer Detection Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Mokhled S. AL-TARAWNEH

    2012-08-01

    Full Text Available Recently, image processing techniques are widely used in several medical areas for image improvement in earlier detection and treatment stages, where the time factor is very important to discover the abnormality issues in target images, especially in various cancer tumours such as lung cancer, breast cancer, etc. Image quality and accuracy is the core factors of this research, image quality assessment as well as improvement are depending on the enhancement stage where low pre-processing techniques is used based on Gabor filter within Gaussian rules. Following the segmentation principles, an enhanced region of the object of interest that is used as a basic foundation of feature extraction is obtained. Relying on general features, a normality comparison is made. In this research, the main detected features for accurate images comparison are pixels percentage and mask-labelling.

  15. Accurate detection of differential RNA processing

    Science.gov (United States)

    Drewe, Philipp; Stegle, Oliver; Hartmann, Lisa; Kahles, André; Bohnert, Regina; Wachter, Andreas; Borgwardt, Karsten; Rätsch, Gunnar

    2013-01-01

    Deep transcriptome sequencing (RNA-Seq) has become a vital tool for studying the state of cells in the context of varying environments, genotypes and other factors. RNA-Seq profiling data enable identification of novel isoforms, quantification of known isoforms and detection of changes in transcriptional or RNA-processing activity. Existing approaches to detect differential isoform abundance between samples either require a complete isoform annotation or fall short in providing statistically robust and calibrated significance estimates. Here, we propose a suite of statistical tests to address these open needs: a parametric test that uses known isoform annotations to detect changes in relative isoform abundance and a non-parametric test that detects differential read coverages and can be applied when isoform annotations are not available. Both methods account for the discrete nature of read counts and the inherent biological variability. We demonstrate that these tests compare favorably to previous methods, both in terms of accuracy and statistical calibrations. We use these techniques to analyze RNA-Seq libraries from Arabidopsis thaliana and Drosophila melanogaster. The identified differential RNA processing events were consistent with RT–qPCR measurements and previous studies. The proposed toolkit is available from http://bioweb.me/rdiff and enables in-depth analyses of transcriptomes, with or without available isoform annotation. PMID:23585274

  16. Selective visual attention in object detection processes

    Science.gov (United States)

    Paletta, Lucas; Goyal, Anurag; Greindl, Christian

    2003-03-01

    Object detection is an enabling technology that plays a key role in many application areas, such as content based media retrieval. Attentive cognitive vision systems are here proposed where the focus of attention is directed towards the most relevant target. The most promising information is interpreted in a sequential process that dynamically makes use of knowledge and that enables spatial reasoning on the local object information. The presented work proposes an innovative application of attention mechanisms for object detection which is most general in its understanding of information and action selection. The attentive detection system uses a cascade of increasingly complex classifiers for the stepwise identification of regions of interest (ROIs) and recursively refined object hypotheses. While the most coarse classifiers are used to determine first approximations on a region of interest in the input image, more complex classifiers are used for more refined ROIs to give more confident estimates. Objects are modelled by local appearance based representations and in terms of posterior distributions of the object samples in eigenspace. The discrimination function to discern between objects is modeled by a radial basis functions (RBF) network that has been compared with alternative networks and been proved consistent and superior to other artifical neural networks for appearance based object recognition. The experiments were led for the automatic detection of brand objects in Formula One broadcasts within the European Commission's cognitive vision project DETECT.

  17. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  18. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  19. Digital Signal Processing Based Real Time Vehicular Detection System

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaoxuan; LIN Tao; LI Xiangping; LIU Chunyi; GAO Jian

    2005-01-01

    Traffic monitoring is of major importance for enforcing traffic management policies.To accomplish this task,the detection of vehicle can be achieved by exploiting image analysis techniques.In this paper,a solution is presented to obtain various traffic parameters through vehicular video detection system(VVDS).VVDS exploits the algorithm based on virtual loops to detect moving vehicle in real time.This algorithm uses the background differencing method,and vehicles can be detected through luminance difference of pixels between background image and current image.Furthermore a novel technology named as spatio-temporal image sequences analysis is applied to background differencing to improve detection accuracy.Then a hardware implementation of a digital signal processing (DSP) based board is described in detail and the board can simultaneously process four-channel video from different cameras. The benefit of usage of DSP is that images of a roadway can be processed at frame rate due to DSP′s high performance.In the end,VVDS is tested on real-world scenes and experiment results show that the system is both fast and robust to the surveillance of transportation.

  20. Chemometric processing of second-order liquid chromatographic data with UV-vis and fluorescence detection. A comparison of multivariate curve resolution and parallel factor analysis 2.

    Science.gov (United States)

    Bortolato, Santiago A; Olivieri, Alejandro C

    2014-09-01

    Second-order liquid chromatographic data with multivariate spectral (UV-vis or fluorescence) detection usually show changes in elution time profiles from sample to sample, causing a loss of trilinearity in the data. In order to analyze them with an appropriate model, the latter should permit a given component to have different time profiles in different samples. Two popular models in this regard are multivariate curve resolution-alternating least-squares (MCR-ALS) and parallel factor analysis 2 (PARAFAC2). The conditions to be fulfilled for successful application of the latter model are discussed on the basis of simple chromatographic concepts. An exhaustive analysis of the multivariate calibration models is carried out, employing both simulated and experimental chromatographic data sets. The latter involve the quantitation of benzimidazolic and carbamate pesticides in fruit and juice samples using liquid chromatography with diode array detection, and of polycyclic aromatic hydrocarbons in water samples, in both cases in the presence of potential interferents using liquid chromatography with fluorescence spectral detection, thereby achieving the second-order advantage. The overall results seem to favor MCR-ALS over PARAFAC2, especially in the presence of potential interferents.

  1. Frequency Jump Detection and Analysis

    Science.gov (United States)

    2008-12-01

    CUMULATIVE SUM JUMP DETECTION The Cumulative Sum ( CUSUM ) is a classic change-point analysis technique that uses the cumulative sum of the...sum and y is the average of the data. The CUSUM slope indicates the value of the data with respect to the overall average. A flat cumulative sum...sudden change in the CUSUM slope indicates a jump in the data. The CUSUM plot for a data set having a single jump will have a V or inverted V shape

  2. Detection of microparticles in dynamic processes

    Science.gov (United States)

    Ten, K. A.; Pruuel, E. R.; Kashkarov, A. O.; Rubtsov, I. A.; Shechtman, L. I.; Zhulanov, V. V.; Tolochko, B. P.; Rykovanov, G. N.; Muzyrya, A. K.; Smirnov, E. B.; Stolbikov, M. Yu; Prosvirnin, K. M.

    2016-11-01

    When a metal plate is subjected to a strong shock impact, its free surface emits a flow of particles of different sizes (shock-wave “dusting”). Traditionally, the process of dusting is investigated by the methods of pulsed x-ray or piezoelectric sensor or via an optical technique. The particle size ranges from a few microns to hundreds of microns. The flow is assumed to include also finer particles, which cannot be detected with the existing methods yet. On the accelerator complex VEPP-3-VEPP-4 at the BINP there are two experiment stations for research on fast processes, including explosion ones. The stations enable measurement of both passed radiation (absorption) and small-angle x-ray scattering on synchrotron radiation (SR). Radiation is detected with a precision high-speed detector DIMEX. The detector has an internal memory of 32 frames, which enables recording of the dynamics of the process (shooting of movies) with intervals of 250 ns to 2 μs. Flows of nano- and microparticles from free surfaces of various materials (copper and tin) have been examined. Microparticle flows were emitted from grooves of 50-200 μs in size and joints (gaps) between metal parts. With the soft x-ray spectrum of SR one can explore the dynamics of a single microjet of micron size. The dynamics of density distribution along micro jets were determined. Under a shock wave (∼ 60 GPa) acting on tin disks, flows of microparticles from a smooth surface were recorded.

  3. Detection of hazelnut in foods using ELISA: challenges related to the detectability in processed foodstuffs.

    Science.gov (United States)

    Cucu, Tatiana; Devreese, Bart; Trashin, Stanislav; Kerkaert, Barbara; Rogge, Maarten; De Meulenaer, Bruno

    2012-01-01

    Hazelnuts are widely used nowadays, and can pose a serious threat to allergic consumers due to cross-contamination that may occur during processing. This might lead to the presence of hidden hazelnut in foods. Therefore, reliable tests are needed to detect hazelnut, especially in processed foods. A hazelnut-specific indirect competitive ELISA based on polyclonal chicken antibodies was developed. The polyclonal antibodies were raised against modified hazelnut proteins in order to improve the detectability of hazelnut proteins in processed foods. The assay showed a detection limit of 1.36 microg hazelnut protein/mL of 5 mM urea in phosphate-buffered saline buffer (pH 7.4). Limited cross-reactivity with walnut and pecan nut was observed; no cross-reactivity was observed with other food ingredients. Blank cookies spiked before analysis showed recoveries of 73-107%. However, cookies spiked before baking showed that the detectability was severely decreased. Addition of lactose to the cookies, which led to more severe modification through the Maillard reaction, led to an increase in the detectability. These results indicate that using antibodies developed toward allergens modified through food processing-simulating reactions is a better approach for detection.

  4. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  5. Process Analysis Via Accuracy Control

    Science.gov (United States)

    1982-02-01

    0 1 4 3 NDARDS THE NATIONAL February 1982 Process Analysis Via Accuracy Control RESEARCH PROG RAM U.S. DEPARTMENT OF TRANSPORTATION Maritime...SUBTITLE Process Analysis Via Accuracy Control 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e...examples are contained in Appendix C. Included, are examples of how “A/C” process - analysis leads to design improvement and how a change in sequence can

  6. Analysis of Cocoa Proanthocyanidins Using Reversed Phase High-Performance Liquid Chromatography and Electrochemical Detection: Application to Studies on the Effect of Alkaline Processing.

    Science.gov (United States)

    Stanley, Todd H; Smithson, Andrew T; Neilson, Andrew P; Anantheswaran, Ramaswamy C; Lambert, Joshua D

    2015-07-01

    Flavan-3-ols and proanthocyanidins play a key role in the health beneficial effects of cocoa. Here, we developed a new reversed phased high-performance liquid chromatography-electrochemical detection (HPLC-ECD) method for the analysis of flavan-3-ols and proanthocyanidins of degree of polymerization (DP) 2-7. We used this method to examine the effect of alkalization on polyphenol composition of cocoa powder. Treatment of cocoa powder with NaOH (final pH 8.0) at 92 °C for up to 1 h increased catechin content by 40%, but reduced epicatechin and proanthocyanidins by 23-66%. Proanthocyanidin loss could be modeled using a two-phase exponential decay model (R(2) > 0.7 for epicatchin and proanthocyanidins of odd DP). Alkalization resulted in a significant color change and 20% loss of total polyphenols. The present work demonstrates the first use of HPLC-ECD for the detection of proanthocyanidins up to DP 7 and provides an initial predictive model for the effect of alkali treatment on cocoa polyphenols.

  7. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  8. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  9. Dynamic analysis of process reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  10. Statistical method for detecting structural change in the growth process.

    Science.gov (United States)

    Ninomiya, Yoshiyuki; Yoshimoto, Atsushi

    2008-03-01

    Due to competition among individual trees and other exogenous factors that change the growth environment, each tree grows following its own growth trend with some structural changes in growth over time. In the present article, a new method is proposed to detect a structural change in the growth process. We formulate the method as a simple statistical test for signal detection without constructing any specific model for the structural change. To evaluate the p-value of the test, the tube method is developed because the regular distribution theory is insufficient. Using two sets of tree diameter growth data sampled from planted forest stands of Cryptomeria japonica in Japan, we conduct an analysis of identifying the effect of thinning on the growth process as a structural change. Our results demonstrate that the proposed method is useful to identify the structural change caused by thinning. We also provide the properties of the method in terms of the size and power of the test.

  11. Entanglement of identical particles and the detection process

    DEFF Research Database (Denmark)

    Tichy, Malte C.; de Melo, Fernando; Kus, Marek;

    2013-01-01

    We introduce detector-level entanglement, a unified entanglement concept for identical particles that takes into account the possible deletion of many-particle which-way information through the detection process. The concept implies a measure for the effective indistinguishability of the particles......, which is controlled by the measurement setup and which quantifies the extent to which the (anti-)symmetrization of the wavefunction impacts on physical observables. Initially indistinguishable particles can gain or loose entanglement on their transition to distinguishability, and their quantum...... statistical behavior depends on their initial entanglement. Our results show that entanglement cannot be attributed to a state of identical particles alone, but that the detection process has to be incorporated in the analysis....

  12. Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xi; YAN Wei-wu; ZHAO Xu; SHAO Hui-he

    2007-01-01

    A novel nonlinear process monitoring and fault detection method based on kernel independent component analysis (ICA) is proposed. The kernel ICA method is a two-phase algorithm: whitened kernel principal component (KPCA) plus ICA. KPCA spheres data and makes the data structure become as linearly separable as possible by virtue of an implicit nonlinear mapping determined by kernel. ICA seeks the projection directions in the KPCA whitened space, making the distribution of the projected data as non-gaussian as possible. The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed process monitoring method based on kernel ICA can effectively capture the nonlinear relationship in process variables. Its performance significantly outperforms monitoring method based on ICA or KPCA.

  13. Buffer overflow prone points detection based on inter-process analysis%基于过程间分析的缓冲区溢出易发点检测

    Institute of Scientific and Technical Information of China (English)

    邹雪; 王兴起; 方景龙; 王大全

    2016-01-01

    针对循环拷贝内存引发的缓冲区溢出漏洞,提出了一种上下文相关的过程间分析检测模型,通过对二进制代码进行一系列的静态分析,使用过程间分析提供的数据交互关系,对缓冲区溢出易发点进行挖掘。这种检测模型基于BinNavi的开放平台,以插件形式实现,能够对溢出易发点进行精确的筛选,有效地减少误报漏报情况。%For buffer overflow vulnerability caused by circulating copies of memory, this paper proposes a context-sensitive inter-process analysis and detection model. Through a series of static analysis of binary code, inter-process analysis using data provided by the inter-process analysis, it mines the buffer overflow prone points. This detection model is based on BinNavi open platform, implemented as a plug, is able to screen accurately spill-prone points and effectively reduces false and negative cases.

  14. 石油钻井过程故障检测的多模核主元分析方法%Multimode Kernel Principal Component Analysis Method of Drilling Process Fault Detection

    Institute of Scientific and Technical Information of China (English)

    王杰; 李璐

    2015-01-01

    A kernel principal component analysis ( KPCA ) method applicable to the drilling process fault detection was put forward. Firstly, process data were classified by using threshold classification algo-rithm, and the data of the steady state condition were obtained. Secondly, according to the classification data the corresponding KPCA model was established, and these corresponding KPCA models were com-bined together to realize fault detection. After multiple tests, the method was proved to be suitable for fault detection of drilling process, the detection sensitivity was improved and the error was reduced.%提出了一种适用于石油钻井过程故障检测的多模核主元分析方法。首先,利用门限值分类算法对过程数据进行分类,可以得到钻井过程各个稳态工况下的数据;其次,取不同工况的数据分别建立相对应的核主元模型,将这些核主元模型组合到一起构成一个核主元模型组进行故障检测。经实验数据分析,该检测方法适用于石油钻井过程,提高了检测灵敏度并减少了误差。

  15. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  16. Signal processing techniques for atrial fibrillation source detection.

    Science.gov (United States)

    Ambadkar, Minal; Leonelli, Fabio M; Sankar, Ravi

    2014-01-01

    In clinical practice, Atrial Fibrillation (AF) is the most common and critical cardiac arrhythmia encountered. The treatment that can ensure permanent AF removal is catheter ablation, where cardiologists destroy the affected cardiac muscle cells with RF or Laser. In this procedure it is necessary to know exactly from which part of the heart AF triggers are originated. Various signal processing algorithms provide a strong tool to track AF sources. This study proposes, signal processing techniques that can be exploited for characterization, analysis and source detection of AF signals. These algorithms are implemented on Electrocardiogram (ECG) and intracardiac signals which contain important information that allows the analysis of anatomic and physiologic aspects of the whole cardiac muscle.

  17. Principal Component Analysis in ECG Signal Processing

    Directory of Open Access Journals (Sweden)

    Andreas Bollmann

    2007-01-01

    Full Text Available This paper reviews the current status of principal component analysis in the area of ECG signal processing. The fundamentals of PCA are briefly described and the relationship between PCA and Karhunen-Loève transform is explained. Aspects on PCA related to data with temporal and spatial correlations are considered as adaptive estimation of principal components is. Several ECG applications are reviewed where PCA techniques have been successfully employed, including data compression, ST-T segment analysis for the detection of myocardial ischemia and abnormalities in ventricular repolarization, extraction of atrial fibrillatory waves for detailed characterization of atrial fibrillation, and analysis of body surface potential maps.

  18. Analysis and comparison of common liquid level detecting equipments in flotation process%常见浮选液位测量装置的分析及对比

    Institute of Scientific and Technical Information of China (English)

    王英; 张克

    2013-01-01

    为了解决浮选控制过程中的液位计选型问题,从现场应用出发,分析对比了常用的浮力式液位计、激光液位计、静压液位计、超声波液位计等浮选液位测量装置的基本结构、工作原理及现场应用情况.分析认为因传感器性能、附属装置稳定性、测量介质等不同,各测量装置均有优缺点,实际应用时应综合考虑现场浮选生产过程具体情况、维修条件、装置性能、安装使用费用等因素,选择合适的浮选液位测量装置.%In order to solve selection problem of liquid level detecting equipment in flotation control process, basic structure, working principle and application in separation field of common liquid level detecting equipments in flotation process were analyzed and compared including buoyancy type level gauge, laser level gauge, static level gauge and ultrasonic level gauge from the perspective of actual use. The analysis comes to a conclusion that each detecting equipment has own advantages and disadvantages for differences of sensors performance, stability of attachment devices and measured medium, thus concrete situation in flotation process, maintenance situation, performance of detecting equipment and cost of installation and use should be involved in concerning of selection of level detecting equipment.

  19. Method Validation for the Quantitative Analysis of Aflatoxins (B1, B2, G1, and G2) and Ochratoxin A in Processed Cereal-Based Foods by HPLC with Fluorescence Detection.

    Science.gov (United States)

    Gazioğlu, Işil; Kolak, Ufuk

    2015-01-01

    Modified AOAC 991.31 and AOAC 2000.03 methods for the simultaneous determination of total aflatoxins (AFs), aflatoxin B1, and ochratoxin A (OTA) in processed cereal-based foods by RP-HPLC coupled with fluorescence detection were validated. A KOBRA® Cell derivatization system was used to analyze total AFs. One of the modifications was the extraction procedure of mycotoxins. Both AFs and OTA were extracted with methanol-water (75+25, v/v) and purified with an immunoaffinity column before HPLC analysis. The modified methods were validated by measuring the specificity, selectivity, linearity, sensitivity, accuracy, repeatability, reproducibility, recovery, LOD, and LOQ parameters. The validated methods were successfully applied for the simultaneous determination of mycotoxins in 81 processed cereal-based foods purchased in Turkey. These rapid, sensitive, simple, and validated methods are suitable for the simultaneous determination of AFs and OTA in the processed cereal-based foods.

  20. Employing image processing techniques for cancer detection using microarray images.

    Science.gov (United States)

    Dehghan Khalilabad, Nastaran; Hassanpour, Hamid

    2017-02-01

    Microarray technology is a powerful genomic tool for simultaneously studying and analyzing the behavior of thousands of genes. The analysis of images obtained from this technology plays a critical role in the detection and treatment of diseases. The aim of the current study is to develop an automated system for analyzing data from microarray images in order to detect cancerous cases. The proposed system consists of three main phases, namely image processing, data mining, and the detection of the disease. The image processing phase performs operations such as refining image rotation, gridding (locating genes) and extracting raw data from images the data mining includes normalizing the extracted data and selecting the more effective genes. Finally, via the extracted data, cancerous cell is recognized. To evaluate the performance of the proposed system, microarray database is employed which includes Breast cancer, Myeloid Leukemia and Lymphomas from the Stanford Microarray Database. The results indicate that the proposed system is able to identify the type of cancer from the data set with an accuracy of 95.45%, 94.11%, and 100%, respectively.

  1. Processing Ocean Images to Detect Large Drift Nets

    Science.gov (United States)

    Veenstra, Tim

    2009-01-01

    A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.

  2. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  3. Less is More: Data Processing with SVM for Intrusion Detection

    Institute of Scientific and Technical Information of China (English)

    XIAO Hai-jun; HONG Fan; WANG Ling

    2009-01-01

    To improve the detection rate and lower down the false positive rate in intrusion detection system,dimensionality reduction is widely used in the intrusion detection system.For this purpose,a data processing (DP) with support vector machine (SVM) was built.Different from traditionally identifying the redundant data before purging the audit data by expert knowledge or utilizing different kinds of subsets of the available 41-connection attributes to build a classifier,the proposed strategy first removes the attributes whose correlation with another attribute exceeds a threshold,and then classifies two sequence samples as one class while removing either of the two samples whose similarity exceeds a threshold.The results of performance experiments showed that the strategy of DP and SVM is superior to the other existing data reduction strategies (e.g.,audit reduction,rule extraction,and feature selection),and that the detection model based on DP and SVM outperforms those based on data mining,soft computing,and hierarchical principal component analysis neural networks.

  4. Single and few photon avalanche photodiode detection process study

    Science.gov (United States)

    Blazej, Josef; Prochazka, Ivan

    2009-07-01

    We are presenting the results of the study of the Single Photon Avalanche Diode (SPAD) pulse response risetime and its dependence on several key parameters. We were investigating the unique properties of K14 type SPAD with its high delay uniformity of 200 μm active area and the correlation between the avalanche buildup time and the photon number involved in the avalanche trigger. The detection chip was operated in a passive quenching circuit with active gating. This setup enabled us to monitor the diode reverse current using an electrometer, a fast digitizing oscilloscope, and using a custom design comparator circuit. The electrometer reading enabled to estimate the photon number per detection event, independently on avalanche process. The avalanche build up was recorded on the oscilloscope and processed by custom designed waveform analysis package. The correlation of avalanche build up to the photon number, bias above break, photon absorption location, optical pulse length and photon energy was investigated in detail. The experimental results are presented. The existing solid state photon counting detectors have been dedicated for picosecond resolution and timing stability of single photon events. However, the high timing stability is maintained for individual single photons detection, only. If more than one photon is absorbed within the detector time resolution, the detection delay will be significantly affected. This fact is restricting the application of the solid state photon counters to cases where single photons may be guaranteed, only. For laser ranging purposes it is highly desirable to have a detector, which detects both single photon and multi photon signals with picoseconds stability. The SPAD based photon counter works in a purely digital mode: a uniform output signal is generated once the photon is detected. If the input signal consists of several photons, the first absorbed one triggers the avalanche. Obviously, for multiple photon signals, the

  5. Crack Length Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    1990-01-01

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better then that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  6. Crack Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal, Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better than that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  7. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    CERN Document Server

    Möller, A; Lanusse, F; Neveu, J; Palanque-Delabrouille, N

    2015-01-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data differed photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000 detections dominated by signals of bright objects that were not perfectly subtracted. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of o...

  8. Leak detection in pipelines using cepstrum analysis

    Science.gov (United States)

    Taghvaei, M.; Beck, S. B. M.; Staszewski, W. J.

    2006-02-01

    The detection and location of leaks in pipeline networks is a major problem and the reduction of these leaks has become a major priority for pipeline authorities around the world. Although the reasons for these leaks are well known, some of the current methods for locating and identifying them are either complicated or imprecise; most of them are time consuming. The work described here shows that cepstrum analysis is a viable approach to leak detection and location in pipeline networks. The method uses pressure waves caused by quickly opening and closing a solenoid valve. Due to their simplicity and robustness, transient analyses provide a plausible route towards leak detection. For this work, the time domain signals of these pressure transients were obtained using a single pressure transducer. These pressure signals were first filtered using discrete wavelets to remove the dc offset, and the low and high frequencies. They were then analysed using a cepstrum method which identified the time delay between the initial wave and its reflections. There were some features in the processed results which can be ascribed to features in the pipeline network such as junctions and pipe ends. When holes were drilled in the pipe, new peaks occurred which identified the presence of a leak in the pipeline network. When tested with holes of different sizes, the amplitude of the processed peak was seen to increase as the cube root of the leak diameter. Using this method, it is possible to identify leaks that are difficult to find by other methods as they are small in comparison with the flow through the pipe.

  9. Photoacoustic spectroscopy for process analysis.

    Science.gov (United States)

    Schmid, Thomas

    2006-03-01

    Photoacoustic spectroscopy (PAS) is based on the absorption of electromagnetic radiation by analyte molecules. The absorbed energy is measured by detecting pressure fluctuations in the form of sound waves or shock pulses. In contrast to conventional absorption spectroscopy (such as UV/Vis spectroscopy), PAS allows the determination of absorption coefficients over several orders of magnitude, even in opaque and strongly scattering samples. Small absorption coefficients, such as those encountered during trace gas monitoring, can be detected with cells with relatively short pathlengths. Furthermore, PA techniques allow absorption spectra of solid samples (including powders, chips or large objects) to be determined, and they permit depth profiling of layered systems. These features mean that PAS can be used for on-line monitoring in technical processes without the need for sample preparation and to perform depth-resolved characterization of industrial products. This article gives an overview on PA excitation and detection schemes employed in analytical chemistry, and reviews applications of PAS in process analytical technology and characterization of industrial products.

  10. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  11. Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR

    Science.gov (United States)

    2014-07-12

    Processing Techniques for Landmine Detection Using GPR The views, opinions and/or findings contained in this report are those of the author(s) and should not...AGENCY NAME(S) AND ADDRESS (ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 landmine Detection, Signal...310 Jesse Hall Columbia, MO 65211 -1230 654808 633606 ABSTRACT Advanced Statistical Signal Processing Techniques for Landmine Detection Using GPR Report

  12. Signal processing in cryogenic particle detection

    Energy Technology Data Exchange (ETDEWEB)

    Yuryev, Y.N. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Jang, Y.S. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Kim, S.K. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Lee, K.B.; Lee, M.K. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Lee, S.J. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Yoon, W.S. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Kim, Y.H., E-mail: yhkim@kriss.re.k [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of)

    2011-04-11

    We describe a signal-processing program for a data acquisition system for cryogenic particle detectors. The program is based on an optimal-filtering method for high-resolution measurement of calorimetric signals with a significant amount of noise of unknown origin and non-stationary behavior. The program was applied to improve the energy resolution of the alpha particle spectrum of an {sup 241}Am source.

  13. Detecting causality in policy diffusion processes

    Science.gov (United States)

    Grabow, Carsten; Macinko, James; Silver, Diana; Porfiri, Maurizio

    2016-08-01

    A universal question in network science entails learning about the topology of interaction from collective dynamics. Here, we address this question by examining diffusion of laws across US states. We propose two complementary techniques to unravel determinants of this diffusion process: information-theoretic union transfer entropy and event synchronization. In order to systematically investigate their performance on law activity data, we establish a new stochastic model to generate synthetic law activity data based on plausible networks of interactions. Through extensive parametric studies, we demonstrate the ability of these methods to reconstruct networks, varying in size, link density, and degree heterogeneity. Our results suggest that union transfer entropy should be preferred for slowly varying processes, which may be associated with policies attending to specific local problems that occur only rarely or with policies facing high levels of opposition. In contrast, event synchronization is effective for faster enactment rates, which may be related to policies involving Federal mandates or incentives. This study puts forward a data-driven toolbox to explain the determinants of legal activity applicable to political science, across dynamical systems, information theory, and complex networks.

  14. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  15. Numerical analysis of Eucalyptus grandis × E. urophylla heat-treatment: A dynamically detecting method of mass loss during the process

    Science.gov (United States)

    Zhao, Zijian; Ma, Qing; Mu, Jun; Yi, Songlin; He, Zhengbin

    Eucalyptus particles, lamellas and boards were applied to explore a simply-implemented method with neglected heat and mass transfer to inspect the mass loss during the heat-treatment course. The results revealed that the mass loss of a certain period was theoretically the definite integration of loss rate to time in this period, and a monitoring model for mass loss speed was developed with the particles and validated with the lamellas and boards. The loss rate was correlated to the temperature and temperature-evolving speed in the model which was composed of three functions during different temperature-evolving period. The sample mass loss was calculated in the MATLAB for the lamellas and boards and the model was validated and adjusted based on the difference between the computed results and the practically measured loss values. The error ranges of the new models were -16.30% to 18.35% for wood lamellas and -9.86% to 6.80% for wood boards. This method made it possible to acquire the instantaneous loss value through continuously detecting the wood temperature evolution. This idea could provide a reference for the Eucalyptus heat-treatment to detect the treating course and control the final material characteristics.

  16. Sequential Analysis: Hypothesis Testing and Changepoint Detection

    Science.gov (United States)

    2014-07-11

    health mon- itoring of bridges [24, 25, 43], wind turbines [178, 216], and aircraft [41, 102, 186, 188], detecting multiple sensor faults in an unmanned...describe a couple of signal processing problems, namely segmentation of signals and seismic signal processing. Mechanical systems integrity monitoring is...and useful in image segmentation and bound- ary tracking problems [96]. 1.3.4.2 Seismic Data Processing In many situations of seismic data processing

  17. Analysis of weld seam uniformity through temperature distribution by spatially resolved detector elements in the wavelength range of 0.3μm to 5μm for the detection of structural changing heating and cooling processes

    Science.gov (United States)

    Lempe, B.; Maschke, R.; Rudek, F.; Baselt, T.; Hartmann, P.

    2016-03-01

    Online process control systems often only detecting temperatures at a local area of the machining point and determining an integrated value. In order to determine the proper welding quality and the absence of defects, such as temperature induced stress cracks, it is necessary to do time and space resolved measurements before, during and after the production process. The system under development consists of a beam splitting unit which divides the electromagnetic radiation of the heated component on two different sensor types. For high temperatures, a sensor is used which is sensitive in the visible spectrum and has a dynamic range of 120dB.1 Thus, very high intensity differences can be displayed and a direct analysis of the temperature profile of the weld spots is possible.2 A second sensor is operating in the wavelength range from 1 micron to 5 microns and allows the determination of temperatures from approximately 200°C.3 At the beginning of a welding process, the heat-up phase of the metal is critical to the resultant weld quality. If a defined temperature range exceeded too fast, the risk of cracking is significantly increased.4 During the welding process the thermal supervision of the central processing location is decisive for a high secure weld. In the border areas as well as in connection of the welding process especially cooling processes are crucial for the homogeneity of the results. In order to obtain sufficiently accurate resolution of the dynamic heating- and cooling-processes, the system can carry out up to 500 frames per second.

  18. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  19. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  20. Radar fall detection using principal component analysis

    Science.gov (United States)

    Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem

    2016-05-01

    Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.

  1. Crack Detection with Lamb Wave Wavenumber Analysis

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  2. SENTIMENT ANALYSIS FOR ONLINE FORUMS HOTSPOT DETECTION

    Directory of Open Access Journals (Sweden)

    K. Nirmala Devi

    2012-01-01

    Full Text Available The user generated content on the web grows rapidly in this emergent information age. The evolutionary changes in technology make use of such information to capture only the user’s essence and finally the useful information are exposed to information seekers. Most of the existing research on text information processing, focuses in the factual domain rather than the opinion domain. Text mining plays a vital role in online forum opinion mining. But opinion mining from online forum is much more difficult than pure text process due to their semi structured characteristics. In this paper we detect online hotspot forums by computing sentiment analysis for text data available in each forum. This approach analyzes the forum text data and computes value for each piece of text. The proposed approach combines K-means clustering and Support Vector Machine (SVM classification algorithm that can be used to group the forums into two clusters forming hotspot forums and non-hotspot forums within the current time span. The experiment helps to identify that K-means and SVM together achieve highly consistent results. The prediction result of SVM is also compared with other classifiers such as Naïve Bayes, Decision tree and among them SVM performs the best.

  3. Sign Language Video Processing for Text Detection in Hindi Language

    Directory of Open Access Journals (Sweden)

    Rashmi B Hiremath

    2016-10-01

    Full Text Available Sign language is a way of expressing yourself with your body language, where every bit of ones expressions, goals, or sentiments are conveyed by physical practices, for example, outward appearances, body stance, motions, eye movements, touch and the utilization of space. Non-verbal communication exists in both creatures and people, yet this article concentrates on elucidations of human non-verbal or sign language interpretation into Hindi textual expression. The proposed method of implementation utilizes the image processing methods and synthetic intelligence strategies to get the goal of sign video recognition. To carry out the proposed task implementation it uses image processing methods such as frame analysing based tracking, edge detection, wavelet transform, erosion, dilation, blur elimination, noise elimination, on training videos. It also uses elliptical Fourier descriptors called SIFT for shape feature extraction and most important part analysis for feature set optimization and reduction. For result analysis, this paper uses different category videos such as sign of weeks, months, relations etc. Database of extracted outcomes are compared with the video fed to the system as a input of the signer by a trained unclear inference system.

  4. Measurement Bias Detection through Factor Analysis

    Science.gov (United States)

    Barendse, M. T.; Oort, F. J.; Werner, C. S.; Ligtvoet, R.; Schermelleh-Engel, K.

    2012-01-01

    Measurement bias is defined as a violation of measurement invariance, which can be investigated through multigroup factor analysis (MGFA), by testing across-group differences in intercepts (uniform bias) and factor loadings (nonuniform bias). Restricted factor analysis (RFA) can also be used to detect measurement bias. To also enable nonuniform…

  5. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  6. Fault Management: Degradation Signature Detection, Modeling, and Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  7. Detecting change-points in multidimensional stochatic processes

    NARCIS (Netherlands)

    de Gooijer, J.G.

    2006-01-01

    A general test statistic for detecting change-points in multidimensional stochastic processes with unknown parameters is proposed. The test statistic is specialized to the case of detecting changes in sequences of covariance matrices. Large-sample distributional results are presented for the test st

  8. Fault detection filter design for an anaerobic digestion process

    Energy Technology Data Exchange (ETDEWEB)

    Aubrun, C.; Garnier, O. [Univ. Henri Poincare - Nancy 1, Vandoeuvre (France); Harmand, J.; Steyer, J.P. [LBE-INRA, Narbonne (France)

    2000-05-01

    In this paper, a Fault Detection and Isolation observer based method has been applied to biological wastewater treatment process. The method is designed with a dynamic model and the observer is determined using the eigenstructure assignment approach. The efficiency of the method is demonstrated for both detection and isolation of an actuator and a sensor failure using experimental data from a pilot scale anaerobic digestion process for the treatment of an industrial wine distillery vinasses. (orig.)

  9. A preamplification approach to GMO detection in processed foods.

    Science.gov (United States)

    Del Gaudio, S; Cirillo, A; Di Bernardo, G; Galderisi, U; Cipollaro, M

    2010-03-01

    DNA is widely used as a target for GMO analysis because of its stability and high detectability. Real-time PCR is the method routinely used in most analytical laboratories due to its quantitative performance and great sensitivity. Accurate DNA detection and quantification is dependent on the specificity and sensitivity of the amplification protocol as well as on the quality and quantity of the DNA used in the PCR reaction. In order to enhance the sensitivity of real-time PCR and consequently expand the number of analyzable target genes, we applied a preamplification technique to processed foods where DNA can be present in low amounts and/or in degraded forms thereby affecting the reliability of qualitative and quantitative results. The preamplification procedure utilizes a pool of primers targeting genes of interest and is followed by real-time PCR reactions specific for each gene. An improvement of Ct values was found comparing preamplified vs. non-preamplified DNA. The strategy reported in the present study will be also applicable to other fields requiring quantitative DNA testing by real-time PCR.

  10. OPAD data analysis. [Optical Plumes Anomaly Detection

    Science.gov (United States)

    Buntine, Wray L.; Kraft, Richard; Whitaker, Kevin; Cooper, Anita E.; Powers, W. T.; Wallace, Tim L.

    1993-01-01

    Data obtained in the framework of an Optical Plume Anomaly Detection (OPAD) program intended to create a rocket engine health monitor based on spectrometric detections of anomalous atomic and molecular species in the exhaust plume are analyzed. The major results include techniques for handling data noise, methods for registration of spectra to wavelength, and a simple automatic process for estimating the metallic component of a spectrum.

  11. A signal processing method for the friction-based endpoint detection system of a CMP process

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chi; Guo Dongming; Jin Zhuji; Kang Renke, E-mail: xuchi_dut@163.com [Key Laboratory for Precision and Non-Traditional Machining Technology of Ministry of Education, Dalian University of Technology, Dalian 116024 (China)

    2010-12-15

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process. (semiconductor technology)

  12. Infrared spectroscopy for process control and fault detection of advanced semiconductor processes

    Science.gov (United States)

    Rosenthal, P.; Aarts, W.; Bonanno, A.; Boning, D.; Charpenay, S.; Gower, A.; Richter, M.; Smith, T.; Solomon, P.; Spartz, M.; Nelson, C.; Waldhauer, A.; Xu, J.; Yakovlev, V.; Zhang, W.; Allen, L.; Cordts, B.; Brandt, M.; Mundt, R.; Perry, A.

    1998-11-01

    Fourier transform infrared (FTIR) spectroscopy has emerged as an attractive sensor for in-situ monitoring and control of semiconductor fabrication processes. New applications are being enabled by advances in FTIR hardware and software that provide for: compact size, fast measurements with exceptional stability and signal to noise, and intelligent model based algorithms for thin film and gas analysis. In previously reported work, FTIR instrumentation with automated spectral analysis software was demonstrated as a novel sensor for monitoring layer properties such as thickness, composition and temperature. Recent work has emphasized applications to practical problems in modern semiconductor manufacturing. In this paper we will report pioneering results on: 1) Run-to-run closed loop control of a single wafer epitaxial silicon process using integrated infrared thickness and doping profiling metrology, 2) Fault detection during cluster tool plasma etching using real-time infrared exhaust gas analysis, and 3) oxygen implantation process monitoring during the formation of silicon on insulator (SOI) wafers using infrared reflectometry.

  13. Influence analysis in quantitative trait loci detection.

    Science.gov (United States)

    Dou, Xiaoling; Kuriki, Satoshi; Maeno, Akiteru; Takada, Toyoyuki; Shiroishi, Toshihiko

    2014-07-01

    This paper presents systematic methods for the detection of influential individuals that affect the log odds (LOD) score curve. We derive general formulas of influence functions for profile likelihoods and introduce them into two standard quantitative trait locus detection methods-the interval mapping method and single marker analysis. Besides influence analysis on specific LOD scores, we also develop influence analysis methods on the shape of the LOD score curves. A simulation-based method is proposed to assess the significance of the influence of the individuals. These methods are shown useful in the influence analysis of a real dataset of an experimental population from an F2 mouse cross. By receiver operating characteristic analysis, we confirm that the proposed methods show better performance than existing diagnostics.

  14. Sequential Detection of Fission Processes for Harbor Defense

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  15. Performance evaluation of fault detection methods for wastewater treatment processes.

    Science.gov (United States)

    Corominas, Lluís; Villez, Kris; Aguado, Daniel; Rieger, Leiv; Rosén, Christian; Vanrolleghem, Peter A

    2011-02-01

    Several methods to detect faults have been developed in various fields, mainly in chemical and process engineering. However, minimal practical guidelines exist for their selection and application. This work presents an index that allows for evaluating monitoring and diagnosis performance of fault detection methods, which takes into account several characteristics, such as false alarms, false acceptance, and undesirable switching from correct detection to non-detection during a fault event. The usefulness of the index to process engineering is demonstrated first by application to a simple example. Then, it is used to compare five univariate fault detection methods (Shewhart, EWMA, and residuals of EWMA) applied to the simulated results of the Benchmark Simulation Model No. 1 long-term (BSM1_LT). The BSM1_LT, provided by the IWA Task Group on Benchmarking of Control Strategies, is a simulation platform that allows for creating sensor and actuator faults and process disturbances in a wastewater treatment plant. The results from the method comparison using BSM1_LT show better performance to detect a sensor measurement shift for adaptive methods (residuals of EWMA) and when monitoring the actuator signals in a control loop (e.g., airflow). Overall, the proposed index is able to screen fault detection methods.

  16. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2013-12-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  17. Kernel-based fisher discriminant analysis for hyperspectral target detection

    Institute of Scientific and Technical Information of China (English)

    GU Yan-feng; ZHANG Ye; YOU Di

    2007-01-01

    A new method based on kernel Fisher discriminant analysis (KFDA) is proposed for target detection of hyperspectral images. The KFDA combines kernel mapping derived from support vector machine and the classical linear Fisher discriminant analysis (LFDA), and it possesses good ability to process nonlinear data such as hyperspectral images. According to the Fisher rule that the ratio of the between-class and within-class scatters is maximized, the KFDA is used to obtain a set of optimal discriminant basis vectors in high dimensional feature space. All pixels in the hyperspectral images are projected onto the discriminant basis vectors and the target detection is performed according to the projection result. The numerical experiments are performed on hyperspectral data with 126 bands collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). The experimental results show the effectiveness of the proposed detection method and prove that this method has good ability to overcome small sample size and spectral variability in the hyperspectral target detection.

  18. Visual verification and analysis of cluster detection for molecular dynamics.

    Science.gov (United States)

    Grottel, Sebastian; Reina, Guido; Vrabec, Jadran; Ertl, Thomas

    2007-01-01

    A current research topic in molecular thermodynamics is the condensation of vapor to liquid and the investigation of this process at the molecular level. Condensation is found in many physical phenomena, e.g. the formation of atmospheric clouds or the processes inside steam turbines, where a detailed knowledge of the dynamics of condensation processes will help to optimize energy efficiency and avoid problems with droplets of macroscopic size. The key properties of these processes are the nucleation rate and the critical cluster size. For the calculation of these properties it is essential to make use of a meaningful definition of molecular clusters, which currently is a not completely resolved issue. In this paper a framework capable of interactively visualizing molecular datasets of such nucleation simulations is presented, with an emphasis on the detected molecular clusters. To check the quality of the results of the cluster detection, our framework introduces the concept of flow groups to highlight potential cluster evolution over time which is not detected by the employed algorithm. To confirm the findings of the visual analysis, we coupled the rendering view with a schematic view of the clusters' evolution. This allows to rapidly assess the quality of the molecular cluster detection algorithm and to identify locations in the simulation data in space as well as in time where the cluster detection fails. Thus, thermodynamics researchers can eliminate weaknesses in their cluster detection algorithms. Several examples for the effective and efficient usage of our tool are presented.

  19. PERIODIC SIGNAL DETECTION WITH USING DUFFING SYSTEM POINCARE MAP ANALYSIS

    Directory of Open Access Journals (Sweden)

    Valeriy Martynyuk

    2014-06-01

    Full Text Available In this article the periodic signal detection method on the base of Duffing system chaotic oscillations analysis is presented. This work is a development of the chaos-based signal detection technique. Generally, chaos-based signal detection is the detection of chaotic-to-periodic state transition under input periodic component influence. If the input periodic component reaches certain threshold value, the system transforms from chaotic state to periodic state. The Duffing-type chaotic systems are often used for such a signal detection purpose because of their ability to work in chaotic state for a long time and relatively simple realization. The main advantage of chaos-based signal detection methods is the utilization of chaotic system sensitivity to weak signals. But such methods are not used in practice because of the chaotic system state control problems. The method presented does not require an exact system state control. The Duffing system works continuously in chaotic state and the periodic signal detection process is based on the analysis of Duffing system Poincare map fractal structure. This structure does not depend on noise, and therefore the minimum input signal-to-noise ratio required for periodic signal detection is not limited by chaotic system state control tolerance.

  20. Noise analysis for sensitivity-based structural damage detection

    Institute of Scientific and Technical Information of China (English)

    YIN Tao; ZHU Hong-ping; YU Ling

    2007-01-01

    As vibration-based structural damage detection methods are easily affected by environmental noise, a new statistic-based noise analysis method is proposed together with the Monte Carlo technique to investigate the influence of experimental noise of modal data on sensitivity-based damage detection methods. Different from the commonly used random perturbation technique, the proposed technique is deduced directly by Moore-Penrose generalized inverse of the sensitivity matrix, which does not only make the analysis process more efficient but also can analyze the influence of noise on both frequencies and mode shapes for three commonly used sensitivity-based damage detection methods in a similar way. A one-story portal frame is adopted to evaluate the efficiency of the proposed noise analysis technique.

  1. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  2. An effort allocation model considering different budgetary constraint on fault detection process and fault correction process

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2016-01-01

    Full Text Available Fault detection process (FDP and Fault correction process (FCP are important phases of software development life cycle (SDLC. It is essential for software to undergo a testing phase, during which faults are detected and corrected. The main goal of this article is to allocate the testing resources in an optimal manner to minimize the cost during testing phase using FDP and FCP under dynamic environment. In this paper, we first assume there is a time lag between fault detection and fault correction. Thus, removal of a fault is performed after a fault is detected. In addition, detection process and correction process are taken to be independent simultaneous activities with different budgetary constraints. A structured optimal policy based on optimal control theory is proposed for software managers to optimize the allocation of the limited resources with the reliability criteria. Furthermore, release policy for the proposed model is also discussed. Numerical example is given in support of the theoretical results.

  3. Eco-Efficiency Analysis of biotechnological processes.

    Science.gov (United States)

    Saling, Peter

    2005-07-01

    Eco-Efficiency has been variously defined and analytically implemented by several workers. In most cases, Eco-Efficiency is taken to mean the ecological optimization of overall systems while not disregarding economic factors. Eco-Efficiency should increase the positive ecological performance of a commercial company in relation to economic value creation--or to reduce negative effects. Several companies use Eco-Efficiency Analysis for decision-making processes; and industrial examples of best practices in developing and implementing Eco-Efficiency have been reviewed. They clearly demonstrate the environmental and business benefits of Eco-Efficiency. An instrument for the early recognition and systematic detection of economic and environmental opportunities and risks for production processes in the chemical industry began use in 1997, since when different new features have been developed, leading to many examples. This powerful Eco-Efficiency Analysis allows a feasibility evaluation of existing and future business activities and is applied by BASF. In many cases, decision-makers are able to choose among alternative processes for making a product.

  4. Digital Image Processing Technique for Breast Cancer Detection

    Science.gov (United States)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  5. Trend detection in social networks using Hawkes processes

    OpenAIRE

    2015-01-01

    International audience; —We develop in this paper a trend detection algorithm , designed to find trendy topics being disseminated in a social network. We assume that the broadcasts of messages in the social network is governed by a self-exciting point process, namely a Hawkes process, which takes into consideration the real broadcasting times of messages and the interaction between users and topics. We formally define trendiness and derive trend indices for each topic being disseminated in th...

  6. Unsupervised Approaches for Post-Processing in Computationally Efficient Waveform-Similarity-Based Earthquake Detection

    Science.gov (United States)

    Bergen, K.; Yoon, C. E.; OReilly, O. J.; Beroza, G. C.

    2015-12-01

    Recent improvements in computational efficiency for waveform correlation-based detections achieved by new methods such as Fingerprint and Similarity Thresholding (FAST) promise to allow large-scale blind search for similar waveforms in long-duration continuous seismic data. Waveform similarity search applied to datasets of months to years of continuous seismic data will identify significantly more events than traditional detection methods. With the anticipated increase in number of detections and associated increase in false positives, manual inspection of the detection results will become infeasible. This motivates the need for new approaches to process the output of similarity-based detection. We explore data mining techniques for improved detection post-processing. We approach this by considering similarity-detector output as a sparse similarity graph with candidate events as vertices and similarities as weighted edges. Image processing techniques are leveraged to define candidate events and combine results individually processed at multiple stations. Clustering and graph analysis methods are used to identify groups of similar waveforms and assign a confidence score to candidate detections. Anomaly detection and classification are applied to waveform data for additional false detection removal. A comparison of methods will be presented and their performance will be demonstrated on a suspected induced and non-induced earthquake sequence.

  7. In-process discontinuity detection during friction stir welding

    Science.gov (United States)

    Shrivastava, Amber

    The objective of this work is to develop a method for detecting the creation of discontinuities (e.g., voids) during friction stir welding. Friction stir welding is inherently cost-effective, however, the need for significant weld inspection can make the process cost-prohibitive. A new approach to weld inspection is required -- where an in-situ characterization of weld quality can be obtained, reducing the need for post-process inspection. Friction stir welds with discontinuity and without discontinuity were created. In this work, discontinuities are generated by reducing the friction stir tool rotation frequency and increasing the tool traverse speed in order to create "colder" welds. During the welds, forces are measured. Discontinuity sizes for welds are measured by computerized tomography. The relationship between the force transients and the discontinuity sizes indicate that the force measurement during friction stir welding can be effectively used for detecting discontinuities in friction stir welds. The normalized force transient data and normalized discontinuity size are correlated to develop a criterion for discontinuity detection. Additional welds are performed to validate the discontinuity detection method. The discontinuity sizes estimated by the force measurement based method are in good agreement with the discontinuity sizes measured by computerized tomography. These results show that the force measurement based discontinuity detection model method can be effectively used to detect discontinuities during friction stir welding.

  8. Operational analysis for the drug detection problem

    Science.gov (United States)

    Hoopengardner, Roger L.; Smith, Michael C.

    1994-10-01

    New techniques and sensors to identify the molecular, chemical, or elemental structures unique to drugs are being developed under several national programs. However, the challenge faced by U.S. drug enforcement and Customs officials goes far beyond the simple technical capability to detect an illegal drug. Entry points into the U.S. include ports, border crossings, and airports where cargo ships, vehicles, and aircraft move huge volumes of freight. Current technology and personnel are able to physically inspect only a small fraction of the entering cargo containers. The complexities of how to best utilize new technology to aid the detection process and yet not adversely affect the processing of vehicles and time-sensitive cargo is the challenge faced by these officials. This paper describes an ARPA sponsored initiative to develop a simple, yet useful, method for examining the operational consequences of utilizing various procedures and technologies in combination to achieve an `acceptable' level of detection probability. Since Customs entry points into the U.S. vary from huge seaports to a one lane highway checkpoint between the U.S. and Canadian or Mexico border, no one system can possibly be right for all points. This approach can examine alternative concepts for using different techniques/systems for different types of entry points. Operational measures reported include the average time to process vehicles and containers, the average and maximum numbers in the system at any time, and the utilization of inspection teams. The method is implemented via a PC-based simulation written in GPSS-PC language. Input to the simulation model is (1) the individual detection probabilities and false positive rates for each detection technology or procedure, (2) the inspection time for each procedure, (3) the system configuration, and (4) the physical distance between inspection stations. The model offers on- line graphics to examine effects as the model runs.

  9. Counterfeit Electronics Detection Using Image Processing and Machine Learning

    Science.gov (United States)

    Asadizanjani, Navid; Tehranipoor, Mark; Forte, Domenic

    2017-01-01

    Counterfeiting is an increasing concern for businesses and governments as greater numbers of counterfeit integrated circuits (IC) infiltrate the global market. There is an ongoing effort in experimental and national labs inside the United States to detect and prevent such counterfeits in the most efficient time period. However, there is still a missing piece to automatically detect and properly keep record of detected counterfeit ICs. Here, we introduce a web application database that allows users to share previous examples of counterfeits through an online database and to obtain statistics regarding the prevalence of known defects. We also investigate automated techniques based on image processing and machine learning to detect different physical defects and to determine whether or not an IC is counterfeit.

  10. SCADA alarms processing for wind turbine component failure detection

    Science.gov (United States)

    Gonzalez, E.; Reder, M.; Melero, J. J.

    2016-09-01

    Wind turbine failure and downtime can often compromise the profitability of a wind farm due to their high impact on the operation and maintenance (O&M) costs. Early detection of failures can facilitate the changeover from corrective maintenance towards a predictive approach. This paper presents a cost-effective methodology to combine various alarm analysis techniques, using data from the Supervisory Control and Data Acquisition (SCADA) system, in order to detect component failures. The approach categorises the alarms according to a reviewed taxonomy, turning overwhelming data into valuable information to assess component status. Then, different alarms analysis techniques are applied for two purposes: the evaluation of the SCADA alarm system capability to detect failures, and the investigation of the relation between components faults being followed by failure occurrences in others. Various case studies are presented and discussed. The study highlights the relationship between faulty behaviour in different components and between failures and adverse environmental conditions.

  11. Impact of two types of image processing on cancer detection in mammography

    Science.gov (United States)

    Warren, Lucy M.; Halling-Brown, Mark D.; Looney, Padraig T.; Dance, David R.; Wilkinson, Louise; Wallis, Matthew G.; Given-Wilson, Rosalind M.; Cooke, Julie; McAvinchey, Rita; Young, Kenneth C.

    2016-03-01

    The impact of image processing on cancer detection is still a concern to radiologists and physicists. This work aims to evaluate the effect of two types of image processing on cancer detection in mammography. An observer study was performed in which six radiologists inspected 349 cases (a mixture of normal cases, benign lesions and cancers) processed with two types of image processing. The observers marked areas they were suspicious were cancers. JAFROC analysis was performed to determine if there was a significant difference in cancer detection between the two types of image processing. Cancer detection was significantly better with the standard setting image processing (flavor A) compared with one that provides enhanced image contrast (flavor B), p = 0.036. The image processing was applied to images of the CDMAM test object, which were then analysed using CDCOM. The threshold gold thickness measured with the CDMAM test object was thinner using flavor A than flavor B image processing. Since Flavor A was found to be superior in both the observer study and the measurements using the CDMAM phantom, this may indicate that measurements using the CDMAM correlate with change in cancer detection with different types of image processing.

  12. IWTU Process Sample Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  13. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    Directory of Open Access Journals (Sweden)

    Peter J. Best

    2009-03-01

    Full Text Available Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1 threat monitoring-surveillance of security audit logs for ‘red flags’, (2 automated extraction and analysis of data from audit trails, and (3 using forensic investigation techniques to determine whether a fraud has actually occurred. We demonstrate how mySAP, an enterprise system, can be used for audit trail analysis in detecting financial frauds; afterwards we use a case study of a suspected fraud to illustrate how to implement the methodology.

  14. The effect of image processing on the detection of cancers in digital mammography.

    Science.gov (United States)

    Warren, Lucy M; Given-Wilson, Rosalind M; Wallis, Matthew G; Cooke, Julie; Halling-Brown, Mark D; Mackenzie, Alistair; Chakraborty, Dev P; Bosmans, Hilde; Dance, David R; Young, Kenneth C

    2014-08-01

    OBJECTIVE. The objective of our study was to investigate the effect of image processing on the detection of cancers in digital mammography images. MATERIALS AND METHODS. Two hundred seventy pairs of breast images (both breasts, one view) were collected from eight systems using Hologic amorphous selenium detectors: 80 image pairs showed breasts containing subtle malignant masses; 30 image pairs, biopsy-proven benign lesions; 80 image pairs, simulated calcification clusters; and 80 image pairs, no cancer (normal). The 270 image pairs were processed with three types of image processing: standard (full enhancement), low contrast (intermediate enhancement), and pseudo-film-screen (no enhancement). Seven experienced observers inspected the images, locating and rating regions they suspected to be cancer for likelihood of malignancy. The results were analyzed using a jackknife-alternative free-response receiver operating characteristic (JAFROC) analysis. RESULTS. The detection of calcification clusters was significantly affected by the type of image processing: The JAFROC figure of merit (FOM) decreased from 0.65 with standard image processing to 0.63 with low-contrast image processing (p = 0.04) and from 0.65 with standard image processing to 0.61 with film-screen image processing (p = 0.0005). The detection of noncalcification cancers was not significantly different among the image-processing types investigated (p > 0.40). CONCLUSION. These results suggest that image processing has a significant impact on the detection of calcification clusters in digital mammography. For the three image-processing versions and the system investigated, standard image processing was optimal for the detection of calcification clusters. The effect on cancer detection should be considered when selecting the type of image processing in the future.

  15. Detection of subsurface defects in metal materials using infrared thermography; Image processing and finite element modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ranjit, Shrestha; Kim, Won Tae [Dept. of Mechanical Engineering, Kongju National University, Cheonan (Korea, Republic of)

    2014-04-15

    Infrared thermography is an emerging approach to non-contact, non-intrusive, and non-destructive inspection of various solid materials such as metals, composites, and semiconductors for industrial and research interests. In this study, data processing was applied to infrared thermography measurements to detect defects in metals that were widely used in industrial fields. When analyzing experimental data from infrared thermographic testing, raw images were often not appropriate. Thus, various data analysis methods were used at the pre-processing and processing levels in data processing programs for quantitative analysis of defect detection and characterization; these increased the infrared non-destructive testing capabilities since subtle defects signature became apparent. A 3D finite element simulation was performed to verify and analyze the data obtained from both the experiment and the image processing techniques.

  16. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design prope

  17. Social Media Sentiment Analysis and Topic Detection for Singapore English

    Science.gov (United States)

    2013-09-01

    Social media has become an increasingly important part of our daily lives in the last few years. With the convenience built into smart devices, many...new ways of communicating have been made possible via social - media applications. Sentiment analysis and topic detection are two growing areas in...Natural Language Processing, and there are increasing trends of using them in social media analytics. In this thesis, we analyze various standard methods

  18. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  19. Detecting 2LSB steganography using extended pairs of values analysis

    Science.gov (United States)

    Khalind, Omed; Aziz, Benjamin

    2014-05-01

    In this paper, we propose an extended pairs of values analysis to detect and estimate the amount of secret messages embedded with 2LSB replacement in digital images based on chi-square attack and regularity rate in pixel values. The detection process is separated from the estimation of the hidden message length, as it is the main requirement of any steganalysis method. Hence, the detection process acts as a discrete classifier, which classifies a given set of images into stego and clean classes. The method can accurately detect 2LSB replacement even when the message length is about 10% of the total capacity, it also reaches its best performance with an accuracy of higher than 0.96 and a true positive rate of more than 0.997 when the amount of data are 20% to 100% of the total capacity. However, the method puts no assumptions neither on the image nor the secret message, as it tested with two sets of 3000 images, compressed and uncompressed, embedded with a random message for each case. This method of detection could also be used as an automated tool to analyse a bulk of images for hidden contents, which could be used by digital forensics analysts in their investigation process.

  20. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  1. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  2. Analysis of crisis intervention processes.

    Science.gov (United States)

    Tschacher, Wolfgang; Jacobshagen, Nina

    2002-01-01

    The remediation processes in psychosocial crisis intervention were modeled focusing on cognitive orientation. Frequent observations and subsequent process modeling constitute a novel approach to process research and reveal process-outcome associations. A sample of 40 inpatients who were assigned to treatment in a crisis intervention unit was monitored in order to study the process of crisis intervention. The process data consisted of patients' self-ratings of the variables mood, tension, and cognitive orientation, which were assessed three times a day throughout hospitalization (M = 22.6 days). Linear time series models (vector autoregression) of the process data were computed to describe the prototypical dynamic patterns of the sample. Additionally, the outcome of crisis intervention was evaluated by pre-post questionnaires. Linear trends were found pointing to an improvement of mood, a reduction of tension, and an increase of outward cognitive orientation. Time series modeling showed that, on average, outward cognitive orientation preceded improved mood. The time series models partially predicted the treatment effect, notably the outcome domain "reduction of social anxiety," yet did not predict the domain of symptom reduction. In conclusion, crisis intervention should focus on having patients increasingly engage in outward cognitive orientation in order to stabilize mood, reduce anxiety, and activate their resources.

  3. Exoplanetary Detection By Multifractal Spectral Analysis

    CERN Document Server

    Agarwal, Sahil; Wettlaufer, John S

    2016-01-01

    Owing to technological advances the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies time scales that characterize planetary orbital motion around the host star. Without fitting spectral data to stellar models, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the time scales obtained to primary transit and secondary exoplanet eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via dop...

  4. Bayesian analysis of multiple direct detection experiments

    CERN Document Server

    Arina, Chiara

    2013-01-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the classical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for th...

  5. Detecting Damped Lyman-$\\alpha$ Absorbers with Gaussian Processes

    CERN Document Server

    Garnett, Roman; Bird, Simeon; Schneider, Jeff

    2016-01-01

    We develop an automated technique for detecting damped Lyman-$\\alpha$ absorbers (DLAs) along spectroscopic sightlines to quasi-stellar objects (QSOs or quasars). The detection of DLAs in large-scale spectroscopic surveys such as SDSS-III sheds light on galaxy formation at high redshift, showing the nucleation of galaxies from diffuse gas. We use nearly 50 000 QSO spectra to learn a novel tailored Gaussian process model for quasar emission spectra, which we apply to the DLA detection problem via Bayesian model selection. We propose models for identifying an arbitrary number of DLAs along a given line of sight. We demonstrate our method's effectiveness using a large-scale validation experiment, with excellent performance. We also provide a catalog of our results applied to 162 861 spectra from SDSS-III data release 12.

  6. Detecting Anomalous Process Behaviour using Second Generation Artificial Immune Systems

    CERN Document Server

    Twycross, Jamie; Whitbrook, Amanda

    2010-01-01

    Artificial Immune Systems have been successfully applied to a number of problem domains including fault tolerance and data mining, but have been shown to scale poorly when applied to computer intrusion detec- tion despite the fact that the biological immune system is a very effective anomaly detector. This may be because AIS algorithms have previously been based on the adaptive immune system and biologically-naive mod- els. This paper focuses on describing and testing a more complex and biologically-authentic AIS model, inspired by the interactions between the innate and adaptive immune systems. Its performance on a realistic process anomaly detection problem is shown to be better than standard AIS methods (negative-selection), policy-based anomaly detection methods (systrace), and an alternative innate AIS approach (the DCA). In addition, it is shown that runtime information can be used in combination with system call information to enhance detection capability.

  7. Detection, information fusion, and temporal processing for intelligence in recognition

    Energy Technology Data Exchange (ETDEWEB)

    Casasent, D. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  8. Detection of Blood Culture Bacterial Contamination using Natural Language Processing

    Science.gov (United States)

    Matheny, Michael E.; FitzHenry, Fern; Speroff, Theodore; Hathaway, Jacob; Murff, Harvey J.; Brown, Steven H.; Fielstein, Elliot M.; Dittus, Robert S.; Elkin, Peter L.

    2009-01-01

    Microbiology results are reported in semi-structured formats and have a high content of useful patient information. We developed and validated a hybrid regular expression and natural language processing solution for processing blood culture microbiology reports. Multi-center Veterans Affairs training and testing data sets were randomly extracted and manually reviewed to determine the culture and sensitivity as well as contamination results. The tool was iteratively developed for both outcomes using a training dataset, and then evaluated on the test dataset to determine antibiotic susceptibility data extraction and contamination detection performance. Our algorithm had a sensitivity of 84.8% and a positive predictive value of 96.0% for mapping the antibiotics and bacteria with appropriate sensitivity findings in the test data. The bacterial contamination detection algorithm had a sensitivity of 83.3% and a positive predictive value of 81.8%. PMID:20351890

  9. Detection of blood culture bacterial contamination using natural language processing.

    Science.gov (United States)

    Matheny, Michael E; Fitzhenry, Fern; Speroff, Theodore; Hathaway, Jacob; Murff, Harvey J; Brown, Steven H; Fielstein, Elliot M; Dittus, Robert S; Elkin, Peter L

    2009-11-14

    Microbiology results are reported in semi-structured formats and have a high content of useful patient information. We developed and validated a hybrid regular expression and natural language processing solution for processing blood culture microbiology reports. Multi-center Veterans Affairs training and testing data sets were randomly extracted and manually reviewed to determine the culture and sensitivity as well as contamination results. The tool was iteratively developed for both outcomes using a training dataset, and then evaluated on the test dataset to determine antibiotic susceptibility data extraction and contamination detection performance. Our algorithm had a sensitivity of 84.8% and a positive predictive value of 96.0% for mapping the antibiotics and bacteria with appropriate sensitivity findings in the test data. The bacterial contamination detection algorithm had a sensitivity of 83.3% and a positive predictive value of 81.8%.

  10. Sound Event Detection for Music Signals Using Gaussian Processes

    Directory of Open Access Journals (Sweden)

    Pablo A. Alvarado-Durán

    2013-11-01

    Full Text Available In this paper we present a new methodology for detecting sound events in music signals using Gaussian Processes. Our method firstly takes a time-frequency representation, i.e. the spectrogram, of the input audio signal. Secondly the spectrogram dimension is reduced translating the linear Hertz frequency scale into the logarithmic Mel frequency scale using a triangular filter bank. Finally every short-time spectrum, i.e. every Mel spectrogram column, is classified as “Event” or “Not Event” by a Gaussian Processes Classifier. We compare our method with other event detection techniques widely used. To do so, we use MATLAB® to program each technique and test them using two datasets of music with different levels of complexity. Results show that the new methodology outperforms the standard approaches, getting an improvement by about 1.66 % on the dataset one and 0.45 % on the dataset two in terms of F-measure.

  11. Processing Satellite Imagery To Detect Waste Tire Piles

    Science.gov (United States)

    Skiles, Joseph; Schmidt, Cynthia; Wuinlan, Becky; Huybrechts, Catherine

    2007-01-01

    A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors and managers time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate. The task of identifying tire piles in satellite imagery is uniquely challenging because of their low reflectance levels: Tires tend to be spectrally confused with shadows and deep water, both of which reflect little light to satellite-borne imaging systems. In this methodology, the challenge is met, in part, by use of software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model included incorporation of lessons learned in previous research on the detection and mapping of tire piles by use of manual/ visual and/or computational analysis of aerial and satellite imagery. The TIRe model is a computational model for identifying tire piles and discriminating between tire piles and other objects. The input to the TIRe model is the georeferenced but otherwise raw satellite spectral images of a geographic region to be surveyed

  12. BRVAAF and performance analysis for target detection

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    A bistatic range-velocity-acceleration ambiguity function (BRVAAF) is proposed. The model of radar measurements of an accelerating target involving the time delay, Doppler frequency and Doppler rate is given. The relationships between these measurements and the parameters of the bistatic geometry, target position, velocity and acceleration are derived. Moreover, the effects of the bistatic geometry factors on these measurements are analyzed. Besides, the two relationships of the bistatic integration loss and the bistatic optimum integration time with these factors are built and their change trends are described respectively. This research is helpful to analyze the influences of the bistatic geometry factors on the target detection and signal processing.

  13. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-01-01

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  14. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-12-31

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  15. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  16. Differential thermal analysis microsystem for explosive detection

    DEFF Research Database (Denmark)

    Olsen, Jesper Kenneth; Greve, Anders; Senesac, L.

    2011-01-01

    A micro differential thermal analysis (DTA) system is used for detection of trace explosive particles. The DTA system consists of two silicon micro chips with integrated heaters and temperature sensors. One chip is used for reference and one for the measurement sample. The sensor is constructed...... as a small silicon nitride membrane incorporating heater elements and a temperature measurement resistor. In this manuscript the DTA system is described and tested by measuring calorimetric response of 3 different kinds of explosives (TNT, RDX and PETN). This project is carried out under the framework...

  17. Automotive Parking Lot and Theft Detection through Image Processing

    Directory of Open Access Journals (Sweden)

    2013-10-01

    Full Text Available Automotive parking lot and theft detection through image processing is a smart parking lot which will save time for the owner to park his car in a more organized way and also prevent theft of the car. It is a technology to optimize the checkout process by analysing a database of images of number plates of cars. The heart of the project is based on image processing. The images of number plates will be detected by Matlab and a picture of the driver will be saved in a similar database. As soon as both the images are saved, the garage entrance pole will shift 90 degrees upward using a DC MOTOR and will remain in that position for 30 seconds to allow the car to enter. After 30 seconds it will return back to its previous position. When the car exits the earlier steps will be repeated and Matlab will match both the images that were taken during entering and leaving. Meanwhile the seven segment display will show that a car has left the parking lot, by decrementing a number from its display. The cars are controlled by a microcontroller which is also able to detect and display if a vacant parking space is available. If there is no vacancy a red LED lights up, where as a green LED is used to display presence of parking space along with how many parking spots are available. It is applicable to be used in super market car parking lots and also apartment garages.

  18. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  19. Perturbation analysis of Poisson processes

    CERN Document Server

    Last, Günter

    2012-01-01

    We consider a Poisson process $\\Phi$ on a general phase space. The expectation of a function of $\\Phi$ can be considered as a functional of the intensity measure $\\lambda$ of $\\Phi$. Extending ealier results of Molchanov and Zuyev (2000) on finite Poisson processes, we study the behaviour of this functional under signed (possibly infinite) perturbations of $\\lambda$. In particular we obtain general Margulis--Russo type formulas for the derivative with respect to non-linear transformations of the intensity measure depending on some parameter. As an application we study the behaviour of expectations of functions of multivariate pure jump L\\'evy processes under perturbations of the L\\'evy measure. A key ingredient of our approach is the explicit Fock space representation obtained in Last and Penrose (2011).

  20. Combustion Analysis and Knock Detection in Single Cylinder DI-Diesel Engine Using Vibration Signature Analysis

    OpenAIRE

    Y.V.V.SatyanarayanaMurthy

    2011-01-01

    The purpose of this paper is to detect the “knock” in Diesel engines which deteriorate the engine performance adversely. The methodology introduced in the present work suggests a newly developed approach towards analyzing the vibration analysis of diesel engines. The method is based on fundamental relationship between the engine vibration pattern and the relative characteristics of the combustion process in each or different cylinders. Knock in diesel engine is detected by measuring the vibra...

  1. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  2. Exoplanetary Detection by Multifractal Spectral Analysis

    Science.gov (United States)

    Agarwal, Sahil; Del Sordo, Fabio; Wettlaufer, John S.

    2017-01-01

    Owing to technological advances, the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies timescales that characterize planetary orbital motion around the host star and those that arise from stellar features such as spots. Without fitting stellar models to spectral data, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the timescales obtained to primary and secondary eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via Doppler shift detection. Finally, we have analyzed synthetic spectra obtained using the SOAP 2.0 tool, which simulates a stellar spectrum and the influence of the presence of a planet or a spot on that spectrum over one orbital period. We have demonstrated that, so long as the signal-to-noise-ratio ≥ 75, our approach reconstructs the planetary orbital period, as well as the rotation period of a spot on the stellar surface.

  3. Bayesian analysis of multiple direct detection experiments

    Science.gov (United States)

    Arina, Chiara

    2014-12-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the frequentist statistical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for their excessive complexity. Secondly, we confront several coherent scattering models to determine the current best physical scenario compatible with the experimental hints. We find that exothermic and inelastic dark matter are moderatly disfavored against the elastic scenario, while the isospin violating model has a similar evidence. Lastly the Bayes' factor gives inconclusive evidence for an incompatibility between the data sets of XENON100 and the hints of detection. The same question assessed with goodness of fit would indicate a 2 σ discrepancy. This suggests that more data are therefore needed to settle this question.

  4. Detection and analysis of CRISPRs of Shigella.

    Science.gov (United States)

    Guo, Xiangjiao; Wang, Yingfang; Duan, Guangcai; Xue, Zerun; Wang, Linlin; Wang, Pengfei; Qiu, Shaofu; Xi, Yuanlin; Yang, Haiyan

    2015-01-01

    The recently discovered CRISPRs (Clustered regularly interspaced short palindromic repeats) and Cas (CRISPR-associated) proteins are a novel genetic barrier that limits horizontal gene transfer in prokaryotes and the CRISPR loci provide a historical view of the exposure of prokaryotes to a variety of foreign genetic elements. The aim of study was to investigate the occurrence and distribution of the CRISPRs in Shigella. A collection of 61 strains of Shigella were screened for the existence of CRISPRs. Three CRISPR loci were identified among 61 shigella strains. CRISPR1/cas loci are detected in 49 strains of shigella. Yet, IS elements were detected in cas gene in some strains. In the remaining 12 Shigella flexneri strains, the CRISPR1/cas locus is deleted and only a cas3' pseudo gene and a repeat sequence are present. The presence of CRISPR2 is frequently accompanied by the emergence of CRISPR1. CRISPR3 loci were present in almost all strains (52/61). The length of CRISPR arrays varied from 1 to 9 spacers. Sequence analysis of the CRISPR arrays revealed that few spacers had matches in the GenBank databases. However, one spacer in CRISPR3 loci matches the cognate cas3 genes and no cas gene was present around CRISPR3 region. Analysis of CRISPR sequences show that CRISPR have little change which makes CRISPR poor genotyping markers. The present study is the first attempt to determine and analyze CRISPRs of shigella isolated from clinical patients.

  5. Algorithms Development in Detection of the Gelatinization Process during Enzymatic ‘Dodol’ Processing

    Directory of Open Access Journals (Sweden)

    Azman Hamzah

    2007-11-01

    Full Text Available Computer vision systems have found wide application in foods processing industry to perform the quality evaluation. The systems enable to replace human inspectors for the evaluation of a variety of quality attributes. This paper describes the implementation of the Fast Fourier Transform and Kalman filtering algorithms to detect the glutinous rice flour slurry (GRFS gelatinization in an enzymatic ‘dodol’ processing. The onset of the GRFS gelatinization is critical in determining the quality of an enzymatic ‘dodol’. Combinations of these two algorithms were able to detect the gelatinization of the GRFS. The result shows that the gelatinization of the GRFS was at the time range of 11.75 minutes to 15.33 minutes for 20 batches of processing. This paper will highlight the capability of computer vision using our proposed algorithms in monitoring and controlling of an enzymatic ‘dodol’ processing via image processing technology.

  6. Detecting Phishing Attacks In Purchasing Process Through Proactive Approach

    Directory of Open Access Journals (Sweden)

    S.Arun

    2012-06-01

    Full Text Available A Monitor is a software system that observes and analyzes the behavior of target system determining the quality of interest such as satisfaction of the target system. In the modern technology business processes are open and distributed which may lead to failure. Therefore monitoring is an important task for the services that comprise these processes. We are going to present a framework for multilevel monitoring of these service systems. The main objective of this project is monitoring the customer who purchases items from Merchant. Phishing is an online scam that attempts to defraud people of their personal information such as credit card or bank account information. We are going to detect, locate and remove the phishing E-mail. The customer details will be stored in web registry. We are going to demonstrate how the online business processes can be implemented with multiple scenarios that include monitoring open service policy commitments.

  7. Weighted Dickey-Fuller Processes for Detecting Stationarity

    CERN Document Server

    Steland, Ansgar

    2010-01-01

    Aiming at monitoring a time series to detect stationarity as soon as possible, we introduce monitoring procedures based on kernel-weighted sequential Dickey-Fuller (DF) processes, and related stopping times, which may be called weighted Dickey-Fuller control charts. Under rather weak assumptions, (functional) central limit theorems are established under the unit root null hypothesis and local-to-unity alternatives. For gen- eral dependent and heterogeneous innovation sequences the limit processes depend on a nuisance parameter. In this case of practical interest, one can use estimated control limits obtained from the estimated asymptotic law. Another easy-to-use approach is to transform the DF processes to obtain limit laws which are invariant with respect to the nuisance pa- rameter. We provide asymptotic theory for both approaches and compare their statistical behavior in finite samples by simulation.

  8. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  9. Hypernasal Speech Detection by Acoustic Analysis of Unvoiced Plosive Consonants

    Directory of Open Access Journals (Sweden)

    Alexander Sepúlveda-Sepúlveda

    2009-12-01

    Full Text Available People with a defective velopharyngeal mechanism speak with abnormal nasal resonance (hypernasal speech. Voice analysis methods for hypernasality detection commonly use vowels and nasalized vowels. However to obtain a more general assessment of this abnormality it is necessary to analyze stops and fricatives. This study describes a method with high generalization capability for hypernasality detection analyzing unvoiced Spanish stop consonants. The importance of phoneme-by-phoneme analysis is shown, in contrast with whole word parametrization which includes irrelevant segments from the classification point of view. Parameters that correlate the imprints of Velopharyngeal Incompetence (VPI over voiceless stop consonants were used in the feature estimation stage. Classification was carried out using a Support Vector Machine (SVM, including the Rademacher complexity model with the aim of increasing the generalization capability. Performances of 95.2% and 92.7% were obtained in the processing and verification stages for a repeated cross-validation classifier evaluation.

  10. Timely online chatter detection in end milling process

    Science.gov (United States)

    Fu, Yang; Zhang, Yun; Zhou, Huamin; Li, Dequn; Liu, Hongqi; Qiao, Haiyu; Wang, Xiaoqiang

    2016-06-01

    Chatter is one of the most unexpected and uncontrollable phenomenon during the milling operation. It is very important to develop an effective monitoring method to identify the chatter as soon as possible, while existing methods still cannot detect it before the workpiece has been damaged. This paper proposes an energy aggregation characteristic-based Hilbert-Huang transform method for online chatter detection. The measured vibration signal is firstly decomposed into a series of intrinsic mode functions (IMFs) using ensemble empirical mode decomposition. Feature IMFs are then selected according to the majority energy rule. Subsequently Hilbert spectral analysis is applied on these feature IMFs to calculate the Hilbert time/frequency spectrum. Two indicators are proposed to quantify the spectrum and thresholds are automatically calculated using Gaussian mixed model. Milling experiments prove the proposed method to be effective in protecting the workpiece from severe chatter damage within acceptable time complexity.

  11. Vision based error detection for 3D printing processes

    Directory of Open Access Journals (Sweden)

    Baumann Felix

    2016-01-01

    Full Text Available 3D printers became more popular in the last decade, partly because of the expiration of key patents and the supply of affordable machines. The origin is located in rapid prototyping. With Additive Manufacturing (AM it is possible to create physical objects from 3D model data by layer wise addition of material. Besides professional use for prototyping and low volume manufacturing they are becoming widespread amongst end users starting with the so called Maker Movement. The most prevalent type of consumer grade 3D printers is Fused Deposition Modelling (FDM, also Fused Filament Fabrication FFF. This work focuses on FDM machinery because of their widespread occurrence and large number of open problems like precision and failure. These 3D printers can fail to print objects at a statistical rate depending on the manufacturer and model of the printer. Failures can occur due to misalignment of the print-bed, the print-head, slippage of the motors, warping of the printed material, lack of adhesion or other reasons. The goal of this research is to provide an environment in which these failures can be detected automatically. Direct supervision is inhibited by the recommended placement of FDM printers in separate rooms away from the user due to ventilation issues. The inability to oversee the printing process leads to late or omitted detection of failures. Rejects effect material waste and wasted time thus lowering the utilization of printing resources. Our approach consists of a camera based error detection mechanism that provides a web based interface for remote supervision and early failure detection. Early failure detection can lead to reduced time spent on broken prints, less material wasted and in some cases salvaged objects.

  12. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  13. Risk Analysis for Tea Processing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@ Itis lbviors that after all the disasters with dilxins, BSE, pathogcns,Footand Mouth disease a. o. and now shortly because of the possibillties of bioterrorism, thatFoodSafetyisalmostatthetopoftheagendaoftheEUfor theyearstocome The implementaion of certainhy gicneprinci plessuchas HA C C P and a transparent hygiene policy applicable to all food and all food operators, from the farm to the table, togetherwith effoctiveinstruments to manage Food Safety will form fsubstantialpart on this agenda. As an example external quality factors such as certain pathogens in tea will. be discussed. Since risk analysis of e. g. my cotoxing have already a quite long histoy and development in sereral international bodies and tea might bear unwanted (or deliberately added by terroristic action)contaminants, the need to monitor teamuch more regularly than is being done today, seems to be a"conditio sine qua non ". Recentoy developed Immuno Flow tests may one day help the consumer perhaps to find out if he gets poisoned.

  14. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2016-12-22

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  15. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1991-04-16

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level.

  16. Sensitivity Analysis of Automated Ice Edge Detection

    Science.gov (United States)

    Moen, Mari-Ann N.; Isaksem, Hugo; Debien, Annekatrien

    2016-08-01

    The importance of highly detailed and time sensitive ice charts has increased with the increasing interest in the Arctic for oil and gas, tourism, and shipping. Manual ice charts are prepared by national ice services of several Arctic countries. Methods are also being developed to automate this task. Kongsberg Satellite Services uses a method that detects ice edges within 15 minutes after image acquisition. This paper describes a sensitivity analysis of the ice edge, assessing to which ice concentration class from the manual ice charts it can be compared to. The ice edge is derived using the Ice Tracking from SAR Images (ITSARI) algorithm. RADARSAT-2 images of February 2011 are used, both for the manual ice charts and the automatic ice edges. The results show that the KSAT ice edge lies within ice concentration classes with very low ice concentration or open water.

  17. LEXICAL ANALYSIS TO EFFECTIVELY DETECT USERS’ OPINION

    Directory of Open Access Journals (Sweden)

    Anil Kumar K.M

    2011-11-01

    Full Text Available In this paper we present a lexical approach that will identify opinion of web users popularly expressedusing short words or sms words. These words are pretty popular with diverse web users and are used forexpressing their opinion on the web. The study of opinion from web arises to know the diverse opinion ofweb users. The opinion expressed by web users may be on diverse topics such as politics, sports, products,movies etc. These opinions will be very useful to others such as, leaders of political parties, selectioncommittees of various sports, business analysts and other stake holders of products, directors andproducers of movies as well as to the other concerned web users. We use semantic based approach to findusers opinion from short words or sms words apart of regular opinionated phrases. Our approachefficiently detects opinion from opinionated texts using lexical analysis and is found to be better than theother approaches on different data sets.

  18. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W.B.; Johnston, P.N.; Walker, S.R.; Bubb, I.F. [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J.F. [New South Wales Univ., Kensington, NSW (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  19. Infrared landmine detection and thermal model analysis

    NARCIS (Netherlands)

    Schwering, P.B.W.; Kokonozi, A.; Carter, L.J.; Lensen, H.A.; Franken, E.M.

    2001-01-01

    Infrared imagers are capable of the detection of surface laid mines. Several sensor fused land mine detection systems make use of metal detectors, ground penetrating radar and infrared imagers. Infrared detection systems are sensitive to apparent temperature contrasts and their detection capabilitie

  20. Bisous model-Detecting filamentary patterns in point processes

    Science.gov (United States)

    Tempel, E.; Stoica, R. S.; Kipper, R.; Saar, E.

    2016-07-01

    The cosmic web is a highly complex geometrical pattern, with galaxy clusters at the intersection of filaments and filaments at the intersection of walls. Identifying and describing the filamentary network is not a trivial task due to the overwhelming complexity of the structure, its connectivity and the intrinsic hierarchical nature. To detect and quantify galactic filaments we use the Bisous model, which is a marked point process built to model multi-dimensional patterns. The Bisous filament finder works directly with the galaxy distribution data and the model intrinsically takes into account the connectivity of the filamentary network. The Bisous model generates the visit map (the probability to find a filament at a given point) together with the filament orientation field. Using these two fields, we can extract filament spines from the data. Together with this paper we publish the computer code for the Bisous model that is made available in GitHub. The Bisous filament finder has been successfully used in several cosmological applications and further development of the model will allow to detect the filamentary network also in photometric redshift surveys, using the full redshift posterior. We also want to encourage the astro-statistical community to use the model and to connect it with all other existing methods for filamentary pattern detection and characterisation.

  1. Coherent detection and digital signal processing for fiber optic communications

    Science.gov (United States)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  2. Systematic sustainable process design and analysis of biodiesel processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil; Ismail, Muhammad Imran; Babi, Deenesh Kavi

    2013-01-01

    process intensification opportunities. This work focuses on three main aspects that have been incorporated into a systematic computer-aided framework for sustainable process design. First, the creation of a generic superstructure, which consists of all possible process alternatives based on available...... technology. Second, the evaluation of this superstructure for systematic screening to obtain an appropriate base case design. This is done by first reducing the search space using a sustainability analysis, which provides key indicators for process bottlenecks of different flowsheet configurations...... and then by further reducing the search space by using economic evaluation and life cycle assessment. Third, the determination of sustainable design with/without process intensification using a phenomena-based synthesis/design method. A detailed step by step application of the framework is highlighted through...

  3. Effects of image processing on the detective quantum efficiency

    Science.gov (United States)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-04-01

    Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

  4. Edge detection - Image-plane versus digital processing

    Science.gov (United States)

    Huck, Friedrich O.; Fales, Carl L.; Park, Stephen K.; Triplett, Judith A.

    1987-01-01

    To optimize edge detection with the familiar Laplacian-of-Gaussian operator, it has become common to implement this operator with a large digital convolution mask followed by some interpolation of the processed data to determine the zero crossings that locate edges. It is generally recognized that this large mask causes substantial blurring of fine detail. It is shown that the spatial detail can be improved by a factor of about four with either the Wiener-Laplacian-of-Gaussian filter or an image-plane processor. The Wiener-Laplacian-of-Gaussian filter minimizes the image-gathering degradations if the scene statistics are at least approximately known and also serves as an interpolator to determine the desired zero crossings directly. The image-plane processor forms the Laplacian-of-Gaussian response by properly combining the optical design of the image-gathering system with a minimal three-by-three lateral-inhibitory processing mask. This approach, which is suggested by Marr's model of early processing in human vision, also reduces data processing by about two orders of magnitude and data transmission by up to an order of magnitude.

  5. On damage detection in wind turbine gearboxes using outlier analysis

    Science.gov (United States)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  6. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  7. Exergetic analysis of coal gasification processes

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, P.; Conger, W.L.

    1980-12-01

    In this study, the efficiency and economics of the Synthane Gasification process are evaluated and discussed. The efficiency of the Synthane process was determined using the exergy analysis (availability) approach to process evaluation. The exergy analysis utilizes both the first and second laws of thermodynamics to determine the efficiency of a process, and is very useful in determining the causes of inefficiency. In order to accurately apply the exergy analysis, it is essential that the absolute enthalpy and entropy values of each stream be determined. In this study, methods for predicting the enthalpy and entropy of coal, char, tar, and ash, as a function of temperature and material composition previously developed were used. A computer simulation of the Synthane process was developed which allowed for the effect of changes in plant operating parameters on both the efficiency and economics. The simulation included a three-section model of the Synthane fluidized bed gasifier.

  8. Meta-analysis using Dirichlet process.

    Science.gov (United States)

    Muthukumarana, Saman; Tiwari, Ram C

    2016-02-01

    This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods.

  9. New Approach of Envelope Dynamic Analysis for Milling Process

    CERN Document Server

    Bisu, Claudiu-Florinel; Gérard, Alain; Vijelea, V; Anica, Marin

    2012-01-01

    This paper proposes a method to vibration analysis in order to on-line monitoring of milling process quality. Adapting envelope analysis to characterize the milling tool materials is an important contribution to the qualitative and quantitative characterization of milling capacity and a step by modeling the three-dimensional cutting process. An experimental protocol was designed and developed for the acquisition, processing and analyzing three-dimensional signal. The vibration envelope analysis is proposed to detect the cutting capacity of the tool with the optimization application of cutting parameters. The research is focused on FFT Fourier transform optimization of vibration analysis and vibration envelope to evaluate the dynamic behavior of the machine/ tool/workpiece

  10. Particle contamination formation and detection in magnetron sputtering processes

    Energy Technology Data Exchange (ETDEWEB)

    Selwyn, G.S. [Los Alamos National Lab., NM (United States); Weiss, C.A. [Materials Research Corp., Congers, NY (United States). Sputtering Systems Div.; Sequeda, F.; Huang, C. [Seagate Peripherals Disk Div., Milpitas, CA (United States)

    1996-10-01

    Defects caused by particulate contamination are an important concern in the fabrication of thin film products. Often, magnetron sputtering processes are used for this purpose. Particle contamination can cause electrical shorting, pin holes, problems with photolithography, adhesion failure, as well as visual and cosmetic defects. Particle contamination generated during thin film processing can be detected using laser light scattering, a powerful diagnostic technique that provides real-time, {ital in-situ} imaging of particles > 0.3 {mu}m in diameter. Using this technique, the causes, sources and influences on particles in plasma and non-plasma and non-plasma processes may be independently evaluated and corrected. Several studies employing laser light scattering have demonstrated both homogeneous and heterogeneous causes of particle contamination. In this paper, we demonstrate that the mechanisms for particle generation, transport and trapping during magnetron sputter deposition are different from the mechanisms reported in previously studied plasma etch processes. During magnetron sputter deposition, one source of particle contamination is linked to portions of the sputtering target surface exposed to weaker plasma density. In this region, film redeposition is followed by filament or nodule growth and enhanced trapping which increases filament growth. Eventually the filaments effectively ``short circuit`` the sheath, causing high currents to flow through these features. This, in turn, causes heating failure of the filament fracturing and ejecting the filaments into the plasma and onto the substrate. Evidence of this effect has been observed in semiconductor (IC) fabrication and storage disk manufacturing. Discovery of this mechanism in both technologies suggests that this mechanism may be universal to many sputtering processes.

  11. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2005-01-01

    Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005.......Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005....

  12. Effects of image processing on the detective quantum efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na [Yonsei University, Wonju (Korea, Republic of)

    2010-02-15

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  13. Theoretical analysis of radiographic images by nonstationary Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, K.; Uchida, S. (Gifu Univ. (Japan)); Yamada, I.

    1980-12-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process.

  14. Theoretical Analysis of Radiographic Images by Nonstationary Poisson Processes

    Science.gov (United States)

    Tanaka, Kazuo; Yamada, Isao; Uchida, Suguru

    1980-12-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples of the one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process.

  15. A KPI-based process monitoring and fault detection framework for large-scale processes.

    Science.gov (United States)

    Zhang, Kai; Shardt, Yuri A W; Chen, Zhiwen; Yang, Xu; Ding, Steven X; Peng, Kaixiang

    2017-02-09

    Large-scale processes, consisting of multiple interconnected subprocesses, are commonly encountered in industrial systems, whose performance needs to be determined. A common approach to this problem is to use a key performance indicator (KPI)-based approach. However, the different KPI-based approaches are not developed with a coherent and consistent framework. Thus, this paper proposes a framework for KPI-based process monitoring and fault detection (PM-FD) for large-scale industrial processes, which considers the static and dynamic relationships between process and KPI variables. For the static case, a least squares-based approach is developed that provides an explicit link with least-squares regression, which gives better performance than partial least squares. For the dynamic case, using the kernel representation of each subprocess, an instrument variable is used to reduce the dynamic case to the static case. This framework is applied to the TE benchmark process and the hot strip mill rolling process. The results show that the proposed method can detect faults better than previous methods.

  16. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  17. Process analysis. Prozessanalyse. Elementare stochastische Methoden

    Energy Technology Data Exchange (ETDEWEB)

    Dreyer, H.; Sauer, W.

    1982-01-01

    Process analysis deals with the analysis of technological processes, methods and equipment for fabricating products. This refers to technological processes, methods and equipment being in the stage of development as well as such already applied industrially. In technological processes preferably physical or chemical relations or effects are used for intended changes of properties or shape of materials. They are designed in such a way that a large group or class of products can be fabricated. In order to use technological processes also at an industrial scale the corresponding technological equipment must be produced. The term technological equipment includes the terms machines, plant, facilities, aggregate etc. The technological process serves to the direct industrial fabrication of products. It represents the planned interrelation between technological equipment, man and product. It therefore includes a wide range of problems and tasks and therefore constitutes the subject of technology in a comprehensive sense.

  18. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  19. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  20. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun;

    2016-01-01

    Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...

  1. Infective endocarditis detection through SPECT/CT images digital processing

    Science.gov (United States)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  2. Damage Detection in Composite Structures with Wavenumber Array Data Processing

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Yu, Lingyu

    2013-01-01

    Guided ultrasonic waves (GUW) have the potential to be an efficient and cost-effective method for rapid damage detection and quantification of large structures. Attractive features include sensitivity to a variety of damage types and the capability of traveling relatively long distances. They have proven to be an efficient approach for crack detection and localization in isotropic materials. However, techniques must be pushed beyond isotropic materials in order to be valid for composite aircraft components. This paper presents our study on GUW propagation and interaction with delamination damage in composite structures using wavenumber array data processing, together with advanced wave propagation simulations. Parallel elastodynamic finite integration technique (EFIT) is used for the example simulations. Multi-dimensional Fourier transform is used to convert time-space wavefield data into frequency-wavenumber domain. Wave propagation in the wavenumber-frequency domain shows clear distinction among the guided wave modes that are present. This allows for extracting a guided wave mode through filtering and reconstruction techniques. Presence of delamination causes spectral change accordingly. Results from 3D CFRP guided wave simulations with delamination damage in flat-plate specimens are used for wave interaction with structural defect study.

  3. Research on Defects Detection by Image Processing of Thermographic Images

    Directory of Open Access Journals (Sweden)

    Shrestha Ranjit

    2015-10-01

    Full Text Available This paper presents the results of experimental investigation of thermal phenomena in a square shape (180 mm *180 mm STS 304 specimen with 10 mm thickness and artificial defects with circular cut-outs of varying depth and diameter at the back side. The material is aimed to be tested by means of thermal wave thermography. Lock-in thermography is employed for the detection of defects. The temperature field of the front surface of material tested is observed and analysed. The four point correlation algorithms are applied to extract phase angle of thermal wave’s harmonic component. Phase image are analyzed to find the qualitative information about the defects. Phase contrast method was used for better identification and analysis of the existing defects of the specimen.

  4. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  5. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  6. LRS data processing methods for detection of lunar subsurface echoes

    Science.gov (United States)

    Oshigami, Shoko; Mochizuki, Kengo; Watanabe, Shiho; Watanabe, Toshiki; Yamaguchi, Yasushi; Yamaji, Atsushi; Ono, Takayuki; Kumamoto, Atsushi; Nakagawa, Hiromu; Kobayashi, Takao; Kasahara, Yoshiya

    Lunar Radar Sounder (LRS) is an instrument for one of fifteen science missions of SE- LENE (KAGUYA). LRS is a ground-penetrating FM-CW radar system of HF-band. LRS detects echoes reflected from subsurface discontinuities where dielectric constants of the rocks change. The range resolution of LRS is 75 m in free space, whereas the sampling interval in the flight direction is about 75 m when the spacecraft altitude is 100 km. The primary objective of LRS is to investigate lunar subsurface structures. We plan to perform global soundings by LRS to contribute to studying the evolution of the Moon. In this presentation, we introduce the techniques to process LRS data to produce data products and to detect subsurface echoes. We have two standard data products of LRS under consideration. The time series data of ‘A-scope' which is a plot of signal power spectrum as a function of range derived from of the waveform data are called ‘B-scan'. Because LRS instruments change timing of data recording (measurement delay time) according to the predicted distance between KAGUYA spacecraft and lunar surface, observation range with respect to the spacecraft varies from pulse to pulse. In addition, flight altitude of KAGUYA changes in the range of several tens of kilometers. Therefore a trace of surface nadir echoes in unprocessed B-scan images does not correspond to actual lunar topography. We corrected variations of the measurement delay time and flight altitude of KAGUYA to produce a B-scan data product with the original spatial resolution (BScan high) and a reduced spatial resolution product (BScan low) both in the PDS format. The echo signals in A-scope data might be classified in the following categories; (1) a surface nadir echo, (2) surface off-nadir backscattering echoes, and (3) subsurface echoes. The most intense signal usually comes from the nadir point, when KAGUYA is flying over a level surface. The A-scope data also include various noises resulted from, for example

  7. Raw data based image processing algorithm for fast detection of surface breaking cracks

    Science.gov (United States)

    Sruthi Krishna K., P.; Puthiyaveetil, Nithin; Kidangan, Renil; Unnikrishnakurup, Sreedhar; Zeigler, Mathias; Myrach, Philipp; Balasubramaniam, Krishnan; Biju, P.

    2017-02-01

    The aim of this work is to illustrate the contribution of signal processing techniques in the field of Non-Destructive Evaluation. A component's life evaluation is inevitably related to the presence of flaws in it. The detection and characterization of cracks prior to damage is a technologically and economically significant task and is of very importance when it comes to safety-relevant measures. The Laser Thermography is the most effective and advanced thermography method for Non-Destructive Evaluation. High capability for the detection of surface cracks and for the characterization of the geometry of artificial surface flaws in metallic samples of laser thermography is particularly encouraging. This is one of the non-contacting, fast and real time detection method. The presence of a vertical surface breaking crack will disturb the thermal footprint. The data processing method plays vital role in fast detection of the surface and sub-surface cracks. Currently in laser thermographic inspection lacks a compromising data processing algorithm which is necessary for the fast crack detection and also the analysis of data is done as part of post processing. In this work we introduced a raw data based image processing algorithm which results precise, better and fast crack detection. The algorithm we developed gives better results in both experimental and modeling data. By applying this algorithm we carried out a detailed investigation variation of thermal contrast with crack parameters like depth and width. The algorithm we developed is applied for various surface temperature data from the 2D scanning model and also validated credibility of algorithm with experimental data.

  8. Analysis of physical processes via imaging vectors

    Science.gov (United States)

    Volovodenko, V.; Efremova, N.; Efremov, V.

    2016-06-01

    Practically, all modeling processes in one way or another are random. The foremost formulated theoretical foundation embraces Markov processes, being represented in different forms. Markov processes are characterized as a random process that undergoes transitions from one state to another on a state space, whereas the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. In the Markov processes the proposition (model) of the future by no means changes in the event of the expansion and/or strong information progression relative to preceding time. Basically, modeling physical fields involves process changing in time, i.e. non-stationay processes. In this case, the application of Laplace transformation provides unjustified description complications. Transition to other possibilities results in explicit simplification. The method of imaging vectors renders constructive mathematical models and necessary transition in the modeling process and analysis itself. The flexibility of the model itself using polynomial basis leads to the possible rapid transition of the mathematical model and further analysis acceleration. It should be noted that the mathematical description permits operator representation. Conversely, operator representation of the structures, algorithms and data processing procedures significantly improve the flexibility of the modeling process.

  9. Detection and Analysis of Threats to the Energy Sector: DATES

    Energy Technology Data Exchange (ETDEWEB)

    Alfonso Valdes

    2010-03-31

    This report summarizes Detection and Analysis of Threats to the Energy Sector (DATES), a project sponsored by the United States Department of Energy and performed by a team led by SRI International, with collaboration from Sandia National Laboratories, ArcSight, Inc., and Invensys Process Systems. DATES sought to advance the state of the practice in intrusion detection and situational awareness with respect to cyber attacks in energy systems. This was achieved through adaptation of detection algorithms for process systems as well as development of novel anomaly detection techniques suited for such systems into a detection suite. These detection components, together with third-party commercial security systems, were interfaced with the commercial Security Information Event Management (SIEM) solution from ArcSight. The efficacy of the integrated solution was demonstrated on two testbeds, one based on a Distributed Control System (DCS) from Invensys, and the other based on the Virtual Control System Environment (VCSE) from Sandia. These achievements advance the DOE Cybersecurity Roadmap [DOE2006] goals in the area of security monitoring. The project ran from October 2007 until March 2010, with the final six months focused on experimentation. In the validation phase, team members from SRI and Sandia coupled the two test environments and carried out a number of distributed and cross-site attacks against various points in one or both testbeds. Alert messages from the distributed, heterogeneous detection components were correlated using the ArcSight SIEM platform, providing within-site and cross-site views of the attacks. In particular, the team demonstrated detection and visualization of network zone traversal and denial-of-service attacks. These capabilities were presented to the DistribuTech Conference and Exhibition in March 2010. The project was hampered by interruption of funding due to continuing resolution issues and agreement on cost share for four months in 2008

  10. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  11. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  12. Probabilistic analysis of a thermosetting pultrusion process

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri

    2016-01-01

    process. A new application for the probabilistic analysis of the pultrusion process is introduced using the response surface method (RSM). The results obtained from the RSM are validated by employing the Monte Carlo simulation (MCS) with Latin hypercube sampling technique. According to the results......In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion...

  13. Planar Inlet Design and Analysis Process (PINDAP)

    Science.gov (United States)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  14. Mechanism Analysis of CsI Flicker Crystal on Photoelectric Detection Process%碘化铯闪烁晶体在光电检测过程中的机理分析

    Institute of Scientific and Technical Information of China (English)

    王菊霞

    2015-01-01

    简述了光电检测的基本原理,从电子辐射、能量分布、光电效应的角度分析了碘化铯在光电检测过程中的机理,研究表明:入射光强大约为0.1 MW/cm2时,碘化铯可形成双光子光电效应.碘化铯晶体是目前较为普遍使用的较好闪烁体,可将X射线直接转换为可见光,从而获得数字图像,实现光电检测的目的.%The fundamental principle of photoelectric detection is described. The mechanism of CsI crystal in the process of photoelectric detection is analyzed on the basis of represent electron emission, energy distribution and photoelectric effects. It is found that the two-photon photoelectric effect was obtained in CsI film for incident light intensity of about 0.1 MW/cm2 . The CsI is a fine flicker at large using which can be transform X-Ray to visible light in order to obtain figure image and realize photoelectric de⁃tection.

  15. Systematic Sustainable Process Design and Analysis of Biodiesel Processes

    Directory of Open Access Journals (Sweden)

    Seyed Soheil Mansouri

    2013-09-01

    Full Text Available Biodiesel is a promising fuel alternative compared to traditional diesel obtained from conventional sources such as fossil fuel. Many flowsheet alternatives exist for the production of biodiesel and therefore it is necessary to evaluate these alternatives using defined criteria and also from process intensification opportunities. This work focuses on three main aspects that have been incorporated into a systematic computer-aided framework for sustainable process design. First, the creation of a generic superstructure, which consists of all possible process alternatives based on available technology. Second, the evaluation of this superstructure for systematic screening to obtain an appropriate base case design. This is done by first reducing the search space using a sustainability analysis, which provides key indicators for process bottlenecks of different flowsheet configurations and then by further reducing the search space by using economic evaluation and life cycle assessment. Third, the determination of sustainable design with/without process intensification using a phenomena-based synthesis/design method. A detailed step by step application of the framework is highlighted through a biodiesel production case study.

  16. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  17. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Our...... residuals generalise the well-known residuals for point processes in time, used in signal processing and survival analysis. An important difference is that the conditional intensity or hazard rate of the temporal point process must be replaced by the Papangelou conditional intensity $lambda$ of the spatial...... process. Residuals are ascribed to locations in the empty background, as well as to data points of the point pattern. We obtain variance formulae, and study standardised residuals. There is also an analogy between our spatial residuals and the usual residuals for (non-spatial) generalised linear models...

  18. Flux Analysis in Process Models via Causality

    CERN Document Server

    Kahramanoğullari, Ozan

    2010-01-01

    We present an approach for flux analysis in process algebra models of biological systems. We perceive flux as the flow of resources in stochastic simulations. We resort to an established correspondence between event structures, a broadly recognised model of concurrency, and state transitions of process models, seen as Petri nets. We show that we can this way extract the causal resource dependencies in simulations between individual state transitions as partial orders of events. We propose transformations on the partial orders that provide means for further analysis, and introduce a software tool, which implements these ideas. By means of an example of a published model of the Rho GTP-binding proteins, we argue that this approach can provide the substitute for flux analysis techniques on ordinary differential equation models within the stochastic setting of process algebras.

  19. A Systematic Analysis of Coal Accumulation Process

    Institute of Scientific and Technical Information of China (English)

    CHENG Aiguo

    2008-01-01

    Formation of coal seam and coal-rich zone is an integrated result of a series of factors in coal accumulation process. The coal accumulation system is an architectural aggregation of coal accumulation factors. It can be classified into 4 levels: the global coal accumulation super-system, the coal accumulation domain mega.system, the coal accumulation basin system, and the coal seam or coal seam set sub-system. The coal accumulation process is an open, dynamic, and grey system, and is meanwhile a system with such natures as aggregation, relevance, entirety, purpose-orientated, hierarchy, and environment adaptability. In this paper, we take coal accumulation process as a system to study origin of coal seam and coal-rich zone; and we will discuss a methodology of the systematic analysis of coal accumulation process. As an example, the Ordos coal basin was investigated to elucidate the application of the method of the coal accumulation system analysis.

  20. Exergetic analysis of human & natural processes

    CERN Document Server

    Banhatti, Dilip G

    2011-01-01

    Using the concept of available work or exergy, each human and natural process can be characterized by its contextual efficiency, that is, efficiency with the environment as a reference. Such an efficiency is termed exergy efficiency. Parts of the process which need to be made more efficient & less wasteful stand out in such an analysis, in contrast to an energy analysis. Any new idea for a process can be similarly characterized. This exercise naturally generates paths to newer ideas in given contexts to maximize exergy efficiency. The contextual efficiency is not just output/input, it also naturally includes environmental impact (to be minimized) and any other relevant parameter(s) to be optimized. Natural life processes in different terrestrial environments are already optimized for their environments, and act as guides, for example, in seeking to evolve sustainable energy practices in different contexts. Energy use at lowest possible temperature for each situation is a natural result. Variety of renewab...

  1. KNOWLEDGE PROCESS ANALYSIS:FRAMEWORK AND EXPERIENCE

    Institute of Scientific and Technical Information of China (English)

    Kozo SUGIYAMA; Bertolt MEYER

    2008-01-01

    We present a step-by-step approach for constructing a framework for knowledge process analysis (KPA).We intend to apply this framework to the analysis of own research projects in an exploratory way and elaborate it through the accumulation of case studies.This study is based on a methodology consisting of knowledge process modeling,primitives synthesis,and reflective verification.We describe details of the methodology and present the results of case studies:a novd methodology,a practical work guide,and a tool for KPA;insights for improving future research projects and education;and the integration of existing knowledge creation theories.

  2. Wavelet Analysis of Fractionally Integrated Processes

    OpenAIRE

    Mark J. Jensen

    1994-01-01

    In this paper we apply wavelet analysis to the class of fractionally integrated processes to show that this class is a member of the $1/f$ family of processes as defined by Wornell (1993) and to produce an alternative method of estimating the fractional differencing parameter. Currently the method by Geweke and Porter-Hudak (1983) is used most often to estimate and test the fractional differencing parameter. The GPH approach, however, has been shown to have poor statistical properties and suf...

  3. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  4. An image processing analysis of skin textures

    CERN Document Server

    Sparavigna, A

    2008-01-01

    Colour and coarseness of skin are visually different. When image processing is involved in the skin analysis, it is important to quantitatively evaluate such differences using texture features. In this paper, we discuss a texture analysis and measurements based on a statistical approach to the pattern recognition. Grain size and anisotropy are evaluated with proper diagrams. The possibility to determine the presence of pattern defects is also discussed.

  5. Probing Interfacial Processes on Graphene Surface by Mass Detection

    Science.gov (United States)

    Kakenov, Nurbek; Kocabas, Coskun

    2013-03-01

    In this work we studied the mass density of graphene, probed interfacial processes on graphene surface and examined the formation of graphene oxide by mass detection. The graphene layers were synthesized by chemical vapor deposition method on copper foils and transfer-printed on a quartz crystal microbalance (QCM). The mass density of single layer graphene was measured by investigating the mechanical resonance of the QCM. Moreover, we extended the developed technique to probe the binding dynamics of proteins on the surface of graphene, were able to obtain nonspecific binding constant of BSA protein of graphene surface in aqueous solution. The time trace of resonance signal showed that the BSA molecules rapidly saturated by filling the available binding sites on graphene surface. Furthermore, we monitored oxidation of graphene surface under oxygen plasma by tracing the changes of interfacial mass of the graphene controlled by the shifts in Raman spectra. Three regimes were observed the formation of graphene oxide which increases the interfacial mass, the release of carbon dioxide and the removal of small graphene/graphene oxide flakes. Scientific and Technological Research Council of Turkey (TUBITAK) grant no. 110T304, 109T209, Marie Curie International Reintegration Grant (IRG) grant no 256458, Turkish Academy of Science (TUBA-Gebip).

  6. Bisous model - detecting filamentary patterns in point processes

    CERN Document Server

    Tempel, E; Kipper, R; Saar, E

    2016-01-01

    The cosmic web is a highly complex geometrical pattern, with galaxy clusters at the intersection of filaments and filaments at the intersection of walls. Identifying and describing the filamentary network is not a trivial task due to the overwhelming complexity of the structure, its connectivity and the intrinsic hierarchical nature. To detect and quantify galactic filaments we use the Bisous model, which is a marked point process built to model multi-dimensional patterns. The Bisous filament finder works directly with the galaxy distribution data and the model intrinsically takes into account the connectivity of the filamentary network. The Bisous model generates the visit map (the probability to find a filament at a given point) together with the filament orientation field. Using these two fields, we can extract filament spines from the data. Together with this paper we publish the computer code for the Bisous model that is made available in GitHub. The Bisous filament finder has been successfully used in s...

  7. Detection of Causality between Process Variables Based on Industrial Alarm Data Using Transfer Entropy

    Directory of Open Access Journals (Sweden)

    Weijun Yu

    2015-08-01

    Full Text Available In modern industrial processes, it is easier and less expensive to configure alarms by software settings rather than by wiring, which causes the rapid growth of the number of alarms. Moreover, because there exist complex interactions, in particular the causal relationship among different parts in the process, a fault may propagate along propagation pathways once an abnormal situation occurs, which brings great difficulty to operators to identify its root cause immediately and to take proper actions correctly. Therefore, causality detection becomes a very important problem in the context of multivariate alarm analysis and design. Transfer entropy has become an effective and widely-used method to detect causality between different continuous process variables in both linear and nonlinear situations in recent years. However, such conventional methods to detect causality based on transfer entropy are computationally costly. Alternatively, using binary alarm series can be more computational-friendly and more direct because alarm data analysis is straightforward for alarm management in practice. The methodology and implementation issues are discussed in this paper. Illustrated by several case studies, including both numerical cases and simulated industrial cases, the proposed method is demonstrated to be suitable for industrial situations contaminated by noise.

  8. Experimental exploration to thermal infrared imaging for detecting the transient process of solid impact

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the analysis and the comparison of stress pattern analysis by thermal emission (SPATE) and remote sensing rock mechanics (RSRM), the idea to detect the transient process of solid impact with thermal infrared (TIR) imaging technology is introduced. By means of TVS-8100MKII T IR imaging system, which has high recording speed, high space distinguishability and high temperature sensibility, TIR imaging experiments on free falling steel ball impacting on marble, granite, concrete, steel, organic-glass and wood plate are conducted. It was discovered that: (i) the target's TIR temperature increases remarkably after impact; (ii) when ball's size is not changed, the variation amplitude of target's TIR temperature proportionally increases with the ball's potential energy or falling height; (iii) the variation amplitude of target's TIR temperature is involved with the material type and the surface glabrous condition of the target, and the amplitudes are in order as concrete, unpolished marble, steel plate, wood plate, polished granite, polished marble and organic-glass plate; and (iv) the TIR radiation of fragile targets decreases gradually after impact, while there is delayed TIR radiation strengthening for plastic target. It is deduced that once the relational runctions and technical parameters, which are involved with certain impact body and target material, are set up through experimental study, the remote detection and back analysis based on TIR imaging for the transient process of solid impact will be no problem. Besides, there is also important scientific meaning for the omen mechanics study and satellite TIR detection and prediction for structural earthquake.

  9. SEMICONDUCTOR TECHNOLOGY A signal processing method for the friction-based endpoint detection system of a CMP process

    Science.gov (United States)

    Chi, Xu; Dongming, Guo; Zhuji, Jin; Renke, Kang

    2010-12-01

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process.

  10. Detrended Fluctuation Analysis of Autoregressive Processes

    CERN Document Server

    Morariu, V V; Vamos, C; Soltuz, S

    2007-01-01

    Autoregressive processes (AR) have typical short-range memory. Detrended Fluctuation Analysis (DFA) was basically designed to reveal long range correlation in non stationary processes. However DFA can also be regarded as a suitable method to investigate both long-range and short range correlation in non-stationary and stationary systems. Applying DFA to AR processes can help understanding the non uniform correlation structure of such processes. We systematically investigated a first order autoregressive model AR(1) by DFA and established the relationship between the interaction constant of AR(1) and the DFA correlation exponent. The higher the interaction constant the higher is the short range correlation exponent. They are exponentially related. The investigation was extended to AR(2) processes. The presence of a distant positive interaction in addition to a near by interaction will increase the correlation exponent and the range of correlation while the effect of a distant negative interaction will decrease...

  11. Method for detecting software anomalies based on recurrence plot analysis

    OpenAIRE

    Michał Mosdorf

    2012-01-01

    Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET). Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions).

  12. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  13. Fault Detection and Diagnosis in Process Data Using Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Fang Wu

    2014-01-01

    Full Text Available For the complex industrial process, it has become increasingly challenging to effectively diagnose complicated faults. In this paper, a combined measure of the original Support Vector Machine (SVM and Principal Component Analysis (PCA is provided to carry out the fault classification, and compare its result with what is based on SVM-RFE (Recursive Feature Elimination method. RFE is used for feature extraction, and PCA is utilized to project the original data onto a lower dimensional space. PCA T2, SPE statistics, and original SVM are proposed to detect the faults. Some common faults of the Tennessee Eastman Process (TEP are analyzed in terms of the practical system and reflections of the dataset. PCA-SVM and SVM-RFE can effectively detect and diagnose these common faults. In RFE algorithm, all variables are decreasingly ordered according to their contributions. The classification accuracy rate is improved by choosing a reasonable number of features.

  14. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  15. Application of wavelet analysis to crustal deformation data processing

    Institute of Scientific and Technical Information of China (English)

    张燕; 吴云; 刘永启; 施顺英

    2004-01-01

    The time-frequency analysis and anomaly detection of wavelet transformation make the method irresistibly advantageous in non-stable signal processing. In the paper, the two characteristics are analyzed and demonstrated withsynthetic signal. By applying wavelet transformation to deformation data processing, we find that about 4 monthsbefore strong earthquakes, several deformation stations near the epicenter received at the same time the abnormalsignal with the same frequency and the period from several days to more than ten days. The GPS observation stations near the epicenter all received the abnormal signal whose period is from 3 months to half a year. These abnormal signals are possibly earthquake precursors.

  16. System Runs Analysis with Process Mining

    Directory of Open Access Journals (Sweden)

    S. A. Shershakov

    2015-01-01

    Full Text Available Information systems (IS produce numerous traces and logs at runtime. In the context of SOA-based (service-oriented architecture IS, these logs contain details about sequences of process and service calls. Modern application monitoring and error tracking tools provide only rather straightforward log search and filtering functionality. However, “clever” analysis of the logs is highly useful, since it can provide valuable insights into the system architecture, interaction of business domains and services. Here we took runs event logs (trace data of a big booking system and discovered architectural guidelines violations and common anti-patterns. We applied mature process mining techniques for discovery and analysis of these logs. The aims of process mining are to discover, analyze, and improve processes on the basis of IS behavior recorded as event logs. In several specific examples, we show successful applications of process mining to system runtime analysis and motivate further research in this area.The article is published in the authors’ wording.

  17. Real Time Intelligent Target Detection and Analysis with Machine Vision

    Science.gov (United States)

    Howard, Ayanna; Padgett, Curtis; Brown, Kenneth

    2000-01-01

    We present an algorithm for detecting a specified set of targets for an Automatic Target Recognition (ATR) application. ATR involves processing images for detecting, classifying, and tracking targets embedded in a background scene. We address the problem of discriminating between targets and nontarget objects in a scene by evaluating 40x40 image blocks belonging to an image. Each image block is first projected onto a set of templates specifically designed to separate images of targets embedded in a typical background scene from those background images without targets. These filters are found using directed principal component analysis which maximally separates the two groups. The projected images are then clustered into one of n classes based on a minimum distance to a set of n cluster prototypes. These cluster prototypes have previously been identified using a modified clustering algorithm based on prior sensed data. Each projected image pattern is then fed into the associated cluster's trained neural network for classification. A detailed description of our algorithm will be given in this paper. We outline our methodology for designing the templates, describe our modified clustering algorithm, and provide details on the neural network classifiers. Evaluation of the overall algorithm demonstrates that our detection rates approach 96% with a false positive rate of less than 0.03%.

  18. In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions

    Science.gov (United States)

    Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh

    2005-09-01

    This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.

  19. Analysis of Spatial Data Structures for Proximity Detection

    Institute of Scientific and Technical Information of China (English)

    Anupreet Walia; Jochen Teizer

    2008-01-01

    Construction is a dangerous business.According to statistics,in every of the past thirteen years more than 1000 workers died in the USA construction industry.In order to minimize the overall number of these incidents,the research presented in this paper investigates to monitor and analyze the trejectories of construction resources first in a simulated environment and later on the actual job site.Due to the complex nature of the construction environment,three dimensional (3D) positioning data of workers is hardly col-lected.Although technology is available that allows tracking construction assets in real-time,indoors and outdoors,in 3D,at the same time,the continuously changing spatial and temporal arrangement of job sites requires any successfully working data processing system to work in real-time.This research paper focuses is safety on spatial data structures that offer the capability of realigning itself and reporting the distance of the closest neighbor in real-time.This paper presents results to simulations that allow the processing of real-time location data for collision detection and proximity analysis.The presented data structures and perform-ance results to the developed algorithms demonstmte that real-time tracking and proximity detection of re-sources is feasible.

  20. Qualitative Analysis for Maintenance Process Assessment

    Science.gov (United States)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  1. Detection and Analysis of Solar Eclipse

    OpenAIRE

    2012-01-01

    We propose an algorithm that can be used by amateur astronomers to analyze the images acquired during solar eclipses. The proposed algorithm analyzes the image, detects the eclipse and produces results for parameters like magnitude of eclipse, eclipse obscuration and the approximate distance between the Earth and the Moon.

  2. Processing of polarimetric infrared images for landmine detection

    NARCIS (Netherlands)

    Cremer, F.; Jong, W. de; Schutte, K.

    2003-01-01

    Infrared (IR) cameras are often used in a vehicle based multi-sensor platform for landmine detection. Additional to thermal contrasts, an IR polarimetric sensor also measures surface properties and therefore has the potential of increased detection performance. We have developed a polarimetric IR se

  3. An image processing pipeline to detect and segment nuclei in muscle fiber microscopic images.

    Science.gov (United States)

    Guo, Yanen; Xu, Xiaoyin; Wang, Yuanyuan; Wang, Yaming; Xia, Shunren; Yang, Zhong

    2014-08-01

    Muscle fiber images play an important role in the medical diagnosis and treatment of many muscular diseases. The number of nuclei in skeletal muscle fiber images is a key bio-marker of the diagnosis of muscular dystrophy. In nuclei segmentation one primary challenge is to correctly separate the clustered nuclei. In this article, we developed an image processing pipeline to automatically detect, segment, and analyze nuclei in microscopic image of muscle fibers. The pipeline consists of image pre-processing, identification of isolated nuclei, identification and segmentation of clustered nuclei, and quantitative analysis. Nuclei are initially extracted from background by using local Otsu's threshold. Based on analysis of morphological features of the isolated nuclei, including their areas, compactness, and major axis lengths, a Bayesian network is trained and applied to identify isolated nuclei from clustered nuclei and artifacts in all the images. Then a two-step refined watershed algorithm is applied to segment clustered nuclei. After segmentation, the nuclei can be quantified for statistical analysis. Comparing the segmented results with those of manual analysis and an existing technique, we find that our proposed image processing pipeline achieves good performance with high accuracy and precision. The presented image processing pipeline can therefore help biologists increase their throughput and objectivity in analyzing large numbers of nuclei in muscle fiber images.

  4. Detecting Inhomogeneity in Daily Climate Series Using Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    YAN Zhongwei; Phil D.JONES

    2008-01-01

    A wavelet method was applied to detect inhomogeneities in daily meteorological series,data which are being increasingly applied in studies of climate extremes.The wavelet method has been applied to a few well-established long-term daily temperature series back to the 18th century,which have been "homogenized" with conventional approaches.Various types of problems remaining in the series were revealed with the wavelet method.Their influences on analyses of change in climate extremes are discussed.The results have importance for understanding issues in conventional climate data processing and for development of improved methods of homogenization in order to improve analysis of climate extremes based on daily data.

  5. Detecting DNS Tunnels Using Character Frequency Analysis

    CERN Document Server

    Born, Kenton

    2010-01-01

    High-bandwidth covert channels pose significant risks to sensitive and proprietary information inside company networks. Domain Name System (DNS) tunnels provide a means to covertly infiltrate and exfiltrate large amounts of information passed network boundaries. This paper explores the possibility of detecting DNS tunnels by analyzing the unigram, bigram, and trigram character frequencies of domains in DNS queries and responses. It is empirically shown how domains follow Zipf's law in a similar pattern to natural languages, whereas tunneled traffic has more evenly distributed character frequencies. This approach allows tunnels to be detected across multiple domains, whereas previous methods typically concentrate on monitoring point to point systems. Anomalies are quickly discovered when tunneled traffic is compared to the character frequency fingerprint of legitimate domain traffic.

  6. Arc burst pattern analysis fault detection system

    Science.gov (United States)

    Russell, B. Don (Inventor); Aucoin, B. Michael (Inventor); Benner, Carl L. (Inventor)

    1997-01-01

    A method and apparatus are provided for detecting an arcing fault on a power line carrying a load current. Parameters indicative of power flow and possible fault events on the line, such as voltage and load current, are monitored and analyzed for an arc burst pattern exhibited by arcing faults in a power system. These arcing faults are detected by identifying bursts of each half-cycle of the fundamental current. Bursts occurring at or near a voltage peak indicate arcing on that phase. Once a faulted phase line is identified, a comparison of the current and voltage reveals whether the fault is located in a downstream direction of power flow toward customers, or upstream toward a generation station. If the fault is located downstream, the line is de-energized, and if located upstream, the line may remain energized to prevent unnecessary power outages.

  7. Performance Analysis of Cone Detection Algorithms

    CERN Document Server

    Mariotti, Letizia

    2015-01-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of two popular cone detection algorithms and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the three algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimat...

  8. Analysis of rocket engine injection combustion processes

    Science.gov (United States)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  9. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  10. Stochastic Power Grid Analysis Considering Process Variations

    CERN Document Server

    Ghanta, Praveen; Panda, Rajendran; Wang, Janet

    2011-01-01

    In this paper, we investigate the impact of interconnect and device process variations on voltage fluctuations in power grids. We consider random variations in the power grid's electrical parameters as spatial stochastic processes and propose a new and efficient method to compute the stochastic voltage response of the power grid. Our approach provides an explicit analytical representation of the stochastic voltage response using orthogonal polynomials in a Hilbert space. The approach has been implemented in a prototype software called OPERA (Orthogonal Polynomial Expansions for Response Analysis). Use of OPERA on industrial power grids demonstrated speed-ups of up to two orders of magnitude. The results also show a significant variation of about $\\pm$ 35% in the nominal voltage drops at various nodes of the power grids and demonstrate the need for variation-aware power grid analysis.

  11. Image corruption detection in diffusion tensor imaging for post-processing and real-time monitoring.

    Science.gov (United States)

    Li, Yue; Shea, Steven M; Lorenz, Christine H; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called "corrected Inter-Slice Intensity Discontinuity" (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies.

  12. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    Science.gov (United States)

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  13. Auto Landing Process for Autonomous Flying Robot by Using Image Processing Based on Edge Detection

    Directory of Open Access Journals (Sweden)

    Bahram Lavi Sefidgari

    2014-01-01

    Full Text Available In today’s technological life, everyone is quite familiar with the importance of security measures in our lives. So in this regard, many attempts have been made by researchers and one of them is flying robots technology. One well-known usage of flying robot, perhaps, is its capability in security and care measurements which made this device extremely practical, not only for its unmanned movement, but also for the unique manoeuvre during flight over the arbitrary areas. In this research, the automatic landing of a flying robot is discussed. The system is based on the frequent interruptions that is sent from main microcontroller to camera module in order to take images; these images have been distinguished by image processing system based on edge detection, after analysing the image the system can tell whether or not to land on the ground. This method shows better performance in terms of precision as well as experimentally.

  14. Semiclassical analysis for diffusions and stochastic processes

    CERN Document Server

    Kolokoltsov, Vassili N

    2000-01-01

    The monograph is devoted mainly to the analytical study of the differential, pseudo-differential and stochastic evolution equations describing the transition probabilities of various Markov processes. These include (i) diffusions (in particular,degenerate diffusions), (ii) more general jump-diffusions, especially stable jump-diffusions driven by stable Lévy processes, (iii) complex stochastic Schrödinger equations which correspond to models of quantum open systems. The main results of the book concern the existence, two-sided estimates, path integral representation, and small time and semiclassical asymptotics for the Green functions (or fundamental solutions) of these equations, which represent the transition probability densities of the corresponding random process. The boundary value problem for Hamiltonian systems and some spectral asymptotics ar also discussed. Readers should have an elementary knowledge of probability, complex and functional analysis, and calculus.

  15. Extending TOPS: Ontology-driven Anomaly Detection and Analysis System

    Science.gov (United States)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2010-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include a capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. In order to best capture the knowledge about data hierarchies, Earth science models and implied dependencies between anomalies and occurrences of observable events such as urbanization, deforestation, or fires, we have developed an ontology to serve as a knowledge base. We can query the knowledge base and answer questions about dataset compatibilities, similarities and dependencies so that we can, for example, automatically analyze similar datasets in order to verify a given anomaly occurrence in multiple data sources. We are further extending the system to go beyond anomaly detection towards reasoning about possible causes of anomalies that are also encoded in the knowledge base as either learned or implied knowledge. This enables us to scale up the analysis by eliminating a large number of anomalies early on during the processing by either failure to verify them from other sources, or matching them directly with other observable events without having to perform an extensive and time-consuming exploration and analysis. The knowledge is captured using OWL ontology language, where connections are defined in a schema that is later extended by including specific instances of datasets and models. The information is stored using Sesame server and is accessible through both Java API and web services using SeRQL and SPARQL query languages. Inference is provided using OWLIM component integrated with Sesame.

  16. Application of signal processing techniques to the detection of tip vortex cavitation noise in marine propeller

    Institute of Scientific and Technical Information of China (English)

    LEE Jeung-Hoon; HAN Jae-Moon; PARK Hyung-Gil; SEO Jong-Soo

    2013-01-01

    The tip vortex cavitation and its relevant noise has been the subject of extensive researches up to now.In most cases of experimental approaches,the accurate and objective decision of cavitation inception is primary,which is the main topic of this paper.Although the conventional power spectrum is normally adopted as a signal processing tool for the analysis of cavitation noise,a faithful exploration cannot be made especially for the cavitation inception.Alternatively,the periodic occurrence of bursting noise induced from tip vortex cavitation gives a diagnostic proof that the repeating frequency of the bursting contents can be exploited as an indication of the inception.This study,hence,employed the Short-Time Fourier Transform (STFT) analysis and the Detection of Envelope Modulation On Noise (DEMON) spectrum analysis,both which are appropriate for finding such a repeating frequency.Through the acoustical measurement in a water tunnel,the two signal processing techniques show a satisfactory result in detecting the inception of tip vortex cavitation.

  17. Sound vibration signal processing for detection and identification detonation (knock) to optimize performance Otto engine

    Science.gov (United States)

    Sujono, A.; Santoso, B.; Juwana, W. E.

    2016-03-01

    Problems of detonation (knock) on Otto engine (petrol engine) is completely unresolved problem until now, especially if want to improve the performance. This research did sound vibration signal processing engine with a microphone sensor, for the detection and identification of detonation. A microphone that can be mounted is not attached to the cylinder block, that's high temperature, so that its performance will be more stable, durable and inexpensive. However, the method of analysis is not very easy, because a lot of noise (interference). Therefore the use of new methods of pattern recognition, through filtration, and the regression function normalized envelope. The result is quite good, can achieve a success rate of about 95%.

  18. A method for detecting damage to rolling bearings in toothed gears of processing lines

    Directory of Open Access Journals (Sweden)

    T. Figlus

    2015-10-01

    Full Text Available This paper presents a method of diagnosing damage to rolling bearings in toothed gears of processing lines. The research has shown the usefulness of vibration signal measurements performed with a laser vibrometer and of the method of denoising signals by means of a discrete wavelet transform in detecting damage to bearings. The application of the method of analysis of the characteristic frequencies of changes in the vibration signal amplitude made it possible to draw conclusions about the type of damage to the bearings.

  19. Space Applications for Ensemble Detection and Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Ensemble Detection is both a measurement technique and analysis tool. Like a prism that separates light into spectral bands, an ensemble detector mixes a signal with...

  20. Fine analysis on advanced detection of transient electromagnetic method

    Institute of Scientific and Technical Information of China (English)

    Wang Bo; Liu Shengdong; Yang Zhen; Wang Zhijun; Huang Lanying

    2012-01-01

    Fault fracture zones and water-bearing bodies in front of the driving head are the main disasters in mine laneways,thus it is important to perform their advanced detection and prediction in advance in order to provide reliable technical support for the excavation.Based on the electromagnetic induction theory,we analyzed the characteristics of primary and secondary fields with a positive and negative wave form of current,proposed the fine processing of the advanced detection with variation rate of apparent resistivity and introduced in detail the computational formulae and procedures.The result of physical simulation experiments illustrate that the tectonic interface of modules can be judged by first-order rate of apparent resistivity with a boundary error of 5%,and the position of water body determined by the fine analysis method agrees well with the result of borehole drilling.This shows that in terms of distinguishing structure and aqueous anomalies,the first-order rate of apparent resistivity is more sensitive than the secondorder rate of apparent resistivity.However,some remaining problems are suggested for future solutions.

  1. Ultrasonic flaw detection using EMD-based signal processing

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The precise detection of flaw echoes buried in backscattering noise caused by material microstructure is a problem of great importance in ultrasonic non-destructive testing (NDT). In this work, empirical mode decomposition (EMD) is proposed to deal with ultrasonic signal. A time-frequency filtering method based on EMD is designed to suppress noise and enhance flaw signals. Simulated results are presented,showing that the proposed method has an excellent performance even for a very low signal-to-noise ratio(SNR). The improvement in flaw detection was experimentally verified using stainless steel pipe sample with artificial flaws.

  2. Fault detection and isolation in processes involving induction machines

    Energy Technology Data Exchange (ETDEWEB)

    Zell, K.; Medvedev, A. [Control Engineering Group, Luleaa University of Technology, Luleaa (Sweden)

    1997-12-31

    A model-based technique for fault detection and isolation in electro-mechanical systems comprising induction machines is introduced. Two coupled state observers, one for the induction machine and another for the mechanical load, are used to detect and recognize fault-specific behaviors (fault signatures) from the real-time measurements of the rotor angular velocity and terminal voltages and currents. Practical applicability of the method is verified in full-scale experiments with a conveyor belt drive at SSAB, Luleaa Works. (orig.) 3 refs.

  3. Detection of Rice Leaf Diseases Using Chaos and Fractal Dimension in Image Processing

    Directory of Open Access Journals (Sweden)

    V.Surendrababu

    2014-01-01

    Full Text Available A novel method for detecting rice leaf disease using image processing technique called fractal dimension and chaos theory is proposed in this paper. The analysis of a diseased leaf is carried out according to its image pattern and fractal dimension, and especially box-counting ratio calculation, and chaos, are applied to be able to identify the disease pattern’s self-similarity and to recreate the fractal. The image’s self-similarity is the disease infected one which is same as when it is fully infected. This method is proposed as preliminary information for the development of an early detection system or for developing knowledge based expert system or decision support system.

  4. Efficient signal processing for time-resolved fluorescence detection of nitrogen-vacancy spins in diamond

    Science.gov (United States)

    Gupta, A.; Hacquebard, L.; Childress, L.

    2016-03-01

    Room-temperature fluorescence detection of the nitrogen-vacancy center electronic spin typically has low signal to noise, requiring long experiments to reveal an averaged signal. Here, we present a simple approach to analysis of time-resolved fluorescence data that permits an improvement in measurement precision through signal processing alone. Applying our technique to experimental data reveals an improvement in signal to noise equivalent to a 14% increase in photon collection efficiency. We further explore the dependence of the signal to noise ratio on excitation power, and analyze our results using a rate equation model. Our results provide a rubric for optimizing fluorescence spin detection, which has direct implications for improving precision of nitrogen-vacancy-based sensors.

  5. [Progress on detection and analysis method of endocrine disrupting compounds].

    Science.gov (United States)

    Du, Hui-Fang; Yan, Hui-Fang

    2005-07-01

    EDCs are new generation of environmental pollutions which are globally concerned. They may cause adverse effect mainly to the endocrine system and nervous system, etc. To assess the EDCs' hazard to the health exactly, we should know about the distribution and level of EDCs in the environment. In this paper, the technique of pretreatment in different matrices, the method of detection and analysis about EDCs were reviewed, and the future's prospect on the study of detection and analysis method were talked about also.

  6. Automatic basal slice detection for cardiac analysis

    Science.gov (United States)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  7. Fault detection in processes represented by PLS models using an EWMA control scheme

    KAUST Repository

    Harrou, Fouzi

    2016-10-20

    Fault detection is important for effective and safe process operation. Partial least squares (PLS) has been used successfully in fault detection for multivariate processes with highly correlated variables. However, the conventional PLS-based detection metrics, such as the Hotelling\\'s T and the Q statistics are not well suited to detect small faults because they only use information about the process in the most recent observation. Exponentially weighed moving average (EWMA), however, has been shown to be more sensitive to small shifts in the mean of process variables. In this paper, a PLS-based EWMA fault detection method is proposed for monitoring processes represented by PLS models. The performance of the proposed method is compared with that of the traditional PLS-based fault detection method through a simulated example involving various fault scenarios that could be encountered in real processes. The simulation results clearly show the effectiveness of the proposed method over the conventional PLS method.

  8. Automated analysis for lifecycle assembly processes

    Energy Technology Data Exchange (ETDEWEB)

    Calton, T.L.; Brown, R.G.; Peters, R.R.

    1998-05-01

    Many manufacturing companies today expend more effort on upgrade and disposal projects than on clean-slate design, and this trend is expected to become more prevalent in coming years. However, commercial CAD tools are better suited to initial product design than to the product`s full life cycle. Computer-aided analysis, optimization, and visualization of life cycle assembly processes based on the product CAD data can help ensure accuracy and reduce effort expended in planning these processes for existing products, as well as provide design-for-lifecycle analysis for new designs. To be effective, computer aided assembly planning systems must allow users to express the plan selection criteria that apply to their companies and products as well as to the life cycles of their products. Designing products for easy assembly and disassembly during its entire life cycle for purposes including service, field repair, upgrade, and disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and constraints (compared to initial assembly) require one to re-visit the significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or applied studies of life cycle assembly processes, which give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for; optimize, and analyze life cycle assembly processes.

  9. Physical Meaning of the Optimum Measurement Process in Quantum Detection Theory

    Science.gov (United States)

    Osaki, Masao; Kozuka, Haruhisa; Hirota, Osamu

    1996-01-01

    The optimum measurement processes are represented as the optimum detection operators in the quantum detection theory. The error probability by the optimum detection operators goes beyond the standard quantum limit automatically. However the optimum detection operators are given by pure mathematical descriptions. In order to realize a communication system overcoming the standard quantum limit, we try to give the physical meaning of the optimum detection operators.

  10. Detection limit for rate fluctuations in inhomogeneous Poisson processes

    Science.gov (United States)

    Shintani, Toshiaki; Shinomoto, Shigeru

    2012-04-01

    Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.

  11. Protecting Students' Intellectual Property in the Web Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Dyagilev, Vadim; Tskhay, Alexander

    2012-01-01

    Learning management systems (LMS) play a central role in communications in online and distance education. In the digital era, with all the information now accessible at students' fingertips, plagiarism detection services (PDS) have become a must-have part of LMS. Such integration provides a seamless experience for users, allowing PDS to check…

  12. Protecting Student Intellectual Property in Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Barber, Craig

    2012-01-01

    The rapid development of the Internet along with increasing computer literacy has made it easy and tempting for digital natives to copy-paste someone's work. Plagiarism is now a burning issue in education, industry and even in the research community. In this study, the authors concentrate on plagiarism detection with particular focus on the…

  13. Kernel principal component analysis for change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Morton, J.C.

    2008-01-01

    region acquired at two different time points. If change over time does not dominate the scene, the projection of the original two bands onto the second eigenvector will show change over time. In this paper a kernel version of PCA is used to carry out the analysis. Unlike ordinary PCA, kernel PCA...... with a Gaussian kernel successfully finds the change observations in a case where nonlinearities are introduced artificially....

  14. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  15. Detection of Porphyromonas gingivalis from Saliva by PCR by Using a Simple Sample-Processing Method

    OpenAIRE

    Mättö, Jaana; Saarela, Maria; Alaluusua, Satu; Oja, Virva; Jousimies-Somer, Hannele; Asikainen, Sirkka

    1998-01-01

    Simple sample-processing methods for PCR detection of Porphyromonas gingivalis, a major pathogen causing adult periodontitis, from saliva were studied. The ability to detect P. gingivalis from 118 salivary samples by PCR after boiling and Chelex 100 processing was compared with bacterial culture. P. gingivalis was detected three times more often by PCR than by culture. Chelex 100 processing of saliva proved to be effective in preventing PCR inhibition and was applied to determine the occurren...

  16. Microbiological Analysis of Rice Cake Processing in Korea.

    Science.gov (United States)

    Wang, Jun; Park, Joong-Hyun; Choi, Na-Jung; Ha, Sang-Do; Oh, Deog-Hwan

    2016-01-01

    This study was conducted to evaluate the microbial contamination in rice cake materials and products during processing and in the operation environment in nonhazard analysis [and] critical control point factories. Furthermore, the environmental health of the processing facilities and the bacterial and fungal contamination on the workers' hands were investigated. Pour plate methods were used for enumeration of aerobic plate count (APC), yeast and molds (YM), Bacillus cereus, Staphylococcus aureus, and Clostridium perfringens, whereas Petrifilm count plates were used for enumeration of coliforms and Escherichia coli. The respective microbial levels of APC, coliforms, YM, and B. cereus were in the range of 2.6 to 4.7, 1.0 to 3.8, not detected (ND) to 2.9, and ND to 2.8 log CFU/g in the raw materials and in the range of 2.3 to 6.2, ND to 3.6, ND to 2.7, and ND to 3.7 log CFU/g during processing of the rice cake products. During the processing of rice cakes, APC, coliforms, YM, and B. cereus increased during soaking and smashing treatments and decreased after steaming treatment. E. coli, S. aureus, and C. perfringens were not detected in any of the raw materials and operating areas or during processing. B. cereus was detected on the operators' hands at microbial contamination levels of 1.9 ± 0.19 to 2.0 ± 0.19 log CFU/g. The results showed that B. cereus in the end product is presumably the main concern for rice cakes. In addition, the high contamination level of B. cereus during manufacturing processes, including soaking, smashing, and molding, and the absence of B. cereus from the air sampling plates indicated that the contaminated equipment showed the potential risk to cause cross-contamination.

  17. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    Science.gov (United States)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  18. DETECTING ANOMALY REGIONS IN SATELLITE IMAGE TIME SERIES BASED ON SESAONAL AUTOCORRELATION ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1 it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2 it is flexible to meet some requirement (e.g., z-value or significance level of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3 it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  19. Multiuser detection and independent component analysis-Progress and perspective

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The latest progress in the multiuser detection and independent component analysis (ICA) is reviewed systematically. Then two novel classes of multiuser detection methods based on ICA algorithms and feedforward neural networks are proposed. Theoretical analysis and computer simulation show that ICA algorithms are effective to detect multiuser signals in code-division multiple-access (CDMA) system. The performances of these methods are not identical entirely in various channels, but all of them are robust, efficient, fast and suitable for real-time implementations.

  20. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  1. Real-time Forward Vehicle Detection Method Based on Edge Analysis

    Institute of Scientific and Technical Information of China (English)

    Young-suk JI; Hwan-ik CHUNG; Hern-soo HAHN

    2010-01-01

    This paper proposes a method which uses the extended edge analysis to supplement the inaccurate edge information for better vehicle detection during vehicle detection. The extended edge analysis method detects two vertical edge items, which are the borderlines of both sides of the vehicle, by extending the horizontal edges obtained inaccurately due to the illumination or noise existing on the image. The proposed method extracts the horizontal edges with the method of merging edges by using the horizontal edge information inside the Region of Interest (ROI), which is set up on the pre-processing step. The bottom line is determined by detecting the shadow regions of the vehicle from the extracted horizontal edge one. The general width of the vehicle detecting and the extended edge analyzing methods are carried out side by side on the bottom line of the vehicle to determine width of the vehicle. Finally, the final vehicle is detected through the verification step. On the road image with complicate background, the vehicle detecting method based on the extended edge analysis is more efficient than the existing vehicle detecting method which uses the edge information. The excellence of the proposed vehicle detecting method is confirmed by carrying out the vehicle detecting experiment on the complicate road image.

  2. Mathematical Analysis and Optimization of Infiltration Processes

    Science.gov (United States)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  3. Detection of crossover time scales in multifractal detrended fluctuation analysis

    Science.gov (United States)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  4. Data analysis of inertial sensor for train positioning detection system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seong Jin; Park, Sung Soo; Lee, Jae Ho; Kang, Dong Hoon [Korea Railroad Research Institute, Uiwang (Korea, Republic of)

    2015-02-15

    Train positioning detection information is fundamental for high-speed railroad inspection, making it possible to simultaneously determine the status and evaluate the integrity of railroad equipment. This paper presents the results of measurements and an analysis of an inertial measurement unit (IMU) used as a positioning detection sensors. Acceleration and angular rate measurements from the IMU were analyzed in the amplitude and frequency domains, with a discussion on vibration and train motions. Using these results and GPS information, the positioning detection of a Korean tilting train express was performed from Naju station to Illo station on the Honam-line. The results of a synchronized analysis of sensor measurements and train motion can help in the design of a train location detection system and improve the positioning detection performance.

  5. Real-time Microseismic Processing for Induced Seismicity Hazard Detection

    Energy Technology Data Exchange (ETDEWEB)

    Matzel, Eric M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-31

    Induced seismicity is inherently associated with underground fluid injections. If fluids are injected in proximity to a pre-existing fault or fracture system, the resulting elevated pressures can trigger dynamic earthquake slip, which could both damage surface structures and create new migration pathways. The goal of this research is to develop a fundamentally better approach to geological site characterization and early hazard detection. We combine innovative techniques for analyzing microseismic data with a physics-based inversion model to forecast microseismic cloud evolution. The key challenge is that faults at risk of slipping are often too small to detect during the site characterization phase. Our objective is to devise fast-running methodologies that will allow field operators to respond quickly to changing subsurface conditions.

  6. Across frequency processes involved in auditory detection of coloration

    DEFF Research Database (Denmark)

    Buchholz, Jörg; Kerketsos, P

    2008-01-01

    When an early wall reflection is added to a direct sound, a spectral modulation is introduced to the signal's power spectrum. This spectral modulation typically produces an auditory sensation of coloration or pitch. Throughout this study, auditory spectral-integration effects involved in coloration...... detection are investigated. Coloration detection thresholds were therefore measured as a function of reflection delay and stimulus bandwidth. In order to investigate the involved auditory mechanisms, an auditory model was employed that was conceptually similar to the peripheral weighting model [Yost, JASA......, 1982, 416-425]. When a “classical” gammatone filterbank was applied within this spectrum-based model, the model largely underestimated human performance at high signal frequencies. However, this limitation could be resolved by employing an auditory filterbank with narrower filters. This novel...

  7. Sociolinguistically Informed Natural Language Processing: Automating Irony Detection

    Science.gov (United States)

    2015-04-13

    representation for verbal irony detection. Indeed, sociolinguistic theories of verbal irony imply that a 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND... social media REPORT DOCUMENTATION PAGE 11. SPONSOR/MONITOR’S REPORT NUMBER(S) 10. SPONSOR/MONITOR’S ACRONYM(S) ARO 8. PERFORMING ORGANIZATION...contrast to most text classification problems, word counts and syntactic features alone do not constitute an adequate representation for verbal irony

  8. Signal processing of Shiley heart valve data for fracture detection

    Energy Technology Data Exchange (ETDEWEB)

    Mullenhoff, C.

    1993-04-01

    Given digital acoustic data emanating from the heart sounds of the beating heart measured from laboratory sheep with implanted Bjoerk-Shiley Convexo-Concave heart valves, it is possible to detect and extract the opening and closing heart beats from the data. Once extracted, spectral or other information can then obtained from the heartbeats and passed on to feature extraction algorithms, neutral networks, or pattern recognizers so that the valve condition, either fractured or intact, may be determined.

  9. Deterring digital plagiarism, how effective is the digital detection process?

    Directory of Open Access Journals (Sweden)

    Jayati Chaudhuri

    2008-03-01

    Full Text Available Academic dishonesty or plagiarism is a growing problem in today's digital world. Use of plagiarism detection tools can assist faculty to combat this form of academic dishonesty. In this article, a special emphasis is given to text-matching software called SafeAssignmentTM. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in detail. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in detail.

  10. THE ANALYSIS OF DETECTIVE GENRE IN MEDIA STUDIES IN THE STUDENT AUDIENCE

    Directory of Open Access Journals (Sweden)

    Alexander Fedorov

    2011-11-01

    Full Text Available Development of skills for the critical analysis of media texts - an important task of media education. However, media literacy practice shows that students have the problems with the discussion / analysis of entertainment genres in the early stages of media studies, for example, the difficulties in the process of understanding and interpreting the author's conception, plot and genre features. This article substantiates the methodological approaches to the analysis skills of detective/thriller genre in media studies in the student audience.

  11. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  12. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second......, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics...

  13. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    chemical processes; for example, intensified processes such as reactive distillation. Most importantly, it identifies and eliminates potentially promising design alternatives that may have controllability problems later. To date, a number of methodologies have been proposed and applied on various problems......Process design and process control have been considered as independent problems for many years. In this context, a sequential approach is used where the process is designed first, followed by the control design. However, this sequential approach has its limitations related to dynamic constraint...... violations, for example, infeasible operating points, process overdesign or under-performance. Therefore, by using this approach, a robust performance is not always guaranteed. Furthermore, process design decisions can influence process control and operation. To overcome these limitations, an alternative...

  14. Defect source analysis of directed self-assembly process

    Science.gov (United States)

    Delgadillo, Paulina Rincon; Suri, Mayur; Durant, Stephane; Cross, Andrew; Nagaswami, Venkat R.; Heuvel, Dieter Van Den; Gronheid, Roel; Nealey, Paul

    2013-07-01

    As design rule shrinks, it is essential that the capability to detect smaller and smaller defects should improve. There is considerable effort going on in the industry to enhance immersion lithography using directed self-assembly (DSA) for the 14-nm design node and below. While the process feasibility is demonstrated with DSA, material issues as well as process control requirements are not fully characterized. The chemical epitaxy process is currently the most-preferred process option for frequency multiplication, and it involves new materials at extremely small thicknesses. The image contrast of the lamellar line/space pattern at such small layer thicknesses is a new challenge for optical inspection tools. The study focuses on capability of optical inspection systems to capture DSA unique defects such as dislocations and disclination clusters over the system and wafer noise. The study is also extended to investigate wafer-level data at multiple process steps and to determine the contribution from each process step and materials using defect source analysis methodology. The added defect pareto and spatial distributions of added defects at each process step are discussed.

  15. Auxetic polyurethane foam: Manufacturing and processing analysis

    Science.gov (United States)

    Jahan, Md Deloyer

    experimental design approach to identify significant processing parameters followed by optimization of those processing parameters in fabrication of auxetic PU foam. A split-plot factorial design has been selected for screening purpose. Response Surface Methodology (RSM) has been utilized to optimize the processing parameters in fabrication of auxetic PU foam. Two different designs named Box-Behnken and I-optimal designs have been employed for this analysis. The results obtained by those designs exhibit that I-optimal design provides more accurate and realistic results than Box-Behnken design when experiments are performed in split-plot manner. Finally, a near stationary ridge system is obtained by optimization analysis. As a result a set of operating conditions are obtained that produces similar minimum Poisson's ratio in auxetic PU foam.

  16. People detection in nuclear plants by video processing for safety purpose

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN), Rio de Janeiro, RJ (Brazil); Seixas, Jose M.; Silva, Eduardo Antonio B., E-mail: seixas@lps.ufrj.b, E-mail: eduardo@lps.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Eletrica; Cota, Raphael E.; Ramos, Bruno L., E-mail: brunolange@poli.ufrj.b [Universidade Federal do Rio de Janeiro (EP/UFRJ), RJ (Brazil). Dept. de Engenharia Eletronica e de Computacao

    2011-07-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  17. Quantum Chemical Strain Analysis For Mechanochemical Processes.

    Science.gov (United States)

    Stauch, Tim; Dreuw, Andreas

    2017-03-24

    The use of mechanical force to initiate a chemical reaction is an efficient alternative to the conventional sources of activation energy, i.e., heat, light, and electricity. Applications of mechanochemistry in academic and industrial laboratories are diverse, ranging from chemical syntheses in ball mills and ultrasound baths to direct activation of covalent bonds using an atomic force microscope. The vectorial nature of force is advantageous because specific covalent bonds can be preconditioned for rupture by selective stretching. However, the influence of mechanical force on single molecules is still not understood at a fundamental level, which limits the applicability of mechanochemistry. As a result, many chemists still resort to rules of thumb when it comes to conducting mechanochemical syntheses. In this Account, we show that comprehension of mechanochemistry at the molecular level can be tremendously advanced by quantum chemistry, in particular by using quantum chemical force analysis tools. One such tool is the JEDI (Judgement of Energy DIstribution) analysis, which provides a convenient approach to analyze the distribution of strain energy in a mechanically deformed molecule. Based on the harmonic approximation, the strain energy contribution is calculated for each bond length, bond angle and dihedral angle, thus providing a comprehensive picture of how force affects molecules. This Account examines the theoretical foundations of quantum chemical force analysis and provides a critical overview of the performance of the JEDI analysis in various mechanochemical applications. We explain in detail how this analysis tool is to be used to identify the "force-bearing scaffold" of a distorted molecule, which allows both the rationalization and the optimization of diverse mechanochemical processes. More precisely, we show that the inclusion of every bond, bending and torsion of a molecule allows a particularly insightful discussion of the distribution of mechanical

  18. Equipment to detect bad isolator in hard disk drive testing process

    Directory of Open Access Journals (Sweden)

    Winai Tumthong

    2014-09-01

    Full Text Available With increasing data capacity of hard disk drive (HDD today, the testing time of a hard disk drive becames longer. It takes approximately 25 - 30 hours. Vibrations during HDD testing process significantly contribute to testing errors. Isolator and pocket slot were used to reduce the vibration. The physical property of isolators was charged, that led to the results of HDD testing. Because of the error, the testing process needs to be redone. As a result, the testing time is extended. This research developed a system to detect isolator function using “Ariduino Due” microcontroller along with “Micro-electromechanical System – MEMS” accelerator sensor. For the precise analysis, the statistical method called t test was used to group isolators by their functioning conditions. It was found that, the measured vibration amplitude in any direction canbe used as the criteria for identifying the isolator conditions. This detective system can determine the bad isolator at early stage before the exceeding vibration will affect the testing process.

  19. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    Science.gov (United States)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  20. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery.

    Science.gov (United States)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  1. High-Speed Digital Signal Processing Method for Detection of Repeating Earthquakes Using GPGPU-Acceleration

    Science.gov (United States)

    Kawakami, Taiki; Okubo, Kan; Uchida, Naoki; Takeuchi, Nobunao; Matsuzawa, Toru

    2013-04-01

    Repeating earthquakes are occurring on the similar asperity at the plate boundary. These earthquakes have an important property; the seismic waveforms observed at the identical observation site are very similar regardless of their occurrence time. The slip histories of repeating earthquakes could reveal the existence of asperities: The Analysis of repeating earthquakes can detect the characteristics of the asperities and realize the temporal and spatial monitoring of the slip in the plate boundary. Moreover, we are expecting the medium-term predictions of earthquake at the plate boundary by means of analysis of repeating earthquakes. Although the previous works mostly clarified the existence of asperity and repeating earthquake, and relationship between asperity and quasi-static slip area, the stable and robust method for automatic detection of repeating earthquakes has not been established yet. Furthermore, in order to process the enormous data (so-called big data) the speedup of the signal processing is an important issue. Recently, GPU (Graphic Processing Unit) is used as an acceleration tool for the signal processing in various study fields. This movement is called GPGPU (General Purpose computing on GPUs). In the last few years the performance of GPU keeps on improving rapidly. That is, a PC (personal computer) with GPUs might be a personal supercomputer. GPU computing gives us the high-performance computing environment at a lower cost than before. Therefore, the use of GPUs contributes to a significant reduction of the execution time in signal processing of the huge seismic data. In this study, first, we applied the band-limited Fourier phase correlation as a fast method of detecting repeating earthquake. This method utilizes only band-limited phase information and yields the correlation values between two seismic signals. Secondly, we employ coherence function using three orthogonal components (East-West, North-South, and Up-Down) of seismic data as a

  2. QRS DETECTION OF ECG - A STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    I.S. Siva Rao

    2015-03-01

    Full Text Available Electrocardiogram (ECG is a graphical representation generated by heart muscle. ECG plays an important role in diagnosis and monitoring of heart’s condition. The real time analyzer based on filtering, beat recognition, clustering, classification of signal with maximum few seconds delay can be done to recognize the life threatening arrhythmia. ECG signal examines and study of anatomic and physiologic facets of the entire cardiac muscle. The inceptive task for proficient scrutiny is the expulsion of noise. It is attained by the use of wavelet transform analysis. Wavelets yield temporal and spectral information concurrently and offer stretchability with a possibility of wavelet functions of different properties. This paper is concerned with the extraction of QRS complexes of ECG signals using Discrete Wavelet Transform based algorithms aided with MATLAB. By removing the inconsistent wavelet transform coefficient, denoising is done in ECG signal. In continuation, QRS complexes are identified and in which each peak can be utilized to discover the peak of separate waves like P and T with their derivatives. Here we put forth a new combinatory algorithm builded on using Pan-Tompkins' method and multi-wavelet transform.

  3. Detection of Epileptic Seizures with Multi-modal Signal Processing

    DEFF Research Database (Denmark)

    Conradsen, Isa

    and alarm whenever a seizure starts is of great importance to these patients and their relatives, in the sense, that the alert of the seizure will make them feel more safe. Thus the objective of the project is to investigate the movements of convulsive epileptic seizures and design seizure detection...... convulsive seizures tested. Another study was performed, involving quantitative parameters in the time and frequency domain. The study showed, that there are several differences between tonic seizures and the tonic phase of GTC seizures and furthermore revealed differences of the epileptic (tonic and tonic...

  4. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  5. Advanced signal processing technique for damage detection in steel tubes

    Science.gov (United States)

    Amjad, Umar; Yadav, Susheel Kumar; Dao, Cac Minh; Dao, Kiet; Kundu, Tribikram

    2016-04-01

    In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

  6. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-05-17

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  7. Auditory Processing Speed and Signal Detection in Schizophrenia

    Science.gov (United States)

    Korboot, P. J.; Damiani, N.

    1976-01-01

    Two differing explanations of schizophrenic processing deficit were examined: Chapman and McGhie's and Yates'. Thirty-two schizophrenics, classified on the acute-chronic and paranoid-nonparanoid dimensions, and eight neurotics were tested on two dichotic listening tasks. (Editor)

  8. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy

    OpenAIRE

    Xiangbo An; Jingjing Wang; Hao Li; Zhizhen Lu; Yan Bai; Han Xiao; Youyi Zhang; Yao Song

    2016-01-01

    Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO) mouse model. Pathological hypertrophy...

  9. Nonparametric signal processing validation in T-wave alternans detection and estimation.

    Science.gov (United States)

    Goya-Esteban, R; Barquero-Pérez, O; Blanco-Velasco, M; Caamaño-Fernández, A J; García-Alberola, A; Rojo-Álvarez, J L

    2014-04-01

    Although a number of methods have been proposed for T-Wave Alternans (TWA) detection and estimation, their performance strongly depends on their signal processing stages and on their free parameters tuning. The dependence of the system quality with respect to the main signal processing stages in TWA algorithms has not yet been studied. This study seeks to optimize the final performance of the system by successive comparisons of pairs of TWA analysis systems, with one single processing difference between them. For this purpose, a set of decision statistics are proposed to evaluate the performance, and a nonparametric hypothesis test (from Bootstrap resampling) is used to make systematic decisions. Both the temporal method (TM) and the spectral method (SM) are analyzed in this study. The experiments were carried out in two datasets: first, in semisynthetic signals with artificial alternant waves and added noise; second, in two public Holter databases with different documented risk of sudden cardiac death. For semisynthetic signals (SNR = 15 dB), after the optimization procedure, a reduction of 34.0% (TM) and 5.2% (SM) of the power of TWA amplitude estimation errors was achieved, and the power of error probability was reduced by 74.7% (SM). For Holter databases, appropriate tuning of several processing blocks, led to a larger intergroup separation between the two populations for TWA amplitude estimation. Our proposal can be used as a systematic procedure for signal processing block optimization in TWA algorithmic implementations.

  10. A Bubble Detection Algorithm Based on Sparse and Redundant Image Processing

    Directory of Open Access Journals (Sweden)

    Ye Tian

    2013-06-01

    Full Text Available Deinked pulp flotation column has been applied in wastepaper recycling. Bubble size in deinked pulp flotation column is very important during the flotation process. In this paper, bubble images of deinked pulp flotation column were first caught by digital camera, and then the bubbles were detected by using a detection algorithm based on sparse and redundant image processing. The results show the algorithms are very practical and effective on bubble detection in deinked pulp flotation column.

  11. Research of the image processing in dynamic flatness detection based on improved laser triangular method

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    As a commonly used non-contact flatness detection method, laser triangular detection method is designed with low cost, but it cannot avoid measurement errors caused by strip steel vibration effectively. This paper puts forward a dynamic flatness image processing method based on improved laser triangular detection method. According to the practical application of strip steel straightening, it completes the image pre-processing, image feature curve extraction and calculation of flatness elongation using digit...

  12. Detection Tuna and Processed Products Based Protein and DNA Barcoding

    Directory of Open Access Journals (Sweden)

    Nuring Wulansari

    2015-11-01

    Full Text Available Tuna is the second largest fishery commodity in Indonesia after the shrimp. Since the high demand and the limited stock of tuna resulted in fraudulent chance. Authentication is required to meassure consumers regarding the accuracy of its labeling and food safety. In this study, the authentication was based on protein and DNA barcoding using cytochrome-b gene (cyt-b of the mitochondrial DNA as the target of gene. Primer of cyt b gene was designed based on the tuna species. This study aimed to identify the authenticity of tuna fresh and its processed products through protein using SDS-PAGE and DNA barcoding techniques. The phases of this research were protein electrophoresis by SDS-PAGE, DNA extraction, PCR amplification, electrophoresis and sequencing. Samples of fresh fish (Tu1, Tu2, Tu3, Tu4, and Tu5 and processed tuna (canned and steak were successfully extracted. Result showed that SDS-PAGE proved the damage of proteins in the processed tuna, so this method was not appropriate if it is used to identify the authenticity of tuna. PCR electrophoresis results showed that the samples of tuna, tuna steak, sushi, meat ball, abon, and caned tuna were successfully amplified in the range of 500-750 bp except Ka3, which was in line with the target of DNA (620 bp. Resulted sequences of Tu2, Tu3, Tu4 and Tu5 were identified according the results of morphometric namely T. albacares, while Tu1 was identified as T. obesus with homology level of 99%. Processed tunas (steak and canned tuna were identified as T. albacares, as stated on the labels.

  13. Accelerating Malware Detection via a Graphics Processing Unit

    Science.gov (United States)

    2010-09-01

    Processing Unit . . . . . . . . . . . . . . . . . . 4 PE Portable Executable . . . . . . . . . . . . . . . . . . . . . 4 COFF Common Object File Format...operating systems for the future [Szo05]. The PE format is an updated version of the common object file format ( COFF ) [Mic06]. Microsoft released a new...pro.mspx, Accessed July 2010, 2001. 79 Mic06. Microsoft. Common object file format ( coff ). MSDN, November 2006. Re- vision 4.1. Mic07a. Microsoft

  14. Dinitrogen oxide detection for process failure early warning systems.

    Science.gov (United States)

    Burgess, J E; Stuetz, R M; Morton, S; Stephenson, T

    2002-01-01

    A number of experiments were conducted in order to establish whether the concentration of N2O in the off-gas from an activated sludge pilot plant could be used as a indicator for monitoring the nitrification process and as an early indication of ammonia appearing in the plant effluent. A strong correlation was found between ammonia shock loads and the concentration of N20 in the off-gas from the aeration tank for ammonia shock loads and dissolved oxygen depletion. When subjecting the experimental setup to doses of a nitrification inhibitor (allylthiourea) a similar pattern was seen with a correlation between nitrite build up in the aeration tank and concentration increase of N2O in the off-gas from the aeration tank. The results from this work suggest the concentration and the changes in the concentration of N2O in the exhaust gas from a nitrifying process may be a useful parameter for monitoring nitrifying activated sludge processes.

  15. Image processing and analysis using neural networks for optometry area

    Science.gov (United States)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  16. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  17. Using recurrence plot analysis for software execution interpretation and fault detection

    Science.gov (United States)

    Mosdorf, M.

    2015-09-01

    This paper shows a method targeted at software execution interpretation and fault detection using recurrence plot analysis. In in the proposed approach recurrence plot analysis is applied to software execution trace that contains executed assembly instructions. Results of this analysis are subject to further processing with PCA (Principal Component Analysis) method that simplifies number coefficients used for software execution classification. This method was used for the analysis of five algorithms: Bubble Sort, Quick Sort, Median Filter, FIR, SHA-1. Results show that some of the collected traces could be easily assigned to particular algorithms (logs from Bubble Sort and FIR algorithms) while others are more difficult to distinguish.

  18. ECG Signal Analysis and Arrhythmia Detection using Wavelet Transform

    Science.gov (United States)

    Kaur, Inderbir; Rajni, Rajni; Marwaha, Anupma

    2016-12-01

    Electrocardiogram (ECG) is used to record the electrical activity of the heart. The ECG signal being non-stationary in nature, makes the analysis and interpretation of the signal very difficult. Hence accurate analysis of ECG signal with a powerful tool like discrete wavelet transform (DWT) becomes imperative. In this paper, ECG signal is denoised to remove the artifacts and analyzed using Wavelet Transform to detect the QRS complex and arrhythmia. This work is implemented in MATLAB software for MIT/BIH Arrhythmia database and yields the sensitivity of 99.85 %, positive predictivity of 99.92 % and detection error rate of 0.221 % with wavelet transform. It is also inferred that DWT outperforms principle component analysis technique in detection of ECG signal.

  19. Image edge detection based on multi-fractal spectrum analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-yuan; WANG Yao-nan

    2006-01-01

    In this paper,an image edge detection method based on multi-fractal spectrum analysis is presented.The coarse grain H(o)lder exponent of the image pixels is first computed,then,its multi-fractal spectrum is estimated by the kernel estimation method.Finally,the image edge detection is done by means of different multi-fractal spectrum values.Simulation results show that this method is efficient and has better locality compared with the traditional edge detection methods such as the Sobel method.

  20. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders

    2003-01-01

    . The emphasis of this paper is on the signal-to-noise ratio of the detection and its relation to the sensitivity. Two absorbance cells with an optical path length of 100 μm and 1000 μm were characterized and compared in terms of sensitivity, limit of detection and effective path length for measurements......A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar...

  1. Detection of organic residues on food processing equipment surfaces by spectral imaging method

    Science.gov (United States)

    Qin, Jianwei; Jun, Won; Kim, Moon S.; Chao, Kaunglin

    2010-04-01

    Organic residues on equipment surfaces in poultry processing plants can generate cross contamination and increase the risk of unsafe food for consumers. This research was aimed to investigate the potential of LED-induced fluorescence imaging technique for rapid inspection of organic residues on poultry processing equipment surfaces. High-power blue LEDs with a spectral output at 410 nm were used as the excitation source for a line-scanning hyperspectral imaging system. Common chicken residue samples including fat, blood, and feces from ceca, colon, duodenum, and small intestine were prepared on stainless steel sheets. Fluorescence emission images were acquired from 120 samples (20 for each type of residue) in the wavelength range of 500-700 nm. LED-induced fluorescence characteristics of the tested samples were determined. PCA (principal component analysis) was performed to analyze fluorescence spectral data. Two SIMCA (soft independent modeling of class analogy) models were developed to differentiate organic residues and stainless steel samples. Classification accuracies using 2-class ('stainless steel' and 'organic residue') and 4-class ('stainless steel', 'fat', 'blood', and 'feces') SIMCA models were 100% and 97.5%, respectively. An optimal single-band and a band-pair that are promising for rapid residue detection were identified by correlation analysis. The single-band approach using the selected wavelength of 666 nm could generate false negative errors for chicken blood inspection. Two-band ratio images using 503 and 666 nm (F503/F666) have great potential for detecting various chicken residues on stainless steel surfaces. This wavelength pair can be adopted for developing a LED-based hand-held fluorescence imaging device for inspecting poultry processing equipment surfaces.

  2. Decision analysis applications and the CERCLA process

    Energy Technology Data Exchange (ETDEWEB)

    Purucker, S.T.; Lyon, B.F. [Oak Ridge National Lab., TN (United States). Risk Analysis Section]|[Univ. of Tennessee, Knoxville, TN (United States)

    1994-06-01

    Quantitative decision methods can be developed during environmental restoration projects that incorporate stakeholder input and can complement current efforts that are undertaken for data collection and alternatives evaluation during the CERCLA process. These decision-making tools can supplement current EPA guidance as well as focus on problems that arise as attempts are made to make informed decisions regarding remedial alternative selection. In examining the use of such applications, the authors discuss the use of decision analysis tools and their impact on collecting data and making environmental decisions from a risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based decision rules that incorporate stakeholder concerns. This represents a quantitative method for implementing the Data Quality Objective (DQO) process. These objective functions can be expressed using a variety of indices to analyze problems that currently arise in the environmental field. Examples include cost, magnitude of risk, efficiency, and probability of success or failure. Based on such defined objective functions, a project can evaluate the impact of different risk and decision selection strategies on data worth and alternative selection.

  3. Processing of Instantaneous Angular Speed Signal for Detection of a Diesel Engine Failure

    Directory of Open Access Journals (Sweden)

    Adam Charchalis

    2013-01-01

    Full Text Available Continuous monitoring of diesel engine performance under its operating is critical for the prediction of malfunction development and subsequently functional failure detection. Analysis of instantaneous angular speed (IAS of the crankshaft is considered as one of the nonintrusive and effective methods of the detection of combustion quality deterioration. In this paper results of experimental verification of fuel system's malfunction detecting, using optical encoder for IAS recording are presented. The implemented method relies on the comparison of measurement results, recorded under healthy and faulty conditions of the engine. Elaborated dynamic model of angular speed variations enables us to build templates of engine behavior. Recorded during experiment, values of cylinder pressure were taken for the approximation of pressure basic waveform. The main task of data processing is smoothing the raw angular speed signal. The noise is due to sensor mount vibrations, signal emitter machining, engine body vibrations, and crankshaft torsional vibrations. Smoothing of the measurement data was carried out by the implementation of the Savitzky-Golay filter. Measured signal after smoothing was compared with the model of IAS run.

  4. Image Processing and Analysis for DTMRI

    Directory of Open Access Journals (Sweden)

    Kondapalli Srinivasa Vara Prasad

    2012-01-01

    Full Text Available This paper describes image processing techniques for Diffusion Tensor Magnetic Resonance. In Diffusion Tensor MRI, a tensor describing local water diffusion is acquired for each voxel. The geometric nature of the diffusion tensors can quantitatively characterize the local structure in tissues such as bone, muscles, and white matter of the brain. The close relationship between local image structure and apparent diffusion makes this image modality very interesting for medical image analysis. We present a decomposition of the diffusion tensor based on its symmetry properties resulting in useful measures describing the geometry of the diffusion ellipsoid. A simple anisotropy measure follows naturally from this analysis. We describe how the geometry, or shape, of the tensor can be visualized using a coloring scheme based on the derived shape measures. We show how filtering of the tensor data of a human brain can provide a description of macro structural diffusion which can be used for measures of fiber-tract organization. We also describe how tracking of white matter tracts can be implemented using the introduced methods. These methods offers unique tools for the in vivo demonstration of neural connectivity in healthy and diseased brain tissue.

  5. Thermodynamic Analysis of Nanoporous Membrane Separation Processes

    Science.gov (United States)

    Rogers, David; Rempe, Susan

    2011-03-01

    We give an analysis of desalination energy requirements in order to quantify the potential for future improvements in desalination membrane technology. Our thermodynamic analysis makes it possible to draw conclusions from the vast array of equilibrium molecular dynamics simulations present in the literature as well as create a standardized comparison for measuring and reporting experimental reverse osmosis material efficiency. Commonly employed methods for estimating minimum desalination energy costs have been revised to include operations at positive input stream recovery ratios using a thermodynamic cycle analogous to the Carnot cycle. Several gaps in the statistical mechanical theory of irreversible processes have also been identified which may in the future lead to improved communication between materials engineering models and statistical mechanical simulation. Simulation results for silica surfaces and nanochannels are also presented. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  6. Multisensor Network System for Wildfire Detection Using Infrared Image Processing

    Directory of Open Access Journals (Sweden)

    I. Bosch

    2013-01-01

    Full Text Available This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire, for different probability of false alarm (PFA. The necessity of including decision fusion is thereby demonstrated.

  7. Detection of tremor bursts by a running second order moment function and analysis using interburst histograms

    NARCIS (Netherlands)

    Journee, Henricus Louis; Postma, Alida Annechien; Sun, Mingui; Staal, Michiel J.

    2008-01-01

    Introduction: Conventional linear signal processing techniques are not always suitable for the detection of tremor bursts in clinical practice due to inevitable noise from electromyographic (EMG) bursts. This study introduces (1) a non-linear analysis technique based on a running second order moment

  8. Novel Flood Detection and Analysis Method Using Recurrence Property

    Science.gov (United States)

    Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert

    2016-04-01

    Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.

  9. Fault detection and diagnosis in a food pasteurization process with Hidden Markov Models

    OpenAIRE

    Tokatlı, Figen; Cinar, Ali

    2004-01-01

    Hidden Markov Models (HMM) are used to detect abnormal operation of dynamic processes and diagnose sensor and actuator faults. The method is illustrated by monitoring the operation of a pasteurization plant and diagnosing causes of abnormal operation. Process data collected under the influence of faults of different magnitude and duration in sensors and actuators are used to illustrate the use of HMM in the detection and diagnosis of process faults. Case studies with experimental data from a ...

  10. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  11. Heart Beat Detection in Noisy ECG Signals Using Statistical Analysis of the Automatically Detected Annotations

    Directory of Open Access Journals (Sweden)

    Andrius Gudiškis

    2015-07-01

    Full Text Available This paper proposes an algorithm to reduce the noise distortion influence in heartbeat annotation detection in electrocardiogram (ECG signals. Boundary estimation module is based on energy detector. Heartbeat detection is usually performed by QRS detectors that are able to find QRS regions in a ECG signal that are a direct representation of a heartbeat. However, QRS performs as intended only in cases where ECG signals have high signal to noise ratio, when there are more noticeable signal distortion detectors accuracy decreases. Proposed algorithm uses additional data, taken from arterial blood pressure signal which was recorded in parallel to ECG signal, and uses it to support the QRS detection process in distorted signal areas. Proposed algorithm performs as well as classical QRS detectors in cases where signal to noise ratio is high, compared to the heartbeat annotations provided by experts. In signals with considerably lower signal to noise ratio proposed algorithm improved the detection accuracy to up to 6%.

  12. Generic Packing Detection Using Several Complexity Analysis for Accurate Malware Detection

    Directory of Open Access Journals (Sweden)

    Dr. Mafaz Mohsin Khalil Al-Anezi

    2014-01-01

    Full Text Available The attackers do not want their Malicious software (or malwares to be reviled by anti-virus analyzer. In order to conceal their malware, malware programmers are getting utilize the anti reverse engineering techniques and code changing techniques such as the packing, encoding and encryption techniques. Malware writers have learned that signature based detectors can be easily evaded by “packing” the malicious payload in layers of compression or encryption. State-of-the-art malware detectors have adopted both static and dynamic techniques to recover the payload of packed malware, but unfortunately such techniques are highly ineffective. If the malware is packed or encrypted, then it is very difficult to analyze. Therefore, to prevent the harmful effects of malware and to generate signatures for malware detection, the packed and encrypted executable codes must initially be unpacked. The first step of unpacking is to detect the packed executable files. The objective is to efficiently and accurately distinguish between packed and non-packed executables, so that only executables detected as packed will be sent to an general unpacker, thus saving a significant amount of processing time. The generic method of this paper show that it achieves very high detection accuracy of packed executables with a low average processing time. In this paper, a packed file detection technique based on complexity measured by several algorithms, and it has tested using a packed and unpacked dataset of file type .exe. The preliminary results are very promising where achieved high accuracy with enough performance. Where it achieved about 96% detection rate on packed files and 93% detection rate on unpacked files. The experiments also demonstrate that this generic technique can effectively prepared to detect unknown, obfuscated malware and cannot be evaded by known evade techniques.

  13. A New Method of Color Edge Detection Based on Local Structure Analysis

    Institute of Scientific and Technical Information of China (English)

    JIANG Shu; ZHOU Yue; ZHU Wei-wei

    2008-01-01

    Human's real life is within a colorful world.Compared to the gray images, color images contain more information and have better visual effects.In today's digital image processing, image segmentation is an important section for computers to "understand" images and edge detection is always one of the most important methods in the field of image segmentation.Edges in color images are considered as local discontinuities both in color and spatial domains.Despite the intensive study based on integration of single-channel edge detection results, and on vector space analysis, edge detection in color images remains as a challenging issue.

  14. The National Shipbuilding Research Program. Process Analysis Via Accuracy Control

    Science.gov (United States)

    1985-08-01

    Process Analysis Via Accuracy Control U.S. DEPARTMENT OF TRANSPORTATION Maritime Administration in cooperation with Todd Pacific Shipyards...AUG 1985 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE The National Shipbuilding Research Program Process Analysis Via...lighting, retraining work- ers, or other such approaches. This product of A/C is called process or method analysis. Process analysis involves a

  15. Multi-damage detection with embedded ultrasonic structural radar algorithm using piezoelectric wafer active sensors through advanced signal processing

    Science.gov (United States)

    Yu, Lingyu; Giurgiutiu, Victor

    2005-05-01

    The embedded ultrasonic structural radar (EUSR) algorithm was developed by using piezoelectric wafer active sensor (PWAS) array to detect defects within a large area of a thin-plate specimen. EUSR has been verified to be effective for detecting a single crack either at a broadside or at an offside position. In this research, advanced signal processing techniques were included to enhance inspection image quality and detect multiple damage. The signal processing methods include discrete wavelet transform for signal denoising, short-time Fourier transform and continuous wavelet transform for time-frequency analysis, continuous wavelet transform for frequency filtering, and Hilbert transform for envelope extraction. All these signal processing modules were implemented by developing a graphical user-friendly interface program in LabVIEW. The paper starts with an introduction of embedded ultrasonic structural radar algorithm, followed with the theoretical aspect of the phased array signal processing method. Then, the mathematical algorithms for advanced signal processing are introduced. In the end, laboratory experimental results are presented to show how efficiently the improved EUSR works. The results are analyzed and EUSR is concluded to have been improved by using the advanced signal processing techniques. The improvements include: 1) EUSR is able to provide better image of the specimen under monitoring; 2) it is able to detect multi-damage such as several cracks; 3) it is able to identify different damage types.

  16. Indonesian Social Media Sentiment Analysis With Sarcasm Detection

    OpenAIRE

    Lunando, Edwin; Purwarianti, Ayu

    2015-01-01

    Sarcasm is considered one of the most difficult problem in sentiment analysis. In our ob-servation on Indonesian social media, for cer-tain topics, people tend to criticize something using sarcasm. Here, we proposed two additional features to detect sarcasm after a common sentiment analysis is conducted. The features are the negativity information and the number of interjection words. We also employed translated SentiWordNet in the sentiment classification. All the classifications were conduc...

  17. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    to be in statistical control. Significant deviations between analytical results from different laboratories reveal the presence of systematic errors, and agreement between different laboratories indicate the absence of systematic errors. This statistical approach, referred to as the analysis of precision, was applied...... to results for chlorine in freshwater from BCR certification analyses by highly competent analytical laboratories in the EC. Titration showed systematic errors of several percent, while radiochemical neutron activation analysis produced results without detectable bias....

  18. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  19. Comparing determination methods of detection and quantification limits for aflatoxin analysis in hazelnut

    OpenAIRE

    2016-01-01

    Hazelnut is a type of plant that grows in wet and humid climatic conditions. Adverse climatic conditions result in the formation of aflatoxin in hazelnuts during the harvesting, drying, and storing processes. Aflatoxin is considered an important food contaminant, which makes aflatoxin analysis important in the international produce trade. For this reason, validation is important for the analysis of aflatoxin in hazelnuts. The limit of detection (LOD) and limit of quantification (LOQ) are two ...

  20. Is Side-Channel Analysis really reliable for detecting Hardware Trojans?

    OpenAIRE

    Di Natale, Giorgio; Dupuis, Sophie; Rouzeyre, Bruno

    2012-01-01

    International audience; Hardware Trojans are malicious alterations to a cir- cuit, inserted either during the design phase or during fabrication process. Due to the diversity of Trojans, detecting and/or locating them is a challenging task. Numerous approaches have been proposed to address this problem, whether logic testing based or side-channel analysis based techniques. In this paper, we focus on side-channel analysis, and try to underline the fact that no published technique until now has...

  1. 40 CFR 68.67 - Process hazard analysis.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 15 2010-07-01 2010-07-01 false Process hazard analysis. 68.67 Section...) CHEMICAL ACCIDENT PREVENTION PROVISIONS Program 3 Prevention Program § 68.67 Process hazard analysis. (a) The owner or operator shall perform an initial process hazard analysis (hazard evaluation)...

  2. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  3. Early Detection of Diabetic Retinopathy in Fluorescent Angiography Retinal Images Using Image Processing Methods

    Directory of Open Access Journals (Sweden)

    Meysam Tavakoli

    2010-12-01

    Full Text Available Introduction: Diabetic retinopathy (DR is the single largest cause of sight loss and blindness in the working age population of Western countries; it is the most common cause of blindness in adults between 20 and 60 years of age. Early diagnosis of DR is critical for preventing vision loss so early detection of microaneurysms (MAs as the first signs of DR is important. This paper addresses the automatic detection of MAs in fluorescein angiography fundus images, which plays a key role in computer assisted diagnosis of DR, a serious and frequent eye disease. Material and Methods: The algorithm can be divided into three main steps. The first step or pre-processing was for background normalization and contrast enhancement of the image. The second step aimed at detecting landmarks, i.e., all patterns possibly corresponding to vessels and the optic nerve head, which was achieved using a local radon transform. Then, MAs were extracted, which were used in the final step to automatically classify candidates into real MA and other objects. A database of 120 fluorescein angiography fundus images was used to train and test the algorithm. The algorithm was compared to manually obtained gradings of those images. Results: Sensitivity of diagnosis for DR was 94%, with specificity of 75%, and sensitivity of precise microaneurysm localization was 92%, at an average number of 8 false positives per image. Discussion and Conclusion: Sensitivity and specificity of this algorithm make it one of the best methods in this field. Using local radon transform in this algorithm eliminates the noise sensitivity for microaneurysm detection in retinal image analysis.

  4. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T.R.H.; Whitlow, H.J. [Lund Univ. (Sweden); Bubb, I.F.; Short, R.; Johnston, P.N. [Royal Melbourne Inst. of Tech., VIC (Australia)

    1996-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  5. Apolipoprotein B100 analysis in microchip with electrochemical detection

    Institute of Scientific and Technical Information of China (English)

    Cheng Cheng Liu; Yun Liu; Hui Xiang Wang; Yan Bo Qi; Peng Yuan Yang; Bao Hong Liu

    2011-01-01

    Apolipoprotein B100 (apoB-100) is a major protein of the cholesterol-rich low-density lipoprotein (LDL) and reflects a better assessment of total atherogenic burden to the vascular system than LDL. In this work, a simple and sensitive method has been developed to determine picoliter apoB-100s using the PMMA microfluidic chip coupled with electrochemical detection system. This method performs very well with a detectable linear range of 1-800 pg/mL and a detection limit of 1 pg/mL. A real serum sample has further been detected by this microchip-based biosensor. The results show that this kind of method is practicable and has the potential application in clinical analysis and diagnosis.

  6. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  7. Research of the image processing in dynamic flatness detection based on improved laser triangular method

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    As a commonly used non-contact flatness detection method, laser triangular detection method is designed with low cost, but it cannot avoid measurement errors caused by strip steel vibration effectively. This paper puts forward a dynamic flatness image processing method based on improved laser triangular detection method. According to the practical application of strip steel straightening, it completes the image pre-processing, image feature curve extraction and calculation of flatness elongation using digital image processing technology. Finally it eliminates elongation measurement errors caused by the vibration.

  8. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  9. Spiral analysis-improved clinical utility with center detection.

    Science.gov (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L

    2008-06-30

    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  10. Fault detection of a benchmark wind turbine using interval analysis

    DEFF Research Database (Denmark)

    Tabatabaeipour, Seyed Mojtaba; Odgaard, Peter Fogh; Bak, Thomas

    2012-01-01

    is nonlinear. We use an effective wind speed estimator to estimate the effective wind speed and then using interval analysis and monotonicity of the aerodynamic torque with respect to the effective wind speed, we can apply the method to the nonlinear system. The fault detection algorithm checks the consistency...

  11. Sparse principal component analysis in hyperspectral change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Larsen, Rasmus; Vestergaard, Jacob Schack

    2011-01-01

    This contribution deals with change detection by means of sparse principal component analysis (PCA) of simple differences of calibrated, bi-temporal HyMap data. Results show that if we retain only 15 nonzero loadings (out of 126) in the sparse PCA the resulting change scores appear visually very ...

  12. Gravitational wave detection and data analysis for pulsar timing arrays

    NARCIS (Netherlands)

    Haasteren, Rutger van

    2011-01-01

    Long-term precise timing of Galactic millisecond pulsars holds great promise for measuring long-period (months-to-years) astrophysical gravitational waves. In this work we develop a Bayesian data analysis method for projects called pulsar timing arrays; projects aimed to detect these gravitational w

  13. Solution-processed hybrid materials for light detection

    Science.gov (United States)

    Adinolfi, Valerio

    Inorganic semiconductors form the foundation of modern electronics and optoelectronics. These materials benefit from excellent optoelectronic properties, but applications are generally limited due to high cost of fabrication. More recently, organic semiconductors have emerged as a low-cost alternative for light emitting devices. Organic materials benefit from facile, low temperature fabrication and offer attractive features such as flexibility and transparency. However, these materials are inherently limited by poor electronic transport. In recent years, new materials have been developed to overcome the dichotomy between performance and the cost. Hybrid organic--inorganic semiconductors combine the superior electronic properties of inorganic materials with the facile assembly of organic systems to yield high-performance, low-cost electronics. This dissertation focuses on the development of solution-processed light detectors using hybrid material systems, particularly colloidal quantum dots (CQDs) and hybrid perovskites. First, advanced architectures for colloidal quantum dot light detectors are presented. These devices overcome the responsivity--speed--dark current trade-off that has limited past reports of CQD-based devices. The photo-junction field effect transistors presented in this work decrease the dark current of CQD detectors by two orders of magnitude, ultimately reducing power consumption (100x) and noise current (10x). The detector simultaneously benefits from high gain (˜10 electrons/photon) and fast time response (˜ 10 mus). This represents the first CQD-based three-terminal-junction device reported in the literature. Building on this success, hybrid perovskite devices are then presented. This material system has become a focal point of the semiconductor research community due to its relatively unexplored nature and attractive optoelectronic properties. Herein we present the first extensive electronic characterization of single crystal organolead

  14. Hybridization kinetics analysis of an oligonucleotide microarray for microRNA detection

    Institute of Scientific and Technical Information of China (English)

    Botao Zhao; Shuo Ding; Wei Li; Youxin Jin

    2011-01-01

    MicroRNA (miRNA) microarrays have been successfully used for profiling miRNA expression in many physiological processes such as development, differentiation, oncogenesis,and other disease processes. Detecting miRNA by miRNA microarray is actually based on nucleic acid hybridization between target molecules and their corresponding complementary probes. Due to the small size and high degree of similarity among miRNA sequences, the hybridization condition must be carefully optimized to get specific and reliable signals. Previously, we reported a microarray platform to detect miRNA expression. In this study, we evaluated the sensitivity and specificity of our microarray platform. After systematic analysis, we determined an optimized hybridization condition with high sensitivity and specificity for miRNA detection. Our results would be helpful for other hybridization-based miRNA detection methods, such as northern blot and nuclease protection assay.

  15. Rapid Detection of Biological and Chemical Threat Agents Using Physical Chemistry, Active Detection, and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung; Dong, Li; Fu, Rong; Liotta, Lance; Narayanan, Aarthi; Petricoin, Emanuel; Ross, Mark; Russo, Paul; Zhou, Weidong; Luchini, Alessandra; Manes, Nathan; Chertow, Jessica; Han, Suhua; Kidd, Jessica; Senina, Svetlana; Groves, Stephanie

    2007-01-01

    Basic technologies have been successfully developed within this project: rapid collection of aerosols and a rapid ultra-sensitive immunoassay technique. Water-soluble, humidity-resistant polyacrylamide nano-filters were shown to (1) capture aerosol particles as small as 20 nm, (2) work in humid air and (3) completely liberate their captured particles in an aqueous solution compatible with the immunoassay technique. The immunoassay technology developed within this project combines electrophoretic capture with magnetic bead detection. It allows detection of as few as 150-600 analyte molecules or viruses in only three minutes, something no other known method can duplicate. The technology can be used in a variety of applications where speed of analysis and/or extremely low detection limits are of great importance: in rapid analysis of donor blood for hepatitis, HIV and other blood-borne infections in emergency blood transfusions, in trace analysis of pollutants, or in search of biomarkers in biological fluids. Combined in a single device, the water-soluble filter and ultra-sensitive immunoassay technique may solve the problem of early warning type detection of aerosolized pathogens. These two technologies are protected with five patent applications and are ready for commercialization.

  16. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  17. Citation-based plagiarism detection detecting disguised and cross-language plagiarism using citation pattern analysis

    CERN Document Server

    Gipp, Bela

    2014-01-01

    Plagiarism is a problem with far-reaching consequences for the sciences. However, even today's best software-based systems can only reliably identify copy & paste plagiarism. Disguised plagiarism forms, including paraphrased text, cross-language plagiarism, as well as structural and idea plagiarism often remain undetected. This weakness of current systems results in a large percentage of scientific plagiarism going undetected. Bela Gipp provides an overview of the state-of-the art in plagiarism detection and an analysis of why these approaches fail to detect disguised plagiarism forms. The aut

  18. Two stages of parafoveal processing during reading: Evidence from a display change detection task.

    Science.gov (United States)

    Angele, Bernhard; Slattery, Timothy J; Rayner, Keith

    2016-08-01

    We used a display change detection paradigm (Slattery, Angele, & Rayner Human Perception and Performance, 37, 1924-1938 2011) to investigate whether display change detection uses orthographic regularity and whether detection is affected by the processing difficulty of the word preceding the boundary that triggers the display change. Subjects were significantly more sensitive to display changes when the change was from a nonwordlike preview than when the change was from a wordlike preview, but the preview benefit effect on the target word was not affected by whether the preview was wordlike or nonwordlike. Additionally, we did not find any influence of preboundary word frequency on display change detection performance. Our results suggest that display change detection and lexical processing do not use the same cognitive mechanisms. We propose that parafoveal processing takes place in two stages: an early, orthography-based, preattentional stage, and a late, attention-dependent lexical access stage.

  19. Analysis, synthesis and design of chemical processes

    Energy Technology Data Exchange (ETDEWEB)

    Turton, R. [West Virginia Univ., Morgantown, WV (United States); Bailie, R.C.; Whiting, W.B.

    1998-12-31

    The book illustrates key concepts through a running example from the real world: the manufacture of benzene; covers design, economic considerations, troubleshooting and health/environmental safety; and includes exclusive software for estimating chemical manufacturing equipment capital costs. This book will help chemical engineers optimize the efficiency of production processes, by providing both a philosophical framework and detailed information about chemical process design. Design is the focal point of the chemical engineering practice. This book helps engineers and senior-level students hone their design skills through process design rather than simply plant design. It introduces all the basics of process simulation. Learn how to size equipment, optimize flowsheets, evaluate the economics of projects, and plan the operation of processes. Learn how to use Process Flow Diagrams; choose the operating conditions for a process; and evaluate the performance of existing processes and equipment. Finally, understand how chemical process design impacts health, safety, the environment and the community.

  20. The Development of Manufacturing Process Analysis: Lesson Learned from Process Mining

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2014-01-01

    Full Text Available Process analysis is recognized as a major stage in business process reengineering that has developed over the last two decades. In the field of manufacturing, manufacturing process analysis (MPA is defined as performance analysis of the production process. The performance analysis is an outline from data and knowledge into useful forms that can be broadly applied in manufacturing sectors. Process mining, an emerge tool focusing on process perspective and resource perspective, is a way to analyze system based on the event log. The objective of this study is to extend the existing process analysis framework by considering attribute perspective. This study also aims to learn the lessons from some experiences on process mining in manufacture industries. The result of this study will help manufacturing organizations to utilize the process mining approach for analyzing their respective processes.

  1. Narrow-Band Processing and Fusion Approach for Explosive Hazard Detection in FLGPR

    Science.gov (United States)

    2011-01-01

    for the NIITEK ground penetrating radar using order weighted averaging operators for landmine detection‖, Proc. SPIE 5415, 953-962 (2004). [6...proposes an effective anomaly detection algorithm for a forward-looking ground - penetrating radar (FLGPR). One challenge for threat detection using FLGPR...explosive hazards detection, ground - penetrating radar , narrow-band processing, false alarm rejection, fusion, Gabor filters Timothy C. Havens, James M

  2. Can Mismatch Negativity Be Linked to Synaptic Processes? A Glutamatergic Approach to Deviance Detection

    Science.gov (United States)

    Strelnikov, Kuzma

    2007-01-01

    This article aims to provide a theoretical framework to elucidate the neurophysiological underpinnings of deviance detection as reflected by mismatch negativity. A six-step model of the information processing necessary for deviance detection is proposed. In this model, predictive coding of learned regularities is realized by means of long-term…

  3. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian demin

  4. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity.

  5. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  6. Effect of Using Automated Auditing Tools on Detecting Compliance Failures in Unmanaged Processes

    Science.gov (United States)

    Doganata, Yurdaer; Curbera, Francisco

    The effect of using automated auditing tools to detect compliance failures in unmanaged business processes is investigated. In the absence of a process execution engine, compliance of an unmanaged business process is tracked by using an auditing tool developed based on business provenance technology or employing auditors. Since budget constraints limit employing auditors to evaluate all process instances, a methodology is devised to use both expert opinion on a limited set of process instances and the results produced by fallible automated audit machines on all process instances. An improvement factor is defined based on the average number of non-compliant process instances detected and it is shown that the improvement depends on the prevalence of non-compliance in the process as well as the sensitivity and the specificity of the audit machine.

  7. 3-D analysis of grain selection process

    Science.gov (United States)

    Arao, Tomoka; Esaka, Hisao; Shinozuka, Kei

    2012-07-01

    It is known that the grain selection plays an important role in the manufacturing process for turbine blades. There are some analytical or numerical models to treat the grain selection. However, the detailed mechanism of grain selection in 3-D is still uncertain. Therefore, an experimental research work using Al-Cu alloy has been carried out in order to understand the grain selection in 3-D.A mold made by Al2O3 was heated to 600 °C ( = liquids temperature of the alloy) and was set on a water-colded copper chill plate. Molten Al-20 wt%Cu alloy was cast into the mold and unidirectional solidified ingot was prepared. The size of ingot was approximately phi25×65H mm. To obtain the thermal history, 4 thermocouples were placed in the mold. It is confirmed that the alloy solidified unidirectionally from bottom to top. Solidified structure on a longitudinal cross section was observed and unidirectional solidification up to 40 mm was ensured. EBSD analysis has been performed on horizontal cross section at an interval of ca.200 μm. These observations were carried out 7-5 mm from the bottom surface. Crystallographic orientation of primary Al phase and size of solidified grains were characterized. A large solidified grain, the crystallographic orientation of which is approximately along heat flow direction, is observed near the lowest cross section. The area of grain decreased as solidification proceeded. On the other hand, it is found that the area of grain increased.

  8. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    Science.gov (United States)

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  9. Corpus analysis and automatic detection of emotion-including keywords

    Science.gov (United States)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  10. Finite Element Analysis on Nanomechanical Detection of Small Particles: Toward Virus Detection

    Science.gov (United States)

    Imamura, Gaku; Shiba, Kota; Yoshikawa, Genki

    2016-01-01

    Detection of small particles, including viruses and particulate matter (PM), has been attracting much attention in light of increasing need for environmental monitoring. Owing to their high versatility, a nanomechanical sensor is one of the most promising sensors which can be adapted to various monitoring systems. In this study, we present an optimization strategy to efficiently detect small particles with nanomechanical sensors. Adsorption of particles on the receptor layer of nanomechanical sensors and the resultant signal are analyzed using finite element analysis (FEA). We investigate the effect of structural parameters (e.g., adsorption position and embedded depth of a particle and thickness of the receptor layer) and elastic properties of the receptor layer (e.g., Young's modulus and Poisson's ratio) on the sensitivity. It is found that a membrane-type surface stress sensors (MSS) has the potential for robust detection of small particles. PMID:27148181

  11. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy.

    Science.gov (United States)

    An, Xiangbo; Wang, Jingjing; Li, Hao; Lu, Zhizhen; Bai, Yan; Xiao, Han; Zhang, Youyi; Song, Yao

    2016-01-01

    Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO) mouse model. Pathological hypertrophy was induced by a single subcutaneous injection of ISO. Physiological cardiac hypertrophy was established by daily treadmill exercise for six weeks. Strain analysis, including radial strain (RS), radial strain rate (RSR) and longitudinal strain (LS), showed marked decrease as early as 3 days after ISO injection. Moreover, unlike the regional changes in cardiac infarction, strain analysis revealed global cardiac dysfunction that affects the entire heart in ISO-induced hypertrophy. In contrast, conventional echocardiography, only detected altered E/E', an index reflecting cardiac diastolic function, at 7 days after ISO injection. No change was detected on fractional shortening (FS), E/A and E'/A' at 3 days or 7 days after ISO injection. Interestingly, strain analysis revealed cardiac dysfunction only in ISO-induced pathological hypertrophy but not the physiological hypertrophy induced by exercise. Taken together, our study indicates that strain analysis offers a more sensitive approach for early detection of cardiac dysfunction than conventional echocardiography. Moreover, multiple strain readouts distinguish pathological cardiac hypertrophy from physiological hypertrophy.

  12. Detection and quantification of waterborne microorganisms using an image cytometer based on angular spatial frequency processing

    CERN Document Server

    Pérez, Juan Miguel; Martínez, Pedro; Pruneri, Valerio

    2015-01-01

    We introduce a new image cytometer design for detection of very small particulate and demonstrate its capability in water analysis. The device is a compact microscope composed of off--the--shelf components, such as a light emitting diode (LED) source, a complementary metal--oxide--semiconductor (CMOS) image sensor, and a specific combination of optical lenses that allow, through an appropriate software, Fourier transform processing of the sample volume. Waterborne microorganisms, such as Escherichia coli (E. coli), Legionella pneumophila (L. pneumophila) and Phytoplankton, are detected by interrogating the volume sample either in a fluorescent or label-free mode, i.e. with or without fluorescein isothiocyanate (FITC) molecules attached to the micro-organisms, respectively. We achieve a sensitivity of 50 CFU/ml, which can be further increased to 0.2 CFU/ml by pre-concentrating an initial sample volume of 500 ml with an ad hoc fluidic system. We also prove the capability of the proposed image cytometer of diffe...

  13. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  14. Articulating the Resources for Business Process Analysis and Design

    Science.gov (United States)

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  15. Seismic detection and analysis of icequakes at Columbia Glacier, Alaska

    Science.gov (United States)

    O'Neel, Shad; Marshall, Hans P.; McNamara, Daniel E.; Pfeffer, William Tad

    2007-01-01

    Contributions to sea level rise from rapidly retreating marine-terminating glaciers are large and increasing. Strong increases in iceberg calving occur during retreat, which allows mass transfer to the ocean at a much higher rate than possible through surface melt alone. To study this process, we deployed an 11-sensor passive seismic network at Columbia Glacier, Alaska, during 2004–2005. We show that calving events generate narrow-band seismic signals, allowing frequency domain detections. Detection parameters were determined using direct observations of calving and validated using three statistical methods and hypocenter locations. The 1–3 Hz detections provide a good measure of the temporal distribution and size of calving events. Possible source mechanisms for the unique waveforms are discussed, and we analyze potential forcings for the observed seismicity.

  16. Foreign object detection and removal to improve automated analysis of chest radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van [Diagnostic Image Analysis Group, Radboud University Nijmegen Medical Centre, Nijmegen 6525 GA (Netherlands); Story, Alistair; Hayward, Andrew [University College London, Centre for Infectious Disease Epidemiology, London NW3 2PF (United Kingdom)

    2013-07-15

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.

  17. Analysis of accelerants and fire debris using aroma detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Barshick, S.A.

    1997-01-17

    The purpose of this work was to investigate the utility of electronic aroma detection technologies for the detection and identification of accelerant residues in suspected arson debris. Through the analysis of known accelerant residues, a trained neural network was developed for classifying suspected arson samples. Three unknown fire debris samples were classified using this neural network. The item corresponding to diesel fuel was correctly identified every time. For the other two items, wide variations in sample concentration and excessive water content, producing high sample humidities, were shown to influence the sensor response. Sorbent sampling prior to aroma detection was demonstrated to reduce these problems and to allow proper neural network classification of the remaining items corresponding to kerosene and gasoline.

  18. Reliability analysis for the quench detection in the LHC machine

    CERN Document Server

    Denz, R; Vergara-Fernández, A

    2002-01-01

    The Large Hadron Collider (LHC) will incorporate a large amount of superconducting elements that require protection in case of a quench. Key elements in the quench protection system are the electronic quench detectors. Their reliability will have an important impact on the down time as well as on the operational cost of the collider. The expected rates of both false and missed quenches have been computed for several redundant detection schemes. The developed model takes account of the maintainability of the system to optimise the frequency of foreseen checks, and evaluate their influence on the performance of different detection topologies. Seen the uncertainty of the failure rate of the components combined with the LHC tunnel environment, the study has been completed with a sensitivity analysis of the results. The chosen detection scheme and the maintainability strategy for each detector family are given.

  19. A User Requirements Analysis Approach Based on Business Processes

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yue-bin; HAN Wen-xiu

    2001-01-01

    Requirements analysis is the most important phase of information system development.Existing requirements analysis techniques concern little or no about features of different business processes.This paper presents a user requirements analysis approach which focuses business processes on the early stage of requirements analysis. It also gives an example of the using of this approach in the analysis of an enterprise information system.

  20. Sentiment Detection of Web Users Using Probabilistic Latent Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Weijian Ren

    2014-10-01

    Full Text Available With the wide application of Internet in almost all fields, it has become the most important way for information publication, providing a large number of channels for spreading public opinion. Public opinions, as the response of Internet users to the information such as social events and government policies, reflect the status of both society and economics, which is highly valuable for the decision-making and public relations of enterprises. At present, the analysis methods for Internet public opinion are mainly based on discriminative approaches, such as Support Vector Machine (SVM and neural network. However, when these approaches analyze the sentiment of Internet public opinion, they are failed to exploit information hidden in text, e.g. topic. Motivated by the above observation, this paper proposes a detection method for public sentiment based on Probabilistic Latent Semantic Analysis (PLSA model. PLSA inherits the advantages of LSA, exploiting the semantic topic hidden in data. The procedure of detecting the public sentiment using this algorithm is composed of three main steps: (1 Chinese word segmentation and word refinement, with which each document is represented by a bag of words; (2 modeling the probabilistic distribution of documents using PLSA; (3 using the Z-vector of PLSA as the features of documents and delivering it to SVM for sentiment detection. We collect a set of text data from Weibo, blog, BBS etc. to evaluate our proposed approach. The experimental results shows that the proposed method in this paper can detect the public sentiment with high accuracy, outperforming the state-of-the-art approaches, i.e., word histogram based approach. The results also suggest that, text semantic analysis using PLSA could significantly boost the sentiment detection

  1. A Survey on Malware Propagation, Analysis, and Detection

    Directory of Open Access Journals (Sweden)

    Mohsen Damshenas

    2015-05-01

    Full Text Available Lately, a new kind of war takes place between the security community and malicious software developers, the security specialists use all possible techniques, methods and strategies to stop and remove the threats while the malware developers utilize new types of malwares that bypass implemented security features. In this study we closely looked into malware, to understand the definition, types, propagation of malware, and detecting/defending mechanisms in order to contribute to the process of protection and security enhancement.

  2. DROIDSWAN: DETECTING MALICIOUS ANDROID APPLICATIONS BASED ON STATIC FEATURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Babu Rajesh V

    2015-07-01

    Full Text Available Android being a widely used mobile platform has witnessed an increase in the number of malicious samples on its market place. The availability of multiple sources for downloading applications has also contributed to users falling prey to malicious applications. Classification of an Android application as malicious or benign remains a challenge as malicious applications maneuver to pose themselves as benign. This paper presents an approach which extracts various features from Android Application Package file (APK using static analysis and subsequently classifies using machine learning techniques. The contribution of this work includes deriving, extracting and analyzing crucial features of Android applications that aid in efficient classification. The analysis is carried out using various machine learning algorithms with both weighted and non-weighted approaches. It was observed that weighted approach depicts higher detection rates using fewer features. Random Forest algorithm exhibited high detection rate and shows the least false positive rate.

  3. Speciation analysis by gas chromatography with plasma source spectrometric detection

    Science.gov (United States)

    Łobiński, Ryszard; Adams, Freddy C.

    State-of-the-art species-selective analysis by gas chromatography (GC) with plasma source spectrometric detection is discussed for organometal and organometalloid compounds. Various plasmas, inductively coupled plasma, microwave induced plasma, capacitatively coupled plasma, direct current plasma and alternating current plasma, are characterized and critically compared as sources of radiation for atomic emission spectrometry and sources of ions for mass spectrometry. Interfaces between gas chromatography (packed, wide-bore, capillary and multicapillary) and plasma source spectrometry are characterized. Particular emphasis is given to applications of GC with plasma source detection to real-world analytical problems, which are comprehensively reviewed. The use of plasmas for the acquisition of auxiliary molecular information such as empirical formulae and structural information is discussed. Recent developments relating to sample preparation and presentation to the hyphenated system are addressed. The most significant trends in speciation analysis are highlighted.

  4. Advanced Polymer Composite Molding Through Intelligent Process Analysis and Control

    Science.gov (United States)

    2004-11-30

    In this project. process analysis of Resin Transfer Molding (RTM) was carried out and adaptive process control models were developed. In addition, a...aforementioned work in three separate sections: (1) process analysis and adaptive control modeling, (2) manufacturing of non-invasive sensor, end (3) list of publications resulting from this project.

  5. Review of shock wave detection method in CFD post-processing

    Institute of Scientific and Technical Information of China (English)

    Wu Ziniu; Xu Yizhe; Wang Wenbin; Hu Ruifeng

    2013-01-01

    In the present computational fluid dynamics (CFD) community,post-processing is regarded as a procedure to view parameter distribution,detect characteristic structure and reveal physical mechanism of fluid flow based on computational or experimental results.Field plots by contours,iso-surfaces,streamlines,vectors and others are traditional post-processing techniques.While the shock wave,as one important and critical flow structure in many aerodynamic problems,can hardly be detected or distinguished in a direct way using these traditional methods,due to possible confusions with other similar discontinuous flow structures like slip line,contact discontinuity,etc.Therefore,method for automatic detection of shock wave in post-processing is of great importance for both academic research and engineering applications.In this paper,the current status of methodologies developed for shock wave detection and implementations in post-processing platform are reviewed,as well as discussions on advantages and limitations of the existing methods and proposals for further studies of shock wave detection method.We also develop an advanced post-processing software,with improved shock detection.

  6. Probabilistic analysis of a thermosetting pultrusion process

    NARCIS (Netherlands)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper H.

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion p

  7. Clinical Detection and Feature Analysis on Neuro Signals

    Institute of Scientific and Technical Information of China (English)

    张晓文; 杨煜普; 许晓鸣; 胡天培; 高忠华; 张键; 陈中伟; 陈统一

    2004-01-01

    Research on neuro signals is challenging and significative in modern natural science. By clinical experiment, signals from three main nerves (median nerve, radial nerve and ulnar nerve) are successfully detected and recorded without any infection. Further analysis on their features under different movements, their mechanics and correlations in dominating actions are also performed. The original discovery and first-hand materials make it possible for developing practical neuro-prosthesis.

  8. Food Microbial Pathogen Detection and Analysis Using DNA Microarray Technologies

    OpenAIRE

    Rasooly, Avraham; Herold, Keith E.

    2008-01-01

    Culture-based methods used for microbial detection and identification are simple to use, relatively inexpensive, and sensitive. However, culture-based methods are too time-consuming for high-throughput testing and too tedious for analysis of samples with multiple organisms and provide little clinical information regarding the pathogen (e.g., antibiotic resistance genes, virulence factors, or strain subtype). DNA-based methods, such as polymerase chain reaction (PCR), overcome some these limit...

  9. Fitness Landscape Analysis for Optimum Multiuser Detection Problem

    Institute of Scientific and Technical Information of China (English)

    WANG Shaowei; ZHU Qiuping

    2007-01-01

    Optimum multiuser detection (OMD) for CDMA systems is an NP-complete combinatorial optimization problem. Fitness landscape has been proven to be very useful for understanding the behavior of combinatorial optimization algorithms and can help in predicting their performance. This paper analyzes the statistic properties of the fitness landscape of the OMD problem by performing autocorrelation analysis, fitness distance correlation test and epistasis measure. The analysis results explain why some random search algorithms are effective methods for OMD problem and give hints how to design more efficient randomized search heuristic algorithms for OMD.

  10. Development of vibrational analysis for detection of antisymmetric shells

    CERN Document Server

    Esmailzadeh-Khadem, S; Rezaee, M

    2002-01-01

    In this paper, vibrational behavior of bodies of revolution with different types of structural faults is studied. Since vibrational characteristics of structures are natural properties of system, the existence of any structural faults causes measurable changes in these properties. Here, this matter is demonstrated. In other words, vibrational behavior of a body of revolution with no structural faults is analyzed by two methods of I) numerical analysis using super sap software, II) Experimental model analysis, and natural frequencies and mode shapes are obtained. Then, different types of cracks are introduced in the structure, and analysis is repeated and the results are compared. Based on this study, one may perform crack detection by measuring the natural frequencies and mode shapes of the samples and comparing with reference information obtained from the vibration analysis of the original structure with no fault.

  11. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    Goessens, W H; Kluytmans, J A; den Toom, N; van Rijsoort-Vos, T H; Niesters, B G; Stolz, E; Verbrugh, H A; Quint, W G

    1995-01-01

    In the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least 4%, i.e., 80 microliters, of t

  12. Influence of volume of sample processed on detection of Chlamydia trachomatis in urogenital samples by PCR

    NARCIS (Netherlands)

    W.H.F. Goessens (Wil); J.A.J.W. Kluytmans (Jan); N. den Toom; T.H. van Rijsoort-Vos; E. Stolz (Ernst); H.A. Verbrugh (Henri); W.G.V. Quint (Wim); H.G.M. Niesters (Bert)

    1995-01-01

    textabstractIn the present study, it was demonstrated that the sensitivity of the PCR for the detection of Chlamydia trachomatis is influenced by the volume of the clinical sample which is processed in the PCR. An adequate sensitivity for PCR was established by processing at least

  13. NO.sub.x sensor and process for detecting NO.sub.x

    Science.gov (United States)

    Dalla Betta, Ralph A.; Sheridan, David R.; Reed, Daniel L.

    1994-01-01

    This invention is a process for detecting low levels of nitrogen oxides (NO.sub.x) in a flowing gas stream (typically an exhaust gas stream) and a catalytic NO.sub.x sensor which may be used in that process.

  14. Error detection in GPS observations by means of Multi-process models

    DEFF Research Database (Denmark)

    Thomsen, Henrik F.

    2001-01-01

    The main purpose of this article is to present the idea of using Multi-process models as a method of detecting errors in GPS observations. The theory behind Multi-process models, and double differenced phase observations in GPS is presented shortly. It is shown how to model cycle slips in the Multi...

  15. Adaptive Image Processing Methods for Improving Contaminant Detection Accuracy on Poultry Carcasses

    Science.gov (United States)

    Technical Abstract A real-time multispectral imaging system has demonstrated a science-based tool for fecal and ingesta contaminant detection during poultry processing. In order to implement this imaging system at commercial poultry processing industry, the false positives must be removed. For doi...

  16. Nonlinear damage detection in composite structures using bispectral analysis

    Science.gov (United States)

    Ciampa, Francesco; Pickering, Simon; Scarselli, Gennaro; Meo, Michele

    2014-03-01

    Literature offers a quantitative number of diagnostic methods that can continuously provide detailed information of the material defects and damages in aerospace and civil engineering applications. Indeed, low velocity impact damages can considerably degrade the integrity of structural components and, if not detected, they can result in catastrophic failure conditions. This paper presents a nonlinear Structural Health Monitoring (SHM) method, based on ultrasonic guided waves (GW), for the detection of the nonlinear signature in a damaged composite structure. The proposed technique, based on a bispectral analysis of ultrasonic input waveforms, allows for the evaluation of the nonlinear response due to the presence of cracks and delaminations. Indeed, such a methodology was used to characterize the nonlinear behaviour of the structure, by exploiting the frequency mixing of the original waveform acquired from a sparse array of sensors. The robustness of bispectral analysis was experimentally demonstrated on a damaged carbon fibre reinforce plastic (CFRP) composite panel, and the nonlinear source was retrieved with a high level of accuracy. Unlike other linear and nonlinear ultrasonic methods for damage detection, this methodology does not require any baseline with the undamaged structure for the evaluation of the nonlinear source, nor a priori knowledge of the mechanical properties of the specimen. Moreover, bispectral analysis can be considered as a nonlinear elastic wave spectroscopy (NEWS) technique for materials showing either classical or non-classical nonlinear behaviour.

  17. Application of image processing methodologies for fruit detection and analysis

    OpenAIRE

    Font Calafell, Davinia

    2014-01-01

    En aquesta memòria es presenten diversos treballs d'investigació centrats en l’automatització d’operacions agrícoles mitjançant l’aplicació de diverses tècniques de processament d’imatge. En primer lloc es presenta un mètode desenvolupat per detectar i comptar raïms mitjançant la localització de pics d'intensitat en superfícies esfèriques. En segon lloc es desenvolupa un sistema de recol•lecció automàtica de fruita mitjançant la combinació d'una càmera estereoscòpica de baix cost i un braç...

  18. DETECTING ABNORMAL BEHAVIOR IN SOCIAL NETWORK WEBSITES BY USING A PROCESS MINING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Mahdi Sahlabadi

    2014-01-01

    Full Text Available Detecting abnormal user activity in social network websites could prevent from cyber-crime occurrence. The previous research focused on data mining while this research is based on user behavior process. In this study, the first step is defining a normal user behavioral pattern and the second step is detecting abnormal behavior. These two steps are applied on a case study that includes real and syntactic data sets to obtain more tangible results. The chosen technique used to define the pattern is process mining, which is an affordable, complete and noise-free event log. The proposed model discovers a normal behavior by genetic process mining technique and abnormal activities are detected by the fitness function, which is based on Petri Net rules. Although applying genetic mining is time consuming process, it can overcome the risks of noisy data and produces a comprehensive normal model in Petri net representation form.

  19. Detection of microbial biofilms on food processing surfaces: hyperspectral fluorescence imaging study

    Science.gov (United States)

    Jun, Won; Kim, Moon S.; Chao, Kaunglin; Lefcourt, Alan M.; Roberts, Michael S.; McNaughton, James L.

    2009-05-01

    We used a portable hyperspectral fluorescence imaging system to evaluate biofilm formations on four types of food processing surface materials including stainless steel, polypropylene used for cutting boards, and household counter top materials such as formica and granite. The objective of this investigation was to determine a minimal number of spectral bands suitable to differentiate microbial biofilm formation from the four background materials typically used during food processing. Ultimately, the resultant spectral information will be used in development of handheld portable imaging devices that can be used as visual aid tools for sanitation and safety inspection (microbial contamination) of the food processing surfaces. Pathogenic E. coli O157:H7 and Salmonella cells were grown in low strength M9 minimal medium on various surfaces at 22 +/- 2 °C for 2 days for biofilm formation. Biofilm autofluorescence under UV excitation (320 to 400 nm) obtained by hyperspectral fluorescence imaging system showed broad emissions in the blue-green regions of the spectrum with emission maxima at approximately 480 nm for both E. coli O157:H7 and Salmonella biofilms. Fluorescence images at 480 nm revealed that for background materials with near-uniform fluorescence responses such as stainless steel and formica cutting board, regardless of the background intensity, biofilm formation can be distinguished. This suggested that a broad spectral band in the blue-green regions can be used for handheld imaging devices for sanitation inspection of stainless, cutting board, and formica surfaces. The non-uniform fluorescence responses of granite make distinctions between biofilm and background difficult. To further investigate potential detection of the biofilm formations on granite surfaces with multispectral approaches, principal component analysis (PCA) was performed using the hyperspectral fluorescence image data. The resultant PCA score images revealed distinct contrast between

  20. Consistent approach to edge detection using multiscale fuzzy modeling analysis in the human retina

    Directory of Open Access Journals (Sweden)

    Mehdi Salimian

    2012-06-01

    Full Text Available Today, many widely used image processing algorithms based on human visual system have been developed. In this paper a smart edge detection based on modeling the performance of simple and complex cells and also modeling and multi-scale image processing in the primary visual cortex is presented. A way to adjust the parameters of Gabor filters (mathematical models of simple cells And the proposed non-linear threshold response are presented in order to Modeling of simple and complex cells. Also, due to multi-scale modeling analysis conducted in the human retina, in the proposed algorithm, all edges of the small and large structures with high precision are detected and localized. Comparing the results of the proposed method for a reliable database with conventional methods shows the higher Performance (about 4-13% and reliability of the proposed method in the detection and localization of edge.

  1. Sequential decision analysis for nonstationary stochastic processes

    Science.gov (United States)

    Schaefer, B.

    1974-01-01

    A formulation of the problem of making decisions concerning the state of nonstationary stochastic processes is given. An optimal decision rule, for the case in which the stochastic process is independent of the decisions made, is derived. It is shown that this rule is a generalization of the Bayesian likelihood ratio test; and an analog to Wald's sequential likelihood ratio test is given, in which the optimal thresholds may vary with time.

  2. Detection and Analysis of Twitter Trending Topics via LinkAnomaly Detection

    Directory of Open Access Journals (Sweden)

    Chandan M G

    2015-04-01

    Full Text Available This paper involves two approaches for finding the trending topics in social networks that is key-based approach and link-based approach. In conventional key-based approach for topics detection have mainly focus on frequencies of (textual words. We propose a link-based approach which focuses on posts reflected in the mentioning behavior of hundreds users. The anomaly detection in the twitter data set is carried out by retrieving the trend topics from the twitter in a sequential manner by using some API and corresponding user for training, then computed anomaly score is aggregated from different users. Further the aggregated anomaly score will be feed into change-point analysis or burst detection at the pinpoint, in order to detect the emerging topics. We have used the real time twitter account, so results are vary according to the tweet trends made. The experiment shows that proposed link-based approach performs even better than the keyword-based approach.

  3. Optimal swab processing recovery method for detection of bioterrorism-related Francisella tularensis by real-time PCR.

    Science.gov (United States)

    Walker, Roblena E; Petersen, Jeannine M; Stephens, Kenyatta W; Dauphin, Leslie A

    2010-10-01

    Francisella tularensis, the etiological agent of tularemia, is regarded as a potential bioterrorism agent. The advent of bioterrorism has heightened awareness of the need for validated methods for processing environmental samples. In this study we determined the optimal method for processing environmental swabs for the recovery and subsequent detection of F. tularensis by the use of real-time PCR assays. Four swab processing recovery methods were compared: heat, sonication, vortexing, and the Swab Extraction Tube System (SETS). These methods were evaluated using cotton, foam, polyester and rayon swabs spiked with six pathogenic strains of F. tularensis. Real-time PCR analysis using a multi-target 5'nuclease assay for F. tularensis showed that the use of the SETS method resulted in the best limit of detection when evaluated using multiple strains of F. tularensis. We demonstrated also that the efficiency of F. tularensis recovery from swab specimens was not equivalent for all swab processing methodologies and, thus, that this variable can affect real-time PCR assay sensitivity. The effectiveness of the SETS method was independent of the automated DNA extraction method and real-time PCR platforms used. In conclusion, diagnostic laboratories can now potentially incorporate the SETS method into specimen processing protocols for the rapid and efficient detection of F. tularensis by real-time PCR during laboratory bioterrorism-related investigations.

  4. A New MANET wormhole detection algorithm based on traversal time and hop count analysis.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2011-01-01

    As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET) are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA), which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  5. Real-time Detection of Precursors to Epileptic Seizures: Non-Linear Analysis of System Dynamics.

    Science.gov (United States)

    Nesaei, Sahar; Sharafat, Ahmad R

    2014-04-01

    We propose a novel approach for detecting precursors to epileptic seizures in intracranial electroencephalograms (iEEG), which is based on the analysis of system dynamics. In the proposed scheme, the largest Lyapunov exponent of the discrete wavelet packet transform (DWPT) of the segmented EEG signals is considered as the discriminating features. Such features are processed by a support vector machine (SVM) classifier to identify whether the corresponding segment of the EEG signal contains a precursor to an epileptic seizure. When consecutive EEG segments contain such precursors, a decision is made that a precursor is in fact detected. The proposed scheme is applied to the Freiburg dataset, and the results show that seizure precursors are detected in a time frame that unlike other existing schemes is very much convenient to patients, with sensitivity of 100% and negligible false positive detection rates.

  6. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    Directory of Open Access Journals (Sweden)

    Göran Pulkkis

    2011-11-01

    Full Text Available As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA, which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  7. Morphometric MRI analysis improves detection of focal cortical dysplasia type II.

    Science.gov (United States)

    Wagner, Jan; Weber, Bernd; Urbach, Horst; Elger, Christian E; Huppertz, Hans-Jürgen

    2011-10-01

    Focal cortical dysplasias type II (FCD II) are highly epileptogenic lesions frequently causing pharmacoresistant epilepsy. Detection of these lesions on MRI is still challenging as FCDs may be very subtle in appearance and might escape conventional visual analysis. Morphometric MRI analysis is a voxel-based post-processing method based on algorithms of the statistical parametric mapping software (SPM5). It creates three dimensional feature maps highlighting brain areas with blurred grey-white matter junction and abnormal gyration, and thereby may help to detect FCD. In this study, we evaluated the potential diagnostic value of morphometric analysis as implemented in a morphometric analysis programme, compared with conventional visual analysis by an experienced neuroradiologist in 91 patients with histologically proven FCD II operated on at the University Hospital of Bonn between 2000 and 2010 (FCD IIa, n = 17; IIb, n = 74). All preoperative MRI scans were evaluated independently (i) based on conventional visual analysis by an experienced neuroradiologist and (ii) using morphometric analysis. Both evaluators had the same clinical information (electroencephalography and semiology), but were blinded to each other's results. The detection rate of FCD using morphometric analysis was superior to conventional visual analysis in the FCD IIa subgroup (82% versus 65%), while no difference was found in the FCD IIb subgroup (92% versus 91%). However, the combination of conventional visual analysis and morphometric analysis provided complementary information and detected 89 out of all 91 FCDs (98%). The combination was significantly superior to conventional visual analysis alone in both subgroups resulting in a higher diagnostic sensitivity (94% versus 65%, P = 0.031 for FCD IIa; 99% versus 91%, P = 0.016 for FCD IIb). In conclusion, the additional application of morphometric MRI analysis increases the diagnostic sensitivity for FCD II in comparison with conventional visual

  8. [Multi-DSP parallel processing technique of hyperspectral RX anomaly detection].

    Science.gov (United States)

    Guo, Wen-Ji; Zeng, Xiao-Ru; Zhao, Bao-Wei; Ming, Xing; Zhang, Gui-Feng; Lü, Qun-Bo

    2014-05-01

    To satisfy the requirement of high speed, real-time and mass data storage etc. for RX anomaly detection of hyperspectral image data, the present paper proposes a solution of multi-DSP parallel processing system for hyperspectral image based on CPCI Express standard bus architecture. Hardware topological architecture of the system combines the tight coupling of four DSPs sharing data bus and memory unit with the interconnection of Link ports. On this hardware platform, by assigning parallel processing task for each DSP in consideration of the spectrum RX anomaly detection algorithm and the feature of 3D data in the spectral image, a 4DSP parallel processing technique which computes and solves the mean matrix and covariance matrix of the whole image by spatially partitioning the image is proposed. The experiment result shows that, in the case of equivalent detective effect, it can reach the time efficiency 4 times higher than single DSP process with the 4-DSP parallel processing technique of RX anomaly detection algorithm proposed by this paper, which makes a breakthrough in the constraints to the huge data image processing of DSP's internal storage capacity, meanwhile well meeting the demands of the spectral data in real-time processing.

  9. Chemical Sensing for Buried Landmines - Fundamental Processes Influencing Trace Chemical Detection

    Energy Technology Data Exchange (ETDEWEB)

    PHELAN, JAMES M.

    2002-05-01

    Mine detection dogs have a demonstrated capability to locate hidden objects by trace chemical detection. Because of this capability, demining activities frequently employ mine detection dogs to locate individual buried landmines or for area reduction. The conditions appropriate for use of mine detection dogs are only beginning to emerge through diligent research that combines dog selection/training, the environmental conditions that impact landmine signature chemical vapors, and vapor sensing performance capability and reliability. This report seeks to address the fundamental soil-chemical interactions, driven by local weather history, that influence the availability of chemical for trace chemical detection. The processes evaluated include: landmine chemical emissions to the soil, chemical distribution in soils, chemical degradation in soils, and weather and chemical transport in soils. Simulation modeling is presented as a method to evaluate the complex interdependencies among these various processes and to establish conditions appropriate for trace chemical detection. Results from chemical analyses on soil samples obtained adjacent to landmines are presented and demonstrate the ultra-trace nature of these residues. Lastly, initial measurements of the vapor sensing performance of mine detection dogs demonstrates the extreme sensitivity of dogs in sensing landmine signature chemicals; however, reliability at these ultra-trace vapor concentrations still needs to be determined. Through this compilation, additional work is suggested that will fill in data gaps to improve the utility of trace chemical detection.

  10. Effect of image processing version on detection of non-calcification cancers in 2D digital mammography imaging

    Science.gov (United States)

    Warren, L. M.; Cooke, J.; Given-Wilson, R. M.; Wallis, M. G.; Halling-Brown, M.; Mackenzie, A.; Chakraborty, D. P.; Bosmans, H.; Dance, D. R.; Young, K. C.

    2013-03-01

    Image processing (IP) is the last step in the digital mammography imaging chain before interpretation by a radiologist. Each manufacturer has their own IP algorithm(s) and the appearance of an image after IP can vary greatly depending upon the algorithm and version used. It is unclear whether these differences can affect cancer detection. This work investigates the effect of IP on the detection of non-calcification cancers by expert observers. Digital mammography images for 190 patients were collected from two screening sites using Hologic amorphous selenium detectors. Eighty of these cases contained non-calcification cancers. The images were processed using three versions of IP from Hologic - default (full enhancement), low contrast (intermediate enhancement) and pseudo screen-film (no enhancement). Seven experienced observers inspected the images and marked the location of regions suspected to be non-calcification cancers assigning a score for likelihood of malignancy. This data was analysed using JAFROC analysis. The observers also scored the clinical interpretation of the entire case using the BSBR classification scale. This was analysed using ROC analysis. The breast density in the region surrounding each cancer and the number of times each cancer was detected were calculated. IP did not have a significant effect on the radiologists' judgment of the likelihood of malignancy of individual lesions or their clinical interpretation of the entire case. No correlation was found between number of times each cancer was detected and the density of breast tissue surrounding that cancer.

  11. FEM Analysis on Electromagnetic Processing of Thin Metal Sheets

    Directory of Open Access Journals (Sweden)

    PASCA Sorin

    2014-10-01

    Full Text Available Based on finite element analysis, this paper investigates a possible new technology for electromagnetic processing of thin metal sheets, in order to improve the productivity, especially on automated manufacturing lines. This technology consists of induction heating process followed by magnetoforming process, both applied to metal sheet, using the same tool coil for both processes.

  12. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  13. Wood Crosscutting Process Analysis for Circular Saws

    Directory of Open Access Journals (Sweden)

    Jozef Krilek

    2016-11-01

    Full Text Available This article deals with the influence of some cutting parameters (geometry of cutting edge, wood species, and circular saw type and cutting conditions on the wood crosscutting process carried out with circular saws. The establishment of torque values and feeding power for the crosswise wood cutting process has significant implications for designers of crosscutting lines. The conditions of the experiments are similar to the working conditions of real machines, and the results of individual experiments can be compared with the results obtained via similar experimental workstations. Knowledge of the wood crosscutting process, as well as the choice of suitable cutting conditions and tools could decrease wood production costs and save energy. Changing circular saw type was found to have the biggest influence on cutting power of all factors tested.

  14. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  15. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  16. 300 Area process trench sediment analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, M.G.; Kossik, C.D.

    1987-12-01

    This report describes the results of a sampling program for the sediments underlying the Process Trenches serving the 300 Area on the Hanford reservation. These Process Trenches were the subject of a Closure Plan submitted to the Washington State Department of Ecology and to the US Environmental Protection Agency in lieu of a Part B permit application on November 8, 1985. The closure plan described a proposed sampling plan for the underlying sediments and potential remedial actions to be determined by the sample analyses results. The results and proposed remedial action plan are presented and discussed in this report. 50 refs., 6 figs., 8 tabs.

  17. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot...... of any hidden cycle, eliminating the risk of underestimating process variation. A brief description of selected hardware for extraction of samples from 1-D lots is provided in order to illustrate the key issues to consider when installing new, or optimizing existing sampling devices and procedures...

  18. SWOT Analysis of Software Development Process Models

    Directory of Open Access Journals (Sweden)

    Ashish B. Sasankar

    2011-09-01

    Full Text Available Software worth billions and trillions of dollars have gone waste in the past due to lack of proper techniques used for developing software resulting into software crisis. Historically , the processes of software development has played an important role in the software engineering. A number of life cycle models have been developed in last three decades. This paper is an attempt to Analyze the software process model using SWOT method. The objective is to identify Strength ,Weakness ,Opportunities and Threats of Waterfall, Spiral, Prototype etc.

  19. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira

    2013-01-01

    benefits the process of mapping data into a usually lower dimensional space. Outliers are then identified by spectral analysis of the eigenspace spanned by the set of leading eigenvectors derived from the mapping procedure. The proposed technique is purely data-driven and imposes no assumptions regarding...

  20. Multi-Granular Trend Detection for Time-Series Analysis.

    Science.gov (United States)

    Arthur Van, Goethem; Staals, Frank; Loffler, Maarten; Dykes, Jason; Speckmann, Bettina

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data sets. Trend detection is an effective way to simplify time-varying data and to summarize salient information for visual display and interactive analysis. We propose a geometric model for trend-detection in one-dimensional time-varying data, inspired by topological grouping structures for moving objects in two- or higher-dimensional space. Our model gives provable guarantees on the trends detected and uses three natural parameters: granularity, support-size, and duration. These parameters can be changed on-demand. Our system also supports a variety of selection brushes and a time-sweep to facilitate refined searches and interactive visualization of (sub-)trends. We explore different visual styles and interactions through which trends, their persistence, and evolution can be explored.

  1. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Science.gov (United States)

    Liu, Zhi-Wei; Tong, Zhao-Yang; Liu, Bing; Hao, Lan-Qun; Mu, Xi-Hui; Zhang, Jin-Ping; Gao, Chuan

    2015-04-01

    Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L-1 (S/N ≥ 3). A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L-1-40μg L-1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p soil, and flour sample) determination met the analysis requirements, which was 90.5˜95.5% and 7.85%˜9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R) was greater than or equal to 0.9456 (p < 0.001). Response voltage (ΔUe) and response time (t0) obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  2. RFI detection by automated feature extraction and statistical analysis

    Science.gov (United States)

    Winkel, B.; Kerp, J.; Stanko, S.

    2007-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4σ_rms level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the astronomical line emission of the Milky Way, (2) interferences are polarised, (3) electronic devices in the neighbourhood of the telescope contribute significantly to the RFI radiation. We also show that the radiometer equation is no longer fulfilled in presence of RFI signals.

  3. RFI detection by automated feature extraction and statistical analysis

    CERN Document Server

    Winkel, B; Stanko, S; Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4-sigma level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the a...

  4. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  5. Retinal Image Analysis for Abnormality Detection-An Overview

    Directory of Open Access Journals (Sweden)

    R. Karthikeyan

    2012-01-01

    Full Text Available Problem statement: Classification plays a major role in retinal image analysis for detecting the various abnormalities in retinal images. Classification refers to one of the mining concepts using supervised or unsupervised learning techniques. Approach: Diabetic retinopathy is one of the common complications of diabetes. Unfortunately, in many cases, the patient is not aware of any symptoms until it is too late for effective treatment. Diabetic retinopathy is the leading cause of blindness. Diabetic retinopathy results in retinal disorders that include microaneursyms, drusens, hard exudates and intra-retinal micro-vascular abnormalities. Results: Automatic methods to detect various lesions associated with diabetic retinopathy facilitate the opthalmologists in accurate diagnosis and treatment planning. Abnormal retinal images fall into four different classes namely Non-Proliferative Diabetic Retinopathy (NPDR, Central Retinal Vein Occlusion (CRVO, Choroidal Neo-Vascularization Membrane (CNVM and Central Serous Retinopathy (CSR.. Conclusion: In this study, we have analysed the various methodologies for detecting the abnormalities in retinal images automatically along with their merits and demerits and proposed the new framework for detection of abnormalities using Cellular Neural Network (CNN.

  6. H2S Analysis in Biological Samples Using Gas Chromatography with Sulfur Chemiluminescence Detection

    OpenAIRE

    Vitvitsky, Victor; Banerjee, Ruma

    2015-01-01

    Hydrogen sulfide (H2S) is a metabolite and signaling molecule in biological tissues that regulates many physiological processes. Reliable and sensitive methods for H2S analysis are necessary for a better understanding of H2S biology and for the pharmacological modulation of H2S levels in vivo. In this chapter, we describe the use of gas chromatography coupled to sulfur chemiluminescence detection to measure the rates of H2S production and degradation by tissue homogenates at physiologically r...

  7. Robust Feature Detection and Local Classification for Surfaces Based on Moment Analysis

    OpenAIRE

    2004-01-01

    The stable local classification of discrete surfaces with respect to features such as edges and corners or concave and convex regions, respectively, is as quite difficult as well as indispensable for many surface processing applications. Usually, the feature detection is done via a local curvature analysis. If concerned with large triangular and irregular grids, e.g., generated via a marching cube algorithm, the detectors are tedious to treat and a robust classification is hard to achieve. He...

  8. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  9. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  10. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  11. Analysis of food quality perception processes

    NARCIS (Netherlands)

    J.G. Termorshuizen (Koos); M.T.G. Meulenberg; B. Wierenga (Berend)

    1986-01-01

    textabstractA model of the quality perception process of the consumer with respect to food products has been developed. The model integrates a number of quality-related concepts. An empirical study was carried out to examine the relationships between the concepts. It appears that the various concept

  12. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...

  13. Exergy analysis in industrial food processing

    NARCIS (Netherlands)

    Zisopoulos, F.K.

    2016-01-01

    The sustainable provision of food on a global scale in the near future is a very serious challenge. This thesis focuses on the assessment and design of sustainable industrial food production chains and processes by using the concept of exergy which is an objective metric based on the first and secon

  14. SPAN - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on tape.

  15. SPAN C - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN-C, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on cards.

  16. Optimal detection of a change-set in a spatial Poisson process

    CERN Document Server

    Ivanoff, B Gail; 10.1214/09-AAP629

    2010-01-01

    We generalize the classic change-point problem to a "change-set" framework: a spatial Poisson process changes its intensity on an unobservable random set. Optimal detection of the set is defined by maximizing the expected value of a gain function. In the case that the unknown change-set is defined by a locally finite set of incomparable points, we present a sufficient condition for optimal detection of the set using multiparameter martingale techniques. Two examples are discussed.

  17. [Analysis of aliphatic carboxylic acids in anaerobic digestion process waters by ion-exclusion chromatography].

    Science.gov (United States)

    Ito, Kazuaki; Sakamoto, Jun; Nagaoka, Kazuya; Takayama, Yohichi; Kanahori, Takashi; Sunahara, Hiroshi; Hayashi, Tsuneo; Sato, Shinji; Hirokawa, Takeshi; Tanaka, Kazuhiko

    2012-04-01

    The analysis of seven aliphatic carboxylic acids (formic, acetic, propionic, iso-butyric, n-butyric, iso-valeric and n-valeric acid) in anaerobic digestion process waters for biogas production was examined by ion-exclusion chromatography with dilute acidic eluents (benzoic acid, perfluorobutyric acid (PFBA) and sulfuric acid) and non-suppressed conductivity/ultraviolet (UV) detection. The columns used were a styrene/divinylbenzene-based strongly acidic cation-exchange resin column (TSKgel SCX) and a polymethacrylate-based weakly acidic cation-exchange resin column (TSKgel Super IC-A/C). Good separation was performed on the TSKgel SCX in shorter retention times. For the TSKgel Super IC-A/C, peak shape of the acids was sharp and symmetrical in spite of longer retention times. In addition, the mutual separation of the acids was good except for iso- and n-butyric acids. The better separation and good detection was achieved by using the two columns (TSKgel SCX and TSKgel Super IC-A/C connected in series), lower concentrations of PFBA and sulfuric acid as eluents, non-suppressed conductivity detection and UV detection at 210 nm. This analysis was applied to anaerobic digestion process waters. The chromatograms with conductivity detection were relatively simpler compared with those of UV detection. The use of two columns with different selectivities for the aliphatic carboxylic acids and the two detection modes was effective for the determination and identification of the analytes in anaerobic digestion process waters containing complex matrices.

  18. Resource conflict detection and removal strategy for nondeterministic emergency response processes using Petri nets

    Science.gov (United States)

    Zeng, Qingtian; Liu, Cong; Duan, Hua

    2016-09-01

    Correctness of an emergency response process specification is critical to emergency mission success. Therefore, errors in the specification should be detected and corrected at build-time. In this paper, we propose a resource conflict detection approach and removal strategy for emergency response processes constrained by resources and time. In this kind of emergency response process, there are two timing functions representing the minimum and maximum execution time for each activity, respectively, and many activities require resources to be executed. Based on the RT_ERP_Net, the earliest time to start each activity and the ideal execution time of the process can be obtained. To detect and remove the resource conflicts in the process, the conflict detection algorithms and a priority-activity-first resolution strategy are given. In this way, real execution time for each activity is obtained and a conflict-free RT_ERP_Net is constructed by adding virtual activities. By experiments, it is proved that the resolution strategy proposed can shorten the execution time of the whole process to a great degree.

  19. Detection of Harbours from High Resolution Remote Sensing Imagery via Saliency Analysis and Feature Learning

    Science.gov (United States)

    Wang, Yetianjian; Pan, Li; Wang, Dagang; Kang, Yifei

    2016-06-01

    Harbours are very important objects in civil and military fields. To detect them from high resolution remote sensing imagery is important in various fields and also a challenging task. Traditional methods of detecting harbours mainly focus on the segmentation of water and land and the manual selection of knowledge. They do not make enough use of other features of remote sensing imagery and often fail to describe the harbours completely. In order to improve the detection, a new method is proposed. First, the image is transformed to Hue, Saturation, Value (HSV) colour space and saliency analysis is processed via the generation and enhancement of the co-occurrence histogram to help detect and locate the regions of interest (ROIs) that is salient and may be parts of the harbour. Next, SIFT features are extracted and feature learning is processed to help represent the ROIs. Then, by using classified feature of the harbour, a classifier is trained and used to check the ROIs to find whether they belong to the harbour. Finally, if the ROIs belong to the harbour, a minimum bounding rectangle is formed to include all the harbour ROIs and detect and locate the harbour. The experiment on high resolution remote sensing imagery shows that the proposed method performs better than other methods in precision of classifying ROIs and accuracy of completely detecting and locating harbours.

  20. Analysis of coaxial laser cladding processing conditions

    NARCIS (Netherlands)

    de Oliveira, U; Ocelik, V; De Hosson, JTM

    2005-01-01

    The formation of thick Ni-based coating on a steel substrate by coaxial laser cladding using the Nd:YAG 2 kW continuous laser was studied both from a theoretical and experimental point of view. The theoretical analysis concentrated on the transfer of laser irradiation and powder particles using a si

  1. Rare variant detection using family-based sequencing analysis.

    Science.gov (United States)

    Peng, Gang; Fan, Yu; Palculict, Timothy B; Shen, Peidong; Ruteshouser, E Cristy; Chi, Aung-Kyaw; Davis, Ronald W; Huff, Vicki; Scharfe, Curt; Wang, Wenyi

    2013-03-05

    Next-generation sequencing is revolutionizing genomic analysis, but this analysis can be compromised by high rates of missing true variants. To develop a robust statistical method capable of identifying variants that would otherwise not be called, we conducted sequence data simulations and both whole-genome and targeted sequencing data analysis of 28 families. Our method (Family-Based Sequencing Program, FamSeq) integrates Mendelian transmission information and raw sequencing reads. Sequence analysis using FamSeq reduced the number of false negative variants by 14-33% as assessed by HapMap sample genotype confirmation. In a large family affected with Wilms tumor, 84% of variants uniquely identified by FamSeq were confirmed by Sanger sequencing. In children with early-onset neurodevelopmental disorders from 26 families, de novo variant calls in disease candidate genes were corrected by FamSeq as mendelian variants, and the number of uniquely identified variants in affected individuals increased proportionally as additional family members were included in the analysis. To gain insight into maximizing variant detection, we studied factors impacting actual improvements of family-based calling, including pedigree structure, allele frequency (common vs. rare variants), prior settings of minor allele frequency, sequence signal-to-noise ratio, and coverage depth (∼20× to >200×). These data will help guide the design, analysis, and interpretation of family-based sequencing studies to improve the ability to identify new disease-associated genes.

  2. INTEGRATION OF POKA YOKE INTO PROCESS FAILURE MODE AND EFFECT ANALYSIS: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    A. P. Puvanasvaran

    2014-01-01

    Full Text Available The Failure Mode and Effect Analysis (FMEA is a one of the requirements which was required by the Automotive Industries Action Group (AIAG to all the automotive suppliers and manufacturers worldwide through the TS16949 Quality System. There were a lot of dicrepencies detected on implementing the FMEA which directly related to the user experinces and knowledge. The descrepencies cause the FMEA not meeting the objectives of it. Conceptually, Poka Yoke is able to fit into the Process FMEA. Failure Mode and Effect Analysis (FMEA helps predict and prevent problems through proper control or detection methods. Mistake proofing emphasizes detection and correction of mistakes before they become defects. Poka Yoke helps people and processes work correctly the first time. It refers to techniques that make mistakes impossible to commit. These techniques eliminate defects from products and processes as well as substantially improve their quality and reliability. Poka Yoke can be considered an extension of FMEA. The use of simple Poka Yoke ideas and methods in product and process design eliminates both human and mechanical errors. Ultimately, both FMEA and Poka Yoke methodologies result in zero defects and benefit either the end or the next-in-line customer. The first concept of Poka Yoke emphasizes elimination of the cause or occurrence of the error that creates the defects by concentrating on the cause of the error in the process. The defect is prevented by stopping the line or the machine when the root cause of the defect is triggered or detected. The second concept of Poka Yoke focuses on the effectiveness of the detection system. The foolproof detection system eliminates the defect or detects the error that causes defects. The implementation of the Poka Yoke concept in a foolproof detection system eliminates the possibility that error or defects will slip through the process and reach the customer.

  3. Improved electromagnetic induction processing with novel adaptive matched filter and matched subspace detection

    Science.gov (United States)

    Hayes, Charles E.; McClellan, James H.; Scott, Waymond R.; Kerr, Andrew J.

    2016-05-01

    This work introduces two advances in wide-band electromagnetic induction (EMI) processing: a novel adaptive matched filter (AMF) and matched subspace detection methods. Both advances make use of recent work with a subspace SVD approach to separating the signal, soil, and noise subspaces of the frequency measurements The proposed AMF provides a direct approach to removing the EMI self-response while improving the signal to noise ratio of the data. Unlike previous EMI adaptive downtrack filters, this new filter will not erroneously optimize the EMI soil response instead of the EMI target response because these two responses are projected into separate frequency subspaces. The EMI detection methods in this work elaborate on how the signal and noise subspaces in the frequency measurements are ideal for creating the matched subspace detection (MSD) and constant false alarm rate matched subspace detection (CFAR) metrics developed by Scharf The CFAR detection metric has been shown to be the uniformly most powerful invariant detector.

  4. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Jack Lee

    Full Text Available Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA, high order spectrum analysis (HOS, fractal analysis (FA, and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC are obtained. They are 96.3%, 99.1% and 98.5% (99.3%, respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  5. Elastic recoil detection analysis on the ANSTO heavy ion microprobe

    Science.gov (United States)

    Siegele, R.; Orlic, I.; Cohen, David D.

    2002-05-01

    The heavy ion microprobe at the Australian Nuclear Science and Technology Organisation is capable of focussing heavy ions with an ME/ q2 of up to 100 amu MeV. This makes the microprobe ideally suited for heavy ion elastic recoil detection analysis (ERDA). However, beam currents on a microprobe are usually very small, which requires a detection system with a large solid angle. We apply microbeam heavy ion ERDA using a large solid angle ΔE- E telescope with a gas ΔE detector to layered structures. We demonstrate the capability to measure oxygen and carbon with a lateral resolution of 20 μm, together with determination of the depth of the contamination in thin deposited layers.

  6. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  7. A patterned and un-patterned minefield detection in cluttered environments using Markov marked point process

    Science.gov (United States)

    Trang, Anh; Agarwal, Sanjeev; Regalia, Phillip; Broach, Thomas; Smith, Thomas

    2007-04-01

    A typical minefield detection approach is based on a sequential processing employing mine detection and false alarm rejection followed by minefield detection. The current approach does not work robustly under different backgrounds and environment conditions because target signature changes with time and its performance degrades in the presence of high density of false alarms. The aim of this research will be to advance the state of the art in detection of both patterned and unpatterned minefield in high clutter environments. The proposed method seeks to combine false alarm rejection module and the minefield detection module of the current architecture by spatial-spectral clustering and inference module using a Markov Marked Point Process formulation. The approach simultaneously exploits the feature characteristics of the target signature and spatial distribution of the targets in the interrogation region. The method is based on the premise that most minefields can be characterized by some type of distinctive spatial distribution of "similar" looking mine targets. The minefield detection problem is formulated as a Markov Marked Point Process (MMPP) where the set of possible mine targets is divided into a possibly overlapping mixture of targets. The likelihood of the minefield depends simultaneously on feature characteristics of the target and their spatial distribution. A framework using "Belief Propagation" is developed to solve the minefield inference problem based on MMPP. Preliminary investigation using simulated data shows the efficacy of the approach.

  8. Analysis of the medication reconciliation process conducted at hospital admission

    Directory of Open Access Journals (Sweden)

    María Beatriz Contreras Rey

    2016-07-01

    Full Text Available Objective: To analyze the outcomes of a medication reconciliation process at admission in the hospital setting. To assess the role of the Pharmacist in detecting reconciliation errors and preventing any adverse events entailed. Method: A retrospective study was conducted to analyze the medication reconciliation activity during the previous six months. The study included those patients for whom an apparently not justified discrepancy was detected at admission, after comparing the hospital medication prescribed with the home treatment stated in their clinical hospital records. Those patients for whom the physician ordered the introduction of home medication without any specification were also considered. In order to conduct the reconciliation process, the Pharmacist prepared the best pharmacotherapeutical history possible, reviewing all available information about the medication the patient could be taking before admission, and completing the process with a clinical interview. The discrepancies requiring clarification were reported to the physician. It was considered that the reconciliation proposal had been accepted if the relevant modification was made in the next visit of the physician, or within 24-48 hours maximum; this case was then labeled as a reconciliation error. For the descriptive analysis, the Statistics® SPSS program, version 17.0, was used. Outcomes: 494 medications were reconciled in 220 patients, with a mean of 2.25 medications per patient. More than half of patients (59.5% had some discrepancy that required clarification; the most frequent was the omission of a medication that the patient was taking before admission (86.2%, followed by an unjustified modification in dosing or way of administration (5.9%. In total, 312 discrepancies required clarification; out of these, 93 (29.8% were accepted and considered as reconciliation errors, 126 (40% were not accepted, and in 93 cases (29,8% acceptance was not relevant due to a change in

  9. Modeling fraud detection and the incorporation of forensic specialists in the audit process

    DEFF Research Database (Denmark)

    Sakalauskaite, Dominyka

    Financial statement audits are still comparatively poor in fraud detection. Forensic specialists can play a significant role in increasing audit quality. In this paper, based on prior academic research, I develop a model of fraud detection and the incorporation of forensic specialists in the audit...... process. The intention of the model is to identify the reasons why the audit is weak in fraud detection and to provide the analytical framework to assess whether the incorporation of forensic specialists can help to improve it. The results show that such specialists can potentially improve the fraud...

  10. Detection of Prion Proteins and TSE Infectivity in the Rendering and Biodiesel Manufacture Processes

    Energy Technology Data Exchange (ETDEWEB)

    Brown, R.; Keller, B.; Oleschuk, R. [Queen' s University, Kingston, Ontario (Canada)

    2007-03-15

    This paper addresses emerging issues related to monitoring prion proteins and TSE infectivity in the products and waste streams of rendering and biodiesel manufacture processes. Monitoring is critical to addressing the knowledge gaps identified in 'Biodiesel from Specified Risk Material Tallow: An Appraisal of TSE Risks and their Reduction' (IEA's AMF Annex XXX, 2006) that prevent comprehensive risk assessment of TSE infectivity in products and waste. The most important challenge for monitoring TSE risk is the wide variety of sample types, which are generated at different points in the rendering/biodiesel production continuum. Conventional transmissible spongiform encephalopathy (TSE) assays were developed for specified risk material (SRM) and other biological tissues. These, however, are insufficient to address the diverse sample matrices produced in rendering and biodiesel manufacture. This paper examines the sample types expected in rendering and biodiesel manufacture and the implications of applying TSE assay methods to them. The authors then discuss a sample preparation filtration, which has not yet been applied to these sample types, but which has the potential to provide or significantly improve TSE monitoring. The main improvement will come from transfer of the prion proteins from the sample matrix to a matrix compatible with conventional and emerging bioassays. A second improvement will come from preconcentrating the prion proteins, which means transferring proteins from a larger sample volume into a smaller volume for analysis to provide greater detection sensitivity. This filtration method may also be useful for monitoring other samples, including wash waters and other waste streams, which may contain SRM, including those from abattoirs and on-farm operations. Finally, there is a discussion of emerging mass spectrometric methods, which Prusiner and others have shown to be suitable for detection and characterisation of prion proteins (Stahl

  11. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  12. Trend detection and analysis in Eastern Europe and European Russia

    Science.gov (United States)

    de Beurs, K.; Henebry, G. M.; Owsley, B.

    2014-12-01

    A confluence of computing power, cost of storage, ease of access to data, and ease of product delivery make it possible to harness the power of multiple remote sensing data streams to monitor land surface dynamics. Change detection has always been a fundamental remote sensing task, and there are myriad ways to perceive differences. From a statistical viewpoint, image time series of the vegetated land surface are complicated data to analyze. The time series are often seasonal and have high temporal autocorrelation. These characteristics result in the failure of the data to meet the assumption of most standard parametric statistical tests. Failure of statistical assumptions is not trivial and the use of inappropriate statistical methods may lead to the detection of spurious trends, while any actual trends and/or step changes might be overlooked. Methods for the analysis of messy satellite data, which are often influenced by discontinuity, missing observations, non-linearity, and seasonality, are still being developed within the remote sensing community. We have found several examples of research that compares trends from different datasets. However, there is a dearth of information on the comparison of trend detection methods themselves for standardized datasets. Here we describe three different trend detection methods, and compare their results for a set of synthetic time series exhibiting monotonic trends as well as step changes. We will vary the length of the time series, the number of observations per year and the number of missing values. We will also vary the seasonality and the strength of the autocorrelation. We will then discuss a case study for Eastern Europe and European Russia where we investigate time series of MODIS Nadir BRDF-adjusted (NBAR) data at 8-day and 500m resolution between 2001 and 2013. We investigate basic vegetation indices such as NDVI and EVI but also extend the analysis towards a disturbance index which identifies how pixels differ from

  13. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  14. An analysis of the Logistics Requisition process

    OpenAIRE

    Burson, Dawn A.

    2011-01-01

    Approved for public release; distribution is unlimited. The business of supporting a globally dispersed naval force is fraught with challenges and complexity. Services for warships of differing mission and size must be sourced and provided at ports all over the world. U.S. Navy ships use a formatted report called a Logistics Requisition (LOGREQ) to acquire those necessary services. The unconnected nature of the stakeholders that own specific portions of the process increases complexity as ...

  15. Correlation analysis of transitional processes of chronorhythms

    Science.gov (United States)

    Strinadko, Marina M.; Timochko, Katerina B.; Strinadko, Olena M.; Abramov, Igor V.

    1999-11-01

    The biological system reaction on spasmodic change of a phase of sine wave revolting force is investigated. The model researches for the biosystem unit that is described by linear differential equation of the second order are carried out. Possibility of time asymmetry in adaptation and transitional processes of biological units, at spasmodic change of phase identical modulo and opposite on the sign is shown. The residual in time of adaptation depends on state of biosystem's unit at the moment of perturbation.

  16. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  17. Satellite images analysis for shadow detection and building height estimation

    Science.gov (United States)

    Liasis, Gregoris; Stavrou, Stavros

    2016-09-01

    Satellite images can provide valuable information about the presented urban landscape scenes to remote sensing and telecommunication applications. Obtaining information from satellite images is difficult since all the objects and their surroundings are presented with feature complexity. The shadows cast by buildings in urban scenes can be processed and used for estimating building heights. Thus, a robust and accurate building shadow detection process is important. Region-based active contour models can be used for satellite image segmentation. However, spectral heterogeneity that usually exists in satellite images and the feature similarity representing the shadow and several non-shadow regions makes building shadow detection challenging. In this work, a new automated method for delineating building shadows is proposed. Initially, spectral and spatial features of the satellite image are utilized for designing a custom filter to enhance shadows and reduce intensity heterogeneity. An effective iterative procedure using intensity differences is developed for tuning and subsequently selecting the most appropriate filter settings, able to highlight the building shadows. The response of the filter is then used for automatically estimating the radiometric property of the shadows. The customized filter and the radiometric feature are utilized to form an optimized active contour model where the contours are biased to delineate shadow regions. Post-processing morphological operations are also developed and applied for removing misleading artefacts. Finally, building heights are approximated using shadow length and the predefined or estimated solar elevation angle. Qualitative and quantitative measures are used for evaluating the performance of the proposed method for both shadow detection and building height estimation.

  18. Process Analysis of the CV Group's Operation

    CERN Document Server

    Wilhelmsson, M

    2000-01-01

    This report will give an explanation of the internal reorganization that has been done because of the necessity to optimize operation in the cooling and ventilation group. The basic structure for the group was defined at the end of 1998. We understood then that change was needed to accommodate the increased workload due to the LHC project. In addition, we face a relatively large turnover of personnel (retirements and some recruitment) with related integration issues to consider. We would also like to implement new approaches in the management of both operations and maintenance. After some running-in problems during the first half of 1999, we realized that much more could be gained with the analysis and the definition and documenting of each single function and generic activity within the group. The authors will explain how this analysis was carried out and give some feedback of the outcome, so far.

  19. Effect of consumer reporting on signal detection: using disproportionality analysis.

    Science.gov (United States)

    Hammond, Isaac W; Rich, Donna S; Gibbs, Trevor G

    2007-11-01

    Pharmacovigilance objectives and activities are designed to protect the health of consumers and are generally based on data acquisition from spontaneous adverse event reports (SADRs). SADRs come from different sources, including healthcare professionals, consumers, lawyers, other pharmaceutical companies, regulatory agencies and so on. Pharmacovigilance activities derived from SADRs include signal detection and description of the safety profile of the drug. Consumers are the most frequent source of most SADRs, even though the system was originally designed to receive reports from healthcare professionals. Most spontaneous adverse event reports are received from the US. GlaxoSmithKline (GSK) conducts monthly signal detection on all marketed compounds in its global database using disproportionality analysis, the empirical Bayesian algorithm known as a multiple-item gamma-Poisson shrinker. There are no systematic survey data or reviews of actual experiences within existing safety surveillance databases of how pharmaceutical companies handle consumer reports. Thus, a study was undertaken to determine the impact of consumer reports on signal detection using MGPS disproportionality analysis. Two data sets were created for four randomly selected GSK marketed compounds; one data set included reports from both consumer and healthcare providers and the second included only reports from healthcare providers. Disproportionality analysis was then used to evaluate the two data sets. A total of 23 signals were identified with a mean difference in time to signal detection of 1.8 years. The difference was in the range of -8-10 years. In 52.2% of events (12/23), the signal was identified earlier when consumer reports were included in the data. In 34.8% of events (8/23), the signal was identified in the same year in both data sets and, in 13% of the events (3/23), the signal was identified later when consumer reports were included in the data. It was concluded from this study that

  20. Bony change of apical lesion healing process using fractal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ji Min; Park, Hyok; Jeong, Ho Gul; Kim, Kee Deog; Park, Chang Seo [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2005-06-15

    To investigate the change of bone healing process after endodontic treatment of the tooth with an apical lesion by fractal analysis. Radiographic images of 35 teeth from 33 patients taken on first diagnosis, 6 months, and 1 year after endodontic treatment were selected. Radiographic images were taken by JUPITER computerized Dental X-ray System. Fractal dimensions were calculated three times at each area by Scion Image PC program. Rectangular region of interest (30 x 30) were selected at apical lesion and normal apex of each image. The fractal dimension at apical lesion of first diagnosis (L{sub 0}) is 0.940 {+-} 0.361 and that of normal area (N{sub 0}) is 1.186 {+-} 0.727 (p<0.05). Fractal dimension at apical lesion of 6 months after endodontic treatment (L{sub 1}) is 1.076 {+-} 0.069 and that of normal area (N{sub 1}) is 1.192 {+-} 0.055 (p<0.05). Fractal dimension at apical lesion of 1 year after endodontic treatment (L{sub 2}) is 1.163 {+-} 0.074 and that of normal area (N{sub 2}) is 1.225 {+-} 0.079 (p<0.05). After endodontic treatment, the fractal dimensions at each apical lesions depending on time showed statistically significant difference. And there are statistically significant different between normal area and apical lesion on first diagnosis, 6 months after, 1 year after. But the differences were grow smaller as time flows. The evaluation of the prognosis after the endodontic treatment of the apical lesion was estimated by bone regeneration in apical region. Fractal analysis was attempted to overcome the limit of subjective reading, and as a result the change of the bone during the healing process was able to be detected objectively and quantitatively.

  1. Analysis of DIRAC's behavior using model checking with process algebra

    Science.gov (United States)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-12-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  2. Modal Analysis for Crack Detection in Small Wind Turbine Blades

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Skov, Jonas falk; Dickow, Kristoffer Ahrens

    2013-01-01

    The aim of the present paper is to evaluate structural health monitoring (SHM) techniques based on modal analysis for crack detection in small wind turbine blades. A finite element (FE) model calibrated to measured modal parameters will be introduced to cracks with different sizes along one edge...... of the blade. Changes in modal parameters from the FE model are compared with data obtained from experimental tests. These comparisons will be used to validate the FE model and subsequently discuss the usability of SHM techniques based on modal parameters for condition monitoring of wind turbine blades....

  3. Data-Driven Methods for the Detection of Causal Structures in Process Technology

    Directory of Open Access Journals (Sweden)

    Christian Kühnert

    2014-11-01

    Full Text Available In modern industrial plants, process units are strongly cross-linked with eachother, and disturbances occurring in one unit potentially become plant-wide. This can leadto a flood of alarms at the supervisory control and data acquisition system, hiding the originalfault causing the disturbance. Hence, one major aim in fault diagnosis is to backtrackthe disturbance propagation path of the disturbance and to localize the root cause of thefault. Since detecting correlation in the data is not sufficient to describe the direction of thepropagation path, cause-effect dependencies among process variables need to be detected.Process variables that show a strong causal impact on other variables in the process comeinto consideration as being the root cause. In this paper, different data-driven methods areproposed, compared and combined that can detect causal relationships in data while solelyrelying on process data. The information of causal dependencies is used for localization ofthe root cause of a fault. All proposed methods consist of a statistical part, which determineswhether the disturbance traveling from one process variable to a second is significant, and aquantitative part, which calculates the causal information the first process variable has aboutthe second. The methods are tested on simulated data from a chemical stirred-tank reactorand on a laboratory plant.

  4. Detecting inpatient falls by using natural language processing of electronic medical records

    Directory of Open Access Journals (Sweden)

    Toyabe Shin-ichi

    2012-12-01

    Full Text Available Abstract Background Incident reporting is the most common method for detecting adverse events in a hospital. However, under-reporting or non-reporting and delay in submission of reports are problems that prevent early detection of serious adverse events. The aim of this study was to determine whether it is possible to promptly detect serious injuries after inpatient falls by using a natural language processing method and to determine which data source is the most suitable for this purpose. Methods We tried to detect adverse events from narrative text data of electronic medical records by using a natural language processing method. We made syntactic category decision rules to detect inpatient falls from text data in electronic medical records. We compared how often the true fall events were recorded in various sources of data including progress notes, discharge summaries, image order entries and incident reports. We applied the rules to these data sources and compared F-measures to detect falls between these data sources with reference to the results of a manual chart review. The lag time between event occurrence and data submission and the degree of injury were compared. Results We made 170 syntactic rules to detect inpatient falls by using a natural language processing method. Information on true fall events was most frequently recorded in progress notes (100%, incident reports (65.0% and image order entries (12.5%. However, F-measure to detect falls using the rules was poor when using progress notes (0.12 and discharge summaries (0.24 compared with that when using incident reports (1.00 and image order entries (0.91. Since the results suggested that incident reports and image order entries were possible data sources for prompt detection of serious falls, we focused on a comparison of falls found by incident reports and image order entries. Injury caused by falls found by image order entries was significantly more severe than falls detected by

  5. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    Science.gov (United States)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  6. Automatic solar feature detection using image processing and pattern recognition techniques

    Science.gov (United States)

    Qu, Ming

    The objective of the research in this dissertation is to develop a software system to automatically detect and characterize solar flares, filaments and Corona Mass Ejections (CMEs), the core of so-called solar activity. These tools will assist us to predict space weather caused by violent solar activity. Image processing and pattern recognition techniques are applied to this system. For automatic flare detection, the advanced pattern recognition techniques such as Multi-Layer Perceptron (MLP), Radial Basis Function (RBF), and Support Vector Machine (SVM) are used. By tracking the entire process of flares, the motion properties of two-ribbon flares are derived automatically. In the applications of the solar filament detection, the Stabilized Inverse Diffusion Equation (SIDE) is used to enhance and sharpen filaments; a new method for automatic threshold selection is proposed to extract filaments from background; an SVM classifier with nine input features is used to differentiate between sunspots and filaments. Once a filament is identified, morphological thinning, pruning, and adaptive edge linking methods are applied to determine filament properties. Furthermore, a filament matching method is proposed to detect filament disappearance. The automatic detection and characterization of flares and filaments have been successfully applied on Halpha full-disk images that are continuously obtained at Big Bear Solar Observatory (BBSO). For automatically detecting and classifying CMEs, the image enhancement, segmentation, and pattern recognition techniques are applied to Large Angle Spectrometric Coronagraph (LASCO) C2 and C3 images. The processed LASCO and BBSO images are saved to file archive, and the physical properties of detected solar features such as intensity and speed are recorded in our database. Researchers are able to access the solar feature database and analyze the solar data efficiently and effectively. The detection and characterization system greatly improves

  7. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    Science.gov (United States)

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved

  8. The analysis of thermally stimulated processes

    CERN Document Server

    Chen, R; Pamplin, Brian

    1981-01-01

    Thermally stimulated processes include a number of phenomena - either physical or chemical in nature - in which a certain property of a substance is measured during controlled heating from a 'low' temperature. Workers and graduate students in a wide spectrum of fields require an introduction to methods of extracting information from such measurements. This book gives an interdisciplinary approach to various methods which may be applied to analytical chemistry including radiation dosimetry and determination of archaeological and geological ages. In addition, recent advances are included, such

  9. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  10. Analysis of an EBeam melting process

    Science.gov (United States)

    Schunk, P. R.

    Electron-Beam (EBeam) melting furnaces are routinely used to minimize the occurrence of second-phase particles in the processing of segregation-sensitive alloys. As one part of the process, a circulating electron beam impinges the surface of a crucible melt pool to help control the shape of the solidification front below. By modeling melt pool hydrodynamics, heat transfer, and the shape of solidification boundaries, we plan to optimize the dwell pattern of the beam so that the material solidifies with a composition as spatially homogeneous as possible. Both two-and three-dimensional models are being pursued with FIDAP 5.02, the former serving as a test bed for various degrees of model sophistication. A heat flux distribution is specified on the top of the domain to simulate the EBeam dwell pattern. In two dimensions it is found that an inertially-driven recirculation in the melt pool interacts with a counter-rotating buoyancy-driven recirculation, and that both recirculation influence heavily the shape of the solidification front. In three dimensions the inertial cell decays quickly with distance from the position of the inlet stream. Because the Rayleigh number can exceed 10(exp 7) for materials and operating conditions of interest, stability and the possibility of spontaneous transients are explored.

  11. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.

  12. Power spectrum weighted edge analysis for straight edge detection in images

    Science.gov (United States)

    Karvir, Hrishikesh V.; Skipper, Julie A.

    2007-04-01

    Most man-made objects provide characteristic straight line edges and, therefore, edge extraction is a commonly used target detection tool. However, noisy images often yield broken edges that lead to missed detections, and extraneous edges that may contribute to false target detections. We present a sliding-block approach for target detection using weighted power spectral analysis. In general, straight line edges appearing at a given frequency are represented as a peak in the Fourier domain at a radius corresponding to that frequency, and a direction corresponding to the orientation of the edges in the spatial domain. Knowing the edge width and spacing between the edges, a band-pass filter is designed to extract the Fourier peaks corresponding to the target edges and suppress image noise. These peaks are then detected by amplitude thresholding. The frequency band width and the subsequent spatial filter mask size are variable parameters to facilitate detection of target objects of different sizes under known imaging geometries. Many military objects, such as trucks, tanks and missile launchers, produce definite signatures with parallel lines and the algorithm proves to be ideal for detecting such objects. Moreover, shadow-casting objects generally provide sharp edges and are readily detected. The block operation procedure offers advantages of significant reduction in noise influence, improved edge detection, faster processing speed and versatility to detect diverse objects of different sizes in the image. With Scud missile launcher replicas as target objects, the method has been successfully tested on terrain board test images under different backgrounds, illumination and imaging geometries with cameras of differing spatial resolution and bit-depth.

  13. An analysis of network traffic classification for botnet detection

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2015-01-01

    Botnets represent one of the most serious threats to the Internet security today. This paper explores how can network traffic classification be used for accurate and efficient identification of botnet network activity at local and enterprise networks. The paper examines the effectiveness...... of detecting botnet network traffic using three methods that target protocols widely considered as the main carriers of botnet Command and Control (C&C) and attack traffic, i.e. TCP, UDP and DNS. We propose three traffic classification methods based on capable Random Forests classifier. The proposed methods...... to the optimization of traffic analysis and the correlation of findings from the three analysis methods in order to identify compromised hosts within the network....

  14. Shift endpoint trace selection algorithm and wavelet analysis to detect the endpoint using optical emission spectroscopy

    Science.gov (United States)

    Ben Zakour, Sihem; Taleb, Hassen

    2016-06-01

    Endpoint detection (EPD) is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done on the right way. It is truly a crucial part of supplying repeatable effects in every single wafer. When the film to be etched has been completely erased, the endpoint is reached. In order to ensure the desired device performance on the produced integrated circuit, many sensors are used to detect the endpoint, such as the optical, electrical, acoustical/vibrational, thermal, and frictional. But, except the optical sensor, the other ones show their weaknesses due to the environmental conditions which affect the exactness of reaching endpoint. Unfortunately, some exposed area to the film to be etched is very low (signal and showing the incapacity of the traditional endpoint detection method to determine the wind-up of the etch process. This work has provided a means to improve the endpoint detection sensitivity by collecting a huge numbers of full spectral data containing 1201 spectra for each run, then a new unsophisticated algorithm is proposed to select the important endpoint traces named shift endpoint trace selection (SETS). Then, a sensitivity analysis of linear methods named principal component analysis (PCA) and factor analysis (FA), and the nonlinear method called wavelet analysis (WA) for both approximation and details will be studied to compare performances of the methods mentioned above. The signal to noise ratio (SNR) is not only computed based on the main etch (ME) period but also the over etch (OE) period. Moreover, a new unused statistic for EPD, coefficient of variation (CV), is proposed to reach the endpoint in plasma etches process.

  15. Detecting deception in children: A meta-analysis.

    Science.gov (United States)

    Gongola, Jennifer; Scurich, Nicholas; Quas, Jodi A

    2017-02-01

    Although research reveals that children as young as 3 can use deception and will take steps to obscure truth, research concerning how well others detect children's deceptive efforts remains unclear. Yet adults regularly assess whether children are telling the truth in a variety of contexts, including at school, in the home, and in legal settings, particularly in investigations of maltreatment. We conducted a meta-analysis to synthesize extant research concerning adults' ability to detect deceptive statements produced by children. We included 45 experiments involving 7,893 adult judges and 1,858 children. Overall, adults could accurately discriminate truths/lies at an average rate of 54%, which is slightly but significantly above chance levels. The average rate at which true statements were correctly classified as honest was higher (63.8%), whereas the rate at which lies were classified as dishonest was not different from chance (47.5%). A small positive correlation emerged between judgment confidence and judgment accuracy. Professionals (e.g., social workers, police officers, teachers) slightly outperformed laypersons (e.g., college undergraduates). Finally, exploratory analyses revealed that the child's age did not significantly affect the rate at which adults could discriminate truths/lies from chance. Future research aimed toward improving lie detection accuracy might focus more on individual differences in children's lie-telling abilities in order to uncover any reliable indicators of deception. (PsycINFO Database Record

  16. Damage Detection and Quantification Using Transmissibility Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Yun-Lai Zhou

    2015-01-01

    Full Text Available A new transmissibility-based damage detection and quantification approach is proposed. Based on the operational modal analysis, the transmissibility is extracted from system responses and transmissibility coherence is defined and analyzed. Afterwards, a sensitive-damage indicator is defined in order to detect and identify the severity of damage and compared with an indicator developed by other authors. The proposed approach is validated on data from a physics-based numerical model as well as experimental data from a three-story aluminum frame structure. For both numerical simulation and experiment the results of the new indicator reveal a better performance than coherence measure proposed in Rizos et al., 2008, Rizos et al., 2002, Fassois and Sakellariou, 2007, especially when nonlinearity occurs, which might be further used in real engineering. The main contribution of this study is the construction of the relation between transmissibility coherence and frequency response function coherence and the construction of an effective indicator based on the transmissibility modal assurance criteria for damage (especially for minor nonlinearity detection as well as quantification.

  17. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  18. ANALYSIS ON TECHNOLOGICAL PROCESSES CLEANING OIL PIPELINES

    Directory of Open Access Journals (Sweden)

    Mariana PǍTRAŞCU

    2015-05-01

    Full Text Available In this paper the researches are presented concerning the technological processes of oil pipelines.We know several technologies and materials used for cleaning the sludge deposits, iron and manganese oxides, dross, stone, etc.de on the inner walls of drinking water pipes or industries.For the oil industry, methods of removal of waste materials and waste pipes and liquid and gas transport networks are operations known long, tedious and expensive. The main methods and associated problems can be summarized as follows: 1 Blowing with compressed air.2 manual or mechanical brushing, sanding with water or dry.3 Wash with water jet of high pressure, solvent or chemical solution to remove the stone and hard deposits.4 The combined methods of cleaning machines that use water jets, cutters, chains, rotary heads cutters, etc.

  19. Automatic Defect Detection for TFT-LCD Array Process Using Quasiconformal Kernel Support Vector Data Description

    Directory of Open Access Journals (Sweden)

    Yi-Hung Liu

    2011-09-01

    Full Text Available Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  20. Automatic defect detection for TFT-LCD array process using quasiconformal kernel support vector data description.

    Science.gov (United States)

    Liu, Yi-Hung; Chen, Yan-Jen

    2011-01-01

    Defect detection has been considered an efficient way to increase the yield rate of panels in thin film transistor liquid crystal display (TFT-LCD) manufacturing. In this study we focus on the array process since it is the first and key process in TFT-LCD manufacturing. Various defects occur in the array process, and some of them could cause great damage to the LCD panels. Thus, how to design a method that can robustly detect defects from the images captured from the surface of LCD panels has become crucial. Previously, support vector data description (SVDD) has been successfully applied to LCD defect detection. However, its generalization performance is limited. In this paper, we propose a novel one-class machine learning method, called quasiconformal kernel SVDD (QK-SVDD) to address this issue. The QK-SVDD can significantly improve generalization performance of the traditional SVDD by introducing the quasiconformal transformation into a predefined kernel. Experimental results, carried out on real LCD images provided by an LCD manufacturer in Taiwan, indicate that the proposed QK-SVDD not only obtains a high defect detection rate of 96%, but also greatly improves generalization performance of SVDD. The improvement has shown to be over 30%. In addition, results also show that the QK-SVDD defect detector is able to accomplish the task of defect detection on an LCD image within 60 ms.

  1. Research on a Defects Detection Method in the Ferrite Phase Shifter Cementing Process Based on a Multi-Sensor Prognostic and Health Management (PHM) System.

    Science.gov (United States)

    Wan, Bo; Fu, Guicui; Li, Yanruoyue; Zhao, Youhu

    2016-08-10

    The cementing manufacturing process of ferrite phase shifters has the defect that cementing strength is insufficient and fractures always appear. A detection method of these defects was studied utilizing the multi-sensors Prognostic and Health Management (PHM) theory. Aiming at these process defects, the reasons that lead to defects are analyzed in this paper. In the meanwhile, the key process parameters were determined and Differential Scanning Calorimetry (DSC) tests during the cure process of resin cementing were carried out. At the same time, in order to get data on changing cementing strength, multiple-group cementing process tests of different key process parameters were designed and conducted. A relational model of cementing strength and cure temperature, time and pressure was established, by combining data of DSC and process tests as well as based on the Avrami formula. Through sensitivity analysis for three process parameters, the on-line detection decision criterion and the process parameters which have obvious impact on cementing strength were determined. A PHM system with multiple temperature and pressure sensors was established on this basis, and then, on-line detection, diagnosis and control for ferrite phase shifter cementing process defects were realized. It was verified by subsequent process that the on-line detection system improved the reliability of the ferrite phase shifter cementing process and reduced the incidence of insufficient cementing strength defects.

  2. Research on a Defects Detection Method in the Ferrite Phase Shifter Cementing Process Based on a Multi-Sensor Prognostic and Health Management (PHM System

    Directory of Open Access Journals (Sweden)

    Bo Wan

    2016-08-01

    Full Text Available The cementing manufacturing process of ferrite phase shifters has the defect that cementing strength is insufficient and fractures always appear. A detection method of these defects was studied utilizing the multi-sensors Prognostic and Health Management (PHM theory. Aiming at these process defects, the reasons that lead to defects are analyzed in this paper. In the meanwhile, the key process parameters were determined and Differential Scanning Calorimetry (DSC tests during the cure process of resin cementing were carried out. At the same time, in order to get data on changing cementing strength, multiple-group cementing process tests of different key process parameters were designed and conducted. A relational model of cementing strength and cure temperature, time and pressure was established, by combining data of DSC and process tests as well as based on the Avrami formula. Through sensitivity analysis for three process parameters, the on-line detection decision criterion and the process parameters which have obvious impact on cementing strength were determined. A PHM system with multiple temperature and pressure sensors was established on this basis, and then, on-line detection, diagnosis and control for ferrite phase shifter cementing process defects were realized. It was verified by subsequent process that the on-line detection system improved the reliability of the ferrite phase shifter cementing process and reduced the incidence of insufficient cementing strength defects.

  3. Research on a Defects Detection Method in the Ferrite Phase Shifter Cementing Process Based on a Multi-Sensor Prognostic and Health Management (PHM) System

    Science.gov (United States)

    Wan, Bo; Fu, Guicui; Li, Yanruoyue; Zhao, Youhu

    2016-01-01

    The cementing manufacturing process of ferrite phase shifters has the defect that cementing strength is insufficient and fractures always appear. A detection method of these defects was studied utilizing the multi-sensors Prognostic and Health Management (PHM) theory. Aiming at these process defects, the reasons that lead to defects are analyzed in this paper. In the meanwhile, the key process parameters were determined and Differential Scanning Calorimetry (DSC) tests during the cure process of resin cementing were carried out. At the same time, in order to get data on changing cementing strength, multiple-group cementing process tests of different key process parameters were designed and conducted. A relational model of cementing strength and cure temperature, time and pressure was established, by combining data of DSC and process tests as well as based on the Avrami formula. Through sensitivity analysis for three process parameters, the on-line detection decision criterion and the process parameters which have obvious impact on cementing strength were determined. A PHM system with multiple temperature and pressure sensors was established on this basis, and then, on-line detection, diagnosis and control for ferrite phase shifter cementing process defects were realized. It was verified by subsequent process that the on-line detection system improved the reliability of the ferrite phase shifter cementing process and reduced the incidence of insufficient cementing strength defects. PMID:27517935

  4. Analysis of bilinear stochastic systems. [involving multiplicative noise processes

    Science.gov (United States)

    Willsky, A. S.; Marcus, S. I.; Martin, D. N.

    1974-01-01

    Analysis of stochastic dynamical systems that involve multiplicative (bilinear) noise processes is considered. After defining the systems of interest, the evolution of the moments of such systems, the question of stochastic stability, and estimation for bilinear stochastic systems are discussed. Both exact and approximate methods of analysis are introduced, and, in particular, the uses of Lie-theoretic concepts and harmonic analysis are discussed.

  5. Detection and analysis of diamond fingerprinting feature and its application

    Science.gov (United States)

    Li, Xin; Huang, Guoliang; Li, Qiang; Chen, Shengyi

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the "Diamond ID" and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  6. Detection and analysis of diamond fingerprinting feature and its application

    Energy Technology Data Exchange (ETDEWEB)

    Li Xin; Huang Guoliang; Li Qiang; Chen Shengyi, E-mail: tshgl@tsinghua.edu.cn [Department of Biomedical Engineering, the School of Medicine, Tsinghua University, Beijing, 100084 (China)

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the 'Diamond ID' and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  7. Analysis of digitized cervical images to detect cervical neoplasia

    Science.gov (United States)

    Ferris, Daron G.

    2004-05-01

    Cervical cancer is the second most common malignancy in women worldwide. If diagnosed in the premalignant stage, cure is invariably assured. Although the Papanicolaou (Pap) smear has significantly reduced the incidence of cervical cancer where implemented, the test is only moderately sensitive, highly subjective and skilled-labor intensive. Newer optical screening tests (cervicography, direct visual inspection and speculoscopy), including fluorescent and reflective spectroscopy, are fraught with certain weaknesses. Yet, the integration of optical probes for the detection and discrimination of cervical neoplasia with automated image analysis methods may provide an effective screening tool for early detection of cervical cancer, particularly in resource poor nations. Investigative studies are needed to validate the potential for automated classification and recognition algorithms. By applying image analysis techniques for registration, segmentation, pattern recognition, and classification, cervical neoplasia may be reliably discriminated from normal epithelium. The National Cancer Institute (NCI), in cooperation with the National Library of Medicine (NLM), has embarked on a program to begin this and other similar investigative studies.

  8. Network structure detection and analysis of Shanghai stock market

    Directory of Open Access Journals (Sweden)

    Sen Wu

    2015-04-01

    Full Text Available Purpose: In order to investigate community structure of the component stocks of SSE (Shanghai Stock Exchange 180-index, a stock correlation network is built to find the intra-community and inter-community relationship. Design/methodology/approach: The stock correlation network is built taking the vertices as stocks and edges as correlation coefficients of logarithm returns of stock price. It is built as undirected weighted at first. GN algorithm is selected to detect community structure after transferring the network into un-weighted with different thresholds. Findings: The result of the network community structure analysis shows that the stock market has obvious industrial characteristics. Most of the stocks in the same industry or in the same supply chain are assigned to the same community. The correlation of the internal stock prices’ fluctuation is closer than in different communities. The result of community structure detection also reflects correlations among different industries. Originality/value: Based on the analysis of the community structure in Shanghai stock market, the result reflects some industrial characteristics, which has reference value to relationship among industries or sub-sectors of listed companies.

  9. Analysis of glacial and periglacial processes using structure from motion

    Science.gov (United States)

    Piermattei, L.; Carturan, L.; de Blasi, F.; Tarolli, P.; Dalla Fontana, G.; Vettore, A.; Pfeifer, N.

    2015-11-01

    Close-range photo-based surface reconstruction from the ground is rapidly emerging as an alternative to lidar (light detection and ranging), which today represents the main survey technique in many fields of geoscience. The recent evolution of photogrammetry, incorporating computer vision algorithms such as Structure from Motion (SfM) and dense image matching such as Multi-View Stereo (MVS), allows the reconstruction of dense 3-D point clouds for the photographed object from a sequence of overlapping images taken with a digital consumer camera. The objective of our work was to test the accuracy of the ground-based SfM-MVS approach in calculating the geodetic mass balance of a 2.1 km2 glacier in the Ortles-Cevedale Group, Eastern Italian Alps. In addition, we investigated the feasibility of using the image-based approach for the detection of the surface displacement rate of a neighbouring active rock glacier. Airborne laser scanning (ALS) data were used as benchmarks to estimate the accuracy of the photogrammetric DTMs and the reliability of the method in this specific application. The glacial and periglacial analyses were performed using both range and image-based surveying techniques, and the results were then compared. The results were encouraging because the SfM-MVS approach enables the reconstruction of high-quality DTMs which provided estimates of glacial and periglacial processes similar to those achievable by ALS. Different resolutions and accuracies were obtained for the glacier and the rock glacier, given the different survey geometries, surface characteristics and areal extents. The analysis of the SfM-MVS DTM quality allowed us to highlight the limitations of the adopted expeditious method in the studied alpine terrain and the potential of this method in the multitemporal study of glacial and periglacial areas.

  10. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Directory of Open Access Journals (Sweden)

    Zhi-Wei Liu

    2015-04-01

    Full Text Available Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L−1 (S/N ≥ 3. A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L−1-40μg L−1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p < 0.001. The sensor showed no response for abrin, BSA, and could overcome the influence of complex environmental disruptors, indicating high specificity and good selectivity. Recovery and reproducibility in the result of simulated samples (simulated water, soil, and flour sample determination met the analysis requirements, which was 90.5∼95.5% and 7.85%∼9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R was greater than or equal to 0.9456 (p < 0.001. Response voltage (ΔUe and response time (t0 obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  11. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  12. Improved Kernel PLS-based Fault Detection Approach for Nonlinear Chemical Processes

    Institute of Scientific and Technical Information of China (English)

    王丽; 侍洪波

    2014-01-01

    In this paper, an improved nonlinear process fault detection method is proposed based on modified ker-nel partial least squares (KPLS). By integrating the statistical local approach (SLA) into the KPLS framework, two new statistics are established to monitor changes in the underlying model. The new modeling strategy can avoid the Gaussian distribution assumption of KPLS. Besides, advantage of the proposed method is that the kernel latent variables can be obtained directly through the eigen value decomposition instead of the iterative calculation, which can improve the computing speed. The new method is applied to fault detection in the simulation benchmark of the Tennessee Eastman process. The simulation results show superiority on detection sensitivity and accuracy in com-parison to KPLS monitoring.

  13. Automatic Detection of Steel Ball's Surface Flaws Based on Image Processing

    Institute of Scientific and Technical Information of China (English)

    YU Zheng-lin; TAN Wei; YANG Dong-lin; CAO Guo-hua

    2007-01-01

    A new method to detect steel ball's surface flaws is presented based on computer techniques of image processing and pattern recognition. The steel ball's surface flaws is the primary factor causing bearing failure. The high efficient and precision detections for the surface flaws of steel ball can be conducted by the presented method, including spot, abrasion, burn, scratch and crack, etc. The design of main components of the detecting system is described in detail including automatic feeding mechanism, automatic spreading mechanism of steel ball's surface, optical system of microscope, image acquisition system, image processing system. The whole automatic system is controlled by an industrial control computer, which can carry out the recognition of flaws of steel ball's surface effectively.

  14. Document analysis using an aggregative and iterative process.

    Science.gov (United States)

    Rasmussen, Philippa; Muir-Cochrane, Eimear; Henderson, Ann

    2012-06-01

    This paper is a descriptive commentary concerning the use of document analysis in qualitative research concerned with developing an understanding of the role of child and adolescent mental health nursing in an inpatient. The document analysis was undertaken using thematic analysis with both an iterative process (Attride-Stirling) and an aggregative process, the Joanna Briggs Institute Thematic Analysis Program (TAP). After the initial iterative process the data were entered into an online software program, TAP, for aggregation and further analysis. The TAP software consisted of a three-step approach in the analysis of data extraction of illustrations, aggregation to categories and synthesis of categories into themes. A TAP chart was generated displaying the connections between the illustrations, categories and themes. The advantage and limitations of utilising the TAP software compared with Computer Assisted Qualitative Data Analysis Software were discussed. The program afforded direct involvement by the researcher in the cognitive process of the analysis; rather than just the technical process. A limitation of the program would be the volume of the data if the research involved a vast amount of data. The TAP program was a clearly defined three-step software program that was appropriate for the documents analysis for the research. The program would have a wide application for facilitating the thematic analysis of documents, although the program is suitable for smaller amounts of data.

  15. Mine detection using SF-GPR: A signal processing approach for resolution enhancement and clutter reduction

    DEFF Research Database (Denmark)

    Karlsen, Brian; Jakobsen, Kaj Bjarne; Larsen, Jan;

    2001-01-01

    Proper clutter reduction is essential for Ground Penetrating Radar data since low signal-to-clutter ratio prevent correct detection of mine objects. A signal processing approach for resolution enhancement and clutter reduction used on Stepped-Frequency Ground Penetrating Radar (SF-GPR) data is pr...

  16. Motion compensated image processing and optimal parameters for egg crack detection using modified pressure

    Science.gov (United States)

    Shell eggs with microcracks are often undetected during egg grading processes. In the past, a modified pressure imaging system was developed to detect eggs with microcracks without adversely affecting the quality of normal intact eggs. The basic idea of the modified pressure imaging system was to ap...

  17. Statistical methods for the detection and analysis of radioactive sources

    Science.gov (United States)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  18. Streamlining the analytical workflow for multiplex MS/MS allergen detection in processed foods.

    Science.gov (United States)

    Pilolli, Rosa; De Angelis, Elisabetta; Monaci, Linda

    2017-04-15

    Allergenic ingredients in pre-packaged foods are regulated by EU legislation mandating their inclusion on labels. In order to protect allergic consumers, sensitive analytical methods are required for detect allergen traces in different food products. As a follow-up to our previous investigations, an optimized, sensitive, label-free LC-MS/MS method for multiplex detection of five allergenic ingredients in a processed food matrix is proposed. A cookie base was chosen as a complex food matrix and home-made cookies incurred with whole egg, skimmed milk, soy flour, ground hazelnut and ground peanut were prepared at laboratory scale. In order to improve the analytical workflow both protein extraction and purification protocols were optimized and finally a sensitive streamlined SRM based analytical method for allergens detection in incurred cookies was devised. The effect of baking on the detection of selected markers was also investigated.

  19. Final Scientific Report, Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T.; Gibbons. S.J.; Ringdal, F; Harris, D.B.

    2007-01-30

    In the field of nuclear explosion monitoring, it has become a priority to detect, locate, and identify seismic events down to increasingly small magnitudes. The consideration of smaller seismic events has implications for a reliable monitoring regime. Firstly, the number of events to be considered increases greatly; an exponential increase in naturally occurring seismicity is compounded by large numbers of seismic signals generated by human activity. Secondly, the signals from smaller events become more difficult to detect above the background noise and estimates of parameters required for locating the events may be subject to greater errors. Thirdly, events are likely to be observed by a far smaller number of seismic stations, and the reliability of event detection and location using a very limited set of observations needs to be quantified. For many key seismic stations, detection lists may be dominated by signals from routine industrial explosions which should be ascribed, automatically and with a high level of confidence, to known sources. This means that expensive analyst time is not spent locating routine events from repeating seismic sources and that events from unknown sources, which could be of concern in an explosion monitoring context, are more easily identified and can be examined with due care. We have obtained extensive lists of confirmed seismic events from mining and other artificial sources which have provided an excellent opportunity to assess the quality of existing fully-automatic event bulletins and to guide the development of new techniques for online seismic processing. Comparing the times and locations of confirmed events from sources in Fennoscandia and NW Russia with the corresponding time and location estimates reported in existing automatic bulletins has revealed substantial mislocation errors which preclude a confident association of detected signals with known industrial sources. The causes of the errors are well understood and are

  20. Spectral Components Analysis of Diffuse Emission Processes

    Energy Technology Data Exchange (ETDEWEB)

    Malyshev, Dmitry; /KIPAC, Menlo Park

    2012-09-14

    We develop a novel method to separate the components of a diffuse emission process based on an association with the energy spectra. Most of the existing methods use some information about the spatial distribution of components, e.g., closeness to an external template, independence of components etc., in order to separate them. In this paper we propose a method where one puts conditions on the spectra only. The advantages of our method are: 1) it is internal: the maps of the components are constructed as combinations of data in different energy bins, 2) the components may be correlated among each other, 3) the method is semi-blind: in many cases, it is sufficient to assume a functional form of the spectra and determine the parameters from a maximization of a likelihood function. As an example, we derive the CMB map and the foreground maps for seven yeas of WMAP data. In an Appendix, we present a generalization of the method, where one can also add a number of external templates.

  1. Analysis of the heat setting process

    Science.gov (United States)

    Besler, N.; Gloy, Y. S.; Gries, T.

    2016-07-01

    Heat setting is an expensive and energy elaborative textile process. Heat setting is necessary to guarantee size accuracy and dimensional stability for textile materials. Depending on the material different heat setting methods such as saturated steam or hot air are used for the fixation. The research aim is to define the influence of heat setting on mechanical characteristics and to analyse the correlation of heat setting parameters for polyester. With the help of a “one factor at a time” experimental design heat setting parameters are varied. Mechanical characteristics and the material quality of heat set and not heat set material are evaluated to analyse the heat setting influence. In the described experimental design up to a temperature of 195 °C and a dwell time of 30 seconds the material shrinkage of polyester is increasing with increasing temperature and dwell time. Shrinkage in wales direction is higher than in course direction. The tensile strength in course direction stays constant whereas the tensile strength in wales direction can be increased by heat setting.

  2. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  3. INTERACTIVE CHANGE DETECTION USING HIGH RESOLUTION REMOTE SENSING IMAGES BASED ON ACTIVE LEARNING WITH GAUSSIAN PROCESSES

    Directory of Open Access Journals (Sweden)

    H. Ru

    2016-06-01

    Full Text Available Although there have been many studies for change detection, the effective and efficient use of high resolution remote sensing images is still a problem. Conventional supervised methods need lots of annotations to classify the land cover categories and detect their changes. Besides, the training set in supervised methods often has lots of redundant samples without any essential information. In this study, we present a method for interactive change detection using high resolution remote sensing images with active learning to overcome the shortages of existing remote sensing image change detection techniques. In our method, there is no annotation of actual land cover category at the beginning. First, we find a certain number of the most representative objects in unsupervised way. Then, we can detect the change areas from multi-temporal high resolution remote sensing images by active learning with Gaussian processes in an interactive way gradually until the detection results do not change notably. The artificial labelling can be reduced substantially, and a desirable detection result can be obtained in a few iterations. The experiments on Geo-Eye1 and WorldView2 remote sensing images demonstrate the effectiveness and efficiency of our proposed method.

  4. Exploratory functional flood frequency analysis and outlier detection

    Science.gov (United States)

    Chebana, Fateh; Dabo-Niang, Sophie; Ouarda, Taha B. M. J.

    2012-04-01

    The prevention of flood risks and the effective planning and management of water resources require river flows to be continuously measured and analyzed at a number of stations. For a given station, a hydrograph can be obtained as a graphical representation of the temporal variation of flow over a period of time. The information provided by the hydrograph is essential to determine the severity of extreme events and their frequencies. A flood hydrograph is commonly characterized by its peak, volume, and duration. Traditional hydrological frequency analysis (FA) approaches focused separately on each of these features in a univariate context. Recent multivariate approaches considered these features jointly in order to take into account their dependence structure. However, all these approaches are based on the analysis of a number of characteristics and do not make use of the full information content of the hydrograph. The objective of the present work is to propose a new framework for FA using the hydrographs as curves: functional data. In this context, the whole hydrograph is considered as one infinite-dimensional observation. This context allows us to provide more effective and efficient estimates of the risk associated with extreme events. The proposed approach contributes to addressing the problem of lack of data commonly encountered in hydrology by fully employing all the information contained in the hydrographs. A number of functional data analysis tools are introduced and adapted to flood FA with a focus on exploratory analysis as a first stage toward a complete functional flood FA. These methods, including data visualization, location and scale measures, principal component analysis, and outlier detection, are illustrated in a real-world flood analysis case study from the province of Quebec, Canada.

  5. Real-time progressive hyperspectral image processing endmember finding and anomaly detection

    CERN Document Server

    Chang, Chein-I

    2016-01-01

    The book covers the most crucial parts of real-time hyperspectral image processing: causality and real-time capability. Recently, two new concepts of real time hyperspectral image processing, Progressive Hyperspectral Imaging (PHSI) and Recursive Hyperspectral Imaging (RHSI). Both of these can be used to design algorithms and also form an integral part of real time hyperpsectral image processing. This book focuses on progressive nature in algorithms on their real-time and causal processing implementation in two major applications, endmember finding and anomaly detection, both of which are fundamental tasks in hyperspectral imaging but generally not encountered in multispectral imaging. This book is written to particularly address PHSI in real time processing, while a book, Recursive Hyperspectral Sample and Band Processing: Algorithm Architecture and Implementation (Springer 2016) can be considered as its companion book. Includes preliminary background which is essential to those who work in hyperspectral ima...

  6. A new approach of QRS complex detection based on matched filtering and triangle character analysis.

    Science.gov (United States)

    Li, Yanjun; Yan, Hong; Hong, Feng; Song, Jinzhong

    2012-09-01

    QRS complex detection usually provides the fundamentals to automated electrocardiogram (ECG) analysis. In this paper, a new approach of QRS complex detection without the stage of noise suppression was developed and evaluated, which was based on the combination of two techniques: matched filtering and triangle character analysis. Firstly, a template of QRS complex was selected automatically by the triangle character in ECG, and then it was time-reversed after removing its direct current component. Secondly, matched filtering was implemented at low computational cost by finite impulse response, which further enhanced QRS complex and attenuated non-QRS regions containing P-wave, T-wave and various noise components. Subsequently, triangle structure-based threshold decision was processed to detect QRS complexes. And RR intervals and triangle structures were further analyzed for the reduction of false-positive and false-negative detections. Finally, the performance of the proposed algorithm was tested on all 48 records of the MIT-BIH Arrhythmia Database. The results demonstrated that the detection rate reached 99.62 %, the sensitivity got 99.78 %, and the positive prediction was 99.85 %. In addition, the proposed method was able to identify QRS complexes reliably even under the condition of poor signal quality.

  7. Pulsed laser noise analysis and pump-probe signal detection with a data acquisition card.

    Science.gov (United States)

    Werley, Christopher A; Teo, Stephanie M; Nelson, Keith A

    2011-12-01

    A photodiode and data acquisition card whose sampling clock is synchronized to the repetition rate of a laser are used to measure the energy of each laser pulse. Simple analysis of the data yields the noise spectrum from very low frequencies up to half the repetition rate and quantifies the pulse energy distribution. When two photodiodes for balanced detection are used in combination with an optical modulator, the technique is capable of detecting very weak pump-probe signals (ΔI/I(0) ~ 10(-5) at 1 kHz), with a sensitivity that is competitive with a lock-in amplifier. Detection with the data acquisition card is versatile and offers many advantages including full quantification of noise during each stage of signal processing, arbitrary digital filtering in silico after data collection is complete, direct readout of percent signal modulation, and easy adaptation for fast scanning of delay between pump and probe.

  8. Combination of EEG Complexity and Spectral Analysis for Epilepsy Diagnosis and Seizure Detection

    Directory of Open Access Journals (Sweden)

    Liang Sheng-Fu

    2010-01-01

    Full Text Available Approximately 1% of the world's population has epilepsy, and 25% of epilepsy patients cannot be treated sufficiently by any available therapy. If an automatic seizure-detection system was available, it could reduce the time required by a neurologist to perform an off-line diagnosis by reviewing electroencephalogram (EEG data. It could produce an on-line warning signal to alert healthcare professionals or to drive a treatment device such as an electrical stimulator to enhance the patient's safety and quality of life. This paper describes a systematic evaluation of current approaches to seizure detection in the literature. This evaluation was then used to suggest a reliable, practical epilepsy detection method. The combination of complexity analysis and spectrum analysis on an EEG can perform robust evaluations on the collected data. Principle component analysis (PCA and genetic algorithms (GAs were applied to various linear and nonlinear methods. The best linear models resulted from using all of the features without other processing. For the nonlinear models, applying PCA for feature reduction provided better results than applying GAs. The feasibility of executing the proposed methods on a personal computer for on-line processing was also demonstrated.

  9. Analysis of thermal process of pozzolan production

    Directory of Open Access Journals (Sweden)

    Mejía De Gutiérrez, R.

    2004-06-01

    Full Text Available The objective of this study was evaluated the effect of heat treatment parameters on the pozzolanic activity of natural kaolin clays. The experimental design included three factors: kaolin type, temperature and time. Five types of Colombian kaolin clays were thermally treated from 400 to 1000 °C by 1, 2, and 3 hours. The raw materials and the products obtained were characterized by X-Ray Diffraction (XRD, Fourier Transform Infrared Spectroscopy (FTIR and Differential Thermal / Thermo gravimetric Analysis (DTAJ TGA. The pozzolanic activity of thermally treated samples according to chemical and mechanical tests was investigated.

    El objetivo de este estudio fue caracterizar las variables de producción de un metacaolín de alta reactividad puzolánica. El diseño experimental utilizó un modelo factorial que consideró tres factores: tipo de caolín (C, temperatura y tiempo. A partir del conocimiento de las fuentes de caolín y el contacto con proveedores y distribuidores del producto a nivel nacional, se seleccionaron cinco muestras representativas de arcillas caoliníticas, las cuales se sometieron a un tratamiento térmico entre 400 y 1.000 ºC (seis niveles de temperatura y tres tiempos de exposición, 1, 2 y 3 horas. Los caolines de origen y los productos obtenidos de cada proceso térmico fueron evaluados mediante técnicas de tipo físico y químico, difracción de rayos X, infrarrojo FTIR, y análisis térmico diferencial (OTA, TGA. Complementariamente se evalúa la actividad puzolánica, tanto química como mecánica, del producto obtenido a diferentes temperaturas de estudio.

  10. Human movement analysis with image processing in real time

    Science.gov (United States)

    Fauvet, Eric; Paindavoine, Michel; Cannard, F.

    1991-04-01

    In the field of the human sciences, a lot of applications needs to know the kinematic characteristics of the human movements Psycology is associating the characteristics with the control mechanism, sport and biomechariics are associating them with the performance of the sportman or of the patient. So the trainers or the doctors can correct the gesture of the subject to obtain a better performance if he knows the motion properties. Roherton's studies show the children motion evolution2 . Several investigations methods are able to measure the human movement But now most of the studies are based on image processing. Often the systems are working at the T.V. standard (50 frame per secund ). they permit only to study very slow gesture. A human operator analyses the digitizing sequence of the film manually giving a very expensive, especially long and unprecise operation. On these different grounds many human movement analysis systems were implemented. They consist of: - markers which are fixed to the anatomical interesting points on the subject in motion, - Image compression which is the art to coding picture data. Generally the compression Is limited to the centroid coordinates calculation tor each marker. These systems differ from one other in image acquisition and markers detection.

  11. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    Science.gov (United States)

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  12. In-situ laser material process monitoring using a cladding power detection technique

    Science.gov (United States)

    Su, Daoning; Norris, Ian; Peters, Chris; Hall, Denis R.; Jones, Julian D. C.

    Progress in laser material processing may require real-time monitoring and process control for consistent quality and productivity. We report a method of in-situ monitoring of laser metal cutting and drilling using cladding power monitoring of an optical fibre beam delivery system—a technique which detects the light reflected or scattered from the workpiece. The light signal carries information about the quality of the process. Experiments involving drilling and cutting of two samples, a thin aluminum foil and a 2-mm thick stainless steel plate, confirmed the effectiveness of this method.

  13. A novel time-domain signal processing algorithm for real time ventricular fibrillation detection

    Science.gov (United States)

    Monte, G. E.; Scarone, N. C.; Liscovsky, P. O.; Rotter S/N, P.

    2011-12-01

    This paper presents an application of a novel algorithm for real time detection of ECG pathologies, especially ventricular fibrillation. It is based on segmentation and labeling process of an oversampled signal. After this treatment, analyzing sequence of segments, global signal behaviours are obtained in the same way like a human being does. The entire process can be seen as a morphological filtering after a smart data sampling. The algorithm does not require any ECG digital signal pre-processing, and the computational cost is low, so it can be embedded into the sensors for wearable and permanent applications. The proposed algorithms could be the input signal description to expert systems or to artificial intelligence software in order to detect other pathologies.

  14. Decision tree learning for detecting turning points in business process orientation: a case of Croatian companies

    Directory of Open Access Journals (Sweden)

    Ljubica Milanović Glavan

    2015-03-01

    Full Text Available Companies worldwide are embracing Business Process Orientation (BPO in order to improve their overall performance. This paper presents research results on key turning points in BPO maturity implementation efforts. A key turning point is defined as a component of business process maturity that leads to the establishment and expansion of other factors that move the organization to the next maturity level. Over the past few years, different methodologies for analyzing maturity state of BPO have been developed. The purpose of this paper is to investigate the possibility of using data mining methods in detecting key turning points in BPO. Based on survey results obtained in 2013, the selected data mining technique of classification and regression trees (C&RT was used to detect key turning points in Croatian companies. These findings present invaluable guidelines for any business that strives to achieve more efficient business processes.

  15. Image processing and analysis with graphs theory and practice

    CERN Document Server

    Lézoray, Olivier

    2012-01-01

    Covering the theoretical aspects of image processing and analysis through the use of graphs in the representation and analysis of objects, Image Processing and Analysis with Graphs: Theory and Practice also demonstrates how these concepts are indispensible for the design of cutting-edge solutions for real-world applications. Explores new applications in computational photography, image and video processing, computer graphics, recognition, medical and biomedical imaging With the explosive growth in image production, in everything from digital photographs to medical scans, there has been a drast

  16. [Two Data Inversion Algorithms of Aerosol Horizontal Distributiol Detected by MPL and Error Analysis].

    Science.gov (United States)

    Lü, Li-hui; Liu, Wen-qing; Zhang, Tian-shu; Lu, Yi-huai; Dong, Yun-sheng; Chen, Zhen-yi; Fan, Guang-qiang; Qi, Shao-shuai

    2015-07-01

    Atmospheric aerosols have important impacts on human health, the environment and the climate system. Micro Pulse Lidar (MPL) is a new effective tool for detecting atmosphere aerosol horizontal distribution. And the extinction coefficient inversion and error analysis are important aspects of data processing. In order to detect the horizontal distribution of atmospheric aerosol near the ground, slope and Fernald algorithms were both used to invert horizontal MPL data and then the results were compared. The error analysis showed that the error of the slope algorithm and Fernald algorithm were mainly from theoretical model and some assumptions respectively. Though there still some problems exist in those two horizontal extinction coefficient inversions, they can present the spatial and temporal distribution of aerosol particles accurately, and the correlations with the forward-scattering visibility sensor are both high with the value of 95%. Furthermore relatively speaking, Fernald algorithm is more suitable for the inversion of horizontal extinction coefficient.

  17. Planar optical waveguide based sandwich assay sensors and processes for the detection of biological targets including early detection of cancers

    Science.gov (United States)

    Martinez, Jennifer S.; Swanson, Basil I.; Shively, John E.; Li, Lin

    2009-06-02

    An assay element is described including recognition ligands adapted for binding to carcinoembryonic antigen (CEA) bound to a film on a single mode planar optical waveguide, the film from the group of a membrane, a polymerized bilayer membrane, and a self-assembled monolayer containing polyethylene glycol or polypropylene glycol groups therein and an assay process for detecting the presence of CEA is described including injecting a possible CEA-containing sample into a sensor cell including the assay element, maintaining the sample within the sensor cell for time sufficient for binding to occur between CEA present within the sample and the recognition ligands, injecting a solution including a reporter ligand into the sensor cell; and, interrogating the sample within the sensor cell with excitation light from the waveguide, the excitation light provided by an evanescent field of the single mode penetrating into the biological target-containing sample to a distance of less than about 200 nanometers from the waveguide thereby exciting any bound reporter ligand within a distance of less than about 200 nanometers from the waveguide and resulting in a detectable signal.

  18. Assessing segmentation processes by click detection: online measure of statistical learning, or simple interference?

    Science.gov (United States)

    Franco, Ana; Gaillard, Vinciane; Cleeremans, Axel; Destrebecqz, Arnaud

    2015-12-01

    Statistical learning can be used to extract the words from continuous speech. Gómez, Bion, and Mehler (Language and Cognitive Processes, 26, 212-223, 2011) proposed an online measure of statistical learning: They superimposed auditory clicks on a continuous artificial speech stream made up of a random succession of trisyllabic nonwords. Participants were instructed to detect these clicks, which could be located either within or between words. The results showed that, over the length of exposure, reaction times (RTs) increased more for within-word than for between-word clicks. This result has been accounted for by means of statistical learning of the between-word boundaries. However, even though statistical learning occurs without an intention to learn, it nevertheless requires attentional resources. Therefore, this process could be affected by a concurrent task such as click detection. In the present study, we evaluated the extent to which the click detection task indeed reflects successful statistical learning. Our results suggest that the emergence of RT differences between within- and between-word click detection is neither systematic nor related to the successful segmentation of the artificial language. Therefore, instead of being an online measure of learning, the click detection task seems to interfere with the extraction of statistical regularities.

  19. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  20. Reachability for Finite-State Process Algebras Using Static Analysis

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming

    2011-01-01

    In this work we present an algorithm for solving the reachability problem in finite systems that are modelled with process algebras. Our method uses Static Analysis, in particular, Data Flow Analysis, of the syntax of a process algebraic system with multi-way synchronisation. The results...... of the Data Flow Analysis are used in order to “cut off” some of the branches in the reachability analysis that are not important for determining, whether or not a state is reachable. In this way, it is possible for our reachability algorithm to avoid building large parts of the system altogether and still...