WorldWideScience

Sample records for analysis detection processing

  1. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  2. Dynamics analysis of vibration process in Particle Impact Noise Detection

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui; ZHOU Chang-lei; WANG Shu-juan; ZHAI Guo-fu

    2007-01-01

    Particle Impact Noise Detection (PIND) test is a reliability screening technique for hermetic device that is prescribed by MIL-PRF-39016E. Some test conditions are specified, although MIL-PRF-39016E did not specify how to obtain these conditions. This paper establishes the dynamics model of vibration process based on first order mass-spring system. The corresponding Simulink model is also established to simulate vibration process in optional input excitations. The response equations are derived in sinusoidal excitations and the required electromagnetic force waves are computed in order to obtain a given vibration and shock accelerations. Last, some simulation results are given.

  3. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    Science.gov (United States)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  4. Bayesian analysis to detect abrupt changes in extreme hydrological processes

    Science.gov (United States)

    Jo, Seongil; Kim, Gwangsu; Jeon, Jong-June

    2016-07-01

    In this study, we develop a new method for a Bayesian change point analysis. The proposed method is easy to implement and can be extended to a wide class of distributions. Using a generalized extreme-value distribution, we investigate the annual maximum of precipitations observed at stations in the South Korean Peninsula, and find significant changes in the considered sites. We evaluate the hydrological risk in predictions using the estimated return levels. In addition, we explain that the misspecification of the probability model can lead to a bias in the number of change points and using a simple example, show that this problem is difficult to avoid by technical data transformation.

  5. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available A Similarity Ratio Analysis (SRA method is proposed for early-stage Fault Detection (FD in plasma etching processes using real-time Optical Emission Spectrometer (OES data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A, takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  6. Analysis of active islanding detection methods for grid-connected microinverters for renewable energy processing

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, C.L. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain); Departamento de Ingenieria Electronica, Universidad Distrital Francisco Jose de Caldas, Carrera 7 N 40-53 Piso 5, Bogota (Colombia); Velasco, D.; Figueres, E.; Garcera, G. [Grupo de Sistemas Electronicos Industriales del Departamento de Ingenieria Electronica, Universidad Politecnica de Valencia, Camino de Vera S/N, C.P. 46022, Valencia (Spain)

    2010-11-15

    This paper presents the analysis and comparison of the main active techniques for islanding detection used in grid-connected microinverters for power processing of renewable energy sources. These techniques can be classified into two classes: techniques introducing positive feedback in the control of the inverter and techniques based on harmonics injection. Accurate PSIM trademark simulations have been carried out in order to perform a comparative analysis of the techniques under study and to establish their advantages and disadvantages according to IEEE standards. (author)

  7. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    Science.gov (United States)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  8. Technology Gap Analysis for the Detection of Process Signatures Using Less Than Remote Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hartman, John S.; Atkinson, David A.; Lind, Michael A.; Maughan, A. D.; Kelly, James F.

    2005-01-01

    Although remote sensing methods offer advantages for monitoring important illicit process activities, remote and stand-off technologies cannot successfully detect all important processes with the sensitivity and certainty that is desired. The main scope of the program is observables, with a primary focus on chemical signatures. A number of key process signatures elude remote or stand-off detection for a variety of reasons (e.g., heavy particulate emissions that do not propagate far enough for detection at stand-off distances, semi-volatile chemicals that do not tend to vaporize and remain in the environment near the source, etc.). Some of these compounds can provide persistent, process-specific information that is not available through remote techniques; however, the associated measurement technologies have their own set of advantages, disadvantages and technical challenges that may need to be overcome before additional signature data can be effectively and reliably exploited. The main objective of this report is to describe a process to identify high impact technology gaps for important less-than-remote detection applications. The subsequent analysis focuses on the technology development needed to enable exploitation of important process signatures. The evaluation process that was developed involves three interrelated and often conflicting requirements generation activities: • Identification of target signature chemicals with unique intelligence value and their associated attributes as mitigated by environmentally influenced fate and transport effects (i.e., what can you expect to actually find that has intelligence value, where do you need to look for it and what sensitivity and selectivity do you need to see it) • Identification of end-user deployment scenario possibilities and constraints with a focus on alternative detection requirements, timing issues, logistical consideration, and training requirements for a successful measurement • Identification of

  9. Analysis of Space Shuttle Ground Support System Fault Detection, Isolation, and Recovery Processes and Resources

    Science.gov (United States)

    Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.

    2009-01-01

    As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.

  10. Experimental analysis of the auditory detection process on avian point counts

    Science.gov (United States)

    Simons, T.R.; Alldredge, M.W.; Pollock, K.H.; Wettroth, J.M.

    2007-01-01

    We have developed a system for simulating the conditions of avian surveys in which birds are identified by sound. The system uses a laptop computer to control a set of amplified MP3 players placed at known locations around a survey point. The system can realistically simulate a known population of songbirds under a range of factors that affect detection probabilities. The goals of our research are to describe the sources and range of variability affecting point-count estimates and to find applications of sampling theory and methodologies that produce practical improvements in the quality of bird-census data. Initial experiments in an open field showed that, on average, observers tend to undercount birds on unlimited-radius counts, though the proportion of birds counted by individual observers ranged from 81% to 132% of the actual total. In contrast to the unlimited-radius counts, when data were truncated at a 50-m radius around the point, observers overestimated the total population by 17% to 122%. Results also illustrate how detection distances decline and identification errors increase with increasing levels of ambient noise. Overall, the proportion of birds heard by observers decreased by 28 ?? 4.7% under breezy conditions, 41 ?? 5.2% with the presence of additional background birds, and 42 ?? 3.4% with the addition of 10 dB of white noise. These findings illustrate some of the inherent difficulties in interpreting avian abundance estimates based on auditory detections, and why estimates that do not account for variations in detection probability will not withstand critical scrutiny. ?? The American Ornithologists' Union, 2007.

  11. Lie detection: Cognitive Processes

    OpenAIRE

    Street, C. N. H.

    2013-01-01

    How do we make decisions when we are uncertain? In more real-world settings there is often a vast array of information available to guide the decision, from an understanding of the social situation, to prior beliefs and experience, to information available in the current environment. Yet much of the research into uncertain decision-making has typically studied the process by isolating it from this rich source of information that decision-makers usually have available to them. This thesis take...

  12. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  13. Malware detection and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan; Sanders, Benjamin; Erickson, Jeremy Lee; Fritz, David Jakob

    2016-03-22

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable to the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.

  14. A Process Deviation Analysis Framework

    OpenAIRE

    Depaire, Benoit; Swinnen, Jo; Jans, Mieke; Vanhoof, Koen

    2013-01-01

    Process deviation analysis is becoming increasingly important for companies. This paper presents a framework which structures the field of process deviation analysis and identies new research opportunities. Application of the framework starts from managerial questions which relate to specific deviation categories and methodological steps. Finally a general outline to detect high-level process deviations is formulated.

  15. Fingerprint detection and process prediction by multivariate analysis of fed-batch monoclonal antibody cell culture data.

    Science.gov (United States)

    Sokolov, Michael; Soos, Miroslav; Neunstoecklin, Benjamin; Morbidelli, Massimo; Butté, Alessandro; Leardi, Riccardo; Solacroup, Thomas; Stettler, Matthieu; Broly, Hervé

    2015-01-01

    This work presents a sequential data analysis path, which was successfully applied to identify important patterns (fingerprints) in mammalian cell culture process data regarding process variables, time evolution and process response. The data set incorporates 116 fed-batch cultivation experiments for the production of a Fc-Fusion protein. Having precharacterized the evolutions of the investigated variables and manipulated parameters with univariate analysis, principal component analysis (PCA) and partial least squares regression (PLSR) are used for further investigation. The first major objective is to capture and understand the interaction structure and dynamic behavior of the process variables and the titer (process response) using different models. The second major objective is to evaluate those models regarding their capability to characterize and predict the titer production. Moreover, the effects of data unfolding, imputation of missing data, phase separation, and variable transformation on the performance of the models are evaluated. PMID:26399784

  16. Theoretical and experimental analysis of electroweak corrections to the inclusive jet process. Development of extreme topologies detection methods

    International Nuclear Information System (INIS)

    We have studied the behaviour of the inclusive jet, W+jets and Z+jets processes from the phenomenological and experimental point of view in the ATLAS experiment at LHC in order to understand how important is the impact of Sudakov logarithms on electroweak corrections and in the associated production of weak vector boson and jets at LHC. We have computed the amplitude of the real electroweak corrections to the inclusive jet process due to the real emission of weak vector bosons from jets. We have done this computation with the MCFM and NLOjet++ generators at 7 TeV, 8 TeV and 14 TeV. This study shows that, for the inclusive jet process, the partial cancellation of the virtual weak corrections (due to weak bosons in loops) by the real electroweak corrections occurs. This effect shows that Bloch-Nordsieck violation is reduced for this process. We have then participated to the measure of the differential cross-section for these different processes in the ATLAS experiment at 7 TeV. In particular we have been involved into technical aspects of the measurement such as the study of the QCD background to the W+jets process in the muon channel. We have then combined the different measurements in this channel to compare their behaviour. This tends to show that several effects are giving to the electroweak corrections their relative importance as we see an increase of the relative contribution of weak bosons with jets processes to the inclusive jet process with the transverse momentum of jets, if we explicitly ask for the presence of electroweak bosons in the final state. This study is currently only a preliminary study and aims at showing that this study can be useful to investigate the underlying structure of these processes. Finally we have studied the noises affecting the ATLAS calorimeter. This has allowed for the development of a new way to detect problematic events using well known theorems from statistics. This new method is able to detect bursts of noise and

  17. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  18. Image Processing Technique for Brain Abnormality Detection

    Directory of Open Access Journals (Sweden)

    Ashraf Anwar

    2013-02-01

    Full Text Available Medical imaging is expensive and very much sophisticated because of proprietary software and expert personalities. This paper introduces an inexpensive, user friendly general-purpose image processing tool and visualization program specifically designed in MATLAB to detect much of the brain disorders as early as possible. The application provides clinical and quantitative analysis of medical images. Minute structural difference of brain gradually results in major disorders such as schizophrenia, Epilepsy, inherited speech and language disorder, Alzheimer's dementia etc. Here the main focusing is given to diagnose the disease related to the brain and its psychic nature (Alzheimer’s disease.

  19. Determination of diethanolamine or N-methyldiethanolamine in high ammonium concentration matrices by capillary electrophoresis with indirect UV detection: application to the analysis of refinery process waters

    Energy Technology Data Exchange (ETDEWEB)

    Bord, N.; Cretier, G.; Rocca, J.-L. [Universite Claude Bernard Lyon 1 (France). Laboratoire des Sciences Analytiques; Bailly, C. [Centre de Recherches de Gonfreville, Total France, Laboratoires Chromatographie Liquide et Microbiologie, Rogerville (France); Souchez, J.-P. [Centre de Recherches de Solaize, Total France, Chemin du Canal, BP 22, St-Symphorien d' Ozon (France)

    2004-09-01

    Alkanolamines such as diethanolamine (DEA) and N-methyldiethanolamine (MDEA) are used in desulfurization processes in crude oil refineries. These compounds may be found in process waters following an accidental contamination. The analysis of alkanolamines in refinery process waters is very difficult due to the high ammonium concentration of the samples. This paper describes a method for the determination of DEA in high ammonium concentration refinery process waters by using capillary electrophoresis (CE) with indirect UV detection. The same method can be used for the determination of MDEA. Best results were achieved with a background electrolyte (BGE) comprising 10 mM histidine adjusted to pH 5.0 with acetic acid. The development of this electrolyte and the analytical performances are discussed. The quantification was performed by using internal standardization, by which triethanolamine (TEA) was used as internal standard. A matrix effect due to the high ammonium content has been highlighted and standard addition was therefore used. The developed method was characterized in terms of repeatability of migration times and corrected peak areas, linearity, and accuracy. Limits of detection (LODs) and quantification (LOQs) obtained were 0.2 and 0.7 ppm, respectively. The CE method was applied to the determination of DEA or MDEA in refinery process waters spiked with known amounts of analytes and it gave excellent results, since uncertainties obtained were 8 and 5%, respectively. (orig.)

  20. Integrated Process Capability Analysis

    Institute of Scientific and Technical Information of China (English)

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S

    2002-01-01

    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  1. Ongoing Active Deformation Processes at Fernandina Volcano (Galapagos) Detected via Multi-Orbit COSMO-SkyMed SAR Data Analysis

    Science.gov (United States)

    Pepe, Susi; Castaldo, Raffaele; De Luca, Claudio; Casu, Francesco; Tizzani, Pietro; Sansosti, Eugenio

    2014-05-01

    Fernandina Volcano, Galápagos (Ecuador), has experienced several uplift and eruption episodes over the last twenty-two years. The ground deformation between 2002 and 2006 was interpreted as the effect of an inflation phenomenon of two separate magma reservoirs beneath the caldera. Moreover, the uplift deformation occurred during the 2005 eruption was concentrated near the circumferential eruptive fissures, while being superimposed on a broad subsidence centred on the caldera. The geodetic studies emphasized the presence of two sub volcanic lateral intrusions from the central storage system in December 2006 and August 2007. The latest eruption in 2009 was characterized by lava flows emitted from the SW radial fissures. We analyze the spatial and temporal ground deformation between March 2012 and July 2013, by using data acquired by COSMO-SkyMed X-band constellation along both ascending and descending orbits and by applying advanced InSAR techniques. In particular, we use the SBAS InSAR approach and combine ascending and descending time series to produce vertical and East-West components of the mean deformation velocity and deformation time series. Our analysis revealed a new uplift phenomenon due to the stress concentration inside the shallow magmatic system of the volcano. In particular, the vertical mean velocity map shows that the deformation pattern is concentrated inside caldera region and is characterized by strongly radial symmetry with a maximum displacement of about 20 cm in uplift; an axial symmetry is also observed in the EW horizontal mean velocity map, showing a maximum displacement of about +12 cm towards East for the SE flank, and -12 cm towards West for the NW flank of the volcano. Moreover, the deformation time series show a rather linear uplift trend from March to September 2012, interrupted by a low deformation rate interval lasting until January 2013. After this stage, the deformation shows again a linear behaviour with an increased uplift rate

  2. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  3. 激光加工中光化学作用的检测分析方法%The Detection Analysis Methods of Photochemical Effect in Laser Processing

    Institute of Scientific and Technical Information of China (English)

    宋悠全; 魏昕; 谢小柱; 刘俊宏

    2011-01-01

    Photothermal and photochemical effects exist in the process of laser processed materials. The detection and analysis methods for photochemical effect used in laser processing are reviewed. Photochemical results are mainly detected according to the changes such as material surface structure, composition, micromor-phology, physical-chemical characteristics after laser processing; while the detection of photochemical process adopts the spectroscopy and the mass spectrometer with a time-resolved scale to analyze the molecular fragments and the charged ions in mass distribution, energy distribution and angle distribution during laser processing. The photochemical mechanism is generally studied by analyzing the experimental detection results combining with the related theoretical knowledge of materials chemical structure.%在基于激光加工各种材料过程中存在的光热作用和光化学作用的基础上,综述了激光加工过程中光化学作用的检测分析方法.对光化学作用结果的检测主要通过对激光作用后材料表面的结构、成分、微观形貌、物理化学特性等的变化进行分析;而对光化学作用过程的检测则采用具有时间分辨尺度的质谱仪或光谱仪等对激光加工过程中产生的分子碎片和带电离子的质量分布、能量分布和角度分布进行分析;根据仪器检测的结果,结合材料的化学结构等理论知识综合分析是研究激光光化学作用机理的主要方法.

  4. Ultrasound perfusion signal processing for tumor detection

    Science.gov (United States)

    Kim, MinWoo; Abbey, Craig K.; Insana, Michael F.

    2016-04-01

    Enhanced blood perfusion in a tissue mass is an indication of neo-vascularity and a sign of a potential malignancy. Ultrasonic pulsed-Doppler imaging is a preferred modality for noninvasive monitoring of blood flow. However, the weak blood echoes and disorganized slow flow make it difficult to detect perfusion using standard methods without the expense and risk of contrast enhancement. Our research measures the efficiency of conventional power-Doppler (PD) methods at discriminating flow states by comparing measurement performance to that of an ideal discriminator. ROC analysis applied to the experimental results shows that power Doppler methods are just 30-50 % efficient at perfusion flows less than 1ml/min, suggesting an opportunity to improve perfusion assessment through signal processing. A new perfusion estimator is proposed by extending the statistical discriminator approach. We show that 2-D perfusion color imaging may be enhanced using this approach.

  5. Post-processing noise removal algorithm for magnetic resonance imaging based on edge detection and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Placidi, Giuseppe; Alecci, Marcello; Sotgiu, Antonello [INFM, c/o Centro di Risonanza Magnetica and Dipartimento di Scienze e Tecnologie Biomediche, Universita dell' Aquila, Via Vetoio 10, 67010 Coppito, L' Aquila (Italy)

    2003-07-07

    A post-processing noise suppression technique for biomedical MRI images is presented. The described procedure recovers both sharp edges and smooth surfaces from a given noisy MRI image; it does not blur the edges and does not introduce spikes or other artefacts. The fine details of the image are also preserved. The proposed algorithm first extracts the edges from the original image and then performs noise reduction by using a wavelet de-noise method. After the application of the wavelet method, the edges are restored to the filtered image. The result is the original image with less noise, fine detail and sharp edges. Edge extraction is performed by using an algorithm based on Sobel operators. The wavelet de-noise method is based on the calculation of the correlation factor between wavelet coefficients belonging to different scales. The algorithm was tested on several MRI images and, as an example of its application, we report the results obtained from a spin echo (multi echo) MRI image of a human wrist collected with a low field experimental scanner (the signal-to-noise ratio, SNR, of the experimental image was 12). Other filtering operations have been performed after the addition of white noise on both channels of the experimental image, before the magnitude calculation. The results at SNR = 7, SNR = 5 and SNR = 3 are also reported. For SNR values between 5 and 12, the improvement in SNR was substantial and the fine details were preserved, the edges were not blurred and no spikes or other artefacts were evident, demonstrating the good performances of our method. At very low SNR (SNR = 3) our result is worse than that obtained by a simpler filtering procedure.

  6. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    Directory of Open Access Journals (Sweden)

    Kunming Qin

    2014-01-01

    Full Text Available Background: In traditional Chinese medicine (TCM, raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae, is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry. Results: The results indicated that there was less arctiin in stir-fried materials than in raw materials. however, there were higher levels of arctigenin in stir-fried materials than in raw materials. Conclusion: We suggest that arctiin reduced significantly following the thermal conversion of arctiin to arctigenin. In conclusion, this finding may shed some light on understanding the differences in the therapeutic values of raw versus PFA in TCM.

  7. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  8. Aluminium Process Fault Detection and Diagnosis

    Directory of Open Access Journals (Sweden)

    Nazatul Aini Abd Majid

    2015-01-01

    Full Text Available The challenges in developing a fault detection and diagnosis system for industrial applications are not inconsiderable, particularly complex materials processing operations such as aluminium smelting. However, the organizing into groups of the various fault detection and diagnostic systems of the aluminium smelting process can assist in the identification of the key elements of an effective monitoring system. This paper reviews aluminium process fault detection and diagnosis systems and proposes a taxonomy that includes four key elements: knowledge, techniques, usage frequency, and results presentation. Each element is explained together with examples of existing systems. A fault detection and diagnosis system developed based on the proposed taxonomy is demonstrated using aluminium smelting data. A potential new strategy for improving fault diagnosis is discussed based on the ability of the new technology, augmented reality, to augment operators’ view of an industrial plant, so that it permits a situation-oriented action in real working environments.

  9. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  10. Parallel Image Processing Technology of Surface Detection System

    Institute of Scientific and Technical Information of China (English)

    LI Chang-le; CHENG Wan-sheng; FAN Ji-zhuang; ZHAO Jie

    2008-01-01

    To improve image processing speed and detection precision of a surface detection system on a strip surface, based on the analysis of the characteristics of image data and image processing in detection system on the strip surface, the design of parallel image processing system and the methods of algorithm implementation have been studied. By using field programmable gate array(FPGA) as hardware platform of implementation and considering the characteristic of detection system on the strip surface, a parallel image processing system implemented by using multi IP kernel is designed. According to different computing tasks and the load balancing capability of parallel processing system, the system could set different calculating numbers of nodes to meet the system's demand and save the hardware cost.

  11. Fast Facial Detection by Depth Map Analysis

    Directory of Open Access Journals (Sweden)

    Ming-Yuan Shieh

    2013-01-01

    Full Text Available In order to obtain correct facial recognition results, one needs to adopt appropriate facial detection techniques. Moreover, the effects of facial detection are usually affected by the environmental conditions such as background, illumination, and complexity of objectives. In this paper, the proposed facial detection scheme, which is based on depth map analysis, aims to improve the effectiveness of facial detection and recognition under different environmental illumination conditions. The proposed procedures consist of scene depth determination, outline analysis, Haar-like classification, and related image processing operations. Since infrared light sources can be used to increase dark visibility, the active infrared visual images captured by a structured light sensory device such as Kinect will be less influenced by environmental lights. It benefits the accuracy of the facial detection. Therefore, the proposed system will detect the objective human and face firstly and obtain the relative position by structured light analysis. Next, the face can be determined by image processing operations. From the experimental results, it demonstrates that the proposed scheme not only improves facial detection under varying light conditions but also benefits facial recognition.

  12. Multimode Process Fault Detection Using Local Neighborhood Similarity Analysis☆

    Institute of Scientific and Technical Information of China (English)

    Xiaogang Deng; Xuemin Tian

    2014-01-01

    Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not wel in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis (LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis (PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Final y a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.

  13. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  14. Signal processing aspects of windshear detection

    Science.gov (United States)

    Aalfs, David D.; Baxa, Ernest G., Jr.; Bracalente, Emedio M.

    1993-01-01

    Low-altitude windshear (LAWS) has been identified as a major hazard to aircraft, particularly during takeoff and landing. The Federal Aviation Administration (FAA) has been involved with developing technology to detect LAWS. A key element in this technology is high resolution pulse Doppler weather radar equipped with signal and data processing to provide timely information about possible hazardous conditions.

  15. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  16. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  17. Linear discriminant analysis for welding fault detection

    International Nuclear Information System (INIS)

    This work presents a new method for real time welding fault detection in industry based on Linear Discriminant Analysis (LDA). A set of parameters was calculated from one second blocks of electrical data recorded during welding and based on control data from reference welds under good conditions, as well as faulty welds. Optimised linear combinations of the parameters were determined with LDA and tested with independent data. Short arc welds in overlap joints were studied with various power sources, shielding gases, wire diameters, and process geometries. Out-of-position faults were investigated. Application of LDA fault detection to a broad range of welding procedures was investigated using a similarity measure based on Principal Component Analysis. The measure determines which reference data are most similar to a given industrial procedure and the appropriate LDA weights are then employed. Overall, results show that Linear Discriminant Analysis gives an effective and consistent performance in real-time welding fault detection.

  18. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    OpenAIRE

    Kunming Qin; Qidi Liu; Hao Cai; Gang Cao; Tulin Lu; Baojia Shen; Yachun Shu; Baochang Cai

    2014-01-01

    Background: In traditional Chinese medicine (TCM), raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae), is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA) by hig...

  19. Lung Cancer Detection Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Mokhled S. AL-TARAWNEH

    2012-08-01

    Full Text Available Recently, image processing techniques are widely used in several medical areas for image improvement in earlier detection and treatment stages, where the time factor is very important to discover the abnormality issues in target images, especially in various cancer tumours such as lung cancer, breast cancer, etc. Image quality and accuracy is the core factors of this research, image quality assessment as well as improvement are depending on the enhancement stage where low pre-processing techniques is used based on Gabor filter within Gaussian rules. Following the segmentation principles, an enhanced region of the object of interest that is used as a basic foundation of feature extraction is obtained. Relying on general features, a normality comparison is made. In this research, the main detected features for accurate images comparison are pixels percentage and mask-labelling.

  20. Faulty Sensor Detection and Reconstruction for a PVC Making Process

    Institute of Scientific and Technical Information of China (English)

    李元; 周东华; 谢植; S.Joe.Qin

    2004-01-01

    Based on principal component analysis, this paper presents an application of faulty sensor detection and reconstruction in a batch process, polyvinylchloride (PVC) making process. To deal with inconsistency in process data, it is proposed to use the dynamic time warping technique to make the historical data synchronized first,then build a consistent multi-way principal component analysis model. Fault detection is carried out based on squared prediction error statistical control plot. By defining principal component subspace, residual subspace and sensor validity index, faulty sensor can be reconstructed and identified along the fault direction. Finally, application results are illustrated in detail by use of the real data of an industrial PVC making process.

  1. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  2. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  3. Chemometric processing of second-order liquid chromatographic data with UV-vis and fluorescence detection. A comparison of multivariate curve resolution and parallel factor analysis 2.

    Science.gov (United States)

    Bortolato, Santiago A; Olivieri, Alejandro C

    2014-09-01

    Second-order liquid chromatographic data with multivariate spectral (UV-vis or fluorescence) detection usually show changes in elution time profiles from sample to sample, causing a loss of trilinearity in the data. In order to analyze them with an appropriate model, the latter should permit a given component to have different time profiles in different samples. Two popular models in this regard are multivariate curve resolution-alternating least-squares (MCR-ALS) and parallel factor analysis 2 (PARAFAC2). The conditions to be fulfilled for successful application of the latter model are discussed on the basis of simple chromatographic concepts. An exhaustive analysis of the multivariate calibration models is carried out, employing both simulated and experimental chromatographic data sets. The latter involve the quantitation of benzimidazolic and carbamate pesticides in fruit and juice samples using liquid chromatography with diode array detection, and of polycyclic aromatic hydrocarbons in water samples, in both cases in the presence of potential interferents using liquid chromatography with fluorescence spectral detection, thereby achieving the second-order advantage. The overall results seem to favor MCR-ALS over PARAFAC2, especially in the presence of potential interferents.

  4. Digital Signal Processing Based Real Time Vehicular Detection System

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaoxuan; LIN Tao; LI Xiangping; LIU Chunyi; GAO Jian

    2005-01-01

    Traffic monitoring is of major importance for enforcing traffic management policies.To accomplish this task,the detection of vehicle can be achieved by exploiting image analysis techniques.In this paper,a solution is presented to obtain various traffic parameters through vehicular video detection system(VVDS).VVDS exploits the algorithm based on virtual loops to detect moving vehicle in real time.This algorithm uses the background differencing method,and vehicles can be detected through luminance difference of pixels between background image and current image.Furthermore a novel technology named as spatio-temporal image sequences analysis is applied to background differencing to improve detection accuracy.Then a hardware implementation of a digital signal processing (DSP) based board is described in detail and the board can simultaneously process four-channel video from different cameras. The benefit of usage of DSP is that images of a roadway can be processed at frame rate due to DSP′s high performance.In the end,VVDS is tested on real-world scenes and experiment results show that the system is both fast and robust to the surveillance of transportation.

  5. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  6. Cancer detection by quantitative fluorescence image analysis.

    Science.gov (United States)

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  7. Acoustic analysis assessment in speech pathology detection

    Directory of Open Access Journals (Sweden)

    Panek Daria

    2015-09-01

    Full Text Available Automatic detection of voice pathologies enables non-invasive, low cost and objective assessments of the presence of disorders, as well as accelerating and improving the process of diagnosis and clinical treatment given to patients. In this work, a vector made up of 28 acoustic parameters is evaluated using principal component analysis (PCA, kernel principal component analysis (kPCA and an auto-associative neural network (NLPCA in four kinds of pathology detection (hyperfunctional dysphonia, functional dysphonia, laryngitis, vocal cord paralysis using the a, i and u vowels, spoken at a high, low and normal pitch. The results indicate that the kPCA and NLPCA methods can be considered a step towards pathology detection of the vocal folds. The results show that such an approach provides acceptable results for this purpose, with the best efficiency levels of around 100%. The study brings the most commonly used approaches to speech signal processing together and leads to a comparison of the machine learning methods determining the health status of the patient

  8. Lameness detection in dairy cattle: single predictor v. multivariate analysis of image-based posture processing and behaviour and performance sensing.

    Science.gov (United States)

    Van Hertem, T; Bahr, C; Schlageter Tello, A; Viazzi, S; Steensels, M; Romanini, C E B; Lokhorst, C; Maltz, E; Halachmi, I; Berckmans, D

    2016-09-01

    The objective of this study was to evaluate if a multi-sensor system (milk, activity, body posture) was a better classifier for lameness than the single-sensor-based detection models. Between September 2013 and August 2014, 3629 cow observations were collected on a commercial dairy farm in Belgium. Human locomotion scoring was used as reference for the model development and evaluation. Cow behaviour and performance was measured with existing sensors that were already present at the farm. A prototype of three-dimensional-based video recording system was used to quantify automatically the back posture of a cow. For the single predictor comparisons, a receiver operating characteristics curve was made. For the multivariate detection models, logistic regression and generalized linear mixed models (GLMM) were developed. The best lameness classification model was obtained by the multi-sensor analysis (area under the receiver operating characteristics curve (AUC)=0.757±0.029), containing a combination of milk and milking variables, activity and gait and posture variables from videos. Second, the multivariate video-based system (AUC=0.732±0.011) performed better than the multivariate milk sensors (AUC=0.604±0.026) and the multivariate behaviour sensors (AUC=0.633±0.018). The video-based system performed better than the combined behaviour and performance-based detection model (AUC=0.669±0.028), indicating that it is worthwhile to consider a video-based lameness detection system, regardless the presence of other existing sensors in the farm. The results suggest that Θ2, the feature variable for the back curvature around the hip joints, with an AUC of 0.719 is the best single predictor variable for lameness detection based on locomotion scoring. In general, this study showed that the video-based back posture monitoring system is outperforming the behaviour and performance sensing techniques for locomotion scoring-based lameness detection. A GLMM with seven specific

  9. Lameness detection in dairy cattle: single predictor v. multivariate analysis of image-based posture processing and behaviour and performance sensing.

    Science.gov (United States)

    Van Hertem, T; Bahr, C; Schlageter Tello, A; Viazzi, S; Steensels, M; Romanini, C E B; Lokhorst, C; Maltz, E; Halachmi, I; Berckmans, D

    2016-09-01

    The objective of this study was to evaluate if a multi-sensor system (milk, activity, body posture) was a better classifier for lameness than the single-sensor-based detection models. Between September 2013 and August 2014, 3629 cow observations were collected on a commercial dairy farm in Belgium. Human locomotion scoring was used as reference for the model development and evaluation. Cow behaviour and performance was measured with existing sensors that were already present at the farm. A prototype of three-dimensional-based video recording system was used to quantify automatically the back posture of a cow. For the single predictor comparisons, a receiver operating characteristics curve was made. For the multivariate detection models, logistic regression and generalized linear mixed models (GLMM) were developed. The best lameness classification model was obtained by the multi-sensor analysis (area under the receiver operating characteristics curve (AUC)=0.757±0.029), containing a combination of milk and milking variables, activity and gait and posture variables from videos. Second, the multivariate video-based system (AUC=0.732±0.011) performed better than the multivariate milk sensors (AUC=0.604±0.026) and the multivariate behaviour sensors (AUC=0.633±0.018). The video-based system performed better than the combined behaviour and performance-based detection model (AUC=0.669±0.028), indicating that it is worthwhile to consider a video-based lameness detection system, regardless the presence of other existing sensors in the farm. The results suggest that Θ2, the feature variable for the back curvature around the hip joints, with an AUC of 0.719 is the best single predictor variable for lameness detection based on locomotion scoring. In general, this study showed that the video-based back posture monitoring system is outperforming the behaviour and performance sensing techniques for locomotion scoring-based lameness detection. A GLMM with seven specific

  10. Analysis of Cocoa Proanthocyanidins Using Reversed Phase High-Performance Liquid Chromatography and Electrochemical Detection: Application to Studies on the Effect of Alkaline Processing.

    Science.gov (United States)

    Stanley, Todd H; Smithson, Andrew T; Neilson, Andrew P; Anantheswaran, Ramaswamy C; Lambert, Joshua D

    2015-07-01

    Flavan-3-ols and proanthocyanidins play a key role in the health beneficial effects of cocoa. Here, we developed a new reversed phased high-performance liquid chromatography-electrochemical detection (HPLC-ECD) method for the analysis of flavan-3-ols and proanthocyanidins of degree of polymerization (DP) 2-7. We used this method to examine the effect of alkalization on polyphenol composition of cocoa powder. Treatment of cocoa powder with NaOH (final pH 8.0) at 92 °C for up to 1 h increased catechin content by 40%, but reduced epicatechin and proanthocyanidins by 23-66%. Proanthocyanidin loss could be modeled using a two-phase exponential decay model (R(2) > 0.7 for epicatchin and proanthocyanidins of odd DP). Alkalization resulted in a significant color change and 20% loss of total polyphenols. The present work demonstrates the first use of HPLC-ECD for the detection of proanthocyanidins up to DP 7 and provides an initial predictive model for the effect of alkali treatment on cocoa polyphenols.

  11. Dynamic analysis of process reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  12. Entanglement of identical particles and the detection process

    DEFF Research Database (Denmark)

    Tichy, Malte C.; de Melo, Fernando; Kus, Marek;

    2013-01-01

    We introduce detector-level entanglement, a unified entanglement concept for identical particles that takes into account the possible deletion of many-particle which-way information through the detection process. The concept implies a measure for the effective indistinguishability of the particles......, which is controlled by the measurement setup and which quantifies the extent to which the (anti-)symmetrization of the wavefunction impacts on physical observables. Initially indistinguishable particles can gain or loose entanglement on their transition to distinguishability, and their quantum...... statistical behavior depends on their initial entanglement. Our results show that entanglement cannot be attributed to a state of identical particles alone, but that the detection process has to be incorporated in the analysis....

  13. Nonlinear Statistical Process Monitoring and Fault Detection Using Kernel ICA

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xi; YAN Wei-wu; ZHAO Xu; SHAO Hui-he

    2007-01-01

    A novel nonlinear process monitoring and fault detection method based on kernel independent component analysis (ICA) is proposed. The kernel ICA method is a two-phase algorithm: whitened kernel principal component (KPCA) plus ICA. KPCA spheres data and makes the data structure become as linearly separable as possible by virtue of an implicit nonlinear mapping determined by kernel. ICA seeks the projection directions in the KPCA whitened space, making the distribution of the projected data as non-gaussian as possible. The application to the fluid catalytic cracking unit (FCCU) simulated process indicates that the proposed process monitoring method based on kernel ICA can effectively capture the nonlinear relationship in process variables. Its performance significantly outperforms monitoring method based on ICA or KPCA.

  14. Chemical detection, identification, and analysis system

    International Nuclear Information System (INIS)

    The chemical detection, identification, and analysis system (CDIAS) has three major goals. The first is to display safety information regarding chemical environment before personnel entry. The second is to archive personnel exposure to the environment. Third, the system assists users in identifying the stage of a chemical process in progress and suggests safety precautions associated with that process. In addition to these major goals, the system must be sufficiently compact to provide transportability, and it must be extremely simple to use in order to keep user interaction at a minimum. The system created to meet these goals includes several pieces of hardware and the integration of four software packages. The hardware consists of a low-oxygen, carbon monoxide, explosives, and hydrogen sulfide detector; an ion mobility spectrometer for airborne vapor detection; and a COMPAQ 386/20 portable computer. The software modules are a graphics kernel, an expert system shell, a data-base management system, and an interface management system. A supervisory module developed using the interface management system coordinates the interaction of the other software components. The system determines the safety of the environment using conventional data acquisition and analysis techniques. The low-oxygen, carbon monoxide, hydrogen sulfide, explosives, and vapor detectors are monitored for hazardous levels, and warnings are issued accordingly

  15. Signal processing techniques for atrial fibrillation source detection.

    Science.gov (United States)

    Ambadkar, Minal; Leonelli, Fabio M; Sankar, Ravi

    2014-01-01

    In clinical practice, Atrial Fibrillation (AF) is the most common and critical cardiac arrhythmia encountered. The treatment that can ensure permanent AF removal is catheter ablation, where cardiologists destroy the affected cardiac muscle cells with RF or Laser. In this procedure it is necessary to know exactly from which part of the heart AF triggers are originated. Various signal processing algorithms provide a strong tool to track AF sources. This study proposes, signal processing techniques that can be exploited for characterization, analysis and source detection of AF signals. These algorithms are implemented on Electrocardiogram (ECG) and intracardiac signals which contain important information that allows the analysis of anatomic and physiologic aspects of the whole cardiac muscle.

  16. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  17. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  18. Fast Facial Detection by Depth Map Analysis

    OpenAIRE

    Ming-Yuan Shieh; Tsung-Min Hsieh

    2013-01-01

    In order to obtain correct facial recognition results, one needs to adopt appropriate facial detection techniques. Moreover, the effects of facial detection are usually affected by the environmental conditions such as background, illumination, and complexity of objectives. In this paper, the proposed facial detection scheme, which is based on depth map analysis, aims to improve the effectiveness of facial detection and recognition under different environmental illumination conditions. The pro...

  19. Life Detection System DTIVA for Monitoring Parameter in Fossilization Process

    Science.gov (United States)

    Gómez, F.; Garcia-Descalzo, L.; Cockell, C. S.; Schwendner, P.; Rettberg, P.; Beblo-Vranesevic, K.; Bohmeier, M.; Rabbow, E.; Westall, F.; Gaboyer, F.; Walter, N.; Moissl-Eichinger, M.; Perras, A.; Amils, R.; Malki, M.; Ehrenfreund, P.; Monaghan, E.; Marteinsson, V.; Vannier, P.

    2016-05-01

    Using Life Detection System LDS we followed the physicochemical parameter in a growth culture under fossilization/mineralization-induced process with the objectives of biomarkers detection. Biomarkers study is crucial for the search for life on Mars.

  20. Image processing and analysis software development

    International Nuclear Information System (INIS)

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  1. Traffic sign detection and analysis

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Trivedi, Mohan M.; Moeslund, Thomas B.

    2012-01-01

    Traffic sign recognition (TSR) is a research field that has seen much activity in the recent decade. This paper introduces the problem and presents 4 recent papers on traffic sign detection and 4 recent papers on traffic sign classification. It attempts to extract recent trends in the field...

  2. Method Validation for the Quantitative Analysis of Aflatoxins (B1, B2, G1, and G2) and Ochratoxin A in Processed Cereal-Based Foods by HPLC with Fluorescence Detection.

    Science.gov (United States)

    Gazioğlu, Işil; Kolak, Ufuk

    2015-01-01

    Modified AOAC 991.31 and AOAC 2000.03 methods for the simultaneous determination of total aflatoxins (AFs), aflatoxin B1, and ochratoxin A (OTA) in processed cereal-based foods by RP-HPLC coupled with fluorescence detection were validated. A KOBRA® Cell derivatization system was used to analyze total AFs. One of the modifications was the extraction procedure of mycotoxins. Both AFs and OTA were extracted with methanol-water (75+25, v/v) and purified with an immunoaffinity column before HPLC analysis. The modified methods were validated by measuring the specificity, selectivity, linearity, sensitivity, accuracy, repeatability, reproducibility, recovery, LOD, and LOQ parameters. The validated methods were successfully applied for the simultaneous determination of mycotoxins in 81 processed cereal-based foods purchased in Turkey. These rapid, sensitive, simple, and validated methods are suitable for the simultaneous determination of AFs and OTA in the processed cereal-based foods.

  3. Method Validation for the Quantitative Analysis of Aflatoxins (B1, B2, G1, and G2) and Ochratoxin A in Processed Cereal-Based Foods by HPLC with Fluorescence Detection.

    Science.gov (United States)

    Gazioğlu, Işil; Kolak, Ufuk

    2015-01-01

    Modified AOAC 991.31 and AOAC 2000.03 methods for the simultaneous determination of total aflatoxins (AFs), aflatoxin B1, and ochratoxin A (OTA) in processed cereal-based foods by RP-HPLC coupled with fluorescence detection were validated. A KOBRA® Cell derivatization system was used to analyze total AFs. One of the modifications was the extraction procedure of mycotoxins. Both AFs and OTA were extracted with methanol-water (75+25, v/v) and purified with an immunoaffinity column before HPLC analysis. The modified methods were validated by measuring the specificity, selectivity, linearity, sensitivity, accuracy, repeatability, reproducibility, recovery, LOD, and LOQ parameters. The validated methods were successfully applied for the simultaneous determination of mycotoxins in 81 processed cereal-based foods purchased in Turkey. These rapid, sensitive, simple, and validated methods are suitable for the simultaneous determination of AFs and OTA in the processed cereal-based foods. PMID:26268976

  4. SNIa detection in the SNLS photometric analysis using Morphological Component Analysis

    CERN Document Server

    Möller, A; Lanusse, F; Neveu, J; Palanque-Delabrouille, N

    2015-01-01

    Detection of supernovae (SNe) and, more generally, of transient events in large surveys can provide numerous false detections. In the case of a deferred processing of survey images, this implies reconstructing complete light curves for all detections, requiring sizable processing time and resources. Optimizing the detection of transient events is thus an important issue for both present and future surveys. We present here the optimization done in the SuperNova Legacy Survey (SNLS) for the 5-year data differed photometric analysis. In this analysis, detections are derived from stacks of subtracted images with one stack per lunation. The 3-year analysis provided 300,000 detections dominated by signals of bright objects that were not perfectly subtracted. We developed a subtracted image stack treatment to reduce the number of non SN-like events using morphological component analysis. This technique exploits the morphological diversity of objects to be detected to extract the signal of interest. At the level of o...

  5. Altered fingerprints: analysis and detection.

    Science.gov (United States)

    Yoon, Soweon; Feng, Jianjiang; Jain, Anil K

    2012-03-01

    The widespread deployment of Automated Fingerprint Identification Systems (AFIS) in law enforcement and border control applications has heightened the need for ensuring that these systems are not compromised. While several issues related to fingerprint system security have been investigated, including the use of fake fingerprints for masquerading identity, the problem of fingerprint alteration or obfuscation has received very little attention. Fingerprint obfuscation refers to the deliberate alteration of the fingerprint pattern by an individual for the purpose of masking his identity. Several cases of fingerprint obfuscation have been reported in the press. Fingerprint image quality assessment software (e.g., NFIQ) cannot always detect altered fingerprints since the implicit image quality due to alteration may not change significantly. The main contributions of this paper are: 1) compiling case studies of incidents where individuals were found to have altered their fingerprints for circumventing AFIS, 2) investigating the impact of fingerprint alteration on the accuracy of a commercial fingerprint matcher, 3) classifying the alterations into three major categories and suggesting possible countermeasures, 4) developing a technique to automatically detect altered fingerprints based on analyzing orientation field and minutiae distribution, and 5) evaluating the proposed technique and the NFIQ algorithm on a large database of altered fingerprints provided by a law enforcement agency. Experimental results show the feasibility of the proposed approach in detecting altered fingerprints and highlight the need to further pursue this problem. PMID:21808092

  6. Processing Ocean Images to Detect Large Drift Nets

    Science.gov (United States)

    Veenstra, Tim

    2009-01-01

    A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.

  7. Advanced Signal Processing for Thermal Flaw Detection; TOPICAL

    International Nuclear Information System (INIS)

    Dynamic thermography is a promising technology for inspecting metallic and composite structures used in high-consequence industries. However, the reliability and inspection sensitivity of this technology has historically been limited by the need for extensive operator experience and the use of human judgment and visual acuity to detect flaws in the large volume of infrared image data collected. To overcome these limitations new automated data analysis algorithms and software is needed. The primary objectives of this research effort were to develop a data processing methodology that is tied to the underlying physics, which reduces or removes the data interpretation requirements, and which eliminates the need to look at significant numbers of data frames to determine if a flaw is present. Considering the strengths and weakness of previous research efforts, this research elected to couple both the temporal and spatial attributes of the surface temperature. Of the possible algorithms investigated, the best performing was a radiance weighted root mean square Laplacian metric that included a multiplicative surface effect correction factor and a novel spatio-temporal parametric model for data smoothing. This metric demonstrated the potential for detecting flaws smaller than 0.075 inch in inspection areas on the order of one square foot. Included in this report is the development of a thermal imaging model, a weighted least squares thermal data smoothing algorithm, simulation and experimental flaw detection results, and an overview of the ATAC (Automated Thermal Analysis Code) software that was developed to analyze thermal inspection data

  8. Less is More: Data Processing with SVM for Intrusion Detection

    Institute of Scientific and Technical Information of China (English)

    XIAO Hai-jun; HONG Fan; WANG Ling

    2009-01-01

    To improve the detection rate and lower down the false positive rate in intrusion detection system,dimensionality reduction is widely used in the intrusion detection system.For this purpose,a data processing (DP) with support vector machine (SVM) was built.Different from traditionally identifying the redundant data before purging the audit data by expert knowledge or utilizing different kinds of subsets of the available 41-connection attributes to build a classifier,the proposed strategy first removes the attributes whose correlation with another attribute exceeds a threshold,and then classifies two sequence samples as one class while removing either of the two samples whose similarity exceeds a threshold.The results of performance experiments showed that the strategy of DP and SVM is superior to the other existing data reduction strategies (e.g.,audit reduction,rule extraction,and feature selection),and that the detection model based on DP and SVM outperforms those based on data mining,soft computing,and hierarchical principal component analysis neural networks.

  9. Windshear detection radar signal processing studies

    Science.gov (United States)

    Baxa, Ernest G., Jr.

    1993-01-01

    This final report briefly summarizes research work at Clemson in the Radar Systems Laboratory under the NASA Langley Research Grant NAG-1-928 in support of the Antenna and Microwave Branch, Guidance and Control Division, program to develop airborne sensor technology for the detection of low altitude windshear. A bibliography of all publications generated by Clemson personnel is included. An appendix provides abstracts of all publications.

  10. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  11. Crack Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal, Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better than that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  12. Crack Length Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    1990-01-01

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve...... a resolution better then that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring...

  13. Automatic Prosodic Break Detection and Feature Analysis

    Institute of Scientific and Technical Information of China (English)

    Chong-Jia Ni; Ai-Ying Zhang; Wen-Ju Liu; Bo Xu

    2012-01-01

    Automatic prosodic break detection and annotation are important for both speech understanding and natural speech synthesis.In this paper,we discuss automatic prosodic break detection and feature analysis.The contributions of the paper are two aspects.One is that we use classifier combination method to detect Mandarin and English prosodic break using acoustic,lexical and syntactic evidence.Our proposed method achieves better performance on both the Mandarin prosodic annotation corpus — Annotated Speech Corpus of Chinese Discourse and the English prosodic annotation corpus —Boston University Radio News Corpus when compared with the baseline system and other researches' experimental results.The other is the feature analysis for prosodic break detection.The functions of different features,such as duration,pitch,energy,and intensity,are analyzed and compared in Mandarin and English prosodic break detection.Based on the feature analysis,we also verify some linguistic conclusions.

  14. Processing of Graphene combining Optical Detection and Scanning Probe Lithography

    Directory of Open Access Journals (Sweden)

    Zimmermann Sören

    2015-01-01

    Full Text Available This paper presents an experimental setup tailored for robotic processing of graphene with in-situ vision based control. A robust graphene detection approach is presented applying multiple image processing operations of the visual feedback provided by a high-resolution light microscope. Detected graphene flakes can be modified using a scanning probe based lithographical process that is directly linked to the in-situ optical images. The results of this process are discussed with respect to further application scenarios.

  15. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  16. Corn tassel detection based on image processing

    Science.gov (United States)

    Tang, Wenbing; Zhang, Yane; Zhang, Dongxing; Yang, Wei; Li, Minzan

    2012-01-01

    Machine vision has been widely applied in facility agriculture, and played an important role in obtaining environment information. In this paper, it is studied that application of image processing to recognize and locate corn tassel for corn detasseling machine. The corn tassel identification and location method was studied based on image processing and automated technology guidance information was provided for the actual production of corn emasculation operation. The system is the application of image processing to recognize and locate corn tassel for corn detasseling machine. According to the color characteristic of corn tassel, image processing techniques was applied to identify corn tassel of the images under HSI color space and Image segmentation was applied to extract the part of corn tassel, the feature of corn tassel was analyzed and extracted. Firstly, a series of preprocessing procedures were done. Then, an image segmentation algorithm based on HSI color space was develop to extract corn tassel from background and region growing method was proposed to recognize the corn tassel. The results show that this method could be effective for extracting corn tassel parts from the collected picture and can be used for corn tassel location information; this result could provide theoretical basis guidance for corn intelligent detasseling machine.

  17. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  18. Signal processing in cryogenic particle detection

    Energy Technology Data Exchange (ETDEWEB)

    Yuryev, Y.N. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Jang, Y.S. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Kim, S.K. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Lee, K.B.; Lee, M.K. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Lee, S.J. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Yoon, W.S. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Kim, Y.H., E-mail: yhkim@kriss.re.k [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of)

    2011-04-11

    We describe a signal-processing program for a data acquisition system for cryogenic particle detectors. The program is based on an optimal-filtering method for high-resolution measurement of calorimetric signals with a significant amount of noise of unknown origin and non-stationary behavior. The program was applied to improve the energy resolution of the alpha particle spectrum of an {sup 241}Am source.

  19. Processing of GPR data from NIITEK landmine detection system

    Science.gov (United States)

    Legarsky, Justin J.; Broach, J. T.; Bishop, Steven S.

    2003-09-01

    In this paper, a signal processing approach for wide-bandwidth ground-penetrating-radar imagery from Non-Intrusive Inspection Technology, Incorporated (NIITEK) vehicle-mounted landmine detection sensor is investigated. The approach consists of a sequence of processing steps, which include signal filtering, image enhancement and detection. Filtering strategies before detection aid in image visualization by reducing ground bounce, systematic effects and redundant signals. Post-filter image processing helps by enhancing landmine signatures in the NIITEK radar imagery. Study results from applying this signal processing approach are presented for test minefield lane data, which were collected during 2002 from an Army test site.

  20. Detecting causality in policy diffusion processes

    Science.gov (United States)

    Grabow, Carsten; Macinko, James; Silver, Diana; Porfiri, Maurizio

    2016-08-01

    A universal question in network science entails learning about the topology of interaction from collective dynamics. Here, we address this question by examining diffusion of laws across US states. We propose two complementary techniques to unravel determinants of this diffusion process: information-theoretic union transfer entropy and event synchronization. In order to systematically investigate their performance on law activity data, we establish a new stochastic model to generate synthetic law activity data based on plausible networks of interactions. Through extensive parametric studies, we demonstrate the ability of these methods to reconstruct networks, varying in size, link density, and degree heterogeneity. Our results suggest that union transfer entropy should be preferred for slowly varying processes, which may be associated with policies attending to specific local problems that occur only rarely or with policies facing high levels of opposition. In contrast, event synchronization is effective for faster enactment rates, which may be related to policies involving Federal mandates or incentives. This study puts forward a data-driven toolbox to explain the determinants of legal activity applicable to political science, across dynamical systems, information theory, and complex networks.

  1. Measurement Bias Detection through Factor Analysis

    Science.gov (United States)

    Barendse, M. T.; Oort, F. J.; Werner, C. S.; Ligtvoet, R.; Schermelleh-Engel, K.

    2012-01-01

    Measurement bias is defined as a violation of measurement invariance, which can be investigated through multigroup factor analysis (MGFA), by testing across-group differences in intercepts (uniform bias) and factor loadings (nonuniform bias). Restricted factor analysis (RFA) can also be used to detect measurement bias. To also enable nonuniform…

  2. Analysis of weld seam uniformity through temperature distribution by spatially resolved detector elements in the wavelength range of 0.3μm to 5μm for the detection of structural changing heating and cooling processes

    Science.gov (United States)

    Lempe, B.; Maschke, R.; Rudek, F.; Baselt, T.; Hartmann, P.

    2016-03-01

    Online process control systems often only detecting temperatures at a local area of the machining point and determining an integrated value. In order to determine the proper welding quality and the absence of defects, such as temperature induced stress cracks, it is necessary to do time and space resolved measurements before, during and after the production process. The system under development consists of a beam splitting unit which divides the electromagnetic radiation of the heated component on two different sensor types. For high temperatures, a sensor is used which is sensitive in the visible spectrum and has a dynamic range of 120dB.1 Thus, very high intensity differences can be displayed and a direct analysis of the temperature profile of the weld spots is possible.2 A second sensor is operating in the wavelength range from 1 micron to 5 microns and allows the determination of temperatures from approximately 200°C.3 At the beginning of a welding process, the heat-up phase of the metal is critical to the resultant weld quality. If a defined temperature range exceeded too fast, the risk of cracking is significantly increased.4 During the welding process the thermal supervision of the central processing location is decisive for a high secure weld. In the border areas as well as in connection of the welding process especially cooling processes are crucial for the homogeneity of the results. In order to obtain sufficiently accurate resolution of the dynamic heating- and cooling-processes, the system can carry out up to 500 frames per second.

  3. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  4. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  5. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  6. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  7. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  8. A visual analysis of the process of process modeling

    OpenAIRE

    Claes, J Jan; Vanderfeesten, ITP Irene; Pinggera, J.; Reijers, HA Hajo; Weber, B.; Poels, G

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process modeling. This paper contributes to this research by presenting a way of visualizing the different steps a modeler undertakes to construct a process model, in a so-called process of process modeling C...

  9. Instrumenting the Intelligence Analysis Process

    Energy Technology Data Exchange (ETDEWEB)

    Hampson, Ernest; Cowley, Paula J.

    2005-05-02

    The Advanced Research and Development Activity initiated the Novel Intelligence from Massive Data (NIMD) program to develop advanced analytic technologies and methodologies. In order to support this objective, researchers and developers need to understand what analysts do and how they do it. In the past, this knowledge generally was acquired through subjective feedback from analysts. NIMD established the innovative Glass Box Analysis (GBA) Project to instrument a live intelligence mission and unobtrusively capture and objectively study the analysis process. Instrumenting the analysis process requires tailor-made software hooks that grab data from a myriad of disparate application operations and feed into a complex relational database and hierarchical file store to collect, store, retrieve, and distribute analytic data in a manner that maximizes researchers’ understanding. A key to success is determining the correct data to collect and aggregate low-level data into meaningful analytic events. This paper will examine how the GBA team solved some of these challenges, continues to address others, and supports a growing user community in establishing their own GBA environments and/or studying the data generated by GBA analysts working in the Glass Box.

  10. Signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    At the Specialists' Meeting on Sodium Boiling Detection organized by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency at Chester in the United Kingdom in 1981 various methods of detecting sodium boiling were reported. But, it was not possible to make a comparative assessment of these methods because the signal condition in each experiment was different from others. That is why participants of this meeting recommended that a benchmark test should be carried out in order to evaluate and compare signal processing methods for boiling detection. Organization of the Co-ordinated Research Programme (CRP) on signal processing techniques for sodium boiling noise detection was also recommended at the 16th meeting of the IWGFR. The CRP on Signal Processing Techniques for Sodium Boiling Noise Detection was set up in 1984. Eight laboratories from six countries have agreed to participate in this CRP. The overall objective of the programme was the development of reliable on-line signal processing techniques which could be used for the detection of sodium boiling in an LMFBR core. During the first stage of the programme a number of existing processing techniques used by different countries have been compared and evaluated. In the course of further work, an algorithm for implementation of this sodium boiling detection system in the nuclear reactor will be developed. It was also considered that the acoustic signal processing techniques developed for boiling detection could well make a useful contribution to other acoustic applications in the reactor. This publication consists of two parts. Part I is the final report of the co-ordinated research programme on signal processing techniques for sodium boiling noise detection. Part II contains two introductory papers and 20 papers presented at four research co-ordination meetings since 1985. A separate abstract was prepared for each of these 22 papers. Refs, figs and tabs

  11. Fault Management: Degradation Signature Detection, Modeling, and Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  12. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING & EVALUATION MEHTODS & REQUIREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    SCHOFIELD JS

    2007-10-04

    This document has two purposes: {sm_bullet} Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. {sm_bullet} Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals.

  13. A preamplification approach to GMO detection in processed foods.

    Science.gov (United States)

    Del Gaudio, S; Cirillo, A; Di Bernardo, G; Galderisi, U; Cipollaro, M

    2010-03-01

    DNA is widely used as a target for GMO analysis because of its stability and high detectability. Real-time PCR is the method routinely used in most analytical laboratories due to its quantitative performance and great sensitivity. Accurate DNA detection and quantification is dependent on the specificity and sensitivity of the amplification protocol as well as on the quality and quantity of the DNA used in the PCR reaction. In order to enhance the sensitivity of real-time PCR and consequently expand the number of analyzable target genes, we applied a preamplification technique to processed foods where DNA can be present in low amounts and/or in degraded forms thereby affecting the reliability of qualitative and quantitative results. The preamplification procedure utilizes a pool of primers targeting genes of interest and is followed by real-time PCR reactions specific for each gene. An improvement of Ct values was found comparing preamplified vs. non-preamplified DNA. The strategy reported in the present study will be also applicable to other fields requiring quantitative DNA testing by real-time PCR.

  14. Gaussian processes for state space models and change point detection

    OpenAIRE

    Turner, Ryan Darby

    2012-01-01

    This thesis details several applications of Gaussian processes (GPs) for enhanced time series modeling. We first cover different approaches for using Gaussian processes in time series problems. These are extended to the state space approach to time series in two different problems. We also combine Gaussian processes and Bayesian online change point detection (BOCPD) to increase the generality of the Gaussian process time series methods. These methodologies are evaluated on predict...

  15. Visualizing the analysis process: CZSaw's History View

    OpenAIRE

    Kadivar, Nazanin

    2011-01-01

    Capturing and visualizing the data analysis process is a growing research domain. Visual Analytics tool designers try to understand the analysis process in order to provide better tools. Existing research in process visualization suggests that capturing and visualizing the history of the analysis process is an effective form of process visualization. CZSaw is a Visual Analytics tool that provides visual representations of both data and the analysis process. In this thesis, we discuss the desi...

  16. Sequential Detection of Fission Processes for Harbor Defense

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  17. A signal processing method for the friction-based endpoint detection system of a CMP process

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chi; Guo Dongming; Jin Zhuji; Kang Renke, E-mail: xuchi_dut@163.com [Key Laboratory for Precision and Non-Traditional Machining Technology of Ministry of Education, Dalian University of Technology, Dalian 116024 (China)

    2010-12-15

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process. (semiconductor technology)

  18. Nonlinear processing of radar data for landmine detection

    Science.gov (United States)

    Bartosz, Elizabeth E.; DeJong, Keith; Duvoisin, Herbert A.; Solomon, Geoff Z.; Steinway, William J.; Warren, Albert

    2004-09-01

    Outstanding landmine detection has been achieved by the Handheld Standoff Mine Detection System (HSTAMIDS system) in government-run field tests. The use of anomaly detection using principal component analysis (PCA) on the return of ground penetrating radar (GPR) coupled with metal detection is the key to the success of the HSTAMIDS-like system algorithms. Indications of nonlinearities and asymmetries in Humanitarian Demining (HD) data point to modifications to the current PCA algorithm that might prove beneficial. Asymmetries in the distribution of PCA projections of field data have been quantified in Humanitarian Demining (HD) data. An initial correction for the observed asymmetries has improved the False Alarm Rate (FAR) on this data.

  19. Differential thermal analysis microsystem for explosive detection

    DEFF Research Database (Denmark)

    Olsen, Jesper Kenneth; Greve, Anders; Senesac, L.;

    2011-01-01

    A micro differential thermal analysis (DTA) system is used for detection of trace explosive particles. The DTA system consists of two silicon micro chips with integrated heaters and temperature sensors. One chip is used for reference and one for the measurement sample. The sensor is constructed...

  20. Kernel-based fisher discriminant analysis for hyperspectral target detection

    Institute of Scientific and Technical Information of China (English)

    GU Yan-feng; ZHANG Ye; YOU Di

    2007-01-01

    A new method based on kernel Fisher discriminant analysis (KFDA) is proposed for target detection of hyperspectral images. The KFDA combines kernel mapping derived from support vector machine and the classical linear Fisher discriminant analysis (LFDA), and it possesses good ability to process nonlinear data such as hyperspectral images. According to the Fisher rule that the ratio of the between-class and within-class scatters is maximized, the KFDA is used to obtain a set of optimal discriminant basis vectors in high dimensional feature space. All pixels in the hyperspectral images are projected onto the discriminant basis vectors and the target detection is performed according to the projection result. The numerical experiments are performed on hyperspectral data with 126 bands collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). The experimental results show the effectiveness of the proposed detection method and prove that this method has good ability to overcome small sample size and spectral variability in the hyperspectral target detection.

  1. Performance evaluation of fault detection methods for wastewater treatment processes.

    Science.gov (United States)

    Corominas, Lluís; Villez, Kris; Aguado, Daniel; Rieger, Leiv; Rosén, Christian; Vanrolleghem, Peter A

    2011-02-01

    Several methods to detect faults have been developed in various fields, mainly in chemical and process engineering. However, minimal practical guidelines exist for their selection and application. This work presents an index that allows for evaluating monitoring and diagnosis performance of fault detection methods, which takes into account several characteristics, such as false alarms, false acceptance, and undesirable switching from correct detection to non-detection during a fault event. The usefulness of the index to process engineering is demonstrated first by application to a simple example. Then, it is used to compare five univariate fault detection methods (Shewhart, EWMA, and residuals of EWMA) applied to the simulated results of the Benchmark Simulation Model No. 1 long-term (BSM1_LT). The BSM1_LT, provided by the IWA Task Group on Benchmarking of Control Strategies, is a simulation platform that allows for creating sensor and actuator faults and process disturbances in a wastewater treatment plant. The results from the method comparison using BSM1_LT show better performance to detect a sensor measurement shift for adaptive methods (residuals of EWMA) and when monitoring the actuator signals in a control loop (e.g., airflow). Overall, the proposed index is able to screen fault detection methods.

  2. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  3. PERIODIC SIGNAL DETECTION WITH USING DUFFING SYSTEM POINCARE MAP ANALYSIS

    Directory of Open Access Journals (Sweden)

    Valeriy Martynyuk

    2014-06-01

    Full Text Available In this article the periodic signal detection method on the base of Duffing system chaotic oscillations analysis is presented. This work is a development of the chaos-based signal detection technique. Generally, chaos-based signal detection is the detection of chaotic-to-periodic state transition under input periodic component influence. If the input periodic component reaches certain threshold value, the system transforms from chaotic state to periodic state. The Duffing-type chaotic systems are often used for such a signal detection purpose because of their ability to work in chaotic state for a long time and relatively simple realization. The main advantage of chaos-based signal detection methods is the utilization of chaotic system sensitivity to weak signals. But such methods are not used in practice because of the chaotic system state control problems. The method presented does not require an exact system state control. The Duffing system works continuously in chaotic state and the periodic signal detection process is based on the analysis of Duffing system Poincare map fractal structure. This structure does not depend on noise, and therefore the minimum input signal-to-noise ratio required for periodic signal detection is not limited by chaotic system state control tolerance.

  4. Noise analysis for sensitivity-based structural damage detection

    Institute of Scientific and Technical Information of China (English)

    YIN Tao; ZHU Hong-ping; YU Ling

    2007-01-01

    As vibration-based structural damage detection methods are easily affected by environmental noise, a new statistic-based noise analysis method is proposed together with the Monte Carlo technique to investigate the influence of experimental noise of modal data on sensitivity-based damage detection methods. Different from the commonly used random perturbation technique, the proposed technique is deduced directly by Moore-Penrose generalized inverse of the sensitivity matrix, which does not only make the analysis process more efficient but also can analyze the influence of noise on both frequencies and mode shapes for three commonly used sensitivity-based damage detection methods in a similar way. A one-story portal frame is adopted to evaluate the efficiency of the proposed noise analysis technique.

  5. Detection of clustered microcalcifications using spatial point process modeling

    Energy Technology Data Exchange (ETDEWEB)

    Jing Hao; Yang Yongyi [Department of Electrical and Computer Engineering, Illinois Institute of Technology, 3301 South Dearborn Street, Chicago, IL 60616 (United States); Nishikawa, Robert M, E-mail: yangyo@iit.ed [Department of Radiology, The University of Chicago, 5841S. Maryland Ave., Chicago, IL 60637-1463 (United States)

    2011-01-07

    In this work we propose a spatial point process (SPP) approach to improve the detection accuracy of clustered microcalcifications (MCs) in mammogram images. The conventional approach to MC detection has been to first detect the individual MCs in an image independently, which are subsequently grouped into clusters. Our proposed approach aims to exploit the spatial distribution among the different MCs in a mammogram image (i.e. MCs tend to appear in small clusters) directly during the detection process. We model the MCs by a marked point process (MPP) in which spatially neighboring MCs interact with each other. The MCs are then simultaneously detected through maximum a posteriori (MAP) estimation of the model parameters associated with the MPP process. The proposed approach was evaluated with a dataset of 141 clinical mammograms from 66 cases, and the results show that it could yield improved detection performance compared to a recently proposed support vector machine (SVM) detector. In particular, the proposed approach achieved a sensitivity of about 90% with the FP rate at around 0.5 clusters per image, compared to about 83% for the SVM; the performance of the proposed approach was also demonstrated to be more stable over different compositions of the test images.

  6. Detection of clustered microcalcifications using spatial point process modeling

    Science.gov (United States)

    Jing, Hao; Yang, Yongyi; Nishikawa, Robert M.

    2011-01-01

    In this work we propose a spatial point process (SPP) approach to improve the detection accuracy of clustered microcalcifications (MCs) in mammogram images. The conventional approach to MC detection has been to first detect the individual MCs in an image independently, which are subsequently grouped into clusters. Our proposed approach aims to exploit the spatial distribution among the different MCs in a mammogram image (i.e. MCs tend to appear in small clusters) directly during the detection process. We model the MCs by a marked point process (MPP) in which spatially neighboring MCs interact with each other. The MCs are then simultaneously detected through maximum a posteriori (MAP) estimation of the model parameters associated with the MPP process. The proposed approach was evaluated with a dataset of 141 clinical mammograms from 66 cases, and the results show that it could yield improved detection performance compared to a recently proposed support vector machine (SVM) detector. In particular, the proposed approach achieved a sensitivity of about 90% with the FP rate at around 0.5 clusters per image, compared to about 83% for the SVM; the performance of the proposed approach was also demonstrated to be more stable over different compositions of the test images. This work was supported by NIH/NIBIB grant R01EB009905.

  7. Molecular sieves analysis by elastic recoil detection

    International Nuclear Information System (INIS)

    The opportunity of water determination in zeolites via hydrogen detection using the elastic recoil detection analysis (ERDA) was investigated. The radiation effect upon the desorption rate of hydrogen in miscellaneous types of zeolites, e.g. Y-Faujasite, ZSM-5, SK, etc. and in a natural clay, e.g. an Algerian bentonite was discussed. Quantitative measurements were carried out in order to determine the amount and distribution shape of hydrogen in each material. Various explanations dealing with hydration and constitution water in such a crystalline framework were proposed. The experimental results are in a good agreement with the corresponding theoretical values

  8. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2013-12-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  9. Operational analysis for the drug detection problem

    Science.gov (United States)

    Hoopengardner, Roger L.; Smith, Michael C.

    1994-10-01

    New techniques and sensors to identify the molecular, chemical, or elemental structures unique to drugs are being developed under several national programs. However, the challenge faced by U.S. drug enforcement and Customs officials goes far beyond the simple technical capability to detect an illegal drug. Entry points into the U.S. include ports, border crossings, and airports where cargo ships, vehicles, and aircraft move huge volumes of freight. Current technology and personnel are able to physically inspect only a small fraction of the entering cargo containers. The complexities of how to best utilize new technology to aid the detection process and yet not adversely affect the processing of vehicles and time-sensitive cargo is the challenge faced by these officials. This paper describes an ARPA sponsored initiative to develop a simple, yet useful, method for examining the operational consequences of utilizing various procedures and technologies in combination to achieve an `acceptable' level of detection probability. Since Customs entry points into the U.S. vary from huge seaports to a one lane highway checkpoint between the U.S. and Canadian or Mexico border, no one system can possibly be right for all points. This approach can examine alternative concepts for using different techniques/systems for different types of entry points. Operational measures reported include the average time to process vehicles and containers, the average and maximum numbers in the system at any time, and the utilization of inspection teams. The method is implemented via a PC-based simulation written in GPSS-PC language. Input to the simulation model is (1) the individual detection probabilities and false positive rates for each detection technology or procedure, (2) the inspection time for each procedure, (3) the system configuration, and (4) the physical distance between inspection stations. The model offers on- line graphics to examine effects as the model runs.

  10. Privacy preserving similarity detection for data analysis

    OpenAIRE

    Leontiadis, Iraklis; Önen, Melek; Molva, Refik; Chorley, M. J.; Colombo, G.B.

    2013-01-01

    International audience Current applications tend to use personal sensitive information to achieve better quality with respect to their services. Since the third parties are not trusted the data must be protected such that individual data privacy is not compromised but at the same time operations on it would be compatible. A wide range of data analysis operations entails a similarity detection algorithm between user data. For instance clustering on big data groups together objects based on ...

  11. Cascaded image analysis for dynamic crack detection in material testing

    Science.gov (United States)

    Hampel, U.; Maas, H.-G.

    Concrete probes in civil engineering material testing often show fissures or hairline-cracks. These cracks develop dynamically. Starting at a width of a few microns, they usually cannot be detected visually or in an image of a camera imaging the whole probe. Conventional image analysis techniques will detect fissures only if they show a width in the order of one pixel. To be able to detect and measure fissures with a width of a fraction of a pixel at an early stage of their development, a cascaded image analysis approach has been developed, implemented and tested. The basic idea of the approach is to detect discontinuities in dense surface deformation vector fields. These deformation vector fields between consecutive stereo image pairs, which are generated by cross correlation or least squares matching, show a precision in the order of 1/50 pixel. Hairline-cracks can be detected and measured by applying edge detection techniques such as a Sobel operator to the results of the image matching process. Cracks will show up as linear discontinuities in the deformation vector field and can be vectorized by edge chaining. In practical tests of the method, cracks with a width of 1/20 pixel could be detected, and their width could be determined at a precision of 1/50 pixel.

  12. Eco-Efficiency Analysis of biotechnological processes.

    Science.gov (United States)

    Saling, Peter

    2005-07-01

    Eco-Efficiency has been variously defined and analytically implemented by several workers. In most cases, Eco-Efficiency is taken to mean the ecological optimization of overall systems while not disregarding economic factors. Eco-Efficiency should increase the positive ecological performance of a commercial company in relation to economic value creation--or to reduce negative effects. Several companies use Eco-Efficiency Analysis for decision-making processes; and industrial examples of best practices in developing and implementing Eco-Efficiency have been reviewed. They clearly demonstrate the environmental and business benefits of Eco-Efficiency. An instrument for the early recognition and systematic detection of economic and environmental opportunities and risks for production processes in the chemical industry began use in 1997, since when different new features have been developed, leading to many examples. This powerful Eco-Efficiency Analysis allows a feasibility evaluation of existing and future business activities and is applied by BASF. In many cases, decision-makers are able to choose among alternative processes for making a product.

  13. Digital Image Processing Technique for Breast Cancer Detection

    Science.gov (United States)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  14. Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T; Gibbons, S J; Ringdal, F; Harris, D B

    2007-02-09

    The principal objective of this two-year study is to develop and test a new advanced, automatic approach to seismic detection/location using array processing. We address a strategy to obtain significantly improved precision in the location of low-magnitude events compared with current fully-automatic approaches, combined with a low false alarm rate. We have developed and evaluated a prototype automatic system which uses as a basis regional array processing with fixed, carefully calibrated, site-specific parameters in conjuction with improved automatic phase onset time estimation. We have in parallel developed tools for Matched Field Processing for optimized detection and source-region identification of seismic signals. This narrow-band procedure aims to mitigate some of the causes of difficulty encountered using the standard array processing system, specifically complicated source-time histories of seismic events and shortcomings in the plane-wave approximation for seismic phase arrivals at regional arrays.

  15. In-process discontinuity detection during friction stir welding

    Science.gov (United States)

    Shrivastava, Amber

    The objective of this work is to develop a method for detecting the creation of discontinuities (e.g., voids) during friction stir welding. Friction stir welding is inherently cost-effective, however, the need for significant weld inspection can make the process cost-prohibitive. A new approach to weld inspection is required -- where an in-situ characterization of weld quality can be obtained, reducing the need for post-process inspection. Friction stir welds with discontinuity and without discontinuity were created. In this work, discontinuities are generated by reducing the friction stir tool rotation frequency and increasing the tool traverse speed in order to create "colder" welds. During the welds, forces are measured. Discontinuity sizes for welds are measured by computerized tomography. The relationship between the force transients and the discontinuity sizes indicate that the force measurement during friction stir welding can be effectively used for detecting discontinuities in friction stir welds. The normalized force transient data and normalized discontinuity size are correlated to develop a criterion for discontinuity detection. Additional welds are performed to validate the discontinuity detection method. The discontinuity sizes estimated by the force measurement based method are in good agreement with the discontinuity sizes measured by computerized tomography. These results show that the force measurement based discontinuity detection model method can be effectively used to detect discontinuities during friction stir welding.

  16. Data reconciliation and gross error detection: application in chemical processes

    OpenAIRE

    EGHBAL AHMADİ, Mohammad Hosein

    2015-01-01

    Abstract. Measured data are normally corrupted by different kinds of errors in many chemical processes. In this work, a brief overview in data reconciliation and gross error detection believed as the most efficient technique in reducing the measurement errors and obtaining accurate information about the process is presented. In addition to defining the basic problem and a survey of recent developments in this area that is categorized in “Real Time Optimization” field, we will describe about a...

  17. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    Directory of Open Access Journals (Sweden)

    Peter J. Best

    2009-03-01

    Full Text Available Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1 threat monitoring-surveillance of security audit logs for ‘red flags’, (2 automated extraction and analysis of data from audit trails, and (3 using forensic investigation techniques to determine whether a fraud has actually occurred. We demonstrate how mySAP, an enterprise system, can be used for audit trail analysis in detecting financial frauds; afterwards we use a case study of a suspected fraud to illustrate how to implement the methodology.

  18. Impact of two types of image processing on cancer detection in mammography

    Science.gov (United States)

    Warren, Lucy M.; Halling-Brown, Mark D.; Looney, Padraig T.; Dance, David R.; Wilkinson, Louise; Wallis, Matthew G.; Given-Wilson, Rosalind M.; Cooke, Julie; McAvinchey, Rita; Young, Kenneth C.

    2016-03-01

    The impact of image processing on cancer detection is still a concern to radiologists and physicists. This work aims to evaluate the effect of two types of image processing on cancer detection in mammography. An observer study was performed in which six radiologists inspected 349 cases (a mixture of normal cases, benign lesions and cancers) processed with two types of image processing. The observers marked areas they were suspicious were cancers. JAFROC analysis was performed to determine if there was a significant difference in cancer detection between the two types of image processing. Cancer detection was significantly better with the standard setting image processing (flavor A) compared with one that provides enhanced image contrast (flavor B), p = 0.036. The image processing was applied to images of the CDMAM test object, which were then analysed using CDCOM. The threshold gold thickness measured with the CDMAM test object was thinner using flavor A than flavor B image processing. Since Flavor A was found to be superior in both the observer study and the measurements using the CDMAM phantom, this may indicate that measurements using the CDMAM correlate with change in cancer detection with different types of image processing.

  19. Exoplanetary Detection By Multifractal Spectral Analysis

    CERN Document Server

    Agarwal, Sahil; Wettlaufer, John S

    2016-01-01

    Owing to technological advances the number of exoplanets discovered has risen dramatically in the last few years. However, when trying to observe Earth analogs, it is often difficult to test the veracity of detection. We have developed a new approach to the analysis of exoplanetary spectral observations based on temporal multifractality, which identifies time scales that characterize planetary orbital motion around the host star. Without fitting spectral data to stellar models, we show how the planetary signal can be robustly detected from noisy data using noise amplitude as a source of information. For observation of transiting planets, combining this method with simple geometry allows us to relate the time scales obtained to primary transit and secondary exoplanet eclipse of the exoplanets. Making use of data obtained with ground-based and space-based observations we have tested our approach on HD 189733b. Moreover, we have investigated the use of this technique in measuring planetary orbital motion via dop...

  20. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  1. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  2. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  3. Detecting 2LSB steganography using extended pairs of values analysis

    Science.gov (United States)

    Khalind, Omed; Aziz, Benjamin

    2014-05-01

    In this paper, we propose an extended pairs of values analysis to detect and estimate the amount of secret messages embedded with 2LSB replacement in digital images based on chi-square attack and regularity rate in pixel values. The detection process is separated from the estimation of the hidden message length, as it is the main requirement of any steganalysis method. Hence, the detection process acts as a discrete classifier, which classifies a given set of images into stego and clean classes. The method can accurately detect 2LSB replacement even when the message length is about 10% of the total capacity, it also reaches its best performance with an accuracy of higher than 0.96 and a true positive rate of more than 0.997 when the amount of data are 20% to 100% of the total capacity. However, the method puts no assumptions neither on the image nor the secret message, as it tested with two sets of 3000 images, compressed and uncompressed, embedded with a random message for each case. This method of detection could also be used as an automated tool to analyse a bulk of images for hidden contents, which could be used by digital forensics analysts in their investigation process.

  4. IWTU Process Sample Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  5. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  6. BRVAAF and performance analysis for target detection

    Institute of Scientific and Technical Information of China (English)

    ZHANG Nan; TAO Ran; WANG Yue

    2009-01-01

    A bistatic range-velocity-acceleration ambiguity function (BRVAAF) is proposed. The model of radar measurements of an accelerating target involving the time delay, Doppler frequency and Doppler rate is given. The relationships between these measurements and the parameters of the bistatic geometry,target position, velocity and acceleration are derived. Moreover, the effects of the bistatic geometry factors on these measurements are analyzed. Besides, the two relationships of the bistatic integration loss and the bistatic optimum integration time with these factors are built and their change trends are described respectively. This research is helpful to analyze the influences of the bistatic geometry factors on the target detection and signal processing.

  7. BRVAAF and performance analysis for target detection

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    A bistatic range-velocity-acceleration ambiguity function (BRVAAF) is proposed. The model of radar measurements of an accelerating target involving the time delay, Doppler frequency and Doppler rate is given. The relationships between these measurements and the parameters of the bistatic geometry, target position, velocity and acceleration are derived. Moreover, the effects of the bistatic geometry factors on these measurements are analyzed. Besides, the two relationships of the bistatic integration loss and the bistatic optimum integration time with these factors are built and their change trends are described respectively. This research is helpful to analyze the influences of the bistatic geometry factors on the target detection and signal processing.

  8. Method Taking into Account Process Dispersions to Detect Hardware Trojan Horse by Side-Channel

    OpenAIRE

    Ngo, Xuan Thuy; Najm, Zakaria; Bhasin, Shivam; Guilley, Sylvain; Danger, Jean-Luc

    2014-01-01

    International audience Hardware trojan horses inserted in integrated circuits have received special attention of researchers. Most of the recent researches focus on detecting the presence of hardware trojans through various techniques like reverse engineering, test/verification methods and side-channel analysis (SCA). Previous works using SCA for trojan detection are based on power measurements or even simulations. When using real silicon, the results are strongly biased by the process var...

  9. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design prope

  10. Confusion Analysis and Detection for Workflow Nets

    Directory of Open Access Journals (Sweden)

    Xiao-liang Chen

    2014-01-01

    Full Text Available Option processes often occur in a business procedure with respect to resource competition. In a business procedure modeled with a workflow net (WF-net, all decision behavior and option operations for business tasks are modeled and performed by the conflicts in corresponding WF-net. Concurrency in WF-nets is applied to keep a high-performance operation of business procedures. However, the firing of concurrent transitions in a WF-net may lead to the disappearance of conflicts in the WF-net. The phenomenon is usually called confusions that produces difficulties for the resolution of conflicts. This paper investigates confusion detection problems in WF-nets. First, confusions are formalized as a class of marked subnets with special conflicting and concurrent features. Second, a detection approach based on the characteristics of confusion subnets and the integer linear programming (ILP is developed, which is not required to compute the reachability graph of a WF-net. Examples of the confusion detection in WF-nets are presented. Finally, the impact of confusions on the properties of WF-nets is specified.

  11. Combustion Analysis and Knock Detection in Single Cylinder DI-Diesel Engine Using Vibration Signature Analysis

    OpenAIRE

    Y.V.V.SatyanarayanaMurthy

    2011-01-01

    The purpose of this paper is to detect the “knock” in Diesel engines which deteriorate the engine performance adversely. The methodology introduced in the present work suggests a newly developed approach towards analyzing the vibration analysis of diesel engines. The method is based on fundamental relationship between the engine vibration pattern and the relative characteristics of the combustion process in each or different cylinders. Knock in diesel engine is detected by measuring the vibra...

  12. Detecting Damped Lyman-$\\alpha$ Absorbers with Gaussian Processes

    CERN Document Server

    Garnett, Roman; Bird, Simeon; Schneider, Jeff

    2016-01-01

    We develop an automated technique for detecting damped Lyman-$\\alpha$ absorbers (DLAs) along spectroscopic sightlines to quasi-stellar objects (QSOs or quasars). The detection of DLAs in large-scale spectroscopic surveys such as SDSS-III sheds light on galaxy formation at high redshift, showing the nucleation of galaxies from diffuse gas. We use nearly 50 000 QSO spectra to learn a novel tailored Gaussian process model for quasar emission spectra, which we apply to the DLA detection problem via Bayesian model selection. We propose models for identifying an arbitrary number of DLAs along a given line of sight. We demonstrate our method's effectiveness using a large-scale validation experiment, with excellent performance. We also provide a catalog of our results applied to 162 861 spectra from SDSS-III data release 12.

  13. Processing of radar data for landmine detection: nonlinear transformation

    Science.gov (United States)

    Bartosz, E. E.; Duvoisin, H.; Konduri, R.; Solomon, G. Z.

    2005-06-01

    The Handheld Standoff Mine Detection System (HSTAMIDS system) has achieved outstanding performance in government-run field tests due to its use of anomaly detection using principal component analysis (PCA) on the return of ground penetrating radar (GPR) coupled with metal detection. Indications of nonlinearities and asymmetries in Humanitarian Demining (HD) data point to modifications to the current PCA algorithm that might prove beneficial. Asymmetries in the distribution of PCA projections of field data have been quantified in Humanitarian Demining (HD) data. The data suggest a logarithmic correction to the data. Such a correction has been applied and has improved the FAR on this data set. The increase in performance is comparable to the increase shown using the simpler asymmetric rescaling method.

  14. Analysis of crisis intervention processes.

    Science.gov (United States)

    Tschacher, Wolfgang; Jacobshagen, Nina

    2002-01-01

    The remediation processes in psychosocial crisis intervention were modeled focusing on cognitive orientation. Frequent observations and subsequent process modeling constitute a novel approach to process research and reveal process-outcome associations. A sample of 40 inpatients who were assigned to treatment in a crisis intervention unit was monitored in order to study the process of crisis intervention. The process data consisted of patients' self-ratings of the variables mood, tension, and cognitive orientation, which were assessed three times a day throughout hospitalization (M = 22.6 days). Linear time series models (vector autoregression) of the process data were computed to describe the prototypical dynamic patterns of the sample. Additionally, the outcome of crisis intervention was evaluated by pre-post questionnaires. Linear trends were found pointing to an improvement of mood, a reduction of tension, and an increase of outward cognitive orientation. Time series modeling showed that, on average, outward cognitive orientation preceded improved mood. The time series models partially predicted the treatment effect, notably the outcome domain "reduction of social anxiety," yet did not predict the domain of symptom reduction. In conclusion, crisis intervention should focus on having patients increasingly engage in outward cognitive orientation in order to stabilize mood, reduce anxiety, and activate their resources.

  15. Detection, information fusion, and temporal processing for intelligence in recognition

    Energy Technology Data Exchange (ETDEWEB)

    Casasent, D. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1996-12-31

    The use of intelligence in vision recognition uses many different techniques or tools. This presentation discusses several of these techniques for recognition. The recognition process is generally separated into several steps or stages when implemented in hardware, e.g. detection, segmentation and enhancement, and recognition. Several new distortion-invariant filters, biologically-inspired Gabor wavelet filter techniques, and morphological operations that have been found very useful for detection and clutter rejection are discussed. These are all shift-invariant operations that allow multiple object regions of interest in a scene to be located in parallel. We also discuss new algorithm fusion concepts by which the results from different detection algorithms are combined to reduce detection false alarms; these fusion methods utilize hierarchical processing and fuzzy logic concepts. We have found this to be most necessary, since no single detection algorithm is best for all cases. For the final recognition stage, we describe a new method of representing all distorted versions of different classes of objects and determining the object class and pose that most closely matches that of a given input. Besides being efficient in terms of storage and on-line computations required, it overcomes many of the problems that other classifiers have in terms of the required training set size, poor generalization with many hidden layer neurons, etc. It is also attractive in its ability to reject input regions as clutter (non-objects) and to learn new object descriptions. We also discuss its use in processing a temporal sequence of input images of the contents of each local region of interest. We note how this leads to robust results in which estimation efforts in individual frames can be overcome. This seems very practical, since in many scenarios a decision need not be made after only one frame of data, since subsequent frames of data enter immediately in sequence.

  16. Colour model analysis for microscopic image processing

    OpenAIRE

    García-Rojo Marcial; González Jesús; Déniz Oscar; González Roberto; Bueno Gloria

    2008-01-01

    Abstract This article presents a comparative study between different colour models (RGB, HSI and CIEL*a*b*) applied to a very large microscopic image analysis. Such analysis of different colour models is needed in order to carry out a successful detection and therefore a classification of different regions of interest (ROIs) within the image. This, in turn, allows both distinguishing possible ROIs and retrieving their proper colour for further ROI analysis. This analysis is not commonly done ...

  17. Sound Event Detection for Music Signals Using Gaussian Processes

    Directory of Open Access Journals (Sweden)

    Pablo A. Alvarado-Durán

    2013-11-01

    Full Text Available In this paper we present a new methodology for detecting sound events in music signals using Gaussian Processes. Our method firstly takes a time-frequency representation, i.e. the spectrogram, of the input audio signal. Secondly the spectrogram dimension is reduced translating the linear Hertz frequency scale into the logarithmic Mel frequency scale using a triangular filter bank. Finally every short-time spectrum, i.e. every Mel spectrogram column, is classified as “Event” or “Not Event” by a Gaussian Processes Classifier. We compare our method with other event detection techniques widely used. To do so, we use MATLAB® to program each technique and test them using two datasets of music with different levels of complexity. Results show that the new methodology outperforms the standard approaches, getting an improvement by about 1.66 % on the dataset one and 0.45 % on the dataset two in terms of F-measure.

  18. Detection of Blood Culture Bacterial Contamination using Natural Language Processing

    Science.gov (United States)

    Matheny, Michael E.; FitzHenry, Fern; Speroff, Theodore; Hathaway, Jacob; Murff, Harvey J.; Brown, Steven H.; Fielstein, Elliot M.; Dittus, Robert S.; Elkin, Peter L.

    2009-01-01

    Microbiology results are reported in semi-structured formats and have a high content of useful patient information. We developed and validated a hybrid regular expression and natural language processing solution for processing blood culture microbiology reports. Multi-center Veterans Affairs training and testing data sets were randomly extracted and manually reviewed to determine the culture and sensitivity as well as contamination results. The tool was iteratively developed for both outcomes using a training dataset, and then evaluated on the test dataset to determine antibiotic susceptibility data extraction and contamination detection performance. Our algorithm had a sensitivity of 84.8% and a positive predictive value of 96.0% for mapping the antibiotics and bacteria with appropriate sensitivity findings in the test data. The bacterial contamination detection algorithm had a sensitivity of 83.3% and a positive predictive value of 81.8%. PMID:20351890

  19. Processing Satellite Imagery To Detect Waste Tire Piles

    Science.gov (United States)

    Skiles, Joseph; Schmidt, Cynthia; Wuinlan, Becky; Huybrechts, Catherine

    2007-01-01

    A methodology for processing commercially available satellite spectral imagery has been developed to enable identification and mapping of waste tire piles in California. The California Integrated Waste Management Board initiated the project and provided funding for the method s development. The methodology includes the use of a combination of previously commercially available image-processing and georeferencing software used to develop a model that specifically distinguishes between tire piles and other objects. The methodology reduces the time that must be spent to initially survey a region for tire sites, thereby increasing inspectors and managers time available for remediation of the sites. Remediation is needed because millions of used tires are discarded every year, waste tire piles pose fire hazards, and mosquitoes often breed in water trapped in tires. It should be possible to adapt the methodology to regions outside California by modifying some of the algorithms implemented in the software to account for geographic differences in spectral characteristics associated with terrain and climate. The task of identifying tire piles in satellite imagery is uniquely challenging because of their low reflectance levels: Tires tend to be spectrally confused with shadows and deep water, both of which reflect little light to satellite-borne imaging systems. In this methodology, the challenge is met, in part, by use of software that implements the Tire Identification from Reflectance (TIRe) model. The development of the TIRe model included incorporation of lessons learned in previous research on the detection and mapping of tire piles by use of manual/ visual and/or computational analysis of aerial and satellite imagery. The TIRe model is a computational model for identifying tire piles and discriminating between tire piles and other objects. The input to the TIRe model is the georeferenced but otherwise raw satellite spectral images of a geographic region to be surveyed

  20. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-12-31

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  1. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-01-01

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  2. Fourier Array Processing for Buried Victims Detection using Ultra Wide Band Radar with Uncalibrated Sensors

    NARCIS (Netherlands)

    Lidicky, L.

    2008-01-01

    The purpose of this paper is to propose a new way to detect victims buried in or under layers of rubble or debris in case of disasters such as earthquakes, fires or terrorist attacks. The method is based on Fourier Processing and Principal Component Analysis (PCA). It utilizes a moving array of sens

  3. Automotive Parking Lot and Theft Detection through Image Processing

    Directory of Open Access Journals (Sweden)

    2013-10-01

    Full Text Available Automotive parking lot and theft detection through image processing is a smart parking lot which will save time for the owner to park his car in a more organized way and also prevent theft of the car. It is a technology to optimize the checkout process by analysing a database of images of number plates of cars. The heart of the project is based on image processing. The images of number plates will be detected by Matlab and a picture of the driver will be saved in a similar database. As soon as both the images are saved, the garage entrance pole will shift 90 degrees upward using a DC MOTOR and will remain in that position for 30 seconds to allow the car to enter. After 30 seconds it will return back to its previous position. When the car exits the earlier steps will be repeated and Matlab will match both the images that were taken during entering and leaving. Meanwhile the seven segment display will show that a car has left the parking lot, by decrementing a number from its display. The cars are controlled by a microcontroller which is also able to detect and display if a vacant parking space is available. If there is no vacancy a red LED lights up, where as a green LED is used to display presence of parking space along with how many parking spots are available. It is applicable to be used in super market car parking lots and also apartment garages.

  4. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  5. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  6. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation.......The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  7. Perturbation analysis of Poisson processes

    CERN Document Server

    Last, Günter

    2012-01-01

    We consider a Poisson process $\\Phi$ on a general phase space. The expectation of a function of $\\Phi$ can be considered as a functional of the intensity measure $\\lambda$ of $\\Phi$. Extending ealier results of Molchanov and Zuyev (2000) on finite Poisson processes, we study the behaviour of this functional under signed (possibly infinite) perturbations of $\\lambda$. In particular we obtain general Margulis--Russo type formulas for the derivative with respect to non-linear transformations of the intensity measure depending on some parameter. As an application we study the behaviour of expectations of functions of multivariate pure jump L\\'evy processes under perturbations of the L\\'evy measure. A key ingredient of our approach is the explicit Fock space representation obtained in Last and Penrose (2011).

  8. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  9. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  10. "Information Design" for "Weak Signal" detection and processing in Economic Intelligence: case study on Health resources

    OpenAIRE

    Sidhom, Sahbi; Lambert, Philippe

    2011-01-01

    Abstract -- The topics of this research cover all phases of the "Information Design" applied to detect and take profit on weak signals in economic intelligence (EI) (or BI: business intelligence). The field of the information design (ID) applies the process of translating complex, unorganized or unstructured data into valuable and meaningful information. The ID's practice requires an interdisciplinary approach which combines skills in graphic design - writing, analysis processing and editing ...

  11. Performance Analysis of Intrusion Detection in MANET

    Directory of Open Access Journals (Sweden)

    SAMRIDHI SHARMA

    2011-05-01

    Full Text Available A Mobile Adhoc Network is a collection of autonomous nodes or terminals which communicate with each other by forming a multihop radio network without the aid of any established infrastructure or centralized administration such as a base station. The Adhoc Network provides lack of secure boundaries. At present, the security issues on Mobile Adhoc Network have become one of the primary concerns. The MANET is more vulnerable to attacks as compared to wired networks due to distributed nature and lacks of infrastructure. Those vulnerabilities are nature of the MANET structure that cannot be removed easily. As a result, attacks with malicious intent have been and will be devised to exploit those vulnerabilities and to cripple the MANET Operation. Attacks prevention techniques such as a authentication and encryption, can be used as medium of defense for decreasing the possibilities of attacks. These techniques have a limitation on the effects of prevention techniques in practice and they are designed for a set of known attacks. They are unlikely to prevent newer attacks that are designed for circumventing the existing security measures. For this purpose, there exist a need of mechanism that “detect and response” these type of newer attacks i.e. “Intrusion and Detection”. Intrusion detection provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other node. In addition to this ontology is a proven tool for this type of analysis. In this paper, specific ontology has been modeled which aims to explore and to classify current technique of Intrusion Detection System (IDS aware MANET. To support these ideas, a discussion regarding attacks, IDS architecture and IDS in MANET are presented inclusively and then the comparison among several researches achievement will be evaluated based on these parameters.

  12. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W.B.; Johnston, P.N.; Walker, S.R.; Bubb, I.F. [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J.F. [New South Wales Univ., Kensington, NSW (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  13. LEXICAL ANALYSIS TO EFFECTIVELY DETECT USERS’ OPINION

    Directory of Open Access Journals (Sweden)

    Anil Kumar K.M

    2011-11-01

    Full Text Available In this paper we present a lexical approach that will identify opinion of web users popularly expressedusing short words or sms words. These words are pretty popular with diverse web users and are used forexpressing their opinion on the web. The study of opinion from web arises to know the diverse opinion ofweb users. The opinion expressed by web users may be on diverse topics such as politics, sports, products,movies etc. These opinions will be very useful to others such as, leaders of political parties, selectioncommittees of various sports, business analysts and other stake holders of products, directors andproducers of movies as well as to the other concerned web users. We use semantic based approach to findusers opinion from short words or sms words apart of regular opinionated phrases. Our approachefficiently detects opinion from opinionated texts using lexical analysis and is found to be better than theother approaches on different data sets.

  14. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary - This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  15. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary – This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  16. Detecting Phishing Attacks In Purchasing Process Through Proactive Approach

    Directory of Open Access Journals (Sweden)

    S.Arun

    2012-06-01

    Full Text Available A Monitor is a software system that observes and analyzes the behavior of target system determining the quality of interest such as satisfaction of the target system. In the modern technology business processes are open and distributed which may lead to failure. Therefore monitoring is an important task for the services that comprise these processes. We are going to present a framework for multilevel monitoring of these service systems. The main objective of this project is monitoring the customer who purchases items from Merchant. Phishing is an online scam that attempts to defraud people of their personal information such as credit card or bank account information. We are going to detect, locate and remove the phishing E-mail. The customer details will be stored in web registry. We are going to demonstrate how the online business processes can be implemented with multiple scenarios that include monitoring open service policy commitments.

  17. Weighted Dickey-Fuller Processes for Detecting Stationarity

    CERN Document Server

    Steland, Ansgar

    2010-01-01

    Aiming at monitoring a time series to detect stationarity as soon as possible, we introduce monitoring procedures based on kernel-weighted sequential Dickey-Fuller (DF) processes, and related stopping times, which may be called weighted Dickey-Fuller control charts. Under rather weak assumptions, (functional) central limit theorems are established under the unit root null hypothesis and local-to-unity alternatives. For gen- eral dependent and heterogeneous innovation sequences the limit processes depend on a nuisance parameter. In this case of practical interest, one can use estimated control limits obtained from the estimated asymptotic law. Another easy-to-use approach is to transform the DF processes to obtain limit laws which are invariant with respect to the nuisance pa- rameter. We provide asymptotic theory for both approaches and compare their statistical behavior in finite samples by simulation.

  18. Algorithms Development in Detection of the Gelatinization Process during Enzymatic ‘Dodol’ Processing

    Directory of Open Access Journals (Sweden)

    Azman Hamzah

    2007-11-01

    Full Text Available Computer vision systems have found wide application in foods processing industry to perform the quality evaluation. The systems enable to replace human inspectors for the evaluation of a variety of quality attributes. This paper describes the implementation of the Fast Fourier Transform and Kalman filtering algorithms to detect the glutinous rice flour slurry (GRFS gelatinization in an enzymatic ‘dodol’ processing. The onset of the GRFS gelatinization is critical in determining the quality of an enzymatic ‘dodol’. Combinations of these two algorithms were able to detect the gelatinization of the GRFS. The result shows that the gelatinization of the GRFS was at the time range of 11.75 minutes to 15.33 minutes for 20 batches of processing. This paper will highlight the capability of computer vision using our proposed algorithms in monitoring and controlling of an enzymatic ‘dodol’ processing via image processing technology.

  19. Timely online chatter detection in end milling process

    Science.gov (United States)

    Fu, Yang; Zhang, Yun; Zhou, Huamin; Li, Dequn; Liu, Hongqi; Qiao, Haiyu; Wang, Xiaoqiang

    2016-06-01

    Chatter is one of the most unexpected and uncontrollable phenomenon during the milling operation. It is very important to develop an effective monitoring method to identify the chatter as soon as possible, while existing methods still cannot detect it before the workpiece has been damaged. This paper proposes an energy aggregation characteristic-based Hilbert-Huang transform method for online chatter detection. The measured vibration signal is firstly decomposed into a series of intrinsic mode functions (IMFs) using ensemble empirical mode decomposition. Feature IMFs are then selected according to the majority energy rule. Subsequently Hilbert spectral analysis is applied on these feature IMFs to calculate the Hilbert time/frequency spectrum. Two indicators are proposed to quantify the spectrum and thresholds are automatically calculated using Gaussian mixed model. Milling experiments prove the proposed method to be effective in protecting the workpiece from severe chatter damage within acceptable time complexity.

  20. Vision based error detection for 3D printing processes

    Directory of Open Access Journals (Sweden)

    Baumann Felix

    2016-01-01

    Full Text Available 3D printers became more popular in the last decade, partly because of the expiration of key patents and the supply of affordable machines. The origin is located in rapid prototyping. With Additive Manufacturing (AM it is possible to create physical objects from 3D model data by layer wise addition of material. Besides professional use for prototyping and low volume manufacturing they are becoming widespread amongst end users starting with the so called Maker Movement. The most prevalent type of consumer grade 3D printers is Fused Deposition Modelling (FDM, also Fused Filament Fabrication FFF. This work focuses on FDM machinery because of their widespread occurrence and large number of open problems like precision and failure. These 3D printers can fail to print objects at a statistical rate depending on the manufacturer and model of the printer. Failures can occur due to misalignment of the print-bed, the print-head, slippage of the motors, warping of the printed material, lack of adhesion or other reasons. The goal of this research is to provide an environment in which these failures can be detected automatically. Direct supervision is inhibited by the recommended placement of FDM printers in separate rooms away from the user due to ventilation issues. The inability to oversee the printing process leads to late or omitted detection of failures. Rejects effect material waste and wasted time thus lowering the utilization of printing resources. Our approach consists of a camera based error detection mechanism that provides a web based interface for remote supervision and early failure detection. Early failure detection can lead to reduced time spent on broken prints, less material wasted and in some cases salvaged objects.

  1. Risk Analysis for Tea Processing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@ Itis lbviors that after all the disasters with dilxins, BSE, pathogcns,Footand Mouth disease a. o. and now shortly because of the possibillties of bioterrorism, thatFoodSafetyisalmostatthetopoftheagendaoftheEUfor theyearstocome The implementaion of certainhy gicneprinci plessuchas HA C C P and a transparent hygiene policy applicable to all food and all food operators, from the farm to the table, togetherwith effoctiveinstruments to manage Food Safety will form fsubstantialpart on this agenda. As an example external quality factors such as certain pathogens in tea will. be discussed. Since risk analysis of e. g. my cotoxing have already a quite long histoy and development in sereral international bodies and tea might bear unwanted (or deliberately added by terroristic action)contaminants, the need to monitor teamuch more regularly than is being done today, seems to be a"conditio sine qua non ". Recentoy developed Immuno Flow tests may one day help the consumer perhaps to find out if he gets poisoned.

  2. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1991-04-16

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level.

  3. Process and apparatus for detecting presence of plant substances

    International Nuclear Information System (INIS)

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level

  4. On damage detection in wind turbine gearboxes using outlier analysis

    Science.gov (United States)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  5. Heart rate analysis by sparse representation for acute pain detection.

    Science.gov (United States)

    Tejman-Yarden, Shai; Levi, Ofer; Beizerov, Alex; Parmet, Yisrael; Nguyen, Tu; Saunders, Michael; Rudich, Zvia; Perry, James C; Baker, Dewleen G; Moeller-Bertram, Tobias

    2016-04-01

    Objective pain assessment methods pose an advantage over the currently used subjective pain rating tools. Advanced signal processing methodologies, including the wavelet transform (WT) and the orthogonal matching pursuit algorithm (OMP), were developed in the past two decades. The aim of this study was to apply and compare these time-specific methods to heart rate samples of healthy subjects for acute pain detection. Fifteen adult volunteers participated in a study conducted in the pain clinic at a single center. Each subject's heart rate was sampled for 5-min baseline, followed by a cold pressor test (CPT). Analysis was done by the WT and the OMP algorithm with a Fourier/Wavelet dictionary separately. Data from 11 subjects were analyzed. Compared to baseline, The WT analysis showed a significant coefficients' density increase during the pain incline period (p algorithm (p algorithm (p detection of pain events. Statistical analysis proved the OMP to be by far more specific allowing the Fourier coefficients to represent the signal's basic harmonics and the wavelet coefficients to focus on the time-specific painful event. This is an initial study using OMP for pain detection; further studies need to prove the efficiency of this system in different settings. PMID:26264057

  6. Bisous model-Detecting filamentary patterns in point processes

    Science.gov (United States)

    Tempel, E.; Stoica, R. S.; Kipper, R.; Saar, E.

    2016-07-01

    The cosmic web is a highly complex geometrical pattern, with galaxy clusters at the intersection of filaments and filaments at the intersection of walls. Identifying and describing the filamentary network is not a trivial task due to the overwhelming complexity of the structure, its connectivity and the intrinsic hierarchical nature. To detect and quantify galactic filaments we use the Bisous model, which is a marked point process built to model multi-dimensional patterns. The Bisous filament finder works directly with the galaxy distribution data and the model intrinsically takes into account the connectivity of the filamentary network. The Bisous model generates the visit map (the probability to find a filament at a given point) together with the filament orientation field. Using these two fields, we can extract filament spines from the data. Together with this paper we publish the computer code for the Bisous model that is made available in GitHub. The Bisous filament finder has been successfully used in several cosmological applications and further development of the model will allow to detect the filamentary network also in photometric redshift surveys, using the full redshift posterior. We also want to encourage the astro-statistical community to use the model and to connect it with all other existing methods for filamentary pattern detection and characterisation.

  7. Coherent detection and digital signal processing for fiber optic communications

    Science.gov (United States)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  8. Signal detection in FDA AERS database using Dirichlet process.

    Science.gov (United States)

    Hu, Na; Huang, Lan; Tiwari, Ram C

    2015-08-30

    In the recent two decades, data mining methods for signal detection have been developed for drug safety surveillance, using large post-market safety data. Several of these methods assume that the number of reports for each drug-adverse event combination is a Poisson random variable with mean proportional to the unknown reporting rate of the drug-adverse event pair. Here, a Bayesian method based on the Poisson-Dirichlet process (DP) model is proposed for signal detection from large databases, such as the Food and Drug Administration's Adverse Event Reporting System (AERS) database. Instead of using a parametric distribution as a common prior for the reporting rates, as is the case with existing Bayesian or empirical Bayesian methods, a nonparametric prior, namely, the DP, is used. The precision parameter and the baseline distribution of the DP, which characterize the process, are modeled hierarchically. The performance of the Poisson-DP model is compared with some other models, through an intensive simulation study using a Bayesian model selection and frequentist performance characteristics such as type-I error, false discovery rate, sensitivity, and power. For illustration, the proposed model and its extension to address a large amount of zero counts are used to analyze statin drugs for signals using the 2006-2011 AERS data. PMID:25924820

  9. Effects of image processing on the detective quantum efficiency

    Science.gov (United States)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na

    2010-04-01

    Digital radiography has gained popularity in many areas of clinical practice. This transition brings interest in advancing the methodologies for image quality characterization. However, as the methodologies for such characterizations have not been standardized, the results of these studies cannot be directly compared. The primary objective of this study was to standardize methodologies for image quality characterization. The secondary objective was to evaluate affected factors to Modulation transfer function (MTF), noise power spectrum (NPS), and detective quantum efficiency (DQE) according to image processing algorithm. Image performance parameters such as MTF, NPS, and DQE were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) images of hand posterior-anterior (PA) for measuring signal to noise ratio (SNR), slit image for measuring MTF, white image for measuring NPS were obtained and various Multi-Scale Image Contrast Amplification (MUSICA) parameters were applied to each of acquired images. In results, all of modified images were considerably influence on evaluating SNR, MTF, NPS, and DQE. Modified images by the post-processing had higher DQE than the MUSICA=0 image. This suggests that MUSICA values, as a post-processing, have an affect on the image when it is evaluating for image quality. In conclusion, the control parameters of image processing could be accounted for evaluating characterization of image quality in same way. The results of this study could be guided as a baseline to evaluate imaging systems and their imaging characteristics by measuring MTF, NPS, and DQE.

  10. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  11. CONCEPTUAL ANALYSIS OF FORMING PROCESSES PROJECT TEAM

    OpenAIRE

    Celovalnikova, E.

    2010-01-01

    In the article the conceptual analysis of the stages of forming of project team and processes is conducted management its project activity with the purpose of achievement of successful completion of the put aims in a project.

  12. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  13. Meta-analysis using Dirichlet process.

    Science.gov (United States)

    Muthukumarana, Saman; Tiwari, Ram C

    2016-02-01

    This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. PMID:22802045

  14. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2005-01-01

    Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005.......Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005....

  15. Effects of image processing on the detective quantum efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na [Yonsei University, Wonju (Korea, Republic of)

    2010-02-15

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  16. Theoretical analysis of radiographic images by nonstationary Poisson processes

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, K.; Uchida, S. (Gifu Univ. (Japan)); Yamada, I.

    1980-12-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process.

  17. Theoretical Analysis of Radiographic Images by Nonstationary Poisson Processes

    Science.gov (United States)

    Tanaka, Kazuo; Yamada, Isao; Uchida, Suguru

    1980-12-01

    This paper deals with the noise analysis of radiographic images obtained in the usual fluorescent screen-film system. The theory of nonstationary Poisson processes is applied to the analysis of the radiographic images containing the object information. The ensemble averages, the autocorrelation functions, and the Wiener spectrum densities of the light-energy distribution at the fluorescent screen and of the film optical-density distribution are obtained. The detection characteristics of the system are evaluated theoretically. Numerical examples of the one-dimensional image are shown and the results are compared with those obtained under the assumption that the object image is related to the background noise by the additive process.

  18. Detection and Analysis of Threats to the Energy Sector: DATES

    Energy Technology Data Exchange (ETDEWEB)

    Alfonso Valdes

    2010-03-31

    This report summarizes Detection and Analysis of Threats to the Energy Sector (DATES), a project sponsored by the United States Department of Energy and performed by a team led by SRI International, with collaboration from Sandia National Laboratories, ArcSight, Inc., and Invensys Process Systems. DATES sought to advance the state of the practice in intrusion detection and situational awareness with respect to cyber attacks in energy systems. This was achieved through adaptation of detection algorithms for process systems as well as development of novel anomaly detection techniques suited for such systems into a detection suite. These detection components, together with third-party commercial security systems, were interfaced with the commercial Security Information Event Management (SIEM) solution from ArcSight. The efficacy of the integrated solution was demonstrated on two testbeds, one based on a Distributed Control System (DCS) from Invensys, and the other based on the Virtual Control System Environment (VCSE) from Sandia. These achievements advance the DOE Cybersecurity Roadmap [DOE2006] goals in the area of security monitoring. The project ran from October 2007 until March 2010, with the final six months focused on experimentation. In the validation phase, team members from SRI and Sandia coupled the two test environments and carried out a number of distributed and cross-site attacks against various points in one or both testbeds. Alert messages from the distributed, heterogeneous detection components were correlated using the ArcSight SIEM platform, providing within-site and cross-site views of the attacks. In particular, the team demonstrated detection and visualization of network zone traversal and denial-of-service attacks. These capabilities were presented to the DistribuTech Conference and Exhibition in March 2010. The project was hampered by interruption of funding due to continuing resolution issues and agreement on cost share for four months in 2008

  19. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  20. Infective endocarditis detection through SPECT/CT images digital processing

    Science.gov (United States)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  1. Damage Detection in Composite Structures with Wavenumber Array Data Processing

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Yu, Lingyu

    2013-01-01

    Guided ultrasonic waves (GUW) have the potential to be an efficient and cost-effective method for rapid damage detection and quantification of large structures. Attractive features include sensitivity to a variety of damage types and the capability of traveling relatively long distances. They have proven to be an efficient approach for crack detection and localization in isotropic materials. However, techniques must be pushed beyond isotropic materials in order to be valid for composite aircraft components. This paper presents our study on GUW propagation and interaction with delamination damage in composite structures using wavenumber array data processing, together with advanced wave propagation simulations. Parallel elastodynamic finite integration technique (EFIT) is used for the example simulations. Multi-dimensional Fourier transform is used to convert time-space wavefield data into frequency-wavenumber domain. Wave propagation in the wavenumber-frequency domain shows clear distinction among the guided wave modes that are present. This allows for extracting a guided wave mode through filtering and reconstruction techniques. Presence of delamination causes spectral change accordingly. Results from 3D CFRP guided wave simulations with delamination damage in flat-plate specimens are used for wave interaction with structural defect study.

  2. Detecting geomorphic processes and change with high resolution topographic data

    Science.gov (United States)

    Mudd, Simon; Hurst, Martin; Grieve, Stuart; Clubb, Fiona; Milodowski, David; Attal, Mikael

    2016-04-01

    The first global topographic dataset was released in 1996, with 1 km grid spacing. It is astonishing that in only 20 years we now have access to tens of thousands of square kilometres of LiDAR data at point densities greater than 5 points per square meter. This data represents a treasure trove of information that our geomorphic predecessors could only dream of. But what are we to do with this data? Here we explore the potential of high resolution topographic data to dig deeper into geomorphic processes across a wider range of landscapes and using much larger spatial coverage than previously possible. We show how this data can be used to constrain sediment flux relationships using relief and hillslope length, and how this data can be used to detect landscape transience. We show how the nonlinear sediment flux law, proposed for upland, soil mantled landscapes by Roering et al. (1999) is consistent with a number of topographic tests. This flux law allows us to predict how landscapes will respond to tectonic forcing, and we show how these predictions can be used to detect erosion rate perturbations across a range of tectonic settings.

  3. Natural Language Processing for Detecting Forward Reference in a Document

    Directory of Open Access Journals (Sweden)

    Daniel Siahaan

    2012-11-01

    Full Text Available Meyer’s seven sins have been recognized as types of mistakes that a requirements specialist are often fallen to when specifying requirements. Such mistakes play a significant role in plunging a project into failure. Many researchers were focusing in ambiguity and contradiction type of mistakes. Other types of mistakes have been given less attentions. Those mistakes often happened in reality and may equally costly as the first two mistakes. This paper introduces an approach to detect forward reference. It traverses through a requirements document, extracts, and processes each statement. During the statement extraction, any terms that may reside in the statement is also extracted. Based on certain rules which utilize POS patterns, the statement is classified as a term definition or not. For each term definition, a term is added to a list of defined terms. At the same time, every time a new term is found in a statement, it is check against the list of defined terms. If it is not found, then the requirements statement is classified as statement with forward reference. The experimentation on 30 requirements documents from various domains of software project shows that the approach has considerably almost perfect agreement with domain expert in detecting forward reference, given 0.83 kappa index value.

  4. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  5. Research on Defects Detection by Image Processing of Thermographic Images

    Directory of Open Access Journals (Sweden)

    Shrestha Ranjit

    2015-10-01

    Full Text Available This paper presents the results of experimental investigation of thermal phenomena in a square shape (180 mm *180 mm STS 304 specimen with 10 mm thickness and artificial defects with circular cut-outs of varying depth and diameter at the back side. The material is aimed to be tested by means of thermal wave thermography. Lock-in thermography is employed for the detection of defects. The temperature field of the front surface of material tested is observed and analysed. The four point correlation algorithms are applied to extract phase angle of thermal wave’s harmonic component. Phase image are analyzed to find the qualitative information about the defects. Phase contrast method was used for better identification and analysis of the existing defects of the specimen.

  6. A method for detecting damage to rolling bearings in toothed gears of processing lines

    OpenAIRE

    T. Figlus; M. Stańczyk

    2016-01-01

    This paper presents a method of diagnosing damage to rolling bearings in toothed gears of processing lines. The research has shown the usefulness of vibration signal measurements performed with a laser vibrometer and of the method of denoising signals by means of a discrete wavelet transform in detecting damage to bearings. The application of the method of analysis of the characteristic frequencies of changes in the vibration signal amplitude made it possible to draw conclusions about the typ...

  7. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  8. Detection limits in x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    X-ray fluorescence spectrometry is a well established analytical technique for elemental analysis of solids, powders, or liquids. This extended abstract briefly discusses the detection limits or sensitivity of x-ray spectrometers in x-ray fluorescence analysis

  9. LRS data processing methods for detection of lunar subsurface echoes

    Science.gov (United States)

    Oshigami, Shoko; Mochizuki, Kengo; Watanabe, Shiho; Watanabe, Toshiki; Yamaguchi, Yasushi; Yamaji, Atsushi; Ono, Takayuki; Kumamoto, Atsushi; Nakagawa, Hiromu; Kobayashi, Takao; Kasahara, Yoshiya

    Lunar Radar Sounder (LRS) is an instrument for one of fifteen science missions of SE- LENE (KAGUYA). LRS is a ground-penetrating FM-CW radar system of HF-band. LRS detects echoes reflected from subsurface discontinuities where dielectric constants of the rocks change. The range resolution of LRS is 75 m in free space, whereas the sampling interval in the flight direction is about 75 m when the spacecraft altitude is 100 km. The primary objective of LRS is to investigate lunar subsurface structures. We plan to perform global soundings by LRS to contribute to studying the evolution of the Moon. In this presentation, we introduce the techniques to process LRS data to produce data products and to detect subsurface echoes. We have two standard data products of LRS under consideration. The time series data of ‘A-scope' which is a plot of signal power spectrum as a function of range derived from of the waveform data are called ‘B-scan'. Because LRS instruments change timing of data recording (measurement delay time) according to the predicted distance between KAGUYA spacecraft and lunar surface, observation range with respect to the spacecraft varies from pulse to pulse. In addition, flight altitude of KAGUYA changes in the range of several tens of kilometers. Therefore a trace of surface nadir echoes in unprocessed B-scan images does not correspond to actual lunar topography. We corrected variations of the measurement delay time and flight altitude of KAGUYA to produce a B-scan data product with the original spatial resolution (BScan high) and a reduced spatial resolution product (BScan low) both in the PDS format. The echo signals in A-scope data might be classified in the following categories; (1) a surface nadir echo, (2) surface off-nadir backscattering echoes, and (3) subsurface echoes. The most intense signal usually comes from the nadir point, when KAGUYA is flying over a level surface. The A-scope data also include various noises resulted from, for example

  10. Does facial processing prioritize change detection?: change blindness illustrates costs and benefits of holistic processing.

    Science.gov (United States)

    Wilford, Miko M; Wells, Gary L

    2010-11-01

    There is broad consensus among researchers both that faces are processed more holistically than other objects and that this type of processing is beneficial. We predicted that holistic processing of faces also involves a cost, namely, a diminished ability to localize change. This study (N = 150) utilized a modified change-blindness paradigm in which some trials involved a change in one feature of an image (nose, chin, mouth, hair, or eyes for faces; chimney, porch, window, roof, or door for houses), whereas other trials involved no change. People were better able to detect the occurrence of a change for faces than for houses, but were better able to localize which feature had changed for houses than for faces. Half the trials used inverted images, a manipulation that disrupts holistic processing. With inverted images, the critical interaction between image type (faces vs. houses) and task (change detection vs. change localization) disappeared. The results suggest that holistic processing reduces change-localization abilities. PMID:20935169

  11. Extending TOPS: Knowledge Management System for Anomaly Detection and Analysis

    Science.gov (United States)

    Votava, P.; Nemani, R. R.; Michaelis, A.

    2009-12-01

    Terrestrial Observation and Prediction System (TOPS) is a flexible modeling software system that integrates ecosystem models with frequent satellite and surface weather observations to produce ecosystem nowcasts (assessments of current conditions) and forecasts useful in natural resources management, public health and disaster management. We have been extending the Terrestrial Observation and Prediction System (TOPS) to include capability for automated anomaly detection and analysis of both on-line (streaming) and off-line data. While there are large numbers of anomaly detection algorithms for multivariate datasets, we are extending this capability beyond the anomaly detection itself and towards an automated analysis that would discover the possible causes of the anomalies. There are often indirect connections between datasets that manifest themselves during occurrence of external events and rather than searching exhaustively throughout all the datasets, our goal is to capture this knowledge and provide it to the system during automated analysis. This results in more efficient processing. Since we don’t need to process all the datasets using the original anomaly detection algorithms, which is often compute intensive; we achieve data reduction as we don’t need to store all the datasets in order to search for possible connections but we can download selected data on-demand based on our analysis. For example, an anomaly observed in vegetation Net Primary Production (NPP) can relate to an anomaly in vegetation Leaf Area Index (LAI), which is a fairly direct connection, as LAI is one of the inputs for NPP, however the change in LAI could be caused by a fire event, which is not directly connected with NPP. Because we are able to capture this knowledge we can analyze fire datasets and if there is a match with the NPP anomaly, we can infer that a fire is a likely cause. The knowledge is captured using OWL ontology language, where connections are defined in a schema

  12. Mechanism Analysis of CsI Flicker Crystal on Photoelectric Detection Process%碘化铯闪烁晶体在光电检测过程中的机理分析

    Institute of Scientific and Technical Information of China (English)

    王菊霞

    2015-01-01

    简述了光电检测的基本原理,从电子辐射、能量分布、光电效应的角度分析了碘化铯在光电检测过程中的机理,研究表明:入射光强大约为0.1 MW/cm2时,碘化铯可形成双光子光电效应.碘化铯晶体是目前较为普遍使用的较好闪烁体,可将X射线直接转换为可见光,从而获得数字图像,实现光电检测的目的.%The fundamental principle of photoelectric detection is described. The mechanism of CsI crystal in the process of photoelectric detection is analyzed on the basis of represent electron emission, energy distribution and photoelectric effects. It is found that the two-photon photoelectric effect was obtained in CsI film for incident light intensity of about 0.1 MW/cm2 . The CsI is a fine flicker at large using which can be transform X-Ray to visible light in order to obtain figure image and realize photoelectric de⁃tection.

  13. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  14. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  15. Planar Inlet Design and Analysis Process (PINDAP)

    Science.gov (United States)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  16. Obfuscated Malicious Code Detection with Path Condition Analysis

    OpenAIRE

    Wenqing Fan; Xue Lei; Jing An

    2014-01-01

    Code obfuscation is one of the main methods to hide malicious code. This paper proposes a new dynamic method which can effectively detect obfuscated malicious code. This method uses ISR to conduct dynamic debugging. The constraint solving during debugging process can detect deeply hidden malicious code by covering different execution paths. Besides, for malicious code that reads external resources, usually the detection of abnormal behaviors can only be detected by taking the resources into c...

  17. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  18. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  19. Scaling detection in time series: diffusion entropy analysis.

    Science.gov (United States)

    Scafetta, Nicola; Grigolini, Paolo

    2002-09-01

    The methods currently used to determine the scaling exponent of a complex dynamic process described by a time series are based on the numerical evaluation of variance. This means that all of them can be safely applied only to the case where ordinary statistical properties hold true even if strange kinetics are involved. We illustrate a method of statistical analysis based on the Shannon entropy of the diffusion process generated by the time series, called diffusion entropy analysis (DEA). We adopt artificial Gauss and Lévy time series, as prototypes of ordinary and anomalous statistics, respectively, and we analyze them with the DEA and four ordinary methods of analysis, some of which are very popular. We show that the DEA determines the correct scaling exponent even when the statistical properties, as well as the dynamic properties, are anomalous. The other four methods produce correct results in the Gauss case but fail to detect the correct scaling in the case of Lévy statistics. PMID:12366207

  20. Real Time Intelligent Target Detection and Analysis with Machine Vision

    Science.gov (United States)

    Howard, Ayanna; Padgett, Curtis; Brown, Kenneth

    2000-01-01

    We present an algorithm for detecting a specified set of targets for an Automatic Target Recognition (ATR) application. ATR involves processing images for detecting, classifying, and tracking targets embedded in a background scene. We address the problem of discriminating between targets and nontarget objects in a scene by evaluating 40x40 image blocks belonging to an image. Each image block is first projected onto a set of templates specifically designed to separate images of targets embedded in a typical background scene from those background images without targets. These filters are found using directed principal component analysis which maximally separates the two groups. The projected images are then clustered into one of n classes based on a minimum distance to a set of n cluster prototypes. These cluster prototypes have previously been identified using a modified clustering algorithm based on prior sensed data. Each projected image pattern is then fed into the associated cluster's trained neural network for classification. A detailed description of our algorithm will be given in this paper. We outline our methodology for designing the templates, describe our modified clustering algorithm, and provide details on the neural network classifiers. Evaluation of the overall algorithm demonstrates that our detection rates approach 96% with a false positive rate of less than 0.03%.

  1. A Systematic Analysis of Coal Accumulation Process

    Institute of Scientific and Technical Information of China (English)

    CHENG Aiguo

    2008-01-01

    Formation of coal seam and coal-rich zone is an integrated result of a series of factors in coal accumulation process. The coal accumulation system is an architectural aggregation of coal accumulation factors. It can be classified into 4 levels: the global coal accumulation super-system, the coal accumulation domain mega.system, the coal accumulation basin system, and the coal seam or coal seam set sub-system. The coal accumulation process is an open, dynamic, and grey system, and is meanwhile a system with such natures as aggregation, relevance, entirety, purpose-orientated, hierarchy, and environment adaptability. In this paper, we take coal accumulation process as a system to study origin of coal seam and coal-rich zone; and we will discuss a methodology of the systematic analysis of coal accumulation process. As an example, the Ordos coal basin was investigated to elucidate the application of the method of the coal accumulation system analysis.

  2. Exergetic analysis of human & natural processes

    CERN Document Server

    Banhatti, Dilip G

    2011-01-01

    Using the concept of available work or exergy, each human and natural process can be characterized by its contextual efficiency, that is, efficiency with the environment as a reference. Such an efficiency is termed exergy efficiency. Parts of the process which need to be made more efficient & less wasteful stand out in such an analysis, in contrast to an energy analysis. Any new idea for a process can be similarly characterized. This exercise naturally generates paths to newer ideas in given contexts to maximize exergy efficiency. The contextual efficiency is not just output/input, it also naturally includes environmental impact (to be minimized) and any other relevant parameter(s) to be optimized. Natural life processes in different terrestrial environments are already optimized for their environments, and act as guides, for example, in seeking to evolve sustainable energy practices in different contexts. Energy use at lowest possible temperature for each situation is a natural result. Variety of renewab...

  3. KNOWLEDGE PROCESS ANALYSIS:FRAMEWORK AND EXPERIENCE

    Institute of Scientific and Technical Information of China (English)

    Kozo SUGIYAMA; Bertolt MEYER

    2008-01-01

    We present a step-by-step approach for constructing a framework for knowledge process analysis (KPA).We intend to apply this framework to the analysis of own research projects in an exploratory way and elaborate it through the accumulation of case studies.This study is based on a methodology consisting of knowledge process modeling,primitives synthesis,and reflective verification.We describe details of the methodology and present the results of case studies:a novd methodology,a practical work guide,and a tool for KPA;insights for improving future research projects and education;and the integration of existing knowledge creation theories.

  4. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  5. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  6. Analysis of Spatial Data Structures for Proximity Detection

    Institute of Scientific and Technical Information of China (English)

    Anupreet Walia; Jochen Teizer

    2008-01-01

    Construction is a dangerous business.According to statistics,in every of the past thirteen years more than 1000 workers died in the USA construction industry.In order to minimize the overall number of these incidents,the research presented in this paper investigates to monitor and analyze the trejectories of construction resources first in a simulated environment and later on the actual job site.Due to the complex nature of the construction environment,three dimensional (3D) positioning data of workers is hardly col-lected.Although technology is available that allows tracking construction assets in real-time,indoors and outdoors,in 3D,at the same time,the continuously changing spatial and temporal arrangement of job sites requires any successfully working data processing system to work in real-time.This research paper focuses is safety on spatial data structures that offer the capability of realigning itself and reporting the distance of the closest neighbor in real-time.This paper presents results to simulations that allow the processing of real-time location data for collision detection and proximity analysis.The presented data structures and perform-ance results to the developed algorithms demonstmte that real-time tracking and proximity detection of re-sources is feasible.

  7. Probing Interfacial Processes on Graphene Surface by Mass Detection

    Science.gov (United States)

    Kakenov, Nurbek; Kocabas, Coskun

    2013-03-01

    In this work we studied the mass density of graphene, probed interfacial processes on graphene surface and examined the formation of graphene oxide by mass detection. The graphene layers were synthesized by chemical vapor deposition method on copper foils and transfer-printed on a quartz crystal microbalance (QCM). The mass density of single layer graphene was measured by investigating the mechanical resonance of the QCM. Moreover, we extended the developed technique to probe the binding dynamics of proteins on the surface of graphene, were able to obtain nonspecific binding constant of BSA protein of graphene surface in aqueous solution. The time trace of resonance signal showed that the BSA molecules rapidly saturated by filling the available binding sites on graphene surface. Furthermore, we monitored oxidation of graphene surface under oxygen plasma by tracing the changes of interfacial mass of the graphene controlled by the shifts in Raman spectra. Three regimes were observed the formation of graphene oxide which increases the interfacial mass, the release of carbon dioxide and the removal of small graphene/graphene oxide flakes. Scientific and Technological Research Council of Turkey (TUBITAK) grant no. 110T304, 109T209, Marie Curie International Reintegration Grant (IRG) grant no 256458, Turkish Academy of Science (TUBA-Gebip).

  8. Bisous model - detecting filamentary patterns in point processes

    CERN Document Server

    Tempel, E; Kipper, R; Saar, E

    2016-01-01

    The cosmic web is a highly complex geometrical pattern, with galaxy clusters at the intersection of filaments and filaments at the intersection of walls. Identifying and describing the filamentary network is not a trivial task due to the overwhelming complexity of the structure, its connectivity and the intrinsic hierarchical nature. To detect and quantify galactic filaments we use the Bisous model, which is a marked point process built to model multi-dimensional patterns. The Bisous filament finder works directly with the galaxy distribution data and the model intrinsically takes into account the connectivity of the filamentary network. The Bisous model generates the visit map (the probability to find a filament at a given point) together with the filament orientation field. Using these two fields, we can extract filament spines from the data. Together with this paper we publish the computer code for the Bisous model that is made available in GitHub. The Bisous filament finder has been successfully used in s...

  9. Non-Harmonic Fourier Analysis for bladed wheels damage detection

    Science.gov (United States)

    Neri, P.; Peeters, B.

    2015-11-01

    The interaction between bladed wheels and the fluid distributed by the stator vanes results in cyclic loading of the rotating components. Compressors and turbines wheels are subject to vibration and fatigue issues, especially when resonance conditions are excited. Even if resonance conditions can be often predicted and avoided, high cycle fatigue failures can occur, causing safety issues and economic loss. Rigorous maintenance programs are then needed, forcing the system to expensive shut-down. Blade crack detection methods are beneficial for condition-based maintenance. While contact measurement systems are not always usable in exercise conditions (e.g. high temperature), non-contact methods can be more suitable. One (or more) stator-fixed sensor can measure all the blades as they pass by, in order to detect the damaged ones. The main drawback in this situation is the short acquisition time available for each blade, which is shortened by the high rotational speed of the components. A traditional Discrete Fourier Transform (DFT) analysis would result in a poor frequency resolution. A Non-Harmonic Fourier Analysis (NHFA) can be executed with an arbitrary frequency resolution instead, allowing to obtain frequency information even with short-time data samples. This paper shows an analytical investigation of the NHFA method. A data processing algorithm is then proposed to obtain frequency shift information from short time samples. The performances of this algorithm are then studied by experimental and numerical tests.

  10. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders;

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include plan...... of the dye Bromothymol Blue. The influence of three different bonding procedures on the spectrally resolved propagation loss of the integrated waveguides between 500 nm and 900 nm was furthermore determined.......A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar...... waveguides and fiber-to-waveguide coupler structures, are defined in the same processing step. This results in self-alignment of all components and enables a fabrication and packaging time of only one day. The fabrication scheme has recently been presented elsewhere for fluorescence excitation of beads...

  11. Experimental exploration to thermal infrared imaging for detecting the transient process of solid impact

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the analysis and the comparison of stress pattern analysis by thermal emission (SPATE) and remote sensing rock mechanics (RSRM), the idea to detect the transient process of solid impact with thermal infrared (TIR) imaging technology is introduced. By means of TVS-8100MKII T IR imaging system, which has high recording speed, high space distinguishability and high temperature sensibility, TIR imaging experiments on free falling steel ball impacting on marble, granite, concrete, steel, organic-glass and wood plate are conducted. It was discovered that: (i) the target's TIR temperature increases remarkably after impact; (ii) when ball's size is not changed, the variation amplitude of target's TIR temperature proportionally increases with the ball's potential energy or falling height; (iii) the variation amplitude of target's TIR temperature is involved with the material type and the surface glabrous condition of the target, and the amplitudes are in order as concrete, unpolished marble, steel plate, wood plate, polished granite, polished marble and organic-glass plate; and (iv) the TIR radiation of fragile targets decreases gradually after impact, while there is delayed TIR radiation strengthening for plastic target. It is deduced that once the relational runctions and technical parameters, which are involved with certain impact body and target material, are set up through experimental study, the remote detection and back analysis based on TIR imaging for the transient process of solid impact will be no problem. Besides, there is also important scientific meaning for the omen mechanics study and satellite TIR detection and prediction for structural earthquake.

  12. Detection and Analysis of Solar Eclipse

    CERN Document Server

    Sridhar, Sarrvesh Seethapuram; Jackson, I Kenny; Kannan, P

    2012-01-01

    We propose an algorithm that can be used by amateur astronomers to analyze the images acquired during solar eclipses. The proposed algorithm analyzes the image, detects the eclipse and produces results for parameters like magnitude of eclipse, eclipse obscuration and the approximate distance between the Earth and the Moon.

  13. Trace analysis for 300 MM wafers and processes with TXRF

    International Nuclear Information System (INIS)

    Efficient fabrication of semiconductor devices is combined with an increasing size of silicon wafers. The contamination level of processes, media, and equipment has to decrease continuously. A new test laboratory for 300 mm was installed in view of the above mentioned aspects. Aside of numerous processing tools this platform consist electrical test methods, particle detection, vapor phase decomposition (VPD) preparation, and TXRF. The equipment is installed in a cleanroom. It is common to perform process or equipment control, development, evaluation and qualification with monitor wafers. The evaluation and the qualification of 300 mm equipment require direct TXRF on 300 mm wafers. A new TXRF setup was installed due to the wafer size of 300 mm. The 300 mm TXRF is equipped with tungsten and molybdenum anode. This combination allows a sensitive detection of elements with fluorescence energy below 10 keV for tungsten excitation. The molybdenum excitation enables the detection of a wide variety of elements. The detection sensitivity for the tungsten anode excited samples is ten times higher than for molybdenum anode measured samples. The system is calibrated with 1 ng Ni. This calibration shows a stability within 5 % when monitored to control system stability. Decreasing the amount of Ni linear results in a linear decrease of the measured Ni signal. This result is verified for a range of elements by multielement samples. New designs demand new processes and materials, e.g. ferroelectric layers and copper. The trace analysis of many of these materials is supported by the higher excitation energy of the molybdenum anode. Reclaim and recycling of 300 mm wafers demand for an accurate contamination control of the processes to avoid cross contamination. Polishing or etching result in modified surfaces. TXRF as a non-destructive test method allows the simultaneously detection of a variety of elements on differing surfaces in view of contamination control and process

  14. Applications of random process excursion analysis

    CERN Document Server

    Brainina, Irina S

    2013-01-01

    This book addresses one of the key problems in signal processing, the problem of identifying statistical properties of excursions in a random process in order to simplify the theoretical analysis and make it suitable for engineering applications. Precise and approximate formulas are explained, which are relatively simple and can be used for engineering applications such as the design of devices which can overcome the high initial uncertainty of the self-training period. The information presented in the monograph can be used to implement adaptive signal processing devices capable of d

  15. Fault Detection and Diagnosis in Process Data Using Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Fang Wu

    2014-01-01

    Full Text Available For the complex industrial process, it has become increasingly challenging to effectively diagnose complicated faults. In this paper, a combined measure of the original Support Vector Machine (SVM and Principal Component Analysis (PCA is provided to carry out the fault classification, and compare its result with what is based on SVM-RFE (Recursive Feature Elimination method. RFE is used for feature extraction, and PCA is utilized to project the original data onto a lower dimensional space. PCA T2, SPE statistics, and original SVM are proposed to detect the faults. Some common faults of the Tennessee Eastman Process (TEP are analyzed in terms of the practical system and reflections of the dataset. PCA-SVM and SVM-RFE can effectively detect and diagnose these common faults. In RFE algorithm, all variables are decreasingly ordered according to their contributions. The classification accuracy rate is improved by choosing a reasonable number of features.

  16. Detrended Fluctuation Analysis of Autoregressive Processes

    CERN Document Server

    Morariu, V V; Vamos, C; Soltuz, S

    2007-01-01

    Autoregressive processes (AR) have typical short-range memory. Detrended Fluctuation Analysis (DFA) was basically designed to reveal long range correlation in non stationary processes. However DFA can also be regarded as a suitable method to investigate both long-range and short range correlation in non-stationary and stationary systems. Applying DFA to AR processes can help understanding the non uniform correlation structure of such processes. We systematically investigated a first order autoregressive model AR(1) by DFA and established the relationship between the interaction constant of AR(1) and the DFA correlation exponent. The higher the interaction constant the higher is the short range correlation exponent. They are exponentially related. The investigation was extended to AR(2) processes. The presence of a distant positive interaction in addition to a near by interaction will increase the correlation exponent and the range of correlation while the effect of a distant negative interaction will decrease...

  17. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm

    OpenAIRE

    Jonny Karlsson; Dooley, Laurence S.; Göran Pulkkis

    2013-01-01

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate node...

  18. Application of wavelet analysis to crustal deformation data processing

    Institute of Scientific and Technical Information of China (English)

    张燕; 吴云; 刘永启; 施顺英

    2004-01-01

    The time-frequency analysis and anomaly detection of wavelet transformation make the method irresistibly advantageous in non-stable signal processing. In the paper, the two characteristics are analyzed and demonstrated withsynthetic signal. By applying wavelet transformation to deformation data processing, we find that about 4 monthsbefore strong earthquakes, several deformation stations near the epicenter received at the same time the abnormalsignal with the same frequency and the period from several days to more than ten days. The GPS observation stations near the epicenter all received the abnormal signal whose period is from 3 months to half a year. These abnormal signals are possibly earthquake precursors.

  19. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  20. Development of Quantum Devices and Algorithms for Radiation Detection and Radiation Signal Processing

    International Nuclear Information System (INIS)

    The main functions of spectroscopy system are signal detection, filtering and amplification, pileup detection and recovery, dead time correction, amplitude analysis and energy spectrum analysis. Safeguards isotopic measurements require the best spectrometer systems with excellent resolution, stability, efficiency and throughput. However, the resolution and throughput, which depend mainly on the detector, amplifier and the analog-to-digital converter (ADC), can still be improved. These modules have been in continuous development and improvement. For this reason we are interested with both the development of quantum detectors and efficient algorithms of the digital processing measurement. Therefore, the main objective of this thesis is concentrated on both 1. Study quantum dot (QD) devices behaviors under gamma radiation 2. Development of efficient algorithms for handling problems of gamma-ray spectroscopy For gamma radiation detection, a detailed study of nanotechnology QD sources and infrared photodetectors (QDIP) for gamma radiation detection is introduced. There are two different types of quantum scintillator detectors, which dominate the area of ionizing radiation measurements. These detectors are QD scintillator detectors and QDIP scintillator detectors. By comparison with traditional systems, quantum systems have less mass, require less volume, and consume less power. These factors are increasing the need for efficient detector for gamma-ray applications such as gamma-ray spectroscopy. Consequently, the nanocomposite materials based on semiconductor quantum dots has potential for radiation detection via scintillation was demonstrated in the literature. Therefore, this thesis presents a theoretical analysis for the characteristics of QD sources and infrared photodetectors (QDIPs). A model of QD sources under incident gamma radiation detection is developed. A novel methodology is introduced to characterize the effect of gamma radiation on QD devices. The rate

  1. Performance Analysis of Cone Detection Algorithms

    CERN Document Server

    Mariotti, Letizia

    2015-01-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of two popular cone detection algorithms and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the three algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimat...

  2. Detecting DNS Tunnels Using Character Frequency Analysis

    CERN Document Server

    Born, Kenton

    2010-01-01

    High-bandwidth covert channels pose significant risks to sensitive and proprietary information inside company networks. Domain Name System (DNS) tunnels provide a means to covertly infiltrate and exfiltrate large amounts of information passed network boundaries. This paper explores the possibility of detecting DNS tunnels by analyzing the unigram, bigram, and trigram character frequencies of domains in DNS queries and responses. It is empirically shown how domains follow Zipf's law in a similar pattern to natural languages, whereas tunneled traffic has more evenly distributed character frequencies. This approach allows tunnels to be detected across multiple domains, whereas previous methods typically concentrate on monitoring point to point systems. Anomalies are quickly discovered when tunneled traffic is compared to the character frequency fingerprint of legitimate domain traffic.

  3. PERFORMANCE ANALYSIS OF HARDWARE TROJAN DETECTION METHODS

    OpenAIRE

    Ehsan, Sharifi; Kamal, Mohammadiasl; Mehrdad, Havasi; Amir, Yazdani

    2015-01-01

    Due to the increasing use of information and communication technologies in most aspects of life, security of the information has drawn the attention of governments and industry as well as the researchers. In this regard, structural attacks on the functions of a chip are called hardware Trojans, and are capable of rendering ineffective the security protecting our systems and data. This method represents a big challenge for cyber-security as it is nearly impossible to detect with any currently ...

  4. Detecting Inhomogeneity in Daily Climate Series Using Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    YAN Zhongwei; Phil D.JONES

    2008-01-01

    A wavelet method was applied to detect inhomogeneities in daily meteorological series,data which are being increasingly applied in studies of climate extremes.The wavelet method has been applied to a few well-established long-term daily temperature series back to the 18th century,which have been "homogenized" with conventional approaches.Various types of problems remaining in the series were revealed with the wavelet method.Their influences on analyses of change in climate extremes are discussed.The results have importance for understanding issues in conventional climate data processing and for development of improved methods of homogenization in order to improve analysis of climate extremes based on daily data.

  5. Interactive Fringe processing algorithm for interferogram analysis

    Science.gov (United States)

    Parthiban, V.; Sirohi, Rajpal S.

    A highly flexible algorithm for interferogram processing which enables the operator to interact with the computer at every stage, is presented. This algorithm developed on a PDP 11/23 microcomputer, uses Fortran callable subroutines based on Intellect 100 image processing hardware and a CUB R-G-B monitor. It also uses a single frame buffer of 512 x 512 x 8 pixels. This software employs a pseudo-colour mapping technique which helps the operator to select the optimum threshold values. Manual editing of the processed fringe pattern is also possible to enable removal of unwanted kinks and to connect any discontinuities. A fringe scanning subroutine is used to number the fringes and to store the peak coordinates in a data file for fringe analysis. The algorithm is employed for the analysis of an interferogram obtained from an inverting interferometer and the results are presented.

  6. In-Line Detection and Measurement of Molecular Contamination in Semiconductor Process Solutions

    Science.gov (United States)

    Wang, Jason; West, Michael; Han, Ye; McDonald, Robert C.; Yang, Wenjing; Ormond, Bob; Saini, Harmesh

    2005-09-01

    This paper discusses a fully automated metrology tool for detection and quantitative measurement of contamination, including cationic, anionic, metallic, organic, and molecular species present in semiconductor process solutions. The instrument is based on an electrospray ionization time-of-flight mass spectrometer (ESI-TOF/MS) platform. The tool can be used in diagnostic or analytical modes to understand process problems in addition to enabling routine metrology functions. Metrology functions include in-line contamination measurement with near real-time trend analysis. This paper discusses representative organic and molecular contamination measurement results in production process problem solving efforts. The examples include the analysis and identification of organic compounds in SC-1 pre-gate clean solution; urea, NMP (N-Methyl-2-pyrrolidone) and phosphoric acid contamination in UPW; and plasticizer and an organic sulfur-containing compound found in isopropyl alcohol (IPA). It is expected that these unique analytical and metrology capabilities will improve the understanding of the effect of organic and molecular contamination on device performance and yield. This will permit the development of quantitative correlations between contamination levels and process degradation. It is also expected that the ability to perform routine process chemistry metrology will lead to corresponding improvements in manufacturing process control and yield, the ability to avoid excursions and will improve the overall cost effectiveness of the semiconductor manufacturing process.

  7. On Multiview Analysis for Fingerprint Liveness Detection

    OpenAIRE

    Bottino, Andrea Giuseppe; Cumani, Sandro; Toosi, Amirhosein

    2015-01-01

    Fingerprint recognition systems, as any other biometric system, can be subject to attacks, which are usually carried out using artificial fingerprints. Several approaches to discriminate between live and fake fingerprint images have been presented to address this issue. These methods usually rely on the analysis of individual features extracted from the fingerprint images. Such features represent different and complementary views of the object in analysis, and their fusion is likely to improv...

  8. Qualitative Analysis for Maintenance Process Assessment

    Science.gov (United States)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  9. Processing of polarimetric infrared images for landmine detection

    NARCIS (Netherlands)

    Cremer, F.; Jong, W. de; Schutte, K.

    2003-01-01

    Infrared (IR) cameras are often used in a vehicle based multi-sensor platform for landmine detection. Additional to thermal contrasts, an IR polarimetric sensor also measures surface properties and therefore has the potential of increased detection performance. We have developed a polarimetric IR se

  10. Detecting Gender Bias Through Test Item Analysis

    Science.gov (United States)

    González-Espada, Wilson J.

    2009-03-01

    Many physical science and physics instructors might not be trained in pedagogically appropriate test construction methods. This could lead to test items that do not measure what they are intended to measure. A subgroup of these items might show bias against some groups of students. This paper describes how the author became aware of potentially biased items against females in his examinations, which led to the exploration of fundamental issues related to item validity, gender bias, and differential item functioning, or DIF. A brief discussion of DIF in the context of university courses, as well as practical suggestions to detect possible gender-biased items, follows.

  11. Analysis of the Industrial Biodiesel Production Process

    International Nuclear Information System (INIS)

    The reaction of transesterification is the chemical transformation through which you get biodiesel from vegetable oils. The purpose of this work is to plan carefully all the stages of various biodiesel production processes on the basis of recent results obtained in the experimental research. These results allow defining the proper thermodynamic models to be used, the right interpretation of the phenomena and identifying the parameters which affect the process. The modelling was done with ASPENPLUS (R) defining three possible processes used in industrial purpose. A subsequent sensitivity analysis was done for each process allowing the identification of the optimal configurations. By comparing these solutions it is possible to choose the most efficient one to reduce the costs of the final product.

  12. Probabilistic analysis of a thermosetting pultrusion process

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion...... obtained from both methods, the variations in the activation energy as well as the density of the resin are found to have a relatively stronger influence on the centerline degree of cure at the exit. Moreover, different execution strategies are examined for the MCS to investigate their effects...... process. A new application for the probabilistic analysis of the pultrusion process is introduced using the response surface method (RSM). The results obtained from the RSM are validated by employing the Monte Carlo simulation (MCS) with Latin hypercube sampling technique. According to the results...

  13. Performance analysis of cone detection algorithms.

    Science.gov (United States)

    Mariotti, Letizia; Devaney, Nicholas

    2015-04-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of three popular cone detection algorithms, and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the four algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimated regularity is the most sensitive parameter. PMID:26366758

  14. Detection of short transients in colored noise by multiresolution analysis

    OpenAIRE

    Stevens, John Davenport.

    2000-01-01

    Detecting short transients is a signal processing application that has a wide range of military uses. To be specific in Undersea Warfare, sensitive signal detection schemes can increase the effective range of active and passive sonar operations. Current research is being done to improve the capability of detecting short signals buried within background noise, particularly in Littoral waters. Starting with a colored noise model, this thesis will introduce two denoising methods based on multire...

  15. Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method

    OpenAIRE

    DR.K.KUPPUSAMY; S. Murugan

    2010-01-01

    This paper is deliberate to provide a model for “Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method ” , It describes the state’s overall requirements regarding the acquisition and implementation of intrusion prevention and detection systems with intelligence (IIPS/IIDS). This is designed to provide a deeper understanding of intrusion prevention and detection principles with intelligence may be responsible for acquiring, implementing or monitoring such sy...

  16. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira;

    2013-01-01

    Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature...

  17. Space Applications for Ensemble Detection and Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Ensemble Detection is both a measurement technique and analysis tool. Like a prism that separates light into spectral bands, an ensemble detector mixes a signal...

  18. Detection of ancient Egyptian archaeological sites using satellite remote sensing and digital image processing

    Science.gov (United States)

    Corrie, Robert K.

    2011-11-01

    Satellite remote sensing is playing an increasingly important role in the detection and documentation of archaeological sites. Surveying an area from the ground using traditional methods often presents challenges due to the time and costs involved. In contrast, the multispectral synoptic approach afforded by the satellite sensor makes it possible to cover much larger areas in greater spectral detail and more cost effectively. This is especially the case for larger scale regional surveys, which are helping to contribute to a better understanding of ancient Egyptian settlement patterns. This study presents an overview of satellite remote sensing data products, methodologies, and image processing techniques for detecting lost or undiscovered archaeological sites with reference to Egypt and the Near East. Key regions of the electromagnetic spectrum useful for site detection are discussed, including the visible near-infrared (VNIR), shortwave infrared (SWIR), thermal infrared (TIR), and microwave (radar). The potential of using Google Earth as both a data provider and a visualization tool is also examined. Finally, a case study is presented for detecting tell sites in Egypt using Landsat ETM+, ASTER, and Google Earth imagery. The results indicated that principal components analysis (PCA) was successfully able to detect and differentiate tell sites from modern settlements in Egypt's northwestern Nile Delta region.

  19. Analysis of rocket engine injection combustion processes

    Science.gov (United States)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  20. Detection of a novel, integrative aging process suggests complex physiological integration.

    Directory of Open Access Journals (Sweden)

    Alan A Cohen

    Full Text Available Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging. The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.; we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  1. Automatic basal slice detection for cardiac analysis

    Science.gov (United States)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  2. Signal detection using change point analysis in postmarket surveillance†

    OpenAIRE

    Xu, Zhiheng; Kass-Hout, Taha; Anderson-Smits, Colin; Gray, Gerry

    2015-01-01

    Purpose Signal detection methods have been used extensively in postmarket surveillance to identify elevated risks of adverse events associated with medical products (drugs, vaccines, and devices). However, current popular disproportionality methods ignore useful information such as trends when the data are aggregated over time for signal detection. Methods In this paper, we applied change point analysis (CPA) to trend analysis of medical products in a spontaneous adverse event reporting syste...

  3. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  4. Cross-Disciplinary Detection and Analysis of Network Motifs

    OpenAIRE

    Ngoc Tam L. Tran; Luke DeLuccia; McDonald, Aidan F; Chun-Hsi Huang

    2015-01-01

    The detection of network motifs has recently become an important part of network analysis across all disciplines. In this work, we detected and analyzed network motifs from undirected and directed networks of several different disciplines, including biological network, social network, ecological network, as well as other networks such as airlines, power grid, and co-purchase of political books networks. Our analysis revealed that undirected networks are similar at the basic three and four nod...

  5. Fine analysis on advanced detection of transient electromagnetic method

    Institute of Scientific and Technical Information of China (English)

    Wang Bo; Liu Shengdong; Yang Zhen; Wang Zhijun; Huang Lanying

    2012-01-01

    Fault fracture zones and water-bearing bodies in front of the driving head are the main disasters in mine laneways,thus it is important to perform their advanced detection and prediction in advance in order to provide reliable technical support for the excavation.Based on the electromagnetic induction theory,we analyzed the characteristics of primary and secondary fields with a positive and negative wave form of current,proposed the fine processing of the advanced detection with variation rate of apparent resistivity and introduced in detail the computational formulae and procedures.The result of physical simulation experiments illustrate that the tectonic interface of modules can be judged by first-order rate of apparent resistivity with a boundary error of 5%,and the position of water body determined by the fine analysis method agrees well with the result of borehole drilling.This shows that in terms of distinguishing structure and aqueous anomalies,the first-order rate of apparent resistivity is more sensitive than the secondorder rate of apparent resistivity.However,some remaining problems are suggested for future solutions.

  6. Onboard Detection of Snow, Ice, Clouds, and Other Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The detection of clouds within a satellite image is essential for retrieving surface geophysical parameters from optical and thermal imagery. Even a small...

  7. Image corruption detection in diffusion tensor imaging for post-processing and real-time monitoring.

    Science.gov (United States)

    Li, Yue; Shea, Steven M; Lorenz, Christine H; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called "corrected Inter-Slice Intensity Discontinuity" (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies.

  8. Location precision analysis of stereo thermal anti-sniper detection system

    Science.gov (United States)

    He, Yuqing; Lu, Ya; Zhang, Xiaoyan; Jin, Weiqi

    2012-06-01

    Anti-sniper detection devices are the urgent requirement in modern warfare. The precision of the anti-sniper detection system is especially important. This paper discusses the location precision analysis of the anti-sniper detection system based on the dual-thermal imaging system. It mainly discusses the following two aspects which produce the error: the digital quantitative effects of the camera; effect of estimating the coordinate of bullet trajectory according to the infrared images in the process of image matching. The formula of the error analysis is deduced according to the method of stereovision model and digital quantitative effects of the camera. From this, we can get the relationship of the detecting accuracy corresponding to the system's parameters. The analysis in this paper provides the theory basis for the error compensation algorithms which are put forward to improve the accuracy of 3D reconstruction of the bullet trajectory in the anti-sniper detection devices.

  9. A pathway analysis of global aerosol processes

    Science.gov (United States)

    Schutgens, N. A. J.; Stier, P.

    2014-11-01

    We present a detailed budget of the changes in atmospheric aerosol mass and numbers due to various processes: emission (including instant condensation of soluble biogenic emissions), nucleation, coagulation, H2SO4 condensation and in-cloud production, aging and deposition. The budget is created from monthly averaged tracer tendencies calculated by the global aerosol model ECHAM5.5-HAM2 and allows us to investigate process contributions at various length-scales and timescales. As a result, we show in unprecedented detail what processes drive the evolution of aerosol. In particular, we show that the processes that affect aerosol masses are quite different from those that affect aerosol numbers. Condensation of H2SO4 gas onto pre-existing particles is an important process, dominating the growth of small particles in the nucleation mode to the Aitken mode and the aging of hydrophobic matter. Together with in-cloud production of H2SO4, it significantly contributes to (and often dominates) the mass burden (and hence composition) of the hydrophilic Aitken and accumulation mode particles. Particle growth itself is the leading source of number densities in the hydrophilic Aitken and accumulation modes, with their hydrophobic counterparts contributing (even locally) relatively little. As expected, the coarse mode is dominated by primary emissions and mostly decoupled from the smaller modes. Our analysis also suggests that coagulation serves mainly as a loss process for number densities and that, relative to other processes, it is a rather unimportant contributor to composition changes of aerosol. The analysis is extended with sensitivity studies where the impact of a lower model resolution or pre-industrial emissions is shown to be small. We discuss the use of the current budget for model simplification, prioritization of model improvements, identification of potential structural model errors and model evaluation against observations.

  10. Process Correlation Analysis Model for Process Improvement Identification

    OpenAIRE

    Su-jin Choi; Dae-Kyoo Kim; Sooyong Park

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practice...

  11. Predictive Analysis for Social Processes II: Predictability and Warning Analysis

    CERN Document Server

    Colbaugh, Richard

    2009-01-01

    This two-part paper presents a new approach to predictive analysis for social processes. Part I identifies a class of social processes, called positive externality processes, which are both important and difficult to predict, and introduces a multi-scale, stochastic hybrid system modeling framework for these systems. In Part II of the paper we develop a systems theory-based, computationally tractable approach to predictive analysis for these systems. Among other capabilities, this analytic methodology enables assessment of process predictability, identification of measurables which have predictive power, discovery of reliable early indicators for events of interest, and robust, scalable prediction. The potential of the proposed approach is illustrated through case studies involving online markets, social movements, and protest behavior.

  12. Strategic analysis of a data processing company

    OpenAIRE

    Chen, George C. M.

    2005-01-01

    This paper contains a strategic analysis of a data processing company that provides outsourced payroll services to employers in Canada. For the purpose of confidentiality, the name of this company is disguised as ABC Canada. This company is in a mature industry, which is characterized by slow growth and narrowing of product differentiation as new entrants are able to penetrate the market at lower cost due to the advancement in information technology. The growth of this industry in the last fo...

  13. Process Simulation Analysis of HF Stripping

    OpenAIRE

    Thaer A. Abdulla

    2013-01-01

       HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq). Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee),...

  14. Application of signal processing techniques to the detection of tip vortex cavitation noise in marine propeller

    Institute of Scientific and Technical Information of China (English)

    LEE Jeung-Hoon; HAN Jae-Moon; PARK Hyung-Gil; SEO Jong-Soo

    2013-01-01

    The tip vortex cavitation and its relevant noise has been the subject of extensive researches up to now.In most cases of experimental approaches,the accurate and objective decision of cavitation inception is primary,which is the main topic of this paper.Although the conventional power spectrum is normally adopted as a signal processing tool for the analysis of cavitation noise,a faithful exploration cannot be made especially for the cavitation inception.Alternatively,the periodic occurrence of bursting noise induced from tip vortex cavitation gives a diagnostic proof that the repeating frequency of the bursting contents can be exploited as an indication of the inception.This study,hence,employed the Short-Time Fourier Transform (STFT) analysis and the Detection of Envelope Modulation On Noise (DEMON) spectrum analysis,both which are appropriate for finding such a repeating frequency.Through the acoustical measurement in a water tunnel,the two signal processing techniques show a satisfactory result in detecting the inception of tip vortex cavitation.

  15. The Detections and Analysis of Composition Proportion in Beer and Optimization Production Processes%啤酒中成分配比的检测分析及生产工艺优化

    Institute of Scientific and Technical Information of China (English)

    马慧宁

    2013-01-01

      采用静态顶空一气相色谱法测定啤酒中的醛类、乙醇、高级醇、高级酯等风味物质,优化了试验条件,确定了最佳的测定工艺参数。并应用以上方法,对不同生产工艺生产出的啤酒相应成分进行检测,使最终产品中各种风味物质含量、比例最佳,帮助企业找到最佳的生产工艺。%This research established a simple head-space gas chromatography (HS-GC) method on analysis of the main flavors in beer, in-clude:aldehydes,alcohols and ethers. Experimental condition was optimized, the optimum test technological parameter were determined.Test on beer components produced by different production processes with above method were made, accompanying with content and proportion of every flavoring substances reached best state in the final beers, to help enterprises find the best production technique.

  16. Sound vibration signal processing for detection and identification detonation (knock) to optimize performance Otto engine

    Science.gov (United States)

    Sujono, A.; Santoso, B.; Juwana, W. E.

    2016-03-01

    Problems of detonation (knock) on Otto engine (petrol engine) is completely unresolved problem until now, especially if want to improve the performance. This research did sound vibration signal processing engine with a microphone sensor, for the detection and identification of detonation. A microphone that can be mounted is not attached to the cylinder block, that's high temperature, so that its performance will be more stable, durable and inexpensive. However, the method of analysis is not very easy, because a lot of noise (interference). Therefore the use of new methods of pattern recognition, through filtration, and the regression function normalized envelope. The result is quite good, can achieve a success rate of about 95%.

  17. Size-varying small target detection for infrared image processing

    Science.gov (United States)

    Li, Miao; Zhu, Ran; Long, Yunli; An, Wei; Zhou, Yiyu

    2015-10-01

    IRST (Infrared Search and Track) has been applied to many military or civil fields such as precise guidance, aerospace, early warning. As a key technique, small target detection based on infrared image plays an important role. However, infrared targets have their own characteristics, such as target size variation, which make the detection work quite difficult. In practical application, the target size may vary due to many reasons, such as optic angle of sensors, imaging distance, environment and so on. For conventional detection methods, it is difficult to detect such size-varying targets, especially when the backgrounds have strong clutters. This paper presents a novel method to detect size-varying infrared targets in a cluttered background. It is easy to find that the target region is salient in infrared images. It means that target region have a signature of discontinuity with its neighboring regions and concentrates in a relatively small region, which can be considered as a homogeneous compact region, and the background is consistent with its neighboring regions. Motivated by the saliency feature and gradient feature, we introduce minimum target intensity (MTI) to measure the dissimilarity between different scales, and use mean gradient to restrict the target scale in a reasonable range. They are integrated to be multiscale MTI filter. The proposed detection method is designed based on multiscale MTI filter. Firstly, salient region is got by morphological low-pass filtering, where the potential target exists in. Secondly, the candidate target regions are extracted by multiscale minimum target intensity filter, which can effectively give the optimal target size. At last, signal-to-clutter ratio (SCR) is used to segment targets, which is computed based on optimal scale of candidate targets. The experimental results indicate that the proposed method can achieve both higher detection precision and robustness in complex background.

  18. Real time loss detection for SNM in process

    International Nuclear Information System (INIS)

    This paper discusses the basis of a design for real time special nuclear material (SNM) loss detectors. The design utilizes process measurements and signal processing techniques to produce a timely estimate of material loss. A state estimator is employed as the primary signal processing algorithm. Material loss is indicated by changes in the states or process innovations (residuals). The design philosophy is discussed in the context of these changes

  19. Multiuser detection and independent component analysis-Progress and perspective

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The latest progress in the multiuser detection and independent component analysis (ICA) is reviewed systematically. Then two novel classes of multiuser detection methods based on ICA algorithms and feedforward neural networks are proposed. Theoretical analysis and computer simulation show that ICA algorithms are effective to detect multiuser signals in code-division multiple-access (CDMA) system. The performances of these methods are not identical entirely in various channels, but all of them are robust, efficient, fast and suitable for real-time implementations.

  20. Semiclassical analysis for diffusions and stochastic processes

    CERN Document Server

    Kolokoltsov, Vassili N

    2000-01-01

    The monograph is devoted mainly to the analytical study of the differential, pseudo-differential and stochastic evolution equations describing the transition probabilities of various Markov processes. These include (i) diffusions (in particular,degenerate diffusions), (ii) more general jump-diffusions, especially stable jump-diffusions driven by stable Lévy processes, (iii) complex stochastic Schrödinger equations which correspond to models of quantum open systems. The main results of the book concern the existence, two-sided estimates, path integral representation, and small time and semiclassical asymptotics for the Green functions (or fundamental solutions) of these equations, which represent the transition probability densities of the corresponding random process. The boundary value problem for Hamiltonian systems and some spectral asymptotics ar also discussed. Readers should have an elementary knowledge of probability, complex and functional analysis, and calculus.

  1. Auto Landing Process for Autonomous Flying Robot by Using Image Processing Based on Edge Detection

    Directory of Open Access Journals (Sweden)

    Bahram Lavi Sefidgari

    2014-01-01

    Full Text Available In today’s technological life, everyone is quite familiar with the importance of security measures in our lives. So in this regard, many attempts have been made by researchers and one of them is flying robots technology. One well-known usage of flying robot, perhaps, is its capability in security and care measurements which made this device extremely practical, not only for its unmanned movement, but also for the unique manoeuvre during flight over the arbitrary areas. In this research, the automatic landing of a flying robot is discussed. The system is based on the frequent interruptions that is sent from main microcontroller to camera module in order to take images; these images have been distinguished by image processing system based on edge detection, after analysing the image the system can tell whether or not to land on the ground. This method shows better performance in terms of precision as well as experimentally.

  2. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    Science.gov (United States)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  3. Real-time Forward Vehicle Detection Method Based on Edge Analysis

    Institute of Scientific and Technical Information of China (English)

    Young-suk JI; Hwan-ik CHUNG; Hern-soo HAHN

    2010-01-01

    This paper proposes a method which uses the extended edge analysis to supplement the inaccurate edge information for better vehicle detection during vehicle detection. The extended edge analysis method detects two vertical edge items, which are the borderlines of both sides of the vehicle, by extending the horizontal edges obtained inaccurately due to the illumination or noise existing on the image. The proposed method extracts the horizontal edges with the method of merging edges by using the horizontal edge information inside the Region of Interest (ROI), which is set up on the pre-processing step. The bottom line is determined by detecting the shadow regions of the vehicle from the extracted horizontal edge one. The general width of the vehicle detecting and the extended edge analyzing methods are carried out side by side on the bottom line of the vehicle to determine width of the vehicle. Finally, the final vehicle is detected through the verification step. On the road image with complicate background, the vehicle detecting method based on the extended edge analysis is more efficient than the existing vehicle detecting method which uses the edge information. The excellence of the proposed vehicle detecting method is confirmed by carrying out the vehicle detecting experiment on the complicate road image.

  4. Data analysis of inertial sensor for train positioning detection system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seong Jin; Park, Sung Soo; Lee, Jae Ho; Kang, Dong Hoon [Korea Railroad Research Institute, Uiwang (Korea, Republic of)

    2015-02-15

    Train positioning detection information is fundamental for high-speed railroad inspection, making it possible to simultaneously determine the status and evaluate the integrity of railroad equipment. This paper presents the results of measurements and an analysis of an inertial measurement unit (IMU) used as a positioning detection sensors. Acceleration and angular rate measurements from the IMU were analyzed in the amplitude and frequency domains, with a discussion on vibration and train motions. Using these results and GPS information, the positioning detection of a Korean tilting train express was performed from Naju station to Illo station on the Honam-line. The results of a synchronized analysis of sensor measurements and train motion can help in the design of a train location detection system and improve the positioning detection performance.

  5. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters. PMID:27305769

  6. Knowledge Base Approach for 3D Objects Detection in Point Clouds Using 3D Processing and Specialists Knowledge

    OpenAIRE

    Ben Hmida, Helmi; Cruz, Christophe; Boochs, Frank; Nicolle, Christophe

    2013-01-01

    International audience This paper presents a knowledge-based detection of objects approach using the OWL ontology language, the Semantic Web Rule Language, and 3D processing built-ins aiming at combining geometrical analysis of 3D point clouds and specialist's knowledge. Here, we share our experience regarding the creation of 3D semantic facility model out of unorganized 3D point clouds. Thus, a knowledge-based detection approach of objects using the OWL ontology language is presented. Thi...

  7. Across frequency processes involved in auditory detection of coloration

    DEFF Research Database (Denmark)

    Buchholz, Jörg; Kerketsos, P

    2008-01-01

    When an early wall reflection is added to a direct sound, a spectral modulation is introduced to the signal's power spectrum. This spectral modulation typically produces an auditory sensation of coloration or pitch. Throughout this study, auditory spectral-integration effects involved in coloration...... detection are investigated. Coloration detection thresholds were therefore measured as a function of reflection delay and stimulus bandwidth. In order to investigate the involved auditory mechanisms, an auditory model was employed that was conceptually similar to the peripheral weighting model [Yost, JASA...... filterbank was designed to approximate auditory filter-shapes measured by Oxenham and Shera [JARO, 2003, 541-554], derived from forward masking data. The results of the present study demonstrate that a “purely” spectrum-based model approach can successfully describe auditory coloration detection even at high...

  8. Latency and mode of error detection in a process industry

    International Nuclear Information System (INIS)

    Licensee event reports (LERs) from an industry provide important information feedback about safety to the industry itself, the regulators and to the public. LERs from four nuclear power reactors were analyzed to find out about detection times, mode of detection and qualitative differences in reports from different reactors. The reliability of the codings was satisfactory and measured as the covariance between the ratings from two independent judges. The results showed differences in detection time across the reactors. On the average about 10% of the errors remained undetected for 100 weeks or more, but the great majority of errors were detected soon after their first appearance in the plant. On the average 40% of the errors were detected in regular tests and 40% through alarms. Operators found about 16% of the errors through noticing something abnormal in the plant. The remaining errors were detected in other ways. There were qualitative differences between the LERs from the different reactors reflecting the different conditions in the plants. The number of reports differed by a magnitude 1:2 between the different plants. However, a greater number of LERs can indicate both higher safety standards (e.g. a greater willingness to report all possible events to be able to learn from them) and lower safety standards (e.g. reporting as few events as possible to make a good impression). It was pointed out that LERs are indispensable in order to maintain safety of an industry and that the differences between plants found in the analyses of this study indicate how error reports can be used to initiate further investigations for improved safety

  9. Efficient signal processing for time-resolved fluorescence detection of nitrogen-vacancy spins in diamond

    Science.gov (United States)

    Gupta, A.; Hacquebard, L.; Childress, L.

    2016-03-01

    Room-temperature fluorescence detection of the nitrogen-vacancy center electronic spin typically has low signal to noise, requiring long experiments to reveal an averaged signal. Here, we present a simple approach to analysis of time-resolved fluorescence data that permits an improvement in measurement precision through signal processing alone. Applying our technique to experimental data reveals an improvement in signal to noise equivalent to a 14% increase in photon collection efficiency. We further explore the dependence of the signal to noise ratio on excitation power, and analyze our results using a rate equation model. Our results provide a rubric for optimizing fluorescence spin detection, which has direct implications for improving precision of nitrogen-vacancy-based sensors.

  10. Multivariate Statistical Process Monitoring Using Robust Nonlinear Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHAO Shijian; XU Yongmao

    2005-01-01

    The principal component analysis (PCA) algorithm is widely applied in a diverse range of fields for performance assessment, fault detection, and diagnosis. However, in the presence of noise and gross errors, the nonlinear PCA (NLPCA) using autoassociative bottle-neck neural networks is so sensitive that the obtained model differs significantly from the underlying system. In this paper, a robust version of NLPCA is introduced by replacing the generally used error criterion mean squared error with a mean log squared error. This is followed by a concise analysis of the corresponding training method. A novel multivariate statistical process monitoring (MSPM) scheme incorporating the proposed robust NLPCA technique is then investigated and its efficiency is assessed through application to an industrial fluidized catalytic cracking plant. The results demonstrate that, compared with NLPCA, the proposed approach can effectively reduce the number of false alarms and is, hence, expected to better monitor real-world processes.

  11. Relative Saliency in Change Signals Affects Perceptual Comparison and Decision Processes in Change Detection

    Science.gov (United States)

    Yang, Cheng-Ta

    2011-01-01

    Change detection requires perceptual comparison and decision processes on different features of multiattribute objects. How relative salience between two feature-changes influences the processes has not been addressed. This study used the systems factorial technology to investigate the processes when detecting changes in a Gabor patch with visual…

  12. Detection of Epileptic Seizures with Multi-modal Signal Processing

    DEFF Research Database (Denmark)

    Conradsen, Isa

    phase of GTC) and simulated seizures. This was valuable information concerning a seizure detection algorithm, and the findings from this research provided evidence for a change in the definition of these seizures by the International League Against Epilepsy (ILAE). Our final study presents a novel...... algorithms for these based on uni- or multimodalities. Regarding seizure detection, the highest potential clinical relevance is for the generalized tonic-clonic (GTC) seizures, as these are associated with an increased risk for sudden unexpected death in epilepsy (SUDEP) in unsupervised patients. Several...

  13. Fault detection and isolation in processes involving induction machines

    Energy Technology Data Exchange (ETDEWEB)

    Zell, K.; Medvedev, A. [Control Engineering Group, Luleaa University of Technology, Luleaa (Sweden)

    1997-12-31

    A model-based technique for fault detection and isolation in electro-mechanical systems comprising induction machines is introduced. Two coupled state observers, one for the induction machine and another for the mechanical load, are used to detect and recognize fault-specific behaviors (fault signatures) from the real-time measurements of the rotor angular velocity and terminal voltages and currents. Practical applicability of the method is verified in full-scale experiments with a conveyor belt drive at SSAB, Luleaa Works. (orig.) 3 refs.

  14. Physical Meaning of the Optimum Measurement Process in Quantum Detection Theory

    Science.gov (United States)

    Osaki, Masao; Kozuka, Haruhisa; Hirota, Osamu

    1996-01-01

    The optimum measurement processes are represented as the optimum detection operators in the quantum detection theory. The error probability by the optimum detection operators goes beyond the standard quantum limit automatically. However the optimum detection operators are given by pure mathematical descriptions. In order to realize a communication system overcoming the standard quantum limit, we try to give the physical meaning of the optimum detection operators.

  15. Protecting Students' Intellectual Property in the Web Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Dyagilev, Vadim; Tskhay, Alexander

    2012-01-01

    Learning management systems (LMS) play a central role in communications in online and distance education. In the digital era, with all the information now accessible at students' fingertips, plagiarism detection services (PDS) have become a must-have part of LMS. Such integration provides a seamless experience for users, allowing PDS to check…

  16. Detection limit for rate fluctuations in inhomogeneous Poisson processes

    Science.gov (United States)

    Shintani, Toshiaki; Shinomoto, Shigeru

    2012-04-01

    Estimations of an underlying rate from data points are inevitably disturbed by the irregular occurrence of events. Proper estimation methods are designed to avoid overfitting by discounting the irregular occurrence of data, and to determine a constant rate from irregular data derived from a constant probability distribution. However, it can occur that rapid or small fluctuations in the underlying density are undetectable when the data are sparse. For an estimation method, the maximum degree of undetectable rate fluctuations is uniquely determined as a phase transition, when considering an infinitely long series of events drawn from a fluctuating density. In this study, we analytically examine an optimized histogram and a Bayesian rate estimator with respect to their detectability of rate fluctuation, and determine whether their detectable-undetectable phase transition points are given by an identical formula defining a degree of fluctuation in an underlying rate. In addition, we numerically examine the variational Bayes hidden Markov model in its detectability of rate fluctuation, and determine whether the numerically obtained transition point is comparable to those of the other two methods. Such consistency among these three principled methods suggests the presence of a theoretical limit for detecting rate fluctuations.

  17. Protecting Student Intellectual Property in Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Barber, Craig

    2012-01-01

    The rapid development of the Internet along with increasing computer literacy has made it easy and tempting for digital natives to copy-paste someone's work. Plagiarism is now a burning issue in education, industry and even in the research community. In this study, the authors concentrate on plagiarism detection with particular focus on the…

  18. Point processes in forestry : an application to tree crown detection

    OpenAIRE

    Perrin, Guillaume; Descombes, Xavier; Zerubia, Josiane

    2006-01-01

    In this research report, we aim at extracting tree crowns from remotely sensed images using marked point processes of discs and ellipses. Our approach is indeed to consider that the data are some realizations of a marked point process. Once a geometrical object is defined, we sample a marked point process defined by a density with a Reversible Jump Markov Chain Monte Carlo dynamics and simulated annealing to get the maximum a posteriori estimator of the tree crown distribution on the image. I...

  19. QRS DETECTION OF ECG - A STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    I.S. Siva Rao

    2015-03-01

    Full Text Available Electrocardiogram (ECG is a graphical representation generated by heart muscle. ECG plays an important role in diagnosis and monitoring of heart’s condition. The real time analyzer based on filtering, beat recognition, clustering, classification of signal with maximum few seconds delay can be done to recognize the life threatening arrhythmia. ECG signal examines and study of anatomic and physiologic facets of the entire cardiac muscle. The inceptive task for proficient scrutiny is the expulsion of noise. It is attained by the use of wavelet transform analysis. Wavelets yield temporal and spectral information concurrently and offer stretchability with a possibility of wavelet functions of different properties. This paper is concerned with the extraction of QRS complexes of ECG signals using Discrete Wavelet Transform based algorithms aided with MATLAB. By removing the inconsistent wavelet transform coefficient, denoising is done in ECG signal. In continuation, QRS complexes are identified and in which each peak can be utilized to discover the peak of separate waves like P and T with their derivatives. Here we put forth a new combinatory algorithm builded on using Pan-Tompkins' method and multi-wavelet transform.

  20. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  1. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  2. Bone feature analysis using image processing techniques.

    Science.gov (United States)

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  3. Microbiological Analysis of Rice Cake Processing in Korea.

    Science.gov (United States)

    Wang, Jun; Park, Joong-Hyun; Choi, Na-Jung; Ha, Sang-Do; Oh, Deog-Hwan

    2016-01-01

    This study was conducted to evaluate the microbial contamination in rice cake materials and products during processing and in the operation environment in nonhazard analysis [and] critical control point factories. Furthermore, the environmental health of the processing facilities and the bacterial and fungal contamination on the workers' hands were investigated. Pour plate methods were used for enumeration of aerobic plate count (APC), yeast and molds (YM), Bacillus cereus, Staphylococcus aureus, and Clostridium perfringens, whereas Petrifilm count plates were used for enumeration of coliforms and Escherichia coli. The respective microbial levels of APC, coliforms, YM, and B. cereus were in the range of 2.6 to 4.7, 1.0 to 3.8, not detected (ND) to 2.9, and ND to 2.8 log CFU/g in the raw materials and in the range of 2.3 to 6.2, ND to 3.6, ND to 2.7, and ND to 3.7 log CFU/g during processing of the rice cake products. During the processing of rice cakes, APC, coliforms, YM, and B. cereus increased during soaking and smashing treatments and decreased after steaming treatment. E. coli, S. aureus, and C. perfringens were not detected in any of the raw materials and operating areas or during processing. B. cereus was detected on the operators' hands at microbial contamination levels of 1.9 ± 0.19 to 2.0 ± 0.19 log CFU/g. The results showed that B. cereus in the end product is presumably the main concern for rice cakes. In addition, the high contamination level of B. cereus during manufacturing processes, including soaking, smashing, and molding, and the absence of B. cereus from the air sampling plates indicated that the contaminated equipment showed the potential risk to cause cross-contamination.

  4. Microbiological Analysis of Rice Cake Processing in Korea.

    Science.gov (United States)

    Wang, Jun; Park, Joong-Hyun; Choi, Na-Jung; Ha, Sang-Do; Oh, Deog-Hwan

    2016-01-01

    This study was conducted to evaluate the microbial contamination in rice cake materials and products during processing and in the operation environment in nonhazard analysis [and] critical control point factories. Furthermore, the environmental health of the processing facilities and the bacterial and fungal contamination on the workers' hands were investigated. Pour plate methods were used for enumeration of aerobic plate count (APC), yeast and molds (YM), Bacillus cereus, Staphylococcus aureus, and Clostridium perfringens, whereas Petrifilm count plates were used for enumeration of coliforms and Escherichia coli. The respective microbial levels of APC, coliforms, YM, and B. cereus were in the range of 2.6 to 4.7, 1.0 to 3.8, not detected (ND) to 2.9, and ND to 2.8 log CFU/g in the raw materials and in the range of 2.3 to 6.2, ND to 3.6, ND to 2.7, and ND to 3.7 log CFU/g during processing of the rice cake products. During the processing of rice cakes, APC, coliforms, YM, and B. cereus increased during soaking and smashing treatments and decreased after steaming treatment. E. coli, S. aureus, and C. perfringens were not detected in any of the raw materials and operating areas or during processing. B. cereus was detected on the operators' hands at microbial contamination levels of 1.9 ± 0.19 to 2.0 ± 0.19 log CFU/g. The results showed that B. cereus in the end product is presumably the main concern for rice cakes. In addition, the high contamination level of B. cereus during manufacturing processes, including soaking, smashing, and molding, and the absence of B. cereus from the air sampling plates indicated that the contaminated equipment showed the potential risk to cause cross-contamination. PMID:26735044

  5. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  6. Specific capture of the hydrolysate on magnetic beads for sensitive detecting plant vacuolar processing enzyme activity.

    Science.gov (United States)

    Zhou, Jun; Cheng, Meng; Zeng, Lizhang; Liu, Weipeng; Zhang, Tao; Xing, Da

    2016-05-15

    Conventional plant protease detection always suffers from high background interference caused by the complex coloring metabolites in plant cells. In this study, a bio-modified magnetic beads-based strategy was developed for sensitive and quantitative detection of plant vacuolar processing enzyme (VPE) activity. Cleavage of the peptide substrate (ESENCRK-FITC) after asparagine residue by VPE resulted in the 2-cyano-6-amino-benzothiazole (CABT)-functionalized magnetic beads capture of the severed substrate CRK-FITC via a condensation reaction between CABT and cysteine (Cys). The catalytic activity was subsequently obtained by the confocal microscopy imaging and flow cytometry quantitative analysis. The sensor system integrated advantages of (i) the high efficient enrichment and separation capabilities of magnetic beads and (ii) the catalyst-free properties of the CABT-Cys condensation reaction. It exhibited a linear relationship between the fluorescence signal and the concentration of severed substrate in the range of 10-600 pM. The practical results showed that, compared with normal growth conditions, VPE activity was increased by 2.7-fold (307.2 ± 25.3 μM min(-1)g(-1)) upon cadmium toxicity stress. This platform effectively overcame the coloring metabolites-caused background interference, showing fine applicability for the detection of VPE activity in real samples. The strategy offers great sensitivity and may be further extended to other protease activity detection. PMID:26797250

  7. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    International Nuclear Information System (INIS)

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P and CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P and CE (BSC 2004 [DIRS 169860

  8. Mathematical Analysis and Optimization of Infiltration Processes

    Science.gov (United States)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  9. Signal processing of Shiley heart valve data for fracture detection

    Energy Technology Data Exchange (ETDEWEB)

    Mullenhoff, C.

    1993-09-01

    Given digital acoustic data emanating from the heart sounds of the beating heart measured from laboratory sheep with implanted Bjoerk-Shiley Convexo-Concave heart valves, it is possible to detect and extract the opening and closing heart beats from the data. Once extracted, spectral or other information can then obtained from the heartbeats and passed on to feature extraction algorithms, neural networks, or pattern recognizers so that the valve condition, either fractured or intact, may be determined.

  10. Signal processing of Shiley heart valve data for fracture detection

    Energy Technology Data Exchange (ETDEWEB)

    Mullenhoff, C.

    1993-04-01

    Given digital acoustic data emanating from the heart sounds of the beating heart measured from laboratory sheep with implanted Bjoerk-Shiley Convexo-Concave heart valves, it is possible to detect and extract the opening and closing heart beats from the data. Once extracted, spectral or other information can then obtained from the heartbeats and passed on to feature extraction algorithms, neutral networks, or pattern recognizers so that the valve condition, either fractured or intact, may be determined.

  11. Affine-Detection Loophole in Quantum Data Processing

    OpenAIRE

    Vlasov, Alexander Yu.

    2002-01-01

    Here is considered a specific detection loophole, that is relevant not only to testing of quantum nonlocality, but also to some other applications of quantum computations and communications. It is described by a simple affine relation between different quantum "data structures" like pure and mixed state, separable and inseparable one. It is shown also, that due to such relations imperfect device for a classical model may mimic measurements of quantum correlations on ideal equipment.

  12. Deterring digital plagiarism, how effective is the digital detection process?

    OpenAIRE

    Jayati Chaudhuri

    2008-01-01

    Academic dishonesty or plagiarism is a growing problem in today's digital world. Use of plagiarism detection tools can assist faculty to combat this form of academic dishonesty. In this article, a special emphasis is given to text-matching software called SafeAssignmentTM. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in detail. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in d...

  13. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  14. Fast Enzymatic Processing of Proteins for MS Detection with a Flow-through Microreactor.

    Science.gov (United States)

    Lazar, Iulia M; Deng, Jingren; Smith, Nicole

    2016-01-01

    The vast majority of mass spectrometry (MS)-based protein analysis methods involve an enzymatic digestion step prior to detection, typically with trypsin. This step is necessary for the generation of small molecular weight peptides, generally with MW microreactor with immobilized enzymes or of a range of complementary physical processes that reduce the time necessary for proteolytic digestion to a few minutes (e.g., microwave or high-pressure). In this work, we describe a simple and cost-effective approach that can be implemented in any laboratory for achieving fast enzymatic digestion of a protein. The protein (or protein mixture) is adsorbed on C18-bonded reversed-phase high performance liquid chromatography (HPLC) silica particles preloaded in a capillary column, and trypsin in aqueous buffer is infused over the particles for a short period of time. To enable on-line MS detection, the tryptic peptides are eluted with a solvent system with increased organic content directly in the MS ion source. This approach avoids the use of high-priced immobilized enzyme particles and does not necessitate any aid for completing the process. Protein digestion and complete sample analysis can be accomplished in less than ~3 min and ~30 min, respectively. PMID:27078683

  15. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-01-01

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm. PMID:23686143

  16. Identifying Time Measurement Tampering in the Traversal Time and Hop Count Analysis (TTHCA Wormhole Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Jonny Karlsson

    2013-05-01

    Full Text Available Traversal time and hop count analysis (TTHCA is a recent wormhole detection algorithm for mobile ad hoc networks (MANET which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm.

  17. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy

    OpenAIRE

    Xiangbo An; Jingjing Wang; Hao Li; Zhizhen Lu; Yan Bai; Han Xiao; Youyi Zhang; Yao Song

    2016-01-01

    Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO) mouse model. Pathological hypertrophy...

  18. A Novel Approach to Detect Malware Based on API Call Sequence Analysis

    OpenAIRE

    Youngjoon Ki; Eunjin Kim; Huy Kang Kim

    2015-01-01

    In the era of ubiquitous sensors and smart devices, detecting malware is becoming an endless battle between ever-evolving malware and antivirus programs that need to process ever-increasing security related data. For malware detection, various approaches have been proposed. Among them, dynamic analysis is known to be effective in terms of providing behavioral information. As malware authors increasingly use obfuscation techniques, it becomes more important to monitor how malware behaves for i...

  19. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  20. Defect source analysis of directed self-assembly process

    Science.gov (United States)

    Delgadillo, Paulina Rincon; Suri, Mayur; Durant, Stephane; Cross, Andrew; Nagaswami, Venkat R.; Heuvel, Dieter Van Den; Gronheid, Roel; Nealey, Paul

    2013-07-01

    As design rule shrinks, it is essential that the capability to detect smaller and smaller defects should improve. There is considerable effort going on in the industry to enhance immersion lithography using directed self-assembly (DSA) for the 14-nm design node and below. While the process feasibility is demonstrated with DSA, material issues as well as process control requirements are not fully characterized. The chemical epitaxy process is currently the most-preferred process option for frequency multiplication, and it involves new materials at extremely small thicknesses. The image contrast of the lamellar line/space pattern at such small layer thicknesses is a new challenge for optical inspection tools. The study focuses on capability of optical inspection systems to capture DSA unique defects such as dislocations and disclination clusters over the system and wafer noise. The study is also extended to investigate wafer-level data at multiple process steps and to determine the contribution from each process step and materials using defect source analysis methodology. The added defect pareto and spatial distributions of added defects at each process step are discussed.

  1. People detection in nuclear plants by video processing for safety purpose

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN), Rio de Janeiro, RJ (Brazil); Seixas, Jose M.; Silva, Eduardo Antonio B., E-mail: seixas@lps.ufrj.b, E-mail: eduardo@lps.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Eletrica; Cota, Raphael E.; Ramos, Bruno L., E-mail: brunolange@poli.ufrj.b [Universidade Federal do Rio de Janeiro (EP/UFRJ), RJ (Brazil). Dept. de Engenharia Eletronica e de Computacao

    2011-07-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  2. People detection in nuclear plants by video processing for safety purpose

    International Nuclear Information System (INIS)

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  3. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    OpenAIRE

    Peter J. Best; Pall Rikhardsson; Mark Toleman

    2009-01-01

    Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1) threat monitoring-surveillance of security audi...

  4. Iterated Function System Models in Data Analysis: Detection and Separation

    CERN Document Server

    Alexander, Zachary; Garland, Joshua; Meiss, James D

    2011-01-01

    We investigate the use of iterated function system (IFS) models for data analysis. An IFS is a collection of dynamical systems that switches between deterministic regimes. An algorithm is developed to detect the regime switches under the assumption of continuity. This method is tested on a simple IFS and applied to an experimental computer performance data set. This methodology has a wide range of potential uses: from change-point detection in time-series data, to the field of digital communications.

  5. RFI detection by automated feature extraction and statistical analysis

    OpenAIRE

    Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorit...

  6. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    Science.gov (United States)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  7. Image processing for femoral endosteal anatomy detection: description and testing of a computed tomography based program

    International Nuclear Information System (INIS)

    A computed tomography (CT)-based image processing computer program was developed for three-dimensional (3D) femoral endosteal cavity shape modelling. For the examinations 50 cadaver femora were used. In the CT imaging 30 axial slices were taken above and below the lesser trochanter area from each femur. Different image analysis methods were used for femoral cavity detection depending on the structure of the processed slice. In the femoral shaft area simple thresholding methods succeeded, but in the problem areas of the metaphyseal femur edge, detection operators and local thresholding were required. In contour tracking several criteria were used to check the validity of the border pixels. The results were saved as four output data files: (i) a file for the longest anteroposterior (ap), ediolateral (ml) and oblique diameters computed by a Euclidian method, (ii) nd (iii) files for 2D and 3D data respectively, and (iv) a file for centre points of each slice. Finally, testing of the results and dimensions obtained from the image analysis were carried out manually by sawing the femora into 0 stipulated horizontal slices. The ap and ml dimensions were measured with caliper ruler. The CT-based image processing yielded a peak distribution of dimensions with a negative difference to those obtained in manual measurements. The mean difference between the image processing and the manual measurements was 1.1 mm (±0.7 mm, ±1 SD). The difference was highest in he proximal slices of the femora of group I (with lowest cortical thickness), i.e. 1.3 mm (±0.8 mm) and lowest in the distal slices of the femora from group III (with highest cortical thickness), i.e. 0.9 mm (±0.6 m). The results are acceptable for further use of the program to study endosteal anatomy for individual femoral component selection and designing asis. (author)

  8. Stability analysis of a polymer coating process

    Science.gov (United States)

    Kallel, A.; Hachem, E.; Demay, Y.; Agassant, J. F.

    2015-05-01

    A new coating process involving a short stretching distance (1 mm) and a high draw ratio (around 200) is considered. The resulting thin molten polymer film (around 10 micrometers) is set down on a solid primary film and then covered by another solid secondary film. In experimental studies, periodical fluctuation in the thickness of the coated layer may be observed. The processing conditions markedly influence the onset and the development of these defects and modeling will help our understanding of their origins. The membrane approach which has been commonly used for cast film modeling is no longer valid and two dimensional time dependent models (within the thickness) are developed in the whole domain (upstream die and stretching path). A boundary-value problem with a free surface for the Stokes equations is considered and stability of the free surface is assessed using two different numerical strategies: a tracking strategy combined with linear stability analysis involving computation of leading eigenvalues, and a Level Set capturing strategy coupled with transient stability analysis.

  9. High-Speed Digital Signal Processing Method for Detection of Repeating Earthquakes Using GPGPU-Acceleration

    Science.gov (United States)

    Kawakami, Taiki; Okubo, Kan; Uchida, Naoki; Takeuchi, Nobunao; Matsuzawa, Toru

    2013-04-01

    Repeating earthquakes are occurring on the similar asperity at the plate boundary. These earthquakes have an important property; the seismic waveforms observed at the identical observation site are very similar regardless of their occurrence time. The slip histories of repeating earthquakes could reveal the existence of asperities: The Analysis of repeating earthquakes can detect the characteristics of the asperities and realize the temporal and spatial monitoring of the slip in the plate boundary. Moreover, we are expecting the medium-term predictions of earthquake at the plate boundary by means of analysis of repeating earthquakes. Although the previous works mostly clarified the existence of asperity and repeating earthquake, and relationship between asperity and quasi-static slip area, the stable and robust method for automatic detection of repeating earthquakes has not been established yet. Furthermore, in order to process the enormous data (so-called big data) the speedup of the signal processing is an important issue. Recently, GPU (Graphic Processing Unit) is used as an acceleration tool for the signal processing in various study fields. This movement is called GPGPU (General Purpose computing on GPUs). In the last few years the performance of GPU keeps on improving rapidly. That is, a PC (personal computer) with GPUs might be a personal supercomputer. GPU computing gives us the high-performance computing environment at a lower cost than before. Therefore, the use of GPUs contributes to a significant reduction of the execution time in signal processing of the huge seismic data. In this study, first, we applied the band-limited Fourier phase correlation as a fast method of detecting repeating earthquake. This method utilizes only band-limited phase information and yields the correlation values between two seismic signals. Secondly, we employ coherence function using three orthogonal components (East-West, North-South, and Up-Down) of seismic data as a

  10. ECG Signal Analysis and Arrhythmia Detection using Wavelet Transform

    Science.gov (United States)

    Kaur, Inderbir; Rajni, Rajni; Marwaha, Anupma

    2016-06-01

    Electrocardiogram (ECG) is used to record the electrical activity of the heart. The ECG signal being non-stationary in nature, makes the analysis and interpretation of the signal very difficult. Hence accurate analysis of ECG signal with a powerful tool like discrete wavelet transform (DWT) becomes imperative. In this paper, ECG signal is denoised to remove the artifacts and analyzed using Wavelet Transform to detect the QRS complex and arrhythmia. This work is implemented in MATLAB software for MIT/BIH Arrhythmia database and yields the sensitivity of 99.85 %, positive predictivity of 99.92 % and detection error rate of 0.221 % with wavelet transform. It is also inferred that DWT outperforms principle component analysis technique in detection of ECG signal.

  11. Urban Detection, Delimitation and Morphology: Comparative Analysis of Selective "MEGACITIES"

    Science.gov (United States)

    Alhaddad, B.; Arellano, B. E.; Roca, J.

    2012-08-01

    Over the last 50 years, the world has faced an impressive growth of urban population. The walled city, close to the outside, an "island"for economic activities and population density within the rural land, has led to the spread of urban life and urban networks in almost all the territory. There was, as said Margalef (1999), "a topological inversion of the landscape". The "urban" has gone from being an island in the ocean of rural land vastness, to represent the totally of the space in which are inserted natural and rural "systems". New phenomena such as the fall of the fordist model of production, the spread of urbanization known as urban sprawl, and the change of scale of the metropolis, covering increasingly large regions, called "megalopolis" (Gottmann, 1961), have characterized the century. However there are no rigorous databases capable of measuring and evaluating the phenomenon of megacities and in general the process of urbanization in the contemporary world. The aim of this paper is to detect, identify and analyze the morphology of the megacities through remote sensing instruments as well as various indicators of landscape. To understand the structure of these heterogeneous landscapes called megacities, land consumption and spatial complexity needs to be quantified accurately. Remote sensing might be helpful in evaluating how the different land covers shape urban megaregions. The morphological landscape analysis allows establishing the analogies and the differences between patterns of cities and studying the symmetry, growth direction, linearity, complexity and compactness of the urban form. The main objective of this paper is to develop a new methodology to detect urbanized land of some megacities around the world (Tokyo, Mexico, Chicago, New York, London, Moscow, Sao Paulo and Shanghai) using Landsat 7 images.

  12. Image edge detection based on multi-fractal spectrum analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-yuan; WANG Yao-nan

    2006-01-01

    In this paper,an image edge detection method based on multi-fractal spectrum analysis is presented.The coarse grain H(o)lder exponent of the image pixels is first computed,then,its multi-fractal spectrum is estimated by the kernel estimation method.Finally,the image edge detection is done by means of different multi-fractal spectrum values.Simulation results show that this method is efficient and has better locality compared with the traditional edge detection methods such as the Sobel method.

  13. Information Design for "Weak Signal" detection and processing in Economic Intelligence: A case study on Health resources

    OpenAIRE

    Sahbi Sidhom; Philippe Lambert

    2011-01-01

    The topics of this research cover all phases of "Information Design" applied to detect and profit from weak signals in economic intelligence (EI) or business intelligence (BI). The field of the information design (ID) applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing), human performances te...

  14. Advanced signal processing technique for damage detection in steel tubes

    Science.gov (United States)

    Amjad, Umar; Yadav, Susheel Kumar; Dao, Cac Minh; Dao, Kiet; Kundu, Tribikram

    2016-04-01

    In recent years, ultrasonic guided waves gained attention for reliable testing and characterization of metals and composites. Guided wave modes are excited and detected by PZT (Lead Zirconate Titanate) transducers either in transmission or reflection mode. In this study guided waves are excited and detected in the transmission mode and the phase change of the propagating wave modes are recorded. In most of the other studies reported in the literature, the change in the received signal strength (amplitude) is investigated with varying degrees of damage while in this study the change in phase is correlated with the extent of damage. Feature extraction techniques are used for extracting phase and time-frequency information. The main advantage of this approach is that the bonding condition between the transducer and the specimen does not affect the phase while it can affect the strength of recorded signal. Therefore, if the specimen is not damaged but the transducer-specimen bonding is deteriorated then the received signal strength is altered but the phase remains same and thus false positive predictions for damage can be avoided.

  15. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  16. Nonparametric signal processing validation in T-wave alternans detection and estimation.

    Science.gov (United States)

    Goya-Esteban, R; Barquero-Pérez, O; Blanco-Velasco, M; Caamaño-Fernández, A J; García-Alberola, A; Rojo-Álvarez, J L

    2014-04-01

    Although a number of methods have been proposed for T-Wave Alternans (TWA) detection and estimation, their performance strongly depends on their signal processing stages and on their free parameters tuning. The dependence of the system quality with respect to the main signal processing stages in TWA algorithms has not yet been studied. This study seeks to optimize the final performance of the system by successive comparisons of pairs of TWA analysis systems, with one single processing difference between them. For this purpose, a set of decision statistics are proposed to evaluate the performance, and a nonparametric hypothesis test (from Bootstrap resampling) is used to make systematic decisions. Both the temporal method (TM) and the spectral method (SM) are analyzed in this study. The experiments were carried out in two datasets: first, in semisynthetic signals with artificial alternant waves and added noise; second, in two public Holter databases with different documented risk of sudden cardiac death. For semisynthetic signals (SNR = 15 dB), after the optimization procedure, a reduction of 34.0% (TM) and 5.2% (SM) of the power of TWA amplitude estimation errors was achieved, and the power of error probability was reduced by 74.7% (SM). For Holter databases, appropriate tuning of several processing blocks, led to a larger intergroup separation between the two populations for TWA amplitude estimation. Our proposal can be used as a systematic procedure for signal processing block optimization in TWA algorithmic implementations.

  17. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  18. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets. PMID:26336154

  19. Novel Flood Detection and Analysis Method Using Recurrence Property

    Science.gov (United States)

    Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert

    2016-04-01

    Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.

  20. Next Generation Detection Systems for Radioactive Material Analysis

    Science.gov (United States)

    Britton, R.; Regan, P. H.; Burnett, J. L.; Davies, A. V.

    2014-05-01

    Compton Suppression techniques have been widely used to reduce the Minimum Detectable Activity of various radionuclides when performing gamma spectroscopy of environmental samples. This is achieved by utilising multiple detectors to reduce the contribution of photons that Compton Scatter out the detector crystal, only partially depositing their energy. Photons that are Compton Scattered out of the primary detector are captured by a surrounding detector, and the corresponding events vetoed from the final dataset using coincidence based fast-timing electronics. The current work presents the use of a LynxTM data acquisition module from Canberra Industries (USA) to collect data in 'List-Mode', where each event is time stamped for offline analysis. A post-processor developed to analyse such datasets allows the optimisation of the coincidence delay, and then identifies and suppresses events within this time window. This is the same process used in conventional systems with fast-timing electronics, however, in the work presented, data can be re-analysed using multiple time and energy windows. All data is also preserved and recorded (in traditional systems, coincident events are lost as they are vetoed in real time), and the results are achieved with a greatly simplified experimental setup. Monte-Carlo simulations of Compton Suppression systems have been completed to support the optimisation work, and are also presented here.

  1. A Bubble Detection Algorithm Based on Sparse and Redundant Image Processing

    Directory of Open Access Journals (Sweden)

    Ye Tian

    2013-06-01

    Full Text Available Deinked pulp flotation column has been applied in wastepaper recycling. Bubble size in deinked pulp flotation column is very important during the flotation process. In this paper, bubble images of deinked pulp flotation column were first caught by digital camera, and then the bubbles were detected by using a detection algorithm based on sparse and redundant image processing. The results show the algorithms are very practical and effective on bubble detection in deinked pulp flotation column.

  2. Heart Beat Detection in Noisy ECG Signals Using Statistical Analysis of the Automatically Detected Annotations

    Directory of Open Access Journals (Sweden)

    Andrius Gudiškis

    2015-07-01

    Full Text Available This paper proposes an algorithm to reduce the noise distortion influence in heartbeat annotation detection in electrocardiogram (ECG signals. Boundary estimation module is based on energy detector. Heartbeat detection is usually performed by QRS detectors that are able to find QRS regions in a ECG signal that are a direct representation of a heartbeat. However, QRS performs as intended only in cases where ECG signals have high signal to noise ratio, when there are more noticeable signal distortion detectors accuracy decreases. Proposed algorithm uses additional data, taken from arterial blood pressure signal which was recorded in parallel to ECG signal, and uses it to support the QRS detection process in distorted signal areas. Proposed algorithm performs as well as classical QRS detectors in cases where signal to noise ratio is high, compared to the heartbeat annotations provided by experts. In signals with considerably lower signal to noise ratio proposed algorithm improved the detection accuracy to up to 6%.

  3. Detection Tuna and Processed Products Based Protein and DNA Barcoding

    Directory of Open Access Journals (Sweden)

    Nuring Wulansari

    2015-11-01

    Full Text Available Tuna is the second largest fishery commodity in Indonesia after the shrimp. Since the high demand and the limited stock of tuna resulted in fraudulent chance. Authentication is required to meassure consumers regarding the accuracy of its labeling and food safety. In this study, the authentication was based on protein and DNA barcoding using cytochrome-b gene (cyt-b of the mitochondrial DNA as the target of gene. Primer of cyt b gene was designed based on the tuna species. This study aimed to identify the authenticity of tuna fresh and its processed products through protein using SDS-PAGE and DNA barcoding techniques. The phases of this research were protein electrophoresis by SDS-PAGE, DNA extraction, PCR amplification, electrophoresis and sequencing. Samples of fresh fish (Tu1, Tu2, Tu3, Tu4, and Tu5 and processed tuna (canned and steak were successfully extracted. Result showed that SDS-PAGE proved the damage of proteins in the processed tuna, so this method was not appropriate if it is used to identify the authenticity of tuna. PCR electrophoresis results showed that the samples of tuna, tuna steak, sushi, meat ball, abon, and caned tuna were successfully amplified in the range of 500-750 bp except Ka3, which was in line with the target of DNA (620 bp. Resulted sequences of Tu2, Tu3, Tu4 and Tu5 were identified according the results of morphometric namely T. albacares, while Tu1 was identified as T. obesus with homology level of 99%. Processed tunas (steak and canned tuna were identified as T. albacares, as stated on the labels.

  4. Optodynamic analysis of pulsed-laser processing with a Nd:YAG laser

    OpenAIRE

    Strgar, Simon; Možina, Janez

    2015-01-01

    Laser drilling and laser marking of metals with a pulsed Nd:YAG laser are discussed. Some characteristics of pulsed-laser processing and the possibilities of process optodynamic analysis are presented for the laser-drilling of aluminium. The optodynamic analysis is based on observation of generated shock waves, which propagate in the material as well as in the surrounding air during laser processing. For the detection of laser-induced shock waves in the air and for measurements of their chara...

  5. Generic Packing Detection Using Several Complexity Analysis for Accurate Malware Detection

    Directory of Open Access Journals (Sweden)

    Dr. Mafaz Mohsin Khalil Al-Anezi

    2014-01-01

    Full Text Available The attackers do not want their Malicious software (or malwares to be reviled by anti-virus analyzer. In order to conceal their malware, malware programmers are getting utilize the anti reverse engineering techniques and code changing techniques such as the packing, encoding and encryption techniques. Malware writers have learned that signature based detectors can be easily evaded by “packing” the malicious payload in layers of compression or encryption. State-of-the-art malware detectors have adopted both static and dynamic techniques to recover the payload of packed malware, but unfortunately such techniques are highly ineffective. If the malware is packed or encrypted, then it is very difficult to analyze. Therefore, to prevent the harmful effects of malware and to generate signatures for malware detection, the packed and encrypted executable codes must initially be unpacked. The first step of unpacking is to detect the packed executable files. The objective is to efficiently and accurately distinguish between packed and non-packed executables, so that only executables detected as packed will be sent to an general unpacker, thus saving a significant amount of processing time. The generic method of this paper show that it achieves very high detection accuracy of packed executables with a low average processing time. In this paper, a packed file detection technique based on complexity measured by several algorithms, and it has tested using a packed and unpacked dataset of file type .exe. The preliminary results are very promising where achieved high accuracy with enough performance. Where it achieved about 96% detection rate on packed files and 93% detection rate on unpacked files. The experiments also demonstrate that this generic technique can effectively prepared to detect unknown, obfuscated malware and cannot be evaded by known evade techniques.

  6. Robust and sensitive video motion detection for sleep analysis.

    Science.gov (United States)

    Heinrich, Adrienne; Geng, Di; Znamenskiy, Dmitry; Vink, Jelte Peter; de Haan, Gerard

    2014-05-01

    In this paper, we propose a camera-based system combining video motion detection, motion estimation, and texture analysis with machine learning for sleep analysis. The system is robust to time-varying illumination conditions while using standard camera and infrared illumination hardware. We tested the system for periodic limb movement (PLM) detection during sleep, using EMG signals as a reference. We evaluated the motion detection performance both per frame and with respect to movement event classification relevant for PLM detection. The Matthews correlation coefficient improved by a factor of 2, compared to a state-of-the-art motion detection method, while sensitivity and specificity increased with 45% and 15%, respectively. Movement event classification improved by a factor of 6 and 3 in constant and highly varying lighting conditions, respectively. On 11 PLM patient test sequences, the proposed system achieved a 100% accurate PLM index (PLMI) score with a slight temporal misalignment of the starting time (PLM detection during sleep is feasible and can give an indication of the PLMI score.

  7. A New Method of Color Edge Detection Based on Local Structure Analysis

    Institute of Scientific and Technical Information of China (English)

    JIANG Shu; ZHOU Yue; ZHU Wei-wei

    2008-01-01

    Human's real life is within a colorful world.Compared to the gray images, color images contain more information and have better visual effects.In today's digital image processing, image segmentation is an important section for computers to "understand" images and edge detection is always one of the most important methods in the field of image segmentation.Edges in color images are considered as local discontinuities both in color and spatial domains.Despite the intensive study based on integration of single-channel edge detection results, and on vector space analysis, edge detection in color images remains as a challenging issue.

  8. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  9. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr3 (Ce) fast scintillator detector and 252Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  10. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  11. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection. PMID:26811073

  12. Space-time signal processing for distributed pattern detection in sensor networks

    Science.gov (United States)

    Paffenroth, Randy C.; Du Toit, Philip C.; Scharf, Louis L.; Jayasumana, Anura P.; Banadara, Vidarshana; Nong, Ryan

    2012-05-01

    We present a theory and algorithm for detecting and classifying weak, distributed patterns in network data that provide actionable information with quantiable measures of uncertainty. Our work demonstrates the eectiveness of space-time inference on graphs, robust matrix completion, and second order analysis for the detection of distributed patterns that are not discernible at the level of individual nodes. Motivated by the importance of the problem, we are specically interested in detecting weak patterns in computer networks related to Cyber Situational Awareness. Our focus is on scenarios where the nodes (terminals, routers, servers, etc.) are sensors that provide measurements (of packet rates, user activity, central processing unit usage, etc.) that, when viewed independently, cannot provide a denitive determination of the underlying pattern, but when fused with data from across the network both spatially and temporally, the relevant patterns emerge. The approach is applicable to many types of sensor networks including computer networks, wireless networks, mobile sensor networks, and social networks, as well as in contexts such as databases and disease outbreaks.

  13. Processing of Instantaneous Angular Speed Signal for Detection of a Diesel Engine Failure

    Directory of Open Access Journals (Sweden)

    Adam Charchalis

    2013-01-01

    Full Text Available Continuous monitoring of diesel engine performance under its operating is critical for the prediction of malfunction development and subsequently functional failure detection. Analysis of instantaneous angular speed (IAS of the crankshaft is considered as one of the nonintrusive and effective methods of the detection of combustion quality deterioration. In this paper results of experimental verification of fuel system's malfunction detecting, using optical encoder for IAS recording are presented. The implemented method relies on the comparison of measurement results, recorded under healthy and faulty conditions of the engine. Elaborated dynamic model of angular speed variations enables us to build templates of engine behavior. Recorded during experiment, values of cylinder pressure were taken for the approximation of pressure basic waveform. The main task of data processing is smoothing the raw angular speed signal. The noise is due to sensor mount vibrations, signal emitter machining, engine body vibrations, and crankshaft torsional vibrations. Smoothing of the measurement data was carried out by the implementation of the Savitzky-Golay filter. Measured signal after smoothing was compared with the model of IAS run.

  14. Multisensor Network System for Wildfire Detection Using Infrared Image Processing

    Directory of Open Access Journals (Sweden)

    I. Bosch

    2013-01-01

    Full Text Available This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire, for different probability of false alarm (PFA. The necessity of including decision fusion is thereby demonstrated.

  15. Decision analysis applications and the CERCLA process

    Energy Technology Data Exchange (ETDEWEB)

    Purucker, S.T.; Lyon, B.F. [Oak Ridge National Lab., TN (United States). Risk Analysis Section]|[Univ. of Tennessee, Knoxville, TN (United States)

    1994-06-01

    Quantitative decision methods can be developed during environmental restoration projects that incorporate stakeholder input and can complement current efforts that are undertaken for data collection and alternatives evaluation during the CERCLA process. These decision-making tools can supplement current EPA guidance as well as focus on problems that arise as attempts are made to make informed decisions regarding remedial alternative selection. In examining the use of such applications, the authors discuss the use of decision analysis tools and their impact on collecting data and making environmental decisions from a risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based decision rules that incorporate stakeholder concerns. This represents a quantitative method for implementing the Data Quality Objective (DQO) process. These objective functions can be expressed using a variety of indices to analyze problems that currently arise in the environmental field. Examples include cost, magnitude of risk, efficiency, and probability of success or failure. Based on such defined objective functions, a project can evaluate the impact of different risk and decision selection strategies on data worth and alternative selection.

  16. Thermodynamic Analysis of Nanoporous Membrane Separation Processes

    Science.gov (United States)

    Rogers, David; Rempe, Susan

    2011-03-01

    We give an analysis of desalination energy requirements in order to quantify the potential for future improvements in desalination membrane technology. Our thermodynamic analysis makes it possible to draw conclusions from the vast array of equilibrium molecular dynamics simulations present in the literature as well as create a standardized comparison for measuring and reporting experimental reverse osmosis material efficiency. Commonly employed methods for estimating minimum desalination energy costs have been revised to include operations at positive input stream recovery ratios using a thermodynamic cycle analogous to the Carnot cycle. Several gaps in the statistical mechanical theory of irreversible processes have also been identified which may in the future lead to improved communication between materials engineering models and statistical mechanical simulation. Simulation results for silica surfaces and nanochannels are also presented. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  17. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  18. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T.R.H.; Whitlow, H.J. [Lund Univ. (Sweden); Bubb, I.F.; Short, R.; Johnston, P.N. [Royal Melbourne Inst. of Tech., VIC (Australia)

    1996-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  19. 动态 QCM 生物传感器检测尿微量白蛋白及过程分析%Dynamic QCM Biosensor Detection of Human Serum Albumin and Process Analysis

    Institute of Scientific and Technical Information of China (English)

    刘宪华; 赵勇; 张林; 冯梦南; 赵友全; 鲁逸人

    2012-01-01

    The screening of urine human serum albumin (HSA) is significant for the early diagnosis and intervention of the nephropathies. An immunosensor based on a quartz crystal microbalance (QCM) was developed for dynamic and real-time detection of HSA. Firstly, quartz crystal resonator was modified with lipoic acid and a self-assembly monolayer was formed on the surface. Then, carboxyl group on the modified surface was activated with l-(3-(dimethylamino)propyl)-3-ethylcarbodimide hydrochloride (EDC) and N-hydroxysuccinimide (NHS) in order to couple anti-HSA antibody. In this experiment, the frequency changes were recorded by QCM biosensor and were related to HSA concentrations. The results show that HSA can be quantitatively detected with the developed immunosensor within the mass concentration range of 0.1-6 μg/mL. Minimum detectable mass concentration of the biosensor is 0.1 μg/mL. The linear equation between frequency response and HSA concentration is y=11.276x + 10.351 with R2 = 0.999 4. In the range of detection, the linear relationship is satisfactory.%尿微量白蛋白(HSA)的检测对肾损伤的早期诊断及治疗具有重要意义.通过搭建基于石英晶体微天平(QCM)原理的免疫型生物传感器,对尿微量白蛋白实现动态实时的检测.首先使用硫辛酸修饰石英晶体表面,形成自组装单分子膜,然后再用N-羟基琥珀酰亚胺(NHS)和1-(3-二甲氨基丙基)-3-乙基碳二亚胺盐酸盐(EDC)来活化晶体表面的羧基,使得HSA抗体能够与之结合.实验通过记录传感器的频率变化,反映被测HSA样品的浓度大小.通过测定不同浓度的HSA,确定所搭建的生物传感器对HSA的检测范围是0.1~6 μg/mL,最小检测质量浓度0.1 μg/mL,频率变化和样品浓度的拟合曲线为y=11.276x+ 10.351,拟合优度R2为0.999 4.在检测范围内,两者线性关系较好.

  20. Apolipoprotein B100 analysis in microchip with electrochemical detection

    Institute of Scientific and Technical Information of China (English)

    Cheng Cheng Liu; Yun Liu; Hui Xiang Wang; Yan Bo Qi; Peng Yuan Yang; Bao Hong Liu

    2011-01-01

    Apolipoprotein B100 (apoB-100) is a major protein of the cholesterol-rich low-density lipoprotein (LDL) and reflects a better assessment of total atherogenic burden to the vascular system than LDL. In this work, a simple and sensitive method has been developed to determine picoliter apoB-100s using the PMMA microfluidic chip coupled with electrochemical detection system. This method performs very well with a detectable linear range of 1-800 pg/mL and a detection limit of 1 pg/mL. A real serum sample has further been detected by this microchip-based biosensor. The results show that this kind of method is practicable and has the potential application in clinical analysis and diagnosis.

  1. Detection of charged particles through a photodiode: design and analysis

    International Nuclear Information System (INIS)

    This project develops and construct an charge particle detector mean a pin photodiode array, design and analysis using a silicon pin Fotodiodo that generally is used to detect visible light, its good efficiency, size compact and reduced cost specifically allows to its use in the radiation monitoring and alpha particle detection. Here, so much, appears the design of the system of detection like its characterization for alpha particles where one is reported as alpha energy resolution and detection efficiency. The equipment used in the development of work consists of alpha particle a triple source composed of Am-241, Pu-239 and Cm-244 with 5,55 KBq as total activity, Maestro 32 software made by ORTEC, a multi-channel card Triumph from ORTEC and one low activity electroplated uranium sample. (Author)

  2. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  3. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  4. Gravitational wave detection and data analysis for pulsar timing arrays

    NARCIS (Netherlands)

    Haasteren, Rutger van

    2011-01-01

    Long-term precise timing of Galactic millisecond pulsars holds great promise for measuring long-period (months-to-years) astrophysical gravitational waves. In this work we develop a Bayesian data analysis method for projects called pulsar timing arrays; projects aimed to detect these gravitational w

  5. Spiral analysis-improved clinical utility with center detection.

    Science.gov (United States)

    Wang, Hongzhi; Yu, Qiping; Kurtis, Mónica M; Floyd, Alicia G; Smith, Whitney A; Pullman, Seth L

    2008-06-30

    Spiral analysis is a computerized method that measures human motor performance from handwritten Archimedean spirals. It quantifies normal motor activity, and detects early disease as well as dysfunction in patients with movement disorders. The clinical utility of spiral analysis is based on kinematic and dynamic indices derived from the original spiral trace, which must be detected and transformed into mathematical expressions with great precision. Accurately determining the center of the spiral and reducing spurious low frequency noise caused by center selection error is important to the analysis. Handwritten spirals do not all start at the same point, even when marked on paper, and drawing artifacts are not easily filtered without distortion of the spiral data and corruption of the performance indices. In this report, we describe a method for detecting the optimal spiral center and reducing the unwanted drawing artifacts. To demonstrate overall improvement to spiral analysis, we study the impact of the optimal spiral center detection in different frequency domains separately and find that it notably improves the clinical spiral measurement accuracy in low frequency domains.

  6. Sparse principal component analysis in hyperspectral change detection

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Larsen, Rasmus; Vestergaard, Jacob Schack

    2011-01-01

    This contribution deals with change detection by means of sparse principal component analysis (PCA) of simple differences of calibrated, bi-temporal HyMap data. Results show that if we retain only 15 nonzero loadings (out of 126) in the sparse PCA the resulting change scores appear visually very ...

  7. Detecting fast, online reasoning processes in clinical decision making.

    Science.gov (United States)

    Flores, Amanda; Cobos, Pedro L; López, Francisco J; Godoy, Antonio

    2014-06-01

    In an experiment that used the inconsistency paradigm, experienced clinical psychologists and psychology students performed a reading task using clinical reports and a diagnostic judgment task. The clinical reports provided information about the symptoms of hypothetical clients who had been previously diagnosed with a specific mental disorder. Reading times of inconsistent target sentences were slower than those of control sentences, demonstrating an inconsistency effect. The results also showed that experienced clinicians gave different weights to different symptoms according to their relevance when fluently reading the clinical reports provided, despite the fact that all the symptoms were of equal diagnostic value according to the Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; American Psychiatric Association, 2000). The diagnostic judgment task yielded a similar pattern of results. In contrast to previous findings, the results of the reading task may be taken as direct evidence of the intervention of reasoning processes that occur very early, rapidly, and online. We suggest that these processes are based on the representation of mental disorders and that these representations are particularly suited to fast retrieval from memory and to making inferences. They may also be related to the clinicians' causal reasoning. The implications of these results for clinician training are also discussed. PMID:24274045

  8. Detection of contamination on selected apple cultivars using reflectance hyperspectral and multispectral analysis

    Science.gov (United States)

    Mehl, Patrick M.; Chao, Kevin; Kim, Moon S.; Chen, Yud-Ren

    2001-03-01

    Presence of natural or exogenous contaminations on apple cultivars is a food safety and quality concern touching the general public and strongly affecting this commodity market. Accumulations of human pathogens are usually observed on surface lesions of commodities. Detections of either lesions or directly of the pathogens are essential for assuring the quality and safety of commodities. We are presenting the application of hyperspectral image analysis towards the development of multispectral techniques for the detection of defects on chosen apple cultivars, such as Golden Delicious, Red Delicious, and Gala apples. Separate apple cultivars possess different spectral characteristics leading to different approaches for analysis. General preprocessing analysis with morphological treatments is followed by different image treatments and condition analysis for highlighting lesions and contaminations on the apple cultivars. Good isolations of scabs, fungal and soil contaminations and bruises are observed with hyperspectral imaging processing either using principal component analysis or utilizing the chlorophyll absorption peak. Applications of hyperspectral results to a multispectral detection are limited by the spectral capabilities of our RGB camera using either specific band pass filters and using direct neutral filters. Good separations of defects are obtained for Golden Delicious apples. It is however limited for the other cultivars. Having an extra near infrared channel will increase the detection level utilizing the chlorophyll absorption band for detection as demonstrated by the present hyperspectral imaging analysis

  9. Rapid Detection of Biological and Chemical Threat Agents Using Physical Chemistry, Active Detection, and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung; Dong, Li; Fu, Rong; Liotta, Lance; Narayanan, Aarthi; Petricoin, Emanuel; Ross, Mark; Russo, Paul; Zhou, Weidong; Luchini, Alessandra; Manes, Nathan; Chertow, Jessica; Han, Suhua; Kidd, Jessica; Senina, Svetlana; Groves, Stephanie

    2007-01-01

    Basic technologies have been successfully developed within this project: rapid collection of aerosols and a rapid ultra-sensitive immunoassay technique. Water-soluble, humidity-resistant polyacrylamide nano-filters were shown to (1) capture aerosol particles as small as 20 nm, (2) work in humid air and (3) completely liberate their captured particles in an aqueous solution compatible with the immunoassay technique. The immunoassay technology developed within this project combines electrophoretic capture with magnetic bead detection. It allows detection of as few as 150-600 analyte molecules or viruses in only three minutes, something no other known method can duplicate. The technology can be used in a variety of applications where speed of analysis and/or extremely low detection limits are of great importance: in rapid analysis of donor blood for hepatitis, HIV and other blood-borne infections in emergency blood transfusions, in trace analysis of pollutants, or in search of biomarkers in biological fluids. Combined in a single device, the water-soluble filter and ultra-sensitive immunoassay technique may solve the problem of early warning type detection of aerosolized pathogens. These two technologies are protected with five patent applications and are ready for commercialization.

  10. Hybridization kinetics analysis of an oligonucleotide microarray for microRNA detection

    Institute of Scientific and Technical Information of China (English)

    Botao Zhao; Shuo Ding; Wei Li; Youxin Jin

    2011-01-01

    MicroRNA (miRNA) microarrays have been successfully used for profiling miRNA expression in many physiological processes such as development, differentiation, oncogenesis,and other disease processes. Detecting miRNA by miRNA microarray is actually based on nucleic acid hybridization between target molecules and their corresponding complementary probes. Due to the small size and high degree of similarity among miRNA sequences, the hybridization condition must be carefully optimized to get specific and reliable signals. Previously, we reported a microarray platform to detect miRNA expression. In this study, we evaluated the sensitivity and specificity of our microarray platform. After systematic analysis, we determined an optimized hybridization condition with high sensitivity and specificity for miRNA detection. Our results would be helpful for other hybridization-based miRNA detection methods, such as northern blot and nuclease protection assay.

  11. Differential functioning of retrieval/comparison processing in detection of the presence and absence of change.

    Directory of Open Access Journals (Sweden)

    Takuma Murakoshi

    Full Text Available BACKGROUND: The phenomenon of change blindness may reflect the failure to detect the presence of change or the absence of change. Although performing the latter is considered more difficult than the former, the differential functioning of retrieval/comparison processing that leads to differences between the detection of the presence and the absence of change has not been clarified. This study aimed to fill this research gap by comparing performance in the detection of the presence and the absence of a change in one item among a set of items. METHODOLOGY/PRINCIPAL FINDINGS: Twenty subjects performed two types of change detection tasks, the first task was detection of one changed item among a set of unchanged items (detection of the presence of a change and the other was the detection of one unchanged item among a set of changed items (detection of the absence of a change. The ANOVA results for the percentage of correct responses and signal detection measurement of A' values regarding change detection and the pattern of the results indicate that the subjects found (1 detection of the presence of change less difficult than detection of the absence of change (2, rejection of the presence of change less difficult than acceptance of the presence of change, and (3 rejection of the absence of change as difficult as acceptance of the absence of change. CONCLUSIONS/SIGNIFICANCE: Retrieval/comparison processing for the detection of the presence of change differs from that for the absence of change, likely because the retrieval/comparison process appears aimed at determining whether an item has changed but not whether an item appears the same as it had previously. This conclusion suggests the existence of an identification process that recognizes each item as the same as that observed previously that exists apart from the mechanism underlying retrieval/comparison processing.

  12. Citation-based plagiarism detection detecting disguised and cross-language plagiarism using citation pattern analysis

    CERN Document Server

    Gipp, Bela

    2014-01-01

    Plagiarism is a problem with far-reaching consequences for the sciences. However, even today's best software-based systems can only reliably identify copy & paste plagiarism. Disguised plagiarism forms, including paraphrased text, cross-language plagiarism, as well as structural and idea plagiarism often remain undetected. This weakness of current systems results in a large percentage of scientific plagiarism going undetected. Bela Gipp provides an overview of the state-of-the art in plagiarism detection and an analysis of why these approaches fail to detect disguised plagiarism forms. The aut

  13. Comparison of performance between rescaled range analysis and rescaled variance analysis in detecting abrupt dynamic change

    Institute of Scientific and Technical Information of China (English)

    何文平; 刘群群; 姜允迪; 卢莹

    2015-01-01

    In the present paper, a comparison of the performance between moving cutting data-rescaled range analysis (MC-R/S) and moving cutting data-rescaled variance analysis (MC-V/S) is made. The results clearly indicate that the operating efficiency of the MC-R/S algorithm is higher than that of the MC-V/S algorithm. In our numerical test, the computer time consumed by MC-V/S is approximately 25 times that by MC-R/S for an identical window size in artificial data. Except for the difference in operating efficiency, there are no significant differences in performance between MC-R/S and MC-V/S for the abrupt dynamic change detection. MC-R/S and MC-V/S both display some degree of anti-noise ability. However, it is important to consider the influences of strong noise on the detection results of MC-R/S and MC-V/S in practical application processes.

  14. Adulteration detection of Brazilian gasoline samples by statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    L.S.M. Wiedemann; L.A. d' Avila; D.A. Azevedo [Universidade Federal do Rio de Janeiro, Rio de Janeiro (Brazil). Departamento de Quimica Organica, Instituto de Quimica

    2005-03-01

    Unfortunately, addition of organic solvents (heavy aliphatic, light aliphatic and aromatic hydrocarbons) in Brazilian gasoline is very frequent, and this illegal practice does not guarantee gasoline quality. Organic solvent adulterations of gasoline samples have been investigated. For characterization and comparison of these samples, physico-chemical parameters together with gas chromatographic analyses data were proposed as the factors for multivariate analysis. Hierarchical clusters analysis was used to improve the detection of the type of solvent and their relative proportion used for this practice. More detailed information of their compositions was revealed. It was found that using physico-chemical properties of gasoline samples together with statistical analysis are a useful method to adulteration detection. 20 refs., 5 figs., 2 tabs.

  15. Data processing of ground-penetrating radar signals for the detection of discontinuities using polarization diversity

    Science.gov (United States)

    Tebchrany, Elias; Sagnard, Florence; Baltazart, Vincent; Tarel, Jean-Phillippe

    2014-05-01

    In civil engineering, ground penetrating radar (GPR) is used to survey pavement thickness at traffic speed, detect and localize buried objects (pipes, cables, voids, cavities), zones of cracks and discontinuities in concrete or soils. In this work, a ground-coupled radar made of a pair of transmitting and receiving bowtie-slot antennas is moved linearly on the soil surface to detect the reflected waves induced by discontinuities in the subsurface. The GPR system operates in the frequency domain using a step-frequency continuous wave (SFCW) using a Vector Network Analyzer (VNA) in an ultra-wide band [0.3 ; 4] GHz. The detection of targets is usually focused on time imaging. Thus, the targets (limited in size) are usually shown by diffraction hyperbolas on a Bscan image that is an unfocused depiction of the scatterers. The contrast in permittivity and the ratio between the size of the object and the wavelength are important parameters in the detection process. Thus, we have made a first study on the use of polarization diversity to obtain additional information relative to the contrast between the soil and the target and the dielectric characteristics of a target. The two main polarizations configurations of the radar have been considered in the presence of objects having a pipe geometry: the TM (Transverse Magnetic) and TE (Transverse Electric. To interpret the diffraction hyperbolas on a Bscan image, we have used pre-processing techniques are necessary to reduce the clutter signal which can overlap and obscure the target responses, particularly shallow objects. The clutter, which can be composed of the direct coupling between the antennas and the reflected wave from the soil surface, the scattering on the heterogeneities due to the granular nature of the subsurface material, and some additive noise, varies with soil dielectric characteristics and/or surface roughness and leads to uncertainty in the measurements (additive noise). Because of the statistical nature of

  16. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity. PMID:27396650

  17. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    Science.gov (United States)

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  18. Corpus analysis and automatic detection of emotion-including keywords

    Science.gov (United States)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  19. Finite Element Analysis on Nanomechanical Detection of Small Particles: Toward Virus Detection.

    Science.gov (United States)

    Imamura, Gaku; Shiba, Kota; Yoshikawa, Genki

    2016-01-01

    Detection of small particles, including viruses and particulate matter (PM), has been attracting much attention in light of increasing need for environmental monitoring. Owing to their high versatility, a nanomechanical sensor is one of the most promising sensors which can be adapted to various monitoring systems. In this study, we present an optimization strategy to efficiently detect small particles with nanomechanical sensors. Adsorption of particles on the receptor layer of nanomechanical sensors and the resultant signal are analyzed using finite element analysis (FEA). We investigate the effect of structural parameters (e.g., adsorption position and embedded depth of a particle and thickness of the receptor layer) and elastic properties of the receptor layer (e.g., Young's modulus and Poisson's ratio) on the sensitivity. It is found that a membrane-type surface stress sensors (MSS) has the potential for robust detection of small particles.

  20. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy.

    Directory of Open Access Journals (Sweden)

    Xiangbo An

    Full Text Available Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO mouse model. Pathological hypertrophy was induced by a single subcutaneous injection of ISO. Physiological cardiac hypertrophy was established by daily treadmill exercise for six weeks. Strain analysis, including radial strain (RS, radial strain rate (RSR and longitudinal strain (LS, showed marked decrease as early as 3 days after ISO injection. Moreover, unlike the regional changes in cardiac infarction, strain analysis revealed global cardiac dysfunction that affects the entire heart in ISO-induced hypertrophy. In contrast, conventional echocardiography, only detected altered E/E', an index reflecting cardiac diastolic function, at 7 days after ISO injection. No change was detected on fractional shortening (FS, E/A and E'/A' at 3 days or 7 days after ISO injection. Interestingly, strain analysis revealed cardiac dysfunction only in ISO-induced pathological hypertrophy but not the physiological hypertrophy induced by exercise. Taken together, our study indicates that strain analysis offers a more sensitive approach for early detection of cardiac dysfunction than conventional echocardiography. Moreover, multiple strain readouts distinguish pathological cardiac hypertrophy from physiological hypertrophy.

  1. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy.

    Science.gov (United States)

    An, Xiangbo; Wang, Jingjing; Li, Hao; Lu, Zhizhen; Bai, Yan; Xiao, Han; Zhang, Youyi; Song, Yao

    2016-01-01

    Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO) mouse model. Pathological hypertrophy was induced by a single subcutaneous injection of ISO. Physiological cardiac hypertrophy was established by daily treadmill exercise for six weeks. Strain analysis, including radial strain (RS), radial strain rate (RSR) and longitudinal strain (LS), showed marked decrease as early as 3 days after ISO injection. Moreover, unlike the regional changes in cardiac infarction, strain analysis revealed global cardiac dysfunction that affects the entire heart in ISO-induced hypertrophy. In contrast, conventional echocardiography, only detected altered E/E', an index reflecting cardiac diastolic function, at 7 days after ISO injection. No change was detected on fractional shortening (FS), E/A and E'/A' at 3 days or 7 days after ISO injection. Interestingly, strain analysis revealed cardiac dysfunction only in ISO-induced pathological hypertrophy but not the physiological hypertrophy induced by exercise. Taken together, our study indicates that strain analysis offers a more sensitive approach for early detection of cardiac dysfunction than conventional echocardiography. Moreover, multiple strain readouts distinguish pathological cardiac hypertrophy from physiological hypertrophy.

  2. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  3. Layout dependent effects analysis on 28nm process

    Science.gov (United States)

    Li, Helen; Zhang, Mealie; Wong, Waisum; Song, Huiyuan; Xu, Wei; Hurat, Philippe; Ding, Hua; Zhang, Yifan; Cote, Michel; Huang, Jason; Lai, Ya-ch

    2015-03-01

    Advanced process nodes introduce new variability effects due to increased density, new material, new device structures, and so forth. This creates more and stronger Layout Dependent effects (LDE), especially below 28nm. These effects such as WPE (Well Proximity Effect), PSE (Poly Spacing Effect) change the carrier mobility and threshold voltage and therefore make the device performances, such as Vth and Idsat, extremely layout dependent. In traditional flows, the impact of these changes can only be simulated after the block has been fully laid out, the design is LVS and DRC clean. It's too late in the design cycle and it increases the number of post-layout iteration. We collaborated to develop a method on an advanced process to embed several LDE sources into a LDE kit. We integrated this LDE kit in custom analog design environment, for LDE analysis at early design stage. These features allow circuit and layout designers to detect the variations caused by LDE, and to fix the weak points caused by LDE. In this paper, we will present this method and how it accelerates design convergence of advanced node custom analog designs by detecting early-on LDE hotspots on partial or fully placed layout, reporting contribution of each LDE component to help identify the root cause of LDE variation, and even providing fixing guidelines on how to modify the layout and to reduce the LDE impact.

  4. Two stages of parafoveal processing during reading: Evidence from a display change detection task.

    Science.gov (United States)

    Angele, Bernhard; Slattery, Timothy J; Rayner, Keith

    2016-08-01

    We used a display change detection paradigm (Slattery, Angele, & Rayner Human Perception and Performance, 37, 1924-1938 2011) to investigate whether display change detection uses orthographic regularity and whether detection is affected by the processing difficulty of the word preceding the boundary that triggers the display change. Subjects were significantly more sensitive to display changes when the change was from a nonwordlike preview than when the change was from a wordlike preview, but the preview benefit effect on the target word was not affected by whether the preview was wordlike or nonwordlike. Additionally, we did not find any influence of preboundary word frequency on display change detection performance. Our results suggest that display change detection and lexical processing do not use the same cognitive mechanisms. We propose that parafoveal processing takes place in two stages: an early, orthography-based, preattentional stage, and a late, attention-dependent lexical access stage. PMID:26769246

  5. Automatic Fatigue Detection of Drivers through Yawning Analysis

    Science.gov (United States)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  6. PVC生产过程故障传感器的检测与重构%Faulty Sensor Detection and Reconstruction for a PVC Making Process

    Institute of Scientific and Technical Information of China (English)

    李元; 周东华; 谢植; S.Joe.Qin

    2004-01-01

    Based on principal component analysis, this paper presents an application of faulty sensor detection and reconstruction in a batch process, polyvinylchloride (PVC) making process. To deal with inconsistency in process data, it is proposed to use the dynamic time warping technique to make the historical data synchronized first, then build a consistent multi-way principal component analysis model. Fault detection is carried out based on squared prediction error statistical control plot. By defining principal component subspace, residual subspace and sensor validity index, faulty sensor can be reconstructed and identified along the fault direction. Finally, application results are illustrated in detail by use of the real data of an industrial PVC making process.

  7. Sentiment Detection of Web Users Using Probabilistic Latent Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Weijian Ren

    2014-10-01

    Full Text Available With the wide application of Internet in almost all fields, it has become the most important way for information publication, providing a large number of channels for spreading public opinion. Public opinions, as the response of Internet users to the information such as social events and government policies, reflect the status of both society and economics, which is highly valuable for the decision-making and public relations of enterprises. At present, the analysis methods for Internet public opinion are mainly based on discriminative approaches, such as Support Vector Machine (SVM and neural network. However, when these approaches analyze the sentiment of Internet public opinion, they are failed to exploit information hidden in text, e.g. topic. Motivated by the above observation, this paper proposes a detection method for public sentiment based on Probabilistic Latent Semantic Analysis (PLSA model. PLSA inherits the advantages of LSA, exploiting the semantic topic hidden in data. The procedure of detecting the public sentiment using this algorithm is composed of three main steps: (1 Chinese word segmentation and word refinement, with which each document is represented by a bag of words; (2 modeling the probabilistic distribution of documents using PLSA; (3 using the Z-vector of PLSA as the features of documents and delivering it to SVM for sentiment detection. We collect a set of text data from Weibo, blog, BBS etc. to evaluate our proposed approach. The experimental results shows that the proposed method in this paper can detect the public sentiment with high accuracy, outperforming the state-of-the-art approaches, i.e., word histogram based approach. The results also suggest that, text semantic analysis using PLSA could significantly boost the sentiment detection

  8. Can Mismatch Negativity Be Linked to Synaptic Processes? A Glutamatergic Approach to Deviance Detection

    Science.gov (United States)

    Strelnikov, Kuzma

    2007-01-01

    This article aims to provide a theoretical framework to elucidate the neurophysiological underpinnings of deviance detection as reflected by mismatch negativity. A six-step model of the information processing necessary for deviance detection is proposed. In this model, predictive coding of learned regularities is realized by means of long-term…

  9. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian demin

  10. DROIDSWAN: DETECTING MALICIOUS ANDROID APPLICATIONS BASED ON STATIC FEATURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Babu Rajesh V

    2015-07-01

    Full Text Available Android being a widely used mobile platform has witnessed an increase in the number of malicious samples on its market place. The availability of multiple sources for downloading applications has also contributed to users falling prey to malicious applications. Classification of an Android application as malicious or benign remains a challenge as malicious applications maneuver to pose themselves as benign. This paper presents an approach which extracts various features from Android Application Package file (APK using static analysis and subsequently classifies using machine learning techniques. The contribution of this work includes deriving, extracting and analyzing crucial features of Android applications that aid in efficient classification. The analysis is carried out using various machine learning algorithms with both weighted and non-weighted approaches. It was observed that weighted approach depicts higher detection rates using fewer features. Random Forest algorithm exhibited high detection rate and shows the least false positive rate.

  11. Detection of bearing damage by statistic vibration analysis

    Science.gov (United States)

    Sikora, E. A.

    2016-04-01

    The condition of bearings, which are essential components in mechanisms, is crucial to safety. The analysis of the bearing vibration signal, which is always contaminated by certain types of noise, is a very important standard for mechanical condition diagnosis of the bearing and mechanical failure phenomenon. In this paper the method of rolling bearing fault detection by statistical analysis of vibration is proposed to filter out Gaussian noise contained in a raw vibration signal. The results of experiments show that the vibration signal can be significantly enhanced by application of the proposed method. Besides, the proposed method is used to analyse real acoustic signals of a bearing with inner race and outer race faults, respectively. The values of attributes are determined according to the degree of the fault. The results confirm that the periods between the transients, which represent bearing fault characteristics, can be successfully detected.

  12. The Development of Manufacturing Process Analysis: Lesson Learned from Process Mining

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2014-01-01

    Full Text Available Process analysis is recognized as a major stage in business process reengineering that has developed over the last two decades. In the field of manufacturing, manufacturing process analysis (MPA is defined as performance analysis of the production process. The performance analysis is an outline from data and knowledge into useful forms that can be broadly applied in manufacturing sectors. Process mining, an emerge tool focusing on process perspective and resource perspective, is a way to analyze system based on the event log. The objective of this study is to extend the existing process analysis framework by considering attribute perspective. This study also aims to learn the lessons from some experiences on process mining in manufacture industries. The result of this study will help manufacturing organizations to utilize the process mining approach for analyzing their respective processes.

  13. Risk Analysis for Nonthermal process interventions

    Science.gov (United States)

    Over the last few years a number of nonthermal process interventions including ionizing radiation and ultraviolet light, high pressure processing, pulsed-electric and radiofrequency electric fields, microwave and infrared technologies, bacteriophages, etc. have been approved by regulatory agencies, ...

  14. Intersections of behavior analysis with cognitive models of contingency detection

    OpenAIRE

    Cigales, Maricel

    1997-01-01

    Bower and Watson have offered, respectively, a logical hypothesis-testing model and a conditional probability model of contingency detection by young infants. Although each could represent cognitive processes concomitant with operant learning, empirical support for these models is sparse. The limitations of each model are discussed, and suggestions are made for a more parsimonious approach by focusing on the areas of overlap between the two.

  15. A Survey on Malware Propagation, Analysis, and Detection

    Directory of Open Access Journals (Sweden)

    Mohsen Damshenas

    2015-05-01

    Full Text Available Lately, a new kind of war takes place between the security community and malicious software developers, the security specialists use all possible techniques, methods and strategies to stop and remove the threats while the malware developers utilize new types of malwares that bypass implemented security features. In this study we closely looked into malware, to understand the definition, types, propagation of malware, and detecting/defending mechanisms in order to contribute to the process of protection and security enhancement.

  16. Big Data and Specific Analysis Methods for Insurance Fraud Detection

    Directory of Open Access Journals (Sweden)

    Ramona BOLOGA

    2014-02-01

    Full Text Available Analytics is the future of big data because only transforming data into information gives them value and can turn data in business in competitive advantage. Large data volumes, their variety and the increasing speed their growth, stretch the boundaries of traditional data warehouses and ETL tools. This paper investigates the benefits of Big Data technology and main methods of analysis that can be applied to the particular case of fraud detection in public health insurance system in Romania.

  17. Clinical Detection and Feature Analysis on Neuro Signals

    Institute of Scientific and Technical Information of China (English)

    张晓文; 杨煜普; 许晓鸣; 胡天培; 高忠华; 张键; 陈中伟; 陈统一

    2004-01-01

    Research on neuro signals is challenging and significative in modern natural science. By clinical experiment, signals from three main nerves (median nerve, radial nerve and ulnar nerve) are successfully detected and recorded without any infection. Further analysis on their features under different movements, their mechanics and correlations in dominating actions are also performed. The original discovery and first-hand materials make it possible for developing practical neuro-prosthesis.

  18. RADIA: RNA and DNA Integrated Analysis for Somatic Mutation Detection

    OpenAIRE

    Radenbaugh, Amie J.; Ma, Singer; Ewing, Adam; Stuart, Joshua M.; Collisson, Eric A.; Zhu, Jingchun; Haussler, David

    2014-01-01

    The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA) to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis), a method that combines the patient-matched normal and tumor DNA with the tumor RN...

  19. Effect of Using Automated Auditing Tools on Detecting Compliance Failures in Unmanaged Processes

    Science.gov (United States)

    Doganata, Yurdaer; Curbera, Francisco

    The effect of using automated auditing tools to detect compliance failures in unmanaged business processes is investigated. In the absence of a process execution engine, compliance of an unmanaged business process is tracked by using an auditing tool developed based on business provenance technology or employing auditors. Since budget constraints limit employing auditors to evaluate all process instances, a methodology is devised to use both expert opinion on a limited set of process instances and the results produced by fallible automated audit machines on all process instances. An improvement factor is defined based on the average number of non-compliant process instances detected and it is shown that the improvement depends on the prevalence of non-compliance in the process as well as the sensitivity and the specificity of the audit machine.

  20. Multidimensional data modeling for business process analysis

    OpenAIRE

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    2007-01-01

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging thes...

  1. Fitness Landscape Analysis for Optimum Multiuser Detection Problem

    Institute of Scientific and Technical Information of China (English)

    WANG Shaowei; ZHU Qiuping

    2007-01-01

    Optimum multiuser detection (OMD) for CDMA systems is an NP-complete combinatorial optimization problem. Fitness landscape has been proven to be very useful for understanding the behavior of combinatorial optimization algorithms and can help in predicting their performance. This paper analyzes the statistic properties of the fitness landscape of the OMD problem by performing autocorrelation analysis, fitness distance correlation test and epistasis measure. The analysis results explain why some random search algorithms are effective methods for OMD problem and give hints how to design more efficient randomized search heuristic algorithms for OMD.

  2. Detection and quantification of waterborne microorganisms using an image cytometer based on angular spatial frequency processing

    CERN Document Server

    Pérez, Juan Miguel; Martínez, Pedro; Pruneri, Valerio

    2015-01-01

    We introduce a new image cytometer design for detection of very small particulate and demonstrate its capability in water analysis. The device is a compact microscope composed of off--the--shelf components, such as a light emitting diode (LED) source, a complementary metal--oxide--semiconductor (CMOS) image sensor, and a specific combination of optical lenses that allow, through an appropriate software, Fourier transform processing of the sample volume. Waterborne microorganisms, such as Escherichia coli (E. coli), Legionella pneumophila (L. pneumophila) and Phytoplankton, are detected by interrogating the volume sample either in a fluorescent or label-free mode, i.e. with or without fluorescein isothiocyanate (FITC) molecules attached to the micro-organisms, respectively. We achieve a sensitivity of 50 CFU/ml, which can be further increased to 0.2 CFU/ml by pre-concentrating an initial sample volume of 500 ml with an ad hoc fluidic system. We also prove the capability of the proposed image cytometer of diffe...

  3. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension to...... marked Hawkes processes are discussed....

  4. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  5. Detection of short circuit in pulse gas metal arc welding process

    Directory of Open Access Journals (Sweden)

    P.K.D.V. Yarlagadda

    2007-09-01

    Full Text Available Purpose: The paper discusses several methods of detecting occurrence of short circuit and short circuit severity in pulse gas metal arc welding process (GMAW-P.Design/methodology/approach: Welding experiments with different values of pulsing parameter and simultaneous recording of high speed camera pictures and welding signals (such as current and voltage were used to identify the occurrence of short circuit and its severity in GMAW-P process. The investigation is based on the measurement of welding signals specifically current and voltage signals and their synchronization with high speed camera to investigate the short circuit phenomenon in GMAW-P process.Findings: The results reveal that short circuit can be detected using signal processing techniques and its severity can be predicted by using statistical models and artificial intelligence techniques in GMAW-P process.Research limitations/implications: Several factors are responsible for short circuit occurrence in GMAW-P process. The results show that voltage and current signal carry rich information about the metal transfer and especially short circuit occurrence in GMAW-P process. Hence it’s possible to detect short circuit occurrence in GMAW-P process. Future work should concentrate on development of advance techniques to improve reliability of techniques mentioned in this paper for short circuit detection and prediction in GMAW-P process.Originality/value: For achieving atomization of the welding processes, implementation of real time monitoring of weld quality is essential. Specifically for GMAW-P process which is widely used for light weight metal which is widely gaining popularity in manufacturing industry. However, in case of GMAW-P process hardly any attempt is made to analyse techniques to detect and predict occurrence of short circuit. This paper analyses different techniques that can be employed for real time monitoring and prediction of short circuit and its severity in the

  6. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  7. Articulating the Resources for Business Process Analysis and Design

    Science.gov (United States)

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  8. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  9. Materials loss-detection sensitivities using process-grade measurements at AGNS BNFP

    International Nuclear Information System (INIS)

    Process quality measurement data from cold runs at AGNS BNFP are used to demonstrate near-real-time accounting by closing hourly materials balances and to evaluate contractor inventory estimation techniques. Loss-detection sensitivities for 1 day of between 4 and 18 kg uranium, at 50% detection probability and 2.5% false-alarm probability, are calculated for selected accounting areas. Pulsed-column inventory estimators are used to calculate an inventory that is generally within 10% of column dump measurements. Loss-detection sensitivity could be improved by incorporating on-line waste stream measurements, improving laboratory measurements for process streams, and refining the pulsed-column inventory estimates

  10. The application of image processing to the detection of corrosion by radiography

    International Nuclear Information System (INIS)

    The computer processing of digitised radiographs has been investigated with a view to improving x-radiography as a method for detecting corrosion. Linearisation of the image-density distribution in a radiograph has been used to enhance information which can be attributed to corrosion, making the detection of corrosion by radiography both easier and more reliable. However, conclusive evidence has yet to be obtained that image processing can result in the detection of corrosion which was not already faintly apparent on an unprocessed radiograph. A potential method has also been discovered for analysing the history of a corrosion site

  11. An underwater ship fault detection method based on Sonar image processing

    Science.gov (United States)

    Hong, Shi; Fang-jian, Shan; Bo, Cong; Wei, Qiu

    2016-02-01

    For the research of underwater ship fault detection method in conditions of sailing on the ocean especially in poor visibility muddy sea, a fault detection method under the assist of sonar image processing was proposed. Firstly, did sonar image denoising using the algorithm of pulse coupled neural network (PCNN); secondly, edge feature extraction for the image after denoising was carried out by morphological wavelet transform; Finally, interested regions Using relevant tracking method were taken, namely fault area mapping. The simulation results presented here proved the feasibility and effectiveness of the sonar image processing in underwater fault detection system.

  12. Compositional analysis of polycrystalline hafnium oxide thin films by heavy-ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, F.L. [Departamento de Electronica y Tecnologia de Computadoras, Universidad Politecnica de Cartagena, Campus Universitario Muralla del Mar, E-30202 Cartagena (Spain)]. E-mail: Felix.Martinez@upct.es; Toledano, M. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); San Andres, E. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Martil, I. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Gonzalez-Diaz, G. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Bohne, W. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany); Roehrich, J. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany); Strub, E. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany)

    2006-10-25

    The composition of polycrystalline hafnium oxide thin films has been measured by heavy-ion elastic recoil detection analysis (HI-ERDA). The films were deposited by high-pressure reactive sputtering (HPRS) on silicon wafers using an oxygen plasma at pressures between 0.8 and 1.6 mbar and during deposition times between 0.5 and 3.0 h. Hydrogen was found to be the main impurity and its concentration increased with deposition pressure. The composition was always slightly oxygen-rich, which is attributed to the oxygen plasma. Additionally, an interfacial silicon oxide thin layer was detected and taken into account. The thickness of the hafnium oxide film was found to increase linearly with deposition time and to decrease exponentially with deposition pressure, whereas the thickness of the silicon oxide interfacial layer has a minimum as a function of pressure at around 1.2 mbar and increases slightly as a function of time. The measurements confirmed that this interfacial layer is formed mainly during the early stages of the deposition process.

  13. A User Requirements Analysis Approach Based on Business Processes

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yue-bin; HAN Wen-xiu

    2001-01-01

    Requirements analysis is the most important phase of information system development.Existing requirements analysis techniques concern little or no about features of different business processes.This paper presents a user requirements analysis approach which focuses business processes on the early stage of requirements analysis. It also gives an example of the using of this approach in the analysis of an enterprise information system.

  14. Using Order Tracking Analysis Method to Detect the Angle Faults of Blades on Wind Turbine

    DEFF Research Database (Denmark)

    Li, Pengfei; Hu, Weihao; Liu, Juncheng;

    2016-01-01

    proposes a novel method of using order tracking analysis to analyze the signal of input aerodynamic torque which is received by hub. After the analyzed process, the fault characteristic frequency could be extracted by the analyzed signals and compared with the signals from normal operating conditions......The angle faults of blades on wind turbines are usually included in the set angle fault and the pitch angle fault. They are occupied with a high proportion in all wind turbine faults. Compare with the traditional fault detection methods, using order tracking analysis method to detect angle faults...... has many advantages, such as easy implementation and high system reliability. Because of using Power Spectral Density method (PSD) or Fast Fourier Transform (FFT) method cannot get clear fault characteristic frequencies, this kind of faults should be detected by an effective method. This paper...

  15. Economic analysis of thermal solvent processes

    International Nuclear Information System (INIS)

    Vapour extraction (VAPEX) uses horizontal well pairs and a gaseous solvent to mobilize the oil. Hybrid solvent processes inject a light hydrocarbon solvent in addition to sufficient amounts of steam to vaporize the solvent. This paper reviewed various laboratory model experiments that evaluated VAPEX and solvent-based processes for the recovery of heavy oil or bitumen. The project compared a VAPEX process, a thermal solvent reflux process and a hybrid-solvent SAGD process using scaled laboratory models. Several experimental models were used. The first high-pressure thermal solvent experiment was conducted with a laboratory model designed to scale a 20 m thick Burnt Lake reservoir. Propane was used as the solvent. The second sequence of experiments scaled a range of processes from VAPEX to hybrid solvents for an Athabasca bitumen reservoir using a sealed can type of model confined by a gaseous overburden with propane as the solvent. The third experiment was a hybrid solvent experiment in which propane and steam were injected simultaneously into the injector well. The final experiment was a propane-steam hybrid experiment at a higher steam injection rate. The aim of the study was to evaluate the processes, build a database of experimental performance and to determine whether any single process had a significant economic advantage. It was concluded that the lowest cost process for Athabasca bitumen was the thermal solvent hybrid process followed by low pressure SAGD. The thermal solvent experiment using hot propane injection recovered heavy oil at costs competitive to SAGD. Many of the experiments suggested a process life longer than 15 years, as the high viscosity of Athabasca bitumen and the resulting low diffusivity resulted in a slower oil recovery process. 5 refs., 3 tabs., 16 figs

  16. Real-time Detection of Precursors to Epileptic Seizures: Non-Linear Analysis of System Dynamics.

    Science.gov (United States)

    Nesaei, Sahar; Sharafat, Ahmad R

    2014-04-01

    We propose a novel approach for detecting precursors to epileptic seizures in intracranial electroencephalograms (iEEG), which is based on the analysis of system dynamics. In the proposed scheme, the largest Lyapunov exponent of the discrete wavelet packet transform (DWPT) of the segmented EEG signals is considered as the discriminating features. Such features are processed by a support vector machine (SVM) classifier to identify whether the corresponding segment of the EEG signal contains a precursor to an epileptic seizure. When consecutive EEG segments contain such precursors, a decision is made that a precursor is in fact detected. The proposed scheme is applied to the Freiburg dataset, and the results show that seizure precursors are detected in a time frame that unlike other existing schemes is very much convenient to patients, with sensitivity of 100% and negligible false positive detection rates.

  17. A New MANET wormhole detection algorithm based on traversal time and hop count analysis.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2011-01-01

    As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET) are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA), which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability. PMID:22247657

  18. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    Directory of Open Access Journals (Sweden)

    Göran Pulkkis

    2011-11-01

    Full Text Available As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA, which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  19. Amorphous silicon batch process cost analysis

    International Nuclear Information System (INIS)

    This report describes the development of baseline manufacturing cost data to assist PVMaT monitoring teams in assessing current and future subcontracts, which an emphasis on commercialization and production. A process for the manufacture of a single-junction, large-area, a Si module was modeled using an existing Research Triangle Institute (RTI) computer model. The model estimates a required, or breakeven, price for the module based on its production process and the financial structure of the company operating the process. Sufficient detail on cost drivers is presented so the relationship of the process features and business characteristics can be related to the estimated required price

  20. Probabilistic analysis of a thermosetting pultrusion process

    NARCIS (Netherlands)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper H.

    2016-01-01

    In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion p

  1. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper;

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...

  2. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  3. A fractal analysis of pathogen detection by biosensors

    Science.gov (United States)

    Doke, Atul M.; Sadana, Ajit

    2006-05-01

    A fractal analysis is presented for the detection of pathogens such as Franscisela tularensis, and Yersinia pestis (the bacterium that causes plague) using a CANARY (cellular analysis and notification of antigens risks and yields) biosensor (Rider et al., 2003). In general, the binding and dissociation rate coefficients may be adequately described by either a single- or a dual-fractal analysis. An attempt is made to relate the binding rate coefficient to the degree of heterogeneity (fractal dimension value) present on the biosensor surface. Binding and dissociation rate coefficient values obtained are presented. The kinetics aspects along with the affinity values presented are of interest, and should along with the rate coefficients presented for the binding and the dissociation phase be of significant interest in help designing better biosensors for an application area that is bound to gain increasing importance in the future.

  4. Inter-laboratory validation study of two immunochemical methods for detection of processed ruminant proteins

    NARCIS (Netherlands)

    Raamsdonk, Van L.W.D.; Margry, R.J.C.F.; Kaathoven, Van R.G.C.; Bremer, M.G.E.G.

    2015-01-01

    In order to facilitate safe re-introduction of non-ruminant processed animal proteins (PAPs) in aqua feed, two immunoassays have been tested in an interlaboratory study for their capability to detect ruminant PAPs processed under European conditions. The sensitivity of the MELISA-TEK assay was im

  5. Error detection in GPS observations by means of Multi-process models

    DEFF Research Database (Denmark)

    Thomsen, Henrik F.

    2001-01-01

    The main purpose of this article is to present the idea of using Multi-process models as a method of detecting errors in GPS observations. The theory behind Multi-process models, and double differenced phase observations in GPS is presented shortly. It is shown how to model cycle slips in the Multi...

  6. Adaptive Image Processing Methods for Improving Contaminant Detection Accuracy on Poultry Carcasses

    Science.gov (United States)

    Technical Abstract A real-time multispectral imaging system has demonstrated a science-based tool for fecal and ingesta contaminant detection during poultry processing. In order to implement this imaging system at commercial poultry processing industry, the false positives must be removed. For doi...

  7. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Science.gov (United States)

    Liu, Zhi-Wei; Tong, Zhao-Yang; Liu, Bing; Hao, Lan-Qun; Mu, Xi-Hui; Zhang, Jin-Ping; Gao, Chuan

    2015-04-01

    Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L-1 (S/N ≥ 3). A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L-1-40μg L-1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p soil, and flour sample) determination met the analysis requirements, which was 90.5˜95.5% and 7.85%˜9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R) was greater than or equal to 0.9456 (p < 0.001). Response voltage (ΔUe) and response time (t0) obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  8. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  9. Fissile Material Detection by Differential Die Away Analysis

    Science.gov (United States)

    Shaw, Timothy J.; Strellis, Dan A.; Stevenson, John; Keeley, Doug; Gozani, Tsahi

    2009-03-01

    Detection and interdiction of Special Nuclear Material (SNM) in transportation is one of the most critical security issues facing the United States. Active inspection by inducing fission in fissile nuclear materials, such as 235U and 239Pu, provides several strong and unique signatures that make the detection of concealed nuclear materials technically very feasible. Differential Die-Away Analysis (DDAA) is a very efficient, active neutron-based technique that uses the abundant prompt fission neutrons signature. It benefits from high penetrability of the probing and signature neutrons, high fission cross section, high detection sensitivity, ease of deployment and relatively low cost. DDAA can use any neutron source or energy as long as it can be suitably pulsed. The neutron generator produces pulses of neutrons that are directed into a cargo. As each pulse passes through the cargo, the neutrons are thermalized and absorbed. If SNM is present, the thermalized neutrons create a new source of (fission) neutrons with a distinctive time profile. An efficient laboratory system was designed, fabricated and tested under a US Government DHS DNDO contract. It was shown that a small uranium sample can be detected in a large variety of cargo types and configurations within practical measurement times using commercial compact (d,T) sources. Using stronger sources and wider detector distribution will further cut inspection time. The system can validate or clear alarms from a primary inspection system such as an automated x-ray system.

  10. Detection of irradiated chicken by 2-alkylcyclobutanone analysis

    International Nuclear Information System (INIS)

    Chicken meat irradiated at 0.5 kGy or higher doses were identified by GC/MS method analyzing 2-dodecylcyclobutanone (2-DCB) and 2-tetradecylcyclobutanone (2-TCB), which are formed from palmitic acid and stearic acid respectively, and isolated using extraction procedures of soxhlet-florisil chromatography. Many fat-containing foods have oleic acid in abundance as parent fatty acid, and chicken meat contains palmitoleic acid to the amount as much as stearic acid. In this study, we detected 2-tetradec-5'-enylcyclobutanone (2-TeCB) and 2-dodec-5'-enylcyclobutanone (2-DeCB) in chicken meat, which are formed from oleic acid and palmitoleic acid by irradiation respectively, using GC/MS method. Sensitivity in detection of both 2-TeCB and 2-DeCB were lower than that of 2-DCB. However, at least 0.57 μg/g/fat of 2-TeCB was detected in chicken meat irradiated at 0.5 kGy, so 2-TeCB seems to be a useful marker for the identification of irradiated foods containing fat. On the contrary, 2-DeCB was not detected clearly at low doses. This suggests that 2-DeCB may be a useful marker for irradiated fat in the food having enough amount of palmitoleic acid needed to analysis. In addition, 2-tetradecadienylcyclobutanone, which is formed from linoleic acid was also found in chicken meat. (author)

  11. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  12. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Science.gov (United States)

    Radenbaugh, Amie J; Ma, Singer; Ewing, Adam; Stuart, Joshua M; Collisson, Eric A; Zhu, Jingchun; Haussler, David

    2014-01-01

    The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA) to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis), a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84%) and very high precision (98% and 99%) for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA. PMID:25405470

  13. Analysis of the theoretical bias in dark matter direct detection

    International Nuclear Information System (INIS)

    Fitting the model ''A'' to dark matter direct detection data, when the model that underlies the data is ''B'', introduces a theoretical bias in the fit. We perform a quantitative study of the theoretical bias in dark matter direct detection, with a focus on assumptions regarding the dark matter interactions, and velocity distribution. We address this problem within the effective theory of isoscalar dark matter-nucleon interactions mediated by a heavy spin-1 or spin-0 particle. We analyze 24 benchmark points in the parameter space of the theory, using frequentist and Bayesian statistical methods. First, we simulate the data of future direct detection experiments assuming a momentum/velocity dependent dark matter-nucleon interaction, and an anisotropic dark matter velocity distribution. Then, we fit a constant scattering cross section, and an isotropic Maxwell-Boltzmann velocity distribution to the simulated data, thereby introducing a bias in the analysis. The best fit values of the dark matter particle mass differ from their benchmark values up to 2 standard deviations. The best fit values of the dark matter-nucleon coupling constant differ from their benchmark values up to several standard deviations. We conclude that common assumptions in dark matter direct detection are a source of potentially significant bias

  14. Image Post-Processing and Analysis. Chapter 17

    International Nuclear Information System (INIS)

    For decades, scientists have used computers to enhance and analyse medical images. At first, they developed simple computer algorithms to enhance the appearance of interesting features in images, helping humans read and interpret them better. Later, they created more advanced algorithms, where the computer would not only enhance images but also participate in facilitating understanding of their content. Segmentation algorithms were developed to detect and extract specific anatomical objects in images, such as malignant lesions in mammograms. Registration algorithms were developed to align images of different modalities and to find corresponding anatomical locations in images from different subjects. These algorithms have made computer aided detection and diagnosis, computer guided surgery and other highly complex medical technologies possible. Nowadays, the field of image processing and analysis is a complex branch of science that lies at the intersection of applied mathematics, computer science, physics, statistics and biomedical sciences. This chapter will give a general overview of the most common problems in this field and the algorithms that address them

  15. Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method

    Directory of Open Access Journals (Sweden)

    DR.K.KUPPUSAMY

    2010-12-01

    Full Text Available This paper is deliberate to provide a model for “Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method ” , It describes the state’s overall requirements regarding the acquisition and implementation of intrusion prevention and detection systems with intelligence (IIPS/IIDS. This is designed to provide a deeper understanding of intrusion prevention and detection principles with intelligence may be responsible for acquiring, implementing or monitoring such systems in understanding the technology and strategies available.With the need for evolution, if not revolution, of current network architectures and the Internet, autonomous and spontaneous management will be a key feature of future networks and information systems. In this context, security is an essential property. It must be thought at the early stage of conception of these systems and designed to be also autonomous and spontaneous.Future networks and systems must be able to automatically configure themselves with respect to their security policies. The security policy specification must be dynamic and adapt itself to the changing environment. Those networks and systems should interoperate securely when their respective security policies are heterogeneous and possibly conflicting. They must be able to autonomously evaluate the impact of an intrusion in order to spontaneously select the appropriate and relevant response when a given intrusion is detected.Autonomous and spontaneous security is a major requirement of future networks and systems. Of course, it is crucial to address this issue in different wireless and mobile technologies available today such as RFID,Wifi, Wimax, 3G, etc. Other technologies such as ad hoc and sensor networks, which introduce new type of services, also share similar requirements for an autonomous and spontaneous management of security.Intelligence Intrusion Prevention Systems (IIPS are designed to aid in preventing the

  16. DETECTING ABNORMAL BEHAVIOR IN SOCIAL NETWORK WEBSITES BY USING A PROCESS MINING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Mahdi Sahlabadi

    2014-01-01

    Full Text Available Detecting abnormal user activity in social network websites could prevent from cyber-crime occurrence. The previous research focused on data mining while this research is based on user behavior process. In this study, the first step is defining a normal user behavioral pattern and the second step is detecting abnormal behavior. These two steps are applied on a case study that includes real and syntactic data sets to obtain more tangible results. The chosen technique used to define the pattern is process mining, which is an affordable, complete and noise-free event log. The proposed model discovers a normal behavior by genetic process mining technique and abnormal activities are detected by the fitness function, which is based on Petri Net rules. Although applying genetic mining is time consuming process, it can overcome the risks of noisy data and produces a comprehensive normal model in Petri net representation form.

  17. Detection of Harbours from High Resolution Remote Sensing Imagery via Saliency Analysis and Feature Learning

    Science.gov (United States)

    Wang, Yetianjian; Pan, Li; Wang, Dagang; Kang, Yifei

    2016-06-01

    Harbours are very important objects in civil and military fields. To detect them from high resolution remote sensing imagery is important in various fields and also a challenging task. Traditional methods of detecting harbours mainly focus on the segmentation of water and land and the manual selection of knowledge. They do not make enough use of other features of remote sensing imagery and often fail to describe the harbours completely. In order to improve the detection, a new method is proposed. First, the image is transformed to Hue, Saturation, Value (HSV) colour space and saliency analysis is processed via the generation and enhancement of the co-occurrence histogram to help detect and locate the regions of interest (ROIs) that is salient and may be parts of the harbour. Next, SIFT features are extracted and feature learning is processed to help represent the ROIs. Then, by using classified feature of the harbour, a classifier is trained and used to check the ROIs to find whether they belong to the harbour. Finally, if the ROIs belong to the harbour, a minimum bounding rectangle is formed to include all the harbour ROIs and detect and locate the harbour. The experiment on high resolution remote sensing imagery shows that the proposed method performs better than other methods in precision of classifying ROIs and accuracy of completely detecting and locating harbours.

  18. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Jack Lee

    Full Text Available Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA, high order spectrum analysis (HOS, fractal analysis (FA, and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC are obtained. They are 96.3%, 99.1% and 98.5% (99.3%, respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  19. Optimal swab processing recovery method for detection of bioterrorism-related Francisella tularensis by real-time PCR.

    Science.gov (United States)

    Walker, Roblena E; Petersen, Jeannine M; Stephens, Kenyatta W; Dauphin, Leslie A

    2010-10-01

    Francisella tularensis, the etiological agent of tularemia, is regarded as a potential bioterrorism agent. The advent of bioterrorism has heightened awareness of the need for validated methods for processing environmental samples. In this study we determined the optimal method for processing environmental swabs for the recovery and subsequent detection of F. tularensis by the use of real-time PCR assays. Four swab processing recovery methods were compared: heat, sonication, vortexing, and the Swab Extraction Tube System (SETS). These methods were evaluated using cotton, foam, polyester and rayon swabs spiked with six pathogenic strains of F. tularensis. Real-time PCR analysis using a multi-target 5'nuclease assay for F. tularensis showed that the use of the SETS method resulted in the best limit of detection when evaluated using multiple strains of F. tularensis. We demonstrated also that the efficiency of F. tularensis recovery from swab specimens was not equivalent for all swab processing methodologies and, thus, that this variable can affect real-time PCR assay sensitivity. The effectiveness of the SETS method was independent of the automated DNA extraction method and real-time PCR platforms used. In conclusion, diagnostic laboratories can now potentially incorporate the SETS method into specimen processing protocols for the rapid and efficient detection of F. tularensis by real-time PCR during laboratory bioterrorism-related investigations.

  20. Trend detection and analysis in Eastern Europe and European Russia

    Science.gov (United States)

    de Beurs, K.; Henebry, G. M.; Owsley, B.

    2014-12-01

    A confluence of computing power, cost of storage, ease of access to data, and ease of product delivery make it possible to harness the power of multiple remote sensing data streams to monitor land surface dynamics. Change detection has always been a fundamental remote sensing task, and there are myriad ways to perceive differences. From a statistical viewpoint, image time series of the vegetated land surface are complicated data to analyze. The time series are often seasonal and have high temporal autocorrelation. These characteristics result in the failure of the data to meet the assumption of most standard parametric statistical tests. Failure of statistical assumptions is not trivial and the use of inappropriate statistical methods may lead to the detection of spurious trends, while any actual trends and/or step changes might be overlooked. Methods for the analysis of messy satellite data, which are often influenced by discontinuity, missing observations, non-linearity, and seasonality, are still being developed within the remote sensing community. We have found several examples of research that compares trends from different datasets. However, there is a dearth of information on the comparison of trend detection methods themselves for standardized datasets. Here we describe three different trend detection methods, and compare their results for a set of synthetic time series exhibiting monotonic trends as well as step changes. We will vary the length of the time series, the number of observations per year and the number of missing values. We will also vary the seasonality and the strength of the autocorrelation. We will then discuss a case study for Eastern Europe and European Russia where we investigate time series of MODIS Nadir BRDF-adjusted (NBAR) data at 8-day and 500m resolution between 2001 and 2013. We investigate basic vegetation indices such as NDVI and EVI but also extend the analysis towards a disturbance index which identifies how pixels differ from

  1. Effect of image processing version on detection of non-calcification cancers in 2D digital mammography imaging

    Science.gov (United States)

    Warren, L. M.; Cooke, J.; Given-Wilson, R. M.; Wallis, M. G.; Halling-Brown, M.; Mackenzie, A.; Chakraborty, D. P.; Bosmans, H.; Dance, D. R.; Young, K. C.

    2013-03-01

    Image processing (IP) is the last step in the digital mammography imaging chain before interpretation by a radiologist. Each manufacturer has their own IP algorithm(s) and the appearance of an image after IP can vary greatly depending upon the algorithm and version used. It is unclear whether these differences can affect cancer detection. This work investigates the effect of IP on the detection of non-calcification cancers by expert observers. Digital mammography images for 190 patients were collected from two screening sites using Hologic amorphous selenium detectors. Eighty of these cases contained non-calcification cancers. The images were processed using three versions of IP from Hologic - default (full enhancement), low contrast (intermediate enhancement) and pseudo screen-film (no enhancement). Seven experienced observers inspected the images and marked the location of regions suspected to be non-calcification cancers assigning a score for likelihood of malignancy. This data was analysed using JAFROC analysis. The observers also scored the clinical interpretation of the entire case using the BSBR classification scale. This was analysed using ROC analysis. The breast density in the region surrounding each cancer and the number of times each cancer was detected were calculated. IP did not have a significant effect on the radiologists' judgment of the likelihood of malignancy of individual lesions or their clinical interpretation of the entire case. No correlation was found between number of times each cancer was detected and the density of breast tissue surrounding that cancer.

  2. Chemical Sensing for Buried Landmines - Fundamental Processes Influencing Trace Chemical Detection

    Energy Technology Data Exchange (ETDEWEB)

    PHELAN, JAMES M.

    2002-05-01

    Mine detection dogs have a demonstrated capability to locate hidden objects by trace chemical detection. Because of this capability, demining activities frequently employ mine detection dogs to locate individual buried landmines or for area reduction. The conditions appropriate for use of mine detection dogs are only beginning to emerge through diligent research that combines dog selection/training, the environmental conditions that impact landmine signature chemical vapors, and vapor sensing performance capability and reliability. This report seeks to address the fundamental soil-chemical interactions, driven by local weather history, that influence the availability of chemical for trace chemical detection. The processes evaluated include: landmine chemical emissions to the soil, chemical distribution in soils, chemical degradation in soils, and weather and chemical transport in soils. Simulation modeling is presented as a method to evaluate the complex interdependencies among these various processes and to establish conditions appropriate for trace chemical detection. Results from chemical analyses on soil samples obtained adjacent to landmines are presented and demonstrate the ultra-trace nature of these residues. Lastly, initial measurements of the vapor sensing performance of mine detection dogs demonstrates the extreme sensitivity of dogs in sensing landmine signature chemicals; however, reliability at these ultra-trace vapor concentrations still needs to be determined. Through this compilation, additional work is suggested that will fill in data gaps to improve the utility of trace chemical detection.

  3. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper;

    2005-01-01

    We define residuals for point process models fitted to spatial point pattern data, and we propose diagnostic plots based on them. The residuals apply to any point process model that has a conditional intensity; the model may exhibit spatial heterogeneity, interpoint interaction and dependence on ....... A plot of smoothed residuals against spatial location, or against a spatial covariate, is effective in diagnosing spatial trend or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction....

  4. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  5. Laplace-Laplace analysis of the fractional Poisson process

    OpenAIRE

    Gorenflo, Rudolf; Mainardi, Francesco

    2013-01-01

    We generate the fractional Poisson process by subordinating the standard Poisson process to the inverse stable subordinator. Our analysis is based on application of the Laplace transform with respect to both arguments of the evolving probability densities.

  6. Satellite images analysis for shadow detection and building height estimation

    Science.gov (United States)

    Liasis, Gregoris; Stavrou, Stavros

    2016-09-01

    Satellite images can provide valuable information about the presented urban landscape scenes to remote sensing and telecommunication applications. Obtaining information from satellite images is difficult since all the objects and their surroundings are presented with feature complexity. The shadows cast by buildings in urban scenes can be processed and used for estimating building heights. Thus, a robust and accurate building shadow detection process is important. Region-based active contour models can be used for satellite image segmentation. However, spectral heterogeneity that usually exists in satellite images and the feature similarity representing the shadow and several non-shadow regions makes building shadow detection challenging. In this work, a new automated method for delineating building shadows is proposed. Initially, spectral and spatial features of the satellite image are utilized for designing a custom filter to enhance shadows and reduce intensity heterogeneity. An effective iterative procedure using intensity differences is developed for tuning and subsequently selecting the most appropriate filter settings, able to highlight the building shadows. The response of the filter is then used for automatically estimating the radiometric property of the shadows. The customized filter and the radiometric feature are utilized to form an optimized active contour model where the contours are biased to delineate shadow regions. Post-processing morphological operations are also developed and applied for removing misleading artefacts. Finally, building heights are approximated using shadow length and the predefined or estimated solar elevation angle. Qualitative and quantitative measures are used for evaluating the performance of the proposed method for both shadow detection and building height estimation.

  7. Computer program performs statistical analysis for random processes

    Science.gov (United States)

    Newberry, M. H.

    1966-01-01

    Random Vibration Analysis Program /RAVAN/ performs statistical analysis on a number of phenomena associated with flight and captive tests, but can also be used in analyzing data from many other random processes.

  8. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  9. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-08-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data. PMID:26737641

  10. Comparative Analysis of Different Fabric Defects Detection Techniques

    Directory of Open Access Journals (Sweden)

    Ali Javed

    2013-01-01

    Full Text Available In last few years’ different textile companies aim to produce the quality fabrics. Major loss of any textile oriented company occurs due to defective fabrics. So the detection of faulty fabrics plays an important role in the success of any company. Till now most of the inspection is done using human visual. This way is too much time consuming, cumbersome and prone to human errors. In past, many advances are made in developing automated and computerized systems to reduce cost and time whereas, increasing the efficiency of the process. This paper aims at comparing some of these techniques on the basis of classification methods and accuracy.

  11. Core power distribution fault detection process for a nuclear pressurized water reactor device for carrying out the process

    International Nuclear Information System (INIS)

    Power distribution faults in the core of a PWR are detected by measuring at least one parameter representative of the core power, each parameter being measured at a predetermined number of points. For each parameter one calculates the difference between the two extreme measured values and the ratio of this difference to the smallest measured value and compare this ratio with a reference value, a faults being detecter if this ratio is greater than the reference value. The detectors are placed symetrically near the core periphery. Preferentially, a detector as placed near the core center. The detectors are, according to the realization, neutron flux measuring chambers and/or temperature sensors. The detection process is very sensitive and can localise faults, whatever their cause, at any position in the core

  12. Comparative analysis of model assessment in community detection

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.

  13. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun Saptohartyadi;

    2014-01-01

    Condition monitoring of wind turbines is a field of continu- ous research and development as new turbine configurations enter into the market and new failure modes appear. Systems utilising well established techniques from the energy and in- dustry sector, such as vibration analysis...... in order to achieve better resolution and thus identify the most proba- ble root cause of the power deviation. An important aspect of the proposed technique is its independence of the power curve provided by the turbine manufacturer. It is shown that by detecting any changes of the two eigenvalues trends...

  14. Modal Analysis for Crack Detection in Small Wind Turbine Blades

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Skov, Jonas falk; Dickow, Kristoffer Ahrens;

    2013-01-01

    The aim of the present paper is to evaluate structural health monitoring (SHM) techniques based on modal analysis for crack detection in small wind turbine blades. A finite element (FE) model calibrated to measured modal parameters will be introduced to cracks with different sizes along one edge...... of the blade. Changes in modal parameters from the FE model are compared with data obtained from experimental tests. These comparisons will be used to validate the FE model and subsequently discuss the usability of SHM techniques based on modal parameters for condition monitoring of wind turbine blades....

  15. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks.

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes. PMID:26193332

  16. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  17. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  18. Kinetic Analysis of Mica Tape Curing Process

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2008-01-01

    Full Text Available Curing program of thermoset insulating materials and its responsible setting has the key importance for assurance of high quality and reliability of electrical devices. It is possible to determine parameters of this program (temperature and time of curing by several ways in practise. There is mostly focused on methods based on kinetic analysis. The result comparison of selected methods of kinetic analysis and residual enthalpy measurement is the main aim of the paper. Two insulating tapes were chosen for the purpose of this study. These tapes correspond in their composition (glass fabric, mica and epoxy binder, but they differ in curing agent type. Simultaneous thermal analysis (STA was used during the measurements. Monitored results demonstrate the advantages and disadvantages of particular methods.

  19. 300 Area process trench sediment analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, M.G.; Kossik, C.D.

    1987-12-01

    This report describes the results of a sampling program for the sediments underlying the Process Trenches serving the 300 Area on the Hanford reservation. These Process Trenches were the subject of a Closure Plan submitted to the Washington State Department of Ecology and to the US Environmental Protection Agency in lieu of a Part B permit application on November 8, 1985. The closure plan described a proposed sampling plan for the underlying sediments and potential remedial actions to be determined by the sample analyses results. The results and proposed remedial action plan are presented and discussed in this report. 50 refs., 6 figs., 8 tabs.

  20. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  1. ELINT signal processing on reconfigurable computers for detection and classification of LPI Emitters

    OpenAIRE

    Brown, Dane A.

    2006-01-01

    This thesis describes the implementation of an ELINT algorithm for the detection and classification of Low Probability of Intercept (LPI) signals. The algorithm was coded in the C programming language and executed on a Field Programmable Gate Array based reconfigurable computer; the SRC-6 manufactured by SRC Computers, Inc. Specifically, this thesis focuses on the preprocessing stage of an LPI signal processing algorithm. This stage receives a detected signal that has been run through a Q...

  2. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    OpenAIRE

    Pookongchai Kritsada; Nakornrat Prasit; Sookananta Bongkoj; Buasri Panhathai

    2015-01-01

    This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point...

  3. Optimal detection of a change-set in a spatial Poisson process

    CERN Document Server

    Ivanoff, B Gail; 10.1214/09-AAP629

    2010-01-01

    We generalize the classic change-point problem to a "change-set" framework: a spatial Poisson process changes its intensity on an unobservable random set. Optimal detection of the set is defined by maximizing the expected value of a gain function. In the case that the unknown change-set is defined by a locally finite set of incomparable points, we present a sufficient condition for optimal detection of the set using multiparameter martingale techniques. Two examples are discussed.

  4. Modeling fraud detection and the incorporation of forensic specialists in the audit process

    DEFF Research Database (Denmark)

    Sakalauskaite, Dominyka

    Financial statement audits are still comparatively poor in fraud detection. Forensic specialists can play a significant role in increasing audit quality. In this paper, based on prior academic research, I develop a model of fraud detection and the incorporation of forensic specialists in the audit...... process. The intention of the model is to identify the reasons why the audit is weak in fraud detection and to provide the analytical framework to assess whether the incorporation of forensic specialists can help to improve it. The results show that such specialists can potentially improve the fraud...... detection in the audit, but might also cause some negative implications. Overall, even though fraud detection is one of the main topics in research there are very few studies done on the subject of how auditors co-operate with forensic specialists. Thus, the paper concludes with suggestions for further...

  5. Resource conflict detection and removal strategy for nondeterministic emergency response processes using Petri nets

    Science.gov (United States)

    Zeng, Qingtian; Liu, Cong; Duan, Hua

    2016-09-01

    Correctness of an emergency response process specification is critical to emergency mission success. Therefore, errors in the specification should be detected and corrected at build-time. In this paper, we propose a resource conflict detection approach and removal strategy for emergency response processes constrained by resources and time. In this kind of emergency response process, there are two timing functions representing the minimum and maximum execution time for each activity, respectively, and many activities require resources to be executed. Based on the RT_ERP_Net, the earliest time to start each activity and the ideal execution time of the process can be obtained. To detect and remove the resource conflicts in the process, the conflict detection algorithms and a priority-activity-first resolution strategy are given. In this way, real execution time for each activity is obtained and a conflict-free RT_ERP_Net is constructed by adding virtual activities. By experiments, it is proved that the resolution strategy proposed can shorten the execution time of the whole process to a great degree.

  6. An analysis of network traffic classification for botnet detection

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2015-01-01

    Botnets represent one of the most serious threats to the Internet security today. This paper explores how can network traffic classification be used for accurate and efficient identification of botnet network activity at local and enterprise networks. The paper examines the effectiveness of detec...... to the optimization of traffic analysis and the correlation of findings from the three analysis methods in order to identify compromised hosts within the network.......Botnets represent one of the most serious threats to the Internet security today. This paper explores how can network traffic classification be used for accurate and efficient identification of botnet network activity at local and enterprise networks. The paper examines the effectiveness...... of detecting botnet network traffic using three methods that target protocols widely considered as the main carriers of botnet Command and Control (C&C) and attack traffic, i.e. TCP, UDP and DNS. We propose three traffic classification methods based on capable Random Forests classifier. The proposed methods...

  7. INTEGRATION OF POKA YOKE INTO PROCESS FAILURE MODE AND EFFECT ANALYSIS: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    A. P. Puvanasvaran

    2014-01-01

    Full Text Available The Failure Mode and Effect Analysis (FMEA is a one of the requirements which was required by the Automotive Industries Action Group (AIAG to all the automotive suppliers and manufacturers worldwide through the TS16949 Quality System. There were a lot of dicrepencies detected on implementing the FMEA which directly related to the user experinces and knowledge. The descrepencies cause the FMEA not meeting the objectives of it. Conceptually, Poka Yoke is able to fit into the Process FMEA. Failure Mode and Effect Analysis (FMEA helps predict and prevent problems through proper control or detection methods. Mistake proofing emphasizes detection and correction of mistakes before they become defects. Poka Yoke helps people and processes work correctly the first time. It refers to techniques that make mistakes impossible to commit. These techniques eliminate defects from products and processes as well as substantially improve their quality and reliability. Poka Yoke can be considered an extension of FMEA. The use of simple Poka Yoke ideas and methods in product and process design eliminates both human and mechanical errors. Ultimately, both FMEA and Poka Yoke methodologies result in zero defects and benefit either the end or the next-in-line customer. The first concept of Poka Yoke emphasizes elimination of the cause or occurrence of the error that creates the defects by concentrating on the cause of the error in the process. The defect is prevented by stopping the line or the machine when the root cause of the defect is triggered or detected. The second concept of Poka Yoke focuses on the effectiveness of the detection system. The foolproof detection system eliminates the defect or detects the error that causes defects. The implementation of the Poka Yoke concept in a foolproof detection system eliminates the possibility that error or defects will slip through the process and reach the customer.

  8. Detection of the Second r-process Peak Element Tellurium in Metal-Poor Stars

    CERN Document Server

    Roederer, Ian U; Cowan, John J; Beers, Timothy C; Frebel, Anna; Ivans, Inese I; Schatz, Hendrik; Sobeck, Jennifer S; Sneden, Christopher

    2012-01-01

    Using near-ultraviolet spectra obtained with the Space Telescope Imaging Spectrograph onboard the Hubble Space Telescope, we detect neutral tellurium in three metal-poor stars enriched by products of r-process nucleosynthesis, BD+17 3248, HD 108317, and HD 128279. Tellurium (Te, Z=52) is found at the second r-process peak (A=130) associated with the N=82 neutron shell closure, and it has not been detected previously in Galactic halo stars. The derived tellurium abundances match the scaled solar system r-process distribution within the uncertainties, confirming the predicted second peak r-process residuals. These results suggest that tellurium is predominantly produced in the main component of the r-process, along with the rare earth elements.

  9. An Automated System for the Detection of Stratified Squamous Epithelial Cancer Cell Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Ram Krishna Kumar

    2013-06-01

    Full Text Available Early detection of cancer disease is a difficult problem and if it is not detected in starting phase the cancer can be fatal. Current medical procedures which are used to diagnose the cancer in body partsare time taking and more laboratory work is required for them. This work is an endeavor to possible recognition of cancer cells in the body part. The process consists of image taken of the affected area and digital image processing of the images to get a morphological pattern which differentiate normal cell to cancer cell. The technique is different than visual inspection and biopsy process. Image processing enables the visualization of cellular structure with substantial resolution. The aim of the work is to exploit differences in cellular organization between cancerous and normal tissue using image processing technique, thus allowing for automated, fast and accurate diagnosis.

  10. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate fr...

  11. SPAN C - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN-C, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on cards.

  12. SPAN - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on tape.

  13. Exergy analysis in industrial food processing

    NARCIS (Netherlands)

    Zisopoulos, F.K.

    2016-01-01

    The sustainable provision of food on a global scale in the near future is a very serious challenge. This thesis focuses on the assessment and design of sustainable industrial food production chains and processes by using the concept of exergy which is an objective metric based on the first and secon

  14. Analysis of food quality perception processes

    NARCIS (Netherlands)

    J.G. Termorshuizen (Koos); M.T.G. Meulenberg; B. Wierenga (Berend)

    1986-01-01

    textabstractA model of the quality perception process of the consumer with respect to food products has been developed. The model integrates a number of quality-related concepts. An empirical study was carried out to examine the relationships between the concepts. It appears that the various concept

  15. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  16. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...

  17. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors...

  18. Power spectrum weighted edge analysis for straight edge detection in images

    Science.gov (United States)

    Karvir, Hrishikesh V.; Skipper, Julie A.

    2007-04-01

    Most man-made objects provide characteristic straight line edges and, therefore, edge extraction is a commonly used target detection tool. However, noisy images often yield broken edges that lead to missed detections, and extraneous edges that may contribute to false target detections. We present a sliding-block approach for target detection using weighted power spectral analysis. In general, straight line edges appearing at a given frequency are represented as a peak in the Fourier domain at a radius corresponding to that frequency, and a direction corresponding to the orientation of the edges in the spatial domain. Knowing the edge width and spacing between the edges, a band-pass filter is designed to extract the Fourier peaks corresponding to the target edges and suppress image noise. These peaks are then detected by amplitude thresholding. The frequency band width and the subsequent spatial filter mask size are variable parameters to facilitate detection of target objects of different sizes under known imaging geometries. Many military objects, such as trucks, tanks and missile launchers, produce definite signatures with parallel lines and the algorithm proves to be ideal for detecting such objects. Moreover, shadow-casting objects generally provide sharp edges and are readily detected. The block operation procedure offers advantages of significant reduction in noise influence, improved edge detection, faster processing speed and versatility to detect diverse objects of different sizes in the image. With Scud missile launcher replicas as target objects, the method has been successfully tested on terrain board test images under different backgrounds, illumination and imaging geometries with cameras of differing spatial resolution and bit-depth.

  19. Analysis of coaxial laser cladding processing conditions

    NARCIS (Netherlands)

    de Oliveira, U; Ocelik, V; De Hosson, JTM

    2005-01-01

    The formation of thick Ni-based coating on a steel substrate by coaxial laser cladding using the Nd:YAG 2 kW continuous laser was studied both from a theoretical and experimental point of view. The theoretical analysis concentrated on the transfer of laser irradiation and powder particles using a si

  20. Detection and Monitoring of Neurotransmitters - a Spectroscopic Analysis

    Science.gov (United States)

    Manciu, Felicia; Lee, Kendall; Durrer, William; Bennet, Kevin

    2012-10-01

    In this work we demonstrate the capability of confocal Raman mapping spectroscopy for simultaneously and locally detecting important compounds in neuroscience such as dopamine, serotonin, and adenosine. The Raman results show shifting of the characteristic vibrations of the compounds, observations consistent with previous spectroscopic studies. Although some vibrations are common in these neurotransmitters, Raman mapping was achieved by detecting non-overlapping characteristic spectral signatures of the compounds, as follows: for dopamine the vibration attributed to C-O stretching, for serotonin the indole ring stretching vibration, and for adenosine the adenine ring vibrations. Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific micro-scale image regions. Such information is particularly important for complex, heterogeneous samples, where modification of the chemical or physical composition can influence the neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.

  1. Enhancement of crack detection in stud bolts of nuclear reactor by ultrasonic signal processing technique

    International Nuclear Information System (INIS)

    'Full-text:' The stud bolts is one of the most critical parts for safety of reactor vessels in the nuclear power plants. However, in the application of ultrasonic technique for crack detection in stud bolt, some difficulties encountered are classification of crack signal from the signals reflected from threads part in stud bolt. In this study, shadow effect technique combined with new signal processing method is Investigated to enhance the detectability of small crack initiated from root of thread in stud bolt. The key idea of signal processing is based on the fact that the shape of waveforms from the threads is uniform since the shape of the threads in a bolt is same. If some cracks exist in the thread, the flaw signals are different to the reference signals. It is demonstrated that the small flaws are efficiently detected by novel ultrasonic technique combined with this new signal processing concept. (author)

  2. Damage Detection and Quantification Using Transmissibility Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Yun-Lai Zhou

    2015-01-01

    Full Text Available A new transmissibility-based damage detection and quantification approach is proposed. Based on the operational modal analysis, the transmissibility is extracted from system responses and transmissibility coherence is defined and analyzed. Afterwards, a sensitive-damage indicator is defined in order to detect and identify the severity of damage and compared with an indicator developed by other authors. The proposed approach is validated on data from a physics-based numerical model as well as experimental data from a three-story aluminum frame structure. For both numerical simulation and experiment the results of the new indicator reveal a better performance than coherence measure proposed in Rizos et al., 2008, Rizos et al., 2002, Fassois and Sakellariou, 2007, especially when nonlinearity occurs, which might be further used in real engineering. The main contribution of this study is the construction of the relation between transmissibility coherence and frequency response function coherence and the construction of an effective indicator based on the transmissibility modal assurance criteria for damage (especially for minor nonlinearity detection as well as quantification.

  3. Error detection process - Model, design, and its impact on computer performance

    Science.gov (United States)

    Shin, K. G.; Lee, Y.-H.

    1984-01-01

    An analytical model is developed for computer error detection processes and applied to estimate their influence on system performance. Faults in the hardware, not in the design, are assumed to be the potential cause of transition to erroneous states during normal operations. The classification properties and associated recovery methods of error detection are discussed. The probability of obtaining an unreliable result is evaluated, along with the resulting computational loss. Error detection during design is considered and a feasible design space is outlined. Extension of the methods to account for the effects of extant multiple faults is indicated.

  4. Detection of Prion Proteins and TSE Infectivity in the Rendering and Biodiesel Manufacture Processes

    Energy Technology Data Exchange (ETDEWEB)

    Brown, R.; Keller, B.; Oleschuk, R. [Queen' s University, Kingston, Ontario (Canada)

    2007-03-15

    This paper addresses emerging issues related to monitoring prion proteins and TSE infectivity in the products and waste streams of rendering and biodiesel manufacture processes. Monitoring is critical to addressing the knowledge gaps identified in 'Biodiesel from Specified Risk Material Tallow: An Appraisal of TSE Risks and their Reduction' (IEA's AMF Annex XXX, 2006) that prevent comprehensive risk assessment of TSE infectivity in products and waste. The most important challenge for monitoring TSE risk is the wide variety of sample types, which are generated at different points in the rendering/biodiesel production continuum. Conventional transmissible spongiform encephalopathy (TSE) assays were developed for specified risk material (SRM) and other biological tissues. These, however, are insufficient to address the diverse sample matrices produced in rendering and biodiesel manufacture. This paper examines the sample types expected in rendering and biodiesel manufacture and the implications of applying TSE assay methods to them. The authors then discuss a sample preparation filtration, which has not yet been applied to these sample types, but which has the potential to provide or significantly improve TSE monitoring. The main improvement will come from transfer of the prion proteins from the sample matrix to a matrix compatible with conventional and emerging bioassays. A second improvement will come from preconcentrating the prion proteins, which means transferring proteins from a larger sample volume into a smaller volume for analysis to provide greater detection sensitivity. This filtration method may also be useful for monitoring other samples, including wash waters and other waste streams, which may contain SRM, including those from abattoirs and on-farm operations. Finally, there is a discussion of emerging mass spectrometric methods, which Prusiner and others have shown to be suitable for detection and characterisation of prion proteins (Stahl

  5. Shift endpoint trace selection algorithm and wavelet analysis to detect the endpoint using optical emission spectroscopy

    Science.gov (United States)

    Ben Zakour, Sihem; Taleb, Hassen

    2016-06-01

    Endpoint detection (EPD) is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done on the right way. It is truly a crucial part of supplying repeatable effects in every single wafer. When the film to be etched has been completely erased, the endpoint is reached. In order to ensure the desired device performance on the produced integrated circuit, many sensors are used to detect the endpoint, such as the optical, electrical, acoustical/vibrational, thermal, and frictional. But, except the optical sensor, the other ones show their weaknesses due to the environmental conditions which affect the exactness of reaching endpoint. Unfortunately, some exposed area to the film to be etched is very low (signal and showing the incapacity of the traditional endpoint detection method to determine the wind-up of the etch process. This work has provided a means to improve the endpoint detection sensitivity by collecting a huge numbers of full spectral data containing 1201 spectra for each run, then a new unsophisticated algorithm is proposed to select the important endpoint traces named shift endpoint trace selection (SETS). Then, a sensitivity analysis of linear methods named principal component analysis (PCA) and factor analysis (FA), and the nonlinear method called wavelet analysis (WA) for both approximation and details will be studied to compare performances of the methods mentioned above. The signal to noise ratio (SNR) is not only computed based on the main etch (ME) period but also the over etch (OE) period. Moreover, a new unused statistic for EPD, coefficient of variation (CV), is proposed to reach the endpoint in plasma etches process.

  6. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  7. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  8. Analysis of digitized cervical images to detect cervical neoplasia

    Science.gov (United States)

    Ferris, Daron G.

    2004-05-01

    Cervical cancer is the second most common malignancy in women worldwide. If diagnosed in the premalignant stage, cure is invariably assured. Although the Papanicolaou (Pap) smear has significantly reduced the incidence of cervical cancer where implemented, the test is only moderately sensitive, highly subjective and skilled-labor intensive. Newer optical screening tests (cervicography, direct visual inspection and speculoscopy), including fluorescent and reflective spectroscopy, are fraught with certain weaknesses. Yet, the integration of optical probes for the detection and discrimination of cervical neoplasia with automated image analysis methods may provide an effective screening tool for early detection of cervical cancer, particularly in resource poor nations. Investigative studies are needed to validate the potential for automated classification and recognition algorithms. By applying image analysis techniques for registration, segmentation, pattern recognition, and classification, cervical neoplasia may be reliably discriminated from normal epithelium. The National Cancer Institute (NCI), in cooperation with the National Library of Medicine (NLM), has embarked on a program to begin this and other similar investigative studies.

  9. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    Science.gov (United States)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  10. Network structure detection and analysis of Shanghai stock market

    Directory of Open Access Journals (Sweden)

    Sen Wu

    2015-04-01

    Full Text Available Purpose: In order to investigate community structure of the component stocks of SSE (Shanghai Stock Exchange 180-index, a stock correlation network is built to find the intra-community and inter-community relationship. Design/methodology/approach: The stock correlation network is built taking the vertices as stocks and edges as correlation coefficients of logarithm returns of stock price. It is built as undirected weighted at first. GN algorithm is selected to detect community structure after transferring the network into un-weighted with different thresholds. Findings: The result of the network community structure analysis shows that the stock market has obvious industrial characteristics. Most of the stocks in the same industry or in the same supply chain are assigned to the same community. The correlation of the internal stock prices’ fluctuation is closer than in different communities. The result of community structure detection also reflects correlations among different industries. Originality/value: Based on the analysis of the community structure in Shanghai stock market, the result reflects some industrial characteristics, which has reference value to relationship among industries or sub-sectors of listed companies.

  11. Detection and analysis of diamond fingerprinting feature and its application

    Energy Technology Data Exchange (ETDEWEB)

    Li Xin; Huang Guoliang; Li Qiang; Chen Shengyi, E-mail: tshgl@tsinghua.edu.cn [Department of Biomedical Engineering, the School of Medicine, Tsinghua University, Beijing, 100084 (China)

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the 'Diamond ID' and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  12. Analysis of patents on preeclampsia detection and diagnosis: a perspective.

    Science.gov (United States)

    Telang, M A; Bhutkar, S P; Hirwani, R R

    2013-01-01

    Computerized patent databases have made it possible to access a wealth of technological information contained in patent documents. Analysis of patent information can greatly help in monitoring technology trends and evolution as well as identification of research gaps. Detection and diagnosis of preeclampsia (PE) was chosen as a case study to emphasize the informative potential of patent analysis. PE complicates about 2-8% pregnancies, affecting a total of about 8.5 million women worldwide. PE can lead to potentially life threatening problems of the liver, kidneys, brain and blood clotting system of the mother. Risks for the baby include poor growth and prematurity. No effective ways of predicting or preventing PE have been found, which highlights the need for further research in this field. Some researchers believe that the incidence of PE is on the rise due to an increased prevalence of predisposing disorders, such as chronic hypertension, diabetes, and obesity. PE thus represents a huge health care burden world over. Complications of PE might be prevented to a certain extent if diagnosed early. Patents on detection and diagnosis of PE were analyzed to gain a better understanding of the technical approaches followed by various research groups around the world. PMID:23200058

  13. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  14. Piezoresistive microcantilever aptasensor for ricin detection and kinetic analysis

    Directory of Open Access Journals (Sweden)

    Zhi-Wei Liu

    2015-04-01

    Full Text Available Up to now, there has been no report on target molecules detection by a piezoresistive microcantilever aptasensor. In order to evaluate the test performance and investigate the response dynamic characteristics of a piezoresistive microcantilever aptasensor, a novel method for ricin detection and kinetic analysis based on a piezoresistive microcantilever aptasensor was proposed, where ricin aptamer was immobilised on the microcantilever surface by biotin-avidin binding system. Results showed that the detection limit of ricin was 0.04μg L−1 (S/N ≥ 3. A linear relationship between the response voltage and the concentration of ricin in the range of 0.2μg L−1-40μg L−1 was obtained, with the linear regression equation of ΔUe = 0.904C + 5.852 (n = 5, R = 0.991, p < 0.001. The sensor showed no response for abrin, BSA, and could overcome the influence of complex environmental disruptors, indicating high specificity and good selectivity. Recovery and reproducibility in the result of simulated samples (simulated water, soil, and flour sample determination met the analysis requirements, which was 90.5∼95.5% and 7.85%∼9.39%, respectively. On this basis, a reaction kinetic model based on ligand-receptor binding and the relationship with response voltage was established. The model could well reflect the dynamic response of the sensor. The correlation coefficient (R was greater than or equal to 0.9456 (p < 0.001. Response voltage (ΔUe and response time (t0 obtained from the fitting equation on different concentrations of ricin fitted well with the measured values.

  15. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  16. Streak detection and analysis pipeline for space-debris optical images

    Science.gov (United States)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  17. An analysis of the Logistics Requisition process

    OpenAIRE

    Burson, Dawn A.

    2011-01-01

    Approved for public release; distribution is unlimited. The business of supporting a globally dispersed naval force is fraught with challenges and complexity. Services for warships of differing mission and size must be sourced and provided at ports all over the world. U.S. Navy ships use a formatted report called a Logistics Requisition (LOGREQ) to acquire those necessary services. The unconnected nature of the stakeholders that own specific portions of the process increases complexity as ...

  18. Quantile spectral processes: Asymptotic analysis and inference

    OpenAIRE

    Kley, Tobias; Volgushev, Stanislav; Dette, Holger; Hallin, Marc

    2016-01-01

    Quantile- and copula-related spectral concepts recently have been considered by various authors. Those spectra, in their most general form, provide a full characterization of the copulas associated with the pairs $(X_{t},X_{t-k})$ in a process $(X_{t})_{t\\in\\mathbb{Z}}$, and account for important dynamic features, such as changes in the conditional shape (skewness, kurtosis), time-irreversibility, or dependence in the extremes that their traditional counterparts cannot capture. Despite variou...

  19. Fire detection and prediction with image processing on a UAV platform

    OpenAIRE

    Mimoun Abderrahaman, Fuad

    2014-01-01

    Fire detection and prediction with image processing on a UAV platform This document provides a fire detection tool designed for fire departments is proposed , algorithms based on image processing, pattern classification and UAV technology En este documento se propone una herramienta de detección de incendios pensada para los cuerpos de bomberos, basada en algoritmos de procesado de imagen, clasificación de patrones y tecnología UAV En aquest document es proposa una eina de detecció d...

  20. Thermodynamic analysis on synthesis process of diamond

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based upon the thermodynamic analysis of the nucleation of diamond crystal, the effects of synthesis temperature and pressure on the nucleation of diamond crystal, diamond growth and output of diamond crystal, particle size and strength were discussed. The results show that the excess pressure has an important effect on the critical radius of nucleation and thermodynamic barrier in the formation of a critical nucleus. Considering the excess pressure, the expression of diamond nucleation rate was obtained.

  1. Analysis and processing - introduction to knowledge management

    OpenAIRE

    Caracostea Ionut-Andrei

    2011-01-01

    Although the essential theoretical knowledge of marketing concepts is not enough for a person involved in the management and marketing planning. To be competitive in the labor market in the area of marketing interest must possess specific information analysis tools in order to capture those aspects of the marketing environment necessary for development decision making. Using the analytical capabilities resulting from technological boom of the last decades, especially those pertaining to infor...

  2. Statistical methods for the detection and analysis of radioactive sources

    Science.gov (United States)

    Klumpp, John

    We consider four topics from areas of radioactive statistical analysis in the present study: Bayesian methods for the analysis of count rate data, analysis of energy data, a model for non-constant background count rate distributions, and a zero-inflated model of the sample count rate. The study begins with a review of Bayesian statistics and techniques for analyzing count rate data. Next, we consider a novel system for incorporating energy information into count rate measurements which searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data in real time to sequentially update a probability distribution for the sample count rate. We then consider a "moving target" model of background radiation in which the instantaneous background count rate is a function of time, rather than being fixed. Unlike the sequential update system, this model assumes a large body of pre-existing data which can be analyzed retrospectively. Finally, we propose a novel Bayesian technique which allows for simultaneous source detection and count rate analysis. This technique is fully compatible with, but independent of, the sequential update system and moving target model.

  3. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  4. APPLICATION OF WESTERN BLOT ANALYSIS FOR DETECTION OF PROLAMIN PROTEINS IN CEREAL GRAINS AND BREAD

    Directory of Open Access Journals (Sweden)

    Ewa Cieślik

    2011-02-01

    Full Text Available Celiac disease is an inflammatory condition of the small intestine in genetically susceptible individuals caused by ingestion of wheat gluten and corresponding proteins from barley and rye. Cereal storage proteins (prolamins are responsible for immunological response of patients with celiac disease. Prolamins are alcohol soluble fractions, namely gliadins (wheat, hordeins (barley and secalins (rye. The main triggering factor is wheat fraction with low molecular weight (20-30 kDa called α-gliadins. Immunochemical detection of celiac active proteins is based on reactivity of gluten-detecting antibodies with prolamins extracted from cereals. In our study, we used Western blot analysis for detection of prolamin complex in cereal grains and processed foods (breads. Western blot was carried out by polyclonal antibody raised against wheat gluten. Reaction was positive for all kind of cereal grains. The samples of wheat and spelt wheat show much more positive affinity to antibody than rye and oat. As well as for cereal grains, all samples of bread showed positive immunological reaction with used antibody. Western blot analysis with gluten polyclonal antibody is suitable method for qualitative detection of prolamin complex in cereal grains and processed foods.doi:10.5219/115

  5. Analysis of DIRAC's behavior using model checking with process algebra

    Science.gov (United States)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-12-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  6. Analysis of DIRAC's behavior using model checking with process algebra

    International Nuclear Information System (INIS)

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  7. Bony change of apical lesion healing process using fractal analysis

    International Nuclear Information System (INIS)

    To investigate the change of bone healing process after endodontic treatment of the tooth with an apical lesion by fractal analysis. Radiographic images of 35 teeth from 33 patients taken on first diagnosis, 6 months, and 1 year after endodontic treatment were selected. Radiographic images were taken by JUPITER computerized Dental X-ray System. Fractal dimensions were calculated three times at each area by Scion Image PC program. Rectangular region of interest (30 x 30) were selected at apical lesion and normal apex of each image. The fractal dimension at apical lesion of first diagnosis (L0) is 0.940 ± 0.361 and that of normal area (N0) is 1.186 ± 0.727 (p1) is 1.076 ± 0.069 and that of normal area (N1) is 1.192 ± 0.055 (p2) is 1.163 ± 0.074 and that of normal area (N2) is 1.225 ± 0.079 (p<0.05). After endodontic treatment, the fractal dimensions at each apical lesions depending on time showed statistically significant difference. And there are statistically significant different between normal area and apical lesion on first diagnosis, 6 months after, 1 year after. But the differences were grow smaller as time flows. The evaluation of the prognosis after the endodontic treatment of the apical lesion was estimated by bone regeneration in apical region. Fractal analysis was attempted to overcome the limit of subjective reading, and as a result the change of the bone during the healing process was able to be detected objectively and quantitatively.

  8. Beta-Negative Binomial Process and Poisson Factor Analysis

    OpenAIRE

    Zhou, Mingyuan; Hannah, Lauren; Dunson, David; Carin, Lawrence

    2011-01-01

    A beta-negative binomial (BNB) process is proposed, leading to a beta-gamma-Poisson process, which may be viewed as a "multi-scoop" generalization of the beta-Bernoulli process. The BNB process is augmented into a beta-gamma-gamma-Poisson hierarchical structure, and applied as a nonparametric Bayesian prior for an infinite Poisson factor analysis model. A finite approximation for the beta process Levy random measure is constructed for convenient implementation. Efficient MCMC computations are...

  9. Data-Driven Methods for the Detection of Causal Structures in Process Technology

    Directory of Open Access Journals (Sweden)

    Christian Kühnert

    2014-11-01

    Full Text Available In modern industrial plants, process units are strongly cross-linked with eachother, and disturbances occurring in one unit potentially become plant-wide. This can leadto a flood of alarms at the supervisory control and data acquisition system, hiding the originalfault causing the disturbance. Hence, one major aim in fault diagnosis is to backtrackthe disturbance propagation path of the disturbance and to localize the root cause of thefault. Since detecting correlation in the data is not sufficient to describe the direction of thepropagation path, cause-effect dependencies among process variables need to be detected.Process variables that show a strong causal impact on other variables in the process comeinto consideration as being the root cause. In this paper, different data-driven methods areproposed, compared and combined that can detect causal relationships in data while solelyrelying on process data. The information of causal dependencies is used for localization ofthe root cause of a fault. All proposed methods consist of a statistical part, which determineswhether the disturbance traveling from one process variable to a second is significant, and aquantitative part, which calculates the causal information the first process variable has aboutthe second. The methods are tested on simulated data from a chemical stirred-tank reactorand on a laboratory plant.

  10. A critical evaluation of the principal component analysis detection of polarized signatures using real stellar data

    Science.gov (United States)

    Paletou, F.

    2012-08-01

    The general context of this study is the post-processing of multiline spectropolarimetric observations of stars, and in particular the numerical analysis techniques aiming at detecting and characterizing polarized signatures. Using real observational data, we compare and clarify several points concerning various methods of analysis. We applied and compared the results of simple line addition, least-squares deconvolution, and denoising by principal component analysis to polarized stellar spectra available from the TBLegacy database of the Narval spectropolarimeter. This comparison of various approaches of distinct sophistication levels allows us to make a safe choice for the next implementation of on-line post-processing of our unique database for the stellar physics community.

  11. Detecting inpatient falls by using natural language processing of electronic medical records

    Directory of Open Access Journals (Sweden)

    Toyabe Shin-ichi

    2012-12-01

    Full Text Available Abstract Background Incident reporting is the most common method for detecting adverse events in a hospital. However, under-reporting or non-reporting and delay in submission of reports are problems that prevent early detection of serious adverse events. The aim of this study was to determine whether it is possible to promptly detect serious injuries after inpatient falls by using a natural language processing method and to determine which data source is the most suitable for this purpose. Methods We tried to detect adverse events from narrative text data of electronic medical records by using a natural language processing method. We made syntactic category decision rules to detect inpatient falls from text data in electronic medical records. We compared how often the true fall events were recorded in various sources of data including progress notes, discharge summaries, image order entries and incident reports. We applied the rules to these data sources and compared F-measures to detect falls between these data sources with reference to the results of a manual chart review. The lag time between event occurrence and data submission and the degree of injury were compared. Results We made 170 syntactic rules to detect inpatient falls by using a natural language processing method. Information on true fall events was most frequently recorded in progress notes (100%, incident reports (65.0% and image order entries (12.5%. However, F-measure to detect falls using the rules was poor when using progress notes (0.12 and discharge summaries (0.24 compared with that when using incident reports (1.00 and image order entries (0.91. Since the results suggested that incident reports and image order entries were possible data sources for prompt detection of serious falls, we focused on a comparison of falls found by incident reports and image order entries. Injury caused by falls found by image order entries was significantly more severe than falls detected by

  12. Acquisition and processing of advanced sensor data for ERW and UXO detection and classification

    Science.gov (United States)

    Schultz, Gregory M.; Keranen, Joe; Miller, Jonathan S.; Shubitidze, Fridon

    2014-06-01

    The remediation of explosive remnants of war (ERW) and associated unexploded ordnance (UXO) has seen improvements through the injection of modern technological advances and streamlined standard operating procedures. However, reliable and cost-effective detection and geophysical mapping of sites contaminated with UXO such as cluster munitions, abandoned ordnance, and improvised explosive devices rely on the ability to discriminate hazardous items from metallic clutter. In addition to anthropogenic clutter, handheld and vehicle-based metal detector systems are plagued by natural geologic and environmental noise in many post conflict areas. We present new and advanced electromagnetic induction (EMI) technologies including man-portable and towed EMI arrays and associated data processing software. While these systems feature vastly different form factors and transmit-receive configurations, they all exhibit several fundamental traits that enable successful classification of EMI anomalies. Specifically, multidirectional sampling of scattered magnetic fields from targets and corresponding high volume of unique data provide rich information for extracting useful classification features for clutter rejection analysis. The quality of classification features depends largely on the extent to which the data resolve unique physics-based parameters. To date, most of the advanced sensors enable high quality inversion by producing data that are extremely rich in spatial content through multi-angle illumination and multi-point reception.

  13. [Fast Detection of Camellia Sinensis Growth Process and Tea Quality Informations with Spectral Technology: A Review].

    Science.gov (United States)

    Peng, Ji-yu; Song, Xing-lin; Liu, Fei; Bao, Yi-dan; He, Yong

    2016-03-01

    The research achievements and trends of spectral technology in fast detection of Camellia sinensis growth process information and tea quality information were being reviewed. Spectral technology is a kind of fast, nondestructive, efficient detection technology, which mainly contains infrared spectroscopy, fluorescence spectroscopy, Raman spectroscopy and mass spectroscopy. The rapid detection of Camellia sinensis growth process information and tea quality is helpful to realize the informatization and automation of tea production and ensure the tea quality and safety. This paper provides a review on its applications containing the detection of tea (Camellia sinensis) growing status(nitrogen, chlorophyll, diseases and insect pest), the discrimination of tea varieties, the grade discrimination of tea, the detection of tea internal quality (catechins, total polyphenols, caffeine, amino acid, pesticide residual and so on), the quality evaluation of tea beverage and tea by-product, the machinery of tea quality determination and discrimination. This paper briefly introduces the trends of the technology of the determination of tea growth process information, sensor and industrial application. In conclusion, spectral technology showed high potential to detect Camellia sinensis growth process information, to predict tea internal quality and to classify tea varieties and grades. Suitable chemometrics and preprocessing methods is helpful to improve the performance of the model and get rid of redundancy, which provides the possibility to develop the portable machinery. Future work is to develop the portable machinery and on-line detection system is recommended to improve the further application. The application and research achievement of spectral technology concerning about tea were outlined in this paper for the first time, which contained Camellia sinensis growth, tea production, the quality and safety of tea and by-produce and so on, as well as some problems to be solved

  14. Application of differential analysis of VLF signals for seismic-ionospheric precursor detection from multiple receivers

    Science.gov (United States)

    Skeberis, Christos; Zaharis, Zaharias; Xenos, Thomas; Contadakis, Michael; Stratakis, Dimitrios; Tommaso, Maggipinto; Biagi, Pier Francesco

    2015-04-01

    This study investigates the application of differential analysis on VLF signals emitted from a single transmitter and received by multiple stations in order to filter and detect disturbances that can be attributed to seismic-ionospheric precursor phenomena. The cross-correlation analysis applied on multiple VLF signals provides a way of discerning the nature of a given disturbance and accounts for more widespread geomagnetic interferences compared to local precursor phenomena. For the purpose of this paper, data acquired in Thessaloniki (40.59N, 22,78E) and in Heraklion (35.31N, 25.10E) from the VLF station in Tavolara, Italy (ICV station Lat. 40.923, Lon. 9.731) for a period of four months (September 2014 - December 2014) are used. The receivers have been developed by Elettronika Srl and are part of the International Network for Frontier Research on Earthquake Precursors (INFREP). A normalization process and an improved variant of the Hilbert-Huang transform are initially applied to the received VLF signals. The signals derived from the first two Intrinsic Mode Functions (IMF1 and IMF2) undergo a cross-correlation analysis and, in this way, time series from the two receivers can be compared. The efficacy of the processing method and the results produced by the proposed process are then discussed. Finally, results are presented along with an evaluation of the discrimination and detection capabilities of the method on disturbances of the received signals. Based upon the results, the merits of such a processing method are discussed to further improve the current method by using differential analysis to better classify between different disturbances but, more importantly, discriminate between points of interest in the provided spectra. This could provide an improved method of detecting disturbances attributed to seismic-ionospheric precursor phenomena and also contribute to a real-time method for correlating seismic activity with the observed disturbances.

  15. Safety analysis of SISL process module

    International Nuclear Information System (INIS)

    This report provides an assessment of various postulated accidental occurrences within an experimental process module which is part of a Special Isotope Separation Laboratory (SISL) currently under construction at the Lawrence Livermore National Laboratory (LLNL). The process module will contain large amounts of molten uranium and various water-cooled structures within a vacuum vessel. Special emphasis is therefore given to potential accidental interactions of molten uranium with water leading to explosive and/or rapid steam formation, as well as uranium oxidation and the potential for combustion. Considerations are also given to the potential for vessel melt-through. Evaluations include mechanical and thermal interactions and design implications both in terms of design basis as well as once-in-a-lifetime accident scenarios. These scenarios include both single- and multiple-failure modes leading to various contact modes and locations within the process module for possible thermal interactions. The evaluations show that a vacuum vessel design based upon nominal operating conditions would appear sufficient to meet safety requirements in connection with both design basis as well as once-in-a-lifetime accidents. Controlled venting requirements for removal of steam and hydrogen in order to avoid possible long-term pressurization events are recommended. Depending upon the resulting accident conditions, the vacuum system (i.e., the roughing system) could also serve this purpose. Finally, based upon accident evaluations of this study, immediate shut-off of all coolant water following an incident leak is not recommended, as such action may have adverse effects in terms of cool-down requirements for the melt crucibles etc. These requirements have not been assessed as part of this study

  16. The analysis of thermally stimulated processes

    CERN Document Server

    Chen, R; Pamplin, Brian

    1981-01-01

    Thermally stimulated processes include a number of phenomena - either physical or chemical in nature - in which a certain property of a substance is measured during controlled heating from a 'low' temperature. Workers and graduate students in a wide spectrum of fields require an introduction to methods of extracting information from such measurements. This book gives an interdisciplinary approach to various methods which may be applied to analytical chemistry including radiation dosimetry and determination of archaeological and geological ages. In addition, recent advances are included, such

  17. Pulsed laser noise analysis and pump-probe signal detection with a data acquisition card.

    Science.gov (United States)

    Werley, Christopher A; Teo, Stephanie M; Nelson, Keith A

    2011-12-01

    A photodiode and data acquisition card whose sampling clock is synchronized to the repetition rate of a laser are used to measure the energy of each laser pulse. Simple analysis of the data yields the noise spectrum from very low frequencies up to half the repetition rate and quantifies the pulse energy distribution. When two photodiodes for balanced detection are used in combination with an optical modulator, the technique is capable of detecting very weak pump-probe signals (ΔI/I(0) ~ 10(-5) at 1 kHz), with a sensitivity that is competitive with a lock-in amplifier. Detection with the data acquisition card is versatile and offers many advantages including full quantification of noise during each stage of signal processing, arbitrary digital filtering in silico after data collection is complete, direct readout of percent signal modulation, and easy adaptation for fast scanning of delay between pump and probe.

  18. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  19. Combination of EEG Complexity and Spectral Analysis for Epilepsy Diagnosis and Seizure Detection

    Directory of Open Access Journals (Sweden)

    Wan-Lin Chang

    2010-01-01

    Full Text Available Approximately 1% of the world's population has epilepsy, and 25% of epilepsy patients cannot be treated sufficiently by any available therapy. If an automatic seizure-detection system was available, it could reduce the time required by a neurologist to perform an off-line diagnosis by reviewing electroencephalogram (EEG data. It could produce an on-line warning signal to alert healthcare professionals or to drive a treatment device such as an electrical stimulator to enhance the patient's safety and quality of life. This paper describes a systematic evaluation of current approaches to seizure detection in the literature. This evaluation was then used to suggest a reliable, practical epilepsy detection method. The combination of complexity analysis and spectrum analysis on an EEG can perform robust evaluations on the collected data. Principle component analysis (PCA and genetic algorithms (GAs were applied to various linear and nonlinear methods. The best linear models resulted from using all of the features without other processing. For the nonlinear models, applying PCA for feature reduction provided better results than applying GAs. The feasibility of executing the proposed methods on a personal computer for on-line processing was also demonstrated.

  20. SINGLE TREE DETECTION FROM AIRBORNE LASER SCANNING DATA USING A MARKED POINT PROCESS BASED METHOD

    Directory of Open Access Journals (Sweden)

    J. Zhang

    2013-05-01

    Full Text Available Tree detection and reconstruction is of great interest in large-scale city modelling. In this paper, we present a marked point process model to detect single trees from airborne laser scanning (ALS data. We consider single trees in ALS recovered canopy height model (CHM as a realization of point process of circles. Unlike traditional marked point process, we sample the model in a constraint configuration space by making use of image process techniques. A Gibbs energy is defined on the model, containing a data term which judge the fitness of the model with respect to the data, and prior term which incorporate the prior knowledge of object layouts. We search the optimal configuration through a steepest gradient descent algorithm. The presented hybrid framework was test on three forest plots and experiments show the effectiveness of the proposed method.