WorldWideScience

Sample records for analysis detection processing

  1. Process fault detection and nonlinear time series analysis for anomaly detection in safeguards

    International Nuclear Information System (INIS)

    In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect loss of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values

  2. Process fault detection and non-linear time series analysis for anomaly detection in safeguards

    International Nuclear Information System (INIS)

    The paper discusses process fault detection and non-linear time series analysis, which are applied to the analysis for vector-valued and single-valued time series data. Model based process fault detection methods for analysing simulated, multivariate, time series data from a three-tank system are investigated. The model predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). Two methods are evaluated, testing all individual residuals with a univariate z score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect loss of material from two different leak scenarios from the three-tank system: a leak without replacement and a leak with replacement of the lost volume. Non-linear time series analysis tools have been compared with the linear methods popularized by Box and Jenkins. The paper compares prediction results using three non-linear and two linear modelling methods on each of six simulated time series: two non-linear and four linear time series. The non-linear methods performed better at predicting the non-linear time series and did as well as the linear methods at predicting the linear values. (author). 10 refs, 5 figs, 1 tab

  3. Dynamics analysis of vibration process in Particle Impact Noise Detection

    Institute of Scientific and Technical Information of China (English)

    ZHANG Hui; ZHOU Chang-lei; WANG Shu-juan; ZHAI Guo-fu

    2007-01-01

    Particle Impact Noise Detection (PIND) test is a reliability screening technique for hermetic device that is prescribed by MIL-PRF-39016E. Some test conditions are specified, although MIL-PRF-39016E did not specify how to obtain these conditions. This paper establishes the dynamics model of vibration process based on first order mass-spring system. The corresponding Simulink model is also established to simulate vibration process in optional input excitations. The response equations are derived in sinusoidal excitations and the required electromagnetic force waves are computed in order to obtain a given vibration and shock accelerations. Last, some simulation results are given.

  4. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    Science.gov (United States)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  5. Bayesian analysis to detect abrupt changes in extreme hydrological processes

    Science.gov (United States)

    Jo, Seongil; Kim, Gwangsu; Jeon, Jong-June

    2016-07-01

    In this study, we develop a new method for a Bayesian change point analysis. The proposed method is easy to implement and can be extended to a wide class of distributions. Using a generalized extreme-value distribution, we investigate the annual maximum of precipitations observed at stations in the South Korean Peninsula, and find significant changes in the considered sites. We evaluate the hydrological risk in predictions using the estimated return levels. In addition, we explain that the misspecification of the probability model can lead to a bias in the number of change points and using a simple example, show that this problem is difficult to avoid by technical data transformation.

  6. Similarity ratio analysis for early stage fault detection with optical emission spectrometer in plasma etching process.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available A Similarity Ratio Analysis (SRA method is proposed for early-stage Fault Detection (FD in plasma etching processes using real-time Optical Emission Spectrometer (OES data as input. The SRA method can help to realise a highly precise control system by detecting abnormal etch-rate faults in real-time during an etching process. The method processes spectrum scans at successive time points and uses a windowing mechanism over the time series to alleviate problems with timing uncertainties due to process shift from one process run to another. A SRA library is first built to capture features of a healthy etching process. By comparing with the SRA library, a Similarity Ratio (SR statistic is then calculated for each spectrum scan as the monitored process progresses. A fault detection mechanism, named 3-Warning-1-Alarm (3W1A, takes the SR values as inputs and triggers a system alarm when certain conditions are satisfied. This design reduces the chance of false alarm, and provides a reliable fault reporting service. The SRA method is demonstrated on a real semiconductor manufacturing dataset. The effectiveness of SRA-based fault detection is evaluated using a time-series SR test and also using a post-process SR test. The time-series SR provides an early-stage fault detection service, so less energy and materials will be wasted by faulty processing. The post-process SR provides a fault detection service with higher reliability than the time-series SR, but with fault testing conducted only after each process run completes.

  7. On-road anomaly detection by multimodal sensor analysis and multimedia processing

    Science.gov (United States)

    Orhan, Fatih; Eren, P. E.

    2014-03-01

    The use of smartphones in Intelligent Transportation Systems is gaining popularity, yet many challenges exist in developing functional applications. Due to the dynamic nature of transportation, vehicular social applications face complexities such as developing robust sensor management, performing signal and image processing tasks, and sharing information among users. This study utilizes a multimodal sensor analysis framework which enables the analysis of sensors in multimodal aspect. It also provides plugin-based analyzing interfaces to develop sensor and image processing based applications, and connects its users via a centralized application as well as to social networks to facilitate communication and socialization. With the usage of this framework, an on-road anomaly detector is being developed and tested. The detector utilizes the sensors of a mobile device and is able to identify anomalies such as hard brake, pothole crossing, and speed bump crossing. Upon such detection, the video portion containing the anomaly is automatically extracted in order to enable further image processing analysis. The detection results are shared on a central portal application for online traffic condition monitoring.

  8. Technology Gap Analysis for the Detection of Process Signatures Using Less Than Remote Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hartman, John S.; Atkinson, David A.; Lind, Michael A.; Maughan, A. D.; Kelly, James F.

    2005-01-01

    Although remote sensing methods offer advantages for monitoring important illicit process activities, remote and stand-off technologies cannot successfully detect all important processes with the sensitivity and certainty that is desired. The main scope of the program is observables, with a primary focus on chemical signatures. A number of key process signatures elude remote or stand-off detection for a variety of reasons (e.g., heavy particulate emissions that do not propagate far enough for detection at stand-off distances, semi-volatile chemicals that do not tend to vaporize and remain in the environment near the source, etc.). Some of these compounds can provide persistent, process-specific information that is not available through remote techniques; however, the associated measurement technologies have their own set of advantages, disadvantages and technical challenges that may need to be overcome before additional signature data can be effectively and reliably exploited. The main objective of this report is to describe a process to identify high impact technology gaps for important less-than-remote detection applications. The subsequent analysis focuses on the technology development needed to enable exploitation of important process signatures. The evaluation process that was developed involves three interrelated and often conflicting requirements generation activities: • Identification of target signature chemicals with unique intelligence value and their associated attributes as mitigated by environmentally influenced fate and transport effects (i.e., what can you expect to actually find that has intelligence value, where do you need to look for it and what sensitivity and selectivity do you need to see it) • Identification of end-user deployment scenario possibilities and constraints with a focus on alternative detection requirements, timing issues, logistical consideration, and training requirements for a successful measurement • Identification of

  9. Analysis of Space Shuttle Ground Support System Fault Detection, Isolation, and Recovery Processes and Resources

    Science.gov (United States)

    Gross, Anthony R.; Gerald-Yamasaki, Michael; Trent, Robert P.

    2009-01-01

    As part of the FDIR (Fault Detection, Isolation, and Recovery) Project for the Constellation Program, a task was designed within the context of the Constellation Program FDIR project called the Legacy Benchmarking Task to document as accurately as possible the FDIR processes and resources that were used by the Space Shuttle ground support equipment (GSE) during the Shuttle flight program. These results served as a comparison with results obtained from the new FDIR capability. The task team assessed Shuttle and EELV (Evolved Expendable Launch Vehicle) historical data for GSE-related launch delays to identify expected benefits and impact. This analysis included a study of complex fault isolation situations that required a lengthy troubleshooting process. Specifically, four elements of that system were considered: LH2 (liquid hydrogen), LO2 (liquid oxygen), hydraulic test, and ground special power.

  10. Statistical Analysis of the Performance of MDL Enumeration for Multiple-Missed Detection in Array Processing

    OpenAIRE

    Fei Du; Yibo Li; Shijiu Jin

    2015-01-01

    An accurate performance analysis on the MDL criterion for source enumeration in array processing is presented in this paper. The enumeration results of MDL can be predicted precisely by the proposed procedure via the statistical analysis of the sample eigenvalues, whose distributive properties are investigated with the consideration of their interactions. A novel approach is also developed for the performance evaluation when the source number is underestimated by a number greater than one, wh...

  11. Human Motion Analysis via Statistical Motion Processing and Sequential Change Detection

    Directory of Open Access Journals (Sweden)

    Alexia Briassouli

    2009-01-01

    Full Text Available The widespread use of digital multimedia in applications, such as security, surveillance, and the semantic web, has made the automated characterization of human activity necessary. In this work, a method for the characterization of multiple human activities based on statistical processing of the video data is presented. First the active pixels of the video are detected, resulting in a binary mask called the Activity Area. Sequential change detection is then applied to the data examined in order to detect at which time instants there are changes in the activity taking place. This leads to the separation of the video sequence into segments with different activities. The change times are examined for periodicity or repetitiveness in the human actions. The Activity Areas and their temporal weighted versions, the Activity History Areas, for the extracted subsequences are used for activity recognition. Experiments with a wide range of indoors and outdoors videos of various human motions, including challenging videos with dynamic backgrounds, demonstrate the proposed system's good performance.

  12. Analysis of In-Situ Vibration Monitoring for End-Point Detection of CMP Planarization Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hetherington, Dale L.; Lauffer, James P.; Shingledecker, David M.; Stein, David J.; Wyckoff, Edward E.

    1999-05-14

    This paper details the analysis of vibration monitoring for end-point control in oxide CMP processes. Two piezoelectric accelerometers were integrated onto the backside of a stainless steel polishing head of an IPEC 472 polisher. One sensor was placed perpendicular to the carrier plate (vertical) and the other parallel to the plate (horizontal). Wafers patterned with metal and coated with oxide material were polished at different speeds and pressures. Our results show that it is possible to sense a change in the vibration signal over time during planarization of oxide material on patterned wafers. The horizontal accelerometer showed more sensitivity to change in vibration amplitude compared to the vertical accelerometer for a given polish condition. At low carrier and platen rotation rates, the change in vibration signal over time at fixed frequencies decreased approximately ½ - 1 order of magnitude (over the 2 to 10 psi polish pressure ranges). At high rotation speeds, the vibration signal remained essentially constant indicating that other factors dominated the vibration signaL These results show that while it is possible to sense changes in acceleration during polishing, more robust hardware and signal processing algorithms are required to ensure its use over a wide range of process conditions.

  13. Experimental analysis of the auditory detection process on avian point counts

    Science.gov (United States)

    Simons, T.R.; Alldredge, M.W.; Pollock, K.H.; Wettroth, J.M.

    2007-01-01

    We have developed a system for simulating the conditions of avian surveys in which birds are identified by sound. The system uses a laptop computer to control a set of amplified MP3 players placed at known locations around a survey point. The system can realistically simulate a known population of songbirds under a range of factors that affect detection probabilities. The goals of our research are to describe the sources and range of variability affecting point-count estimates and to find applications of sampling theory and methodologies that produce practical improvements in the quality of bird-census data. Initial experiments in an open field showed that, on average, observers tend to undercount birds on unlimited-radius counts, though the proportion of birds counted by individual observers ranged from 81% to 132% of the actual total. In contrast to the unlimited-radius counts, when data were truncated at a 50-m radius around the point, observers overestimated the total population by 17% to 122%. Results also illustrate how detection distances decline and identification errors increase with increasing levels of ambient noise. Overall, the proportion of birds heard by observers decreased by 28 ?? 4.7% under breezy conditions, 41 ?? 5.2% with the presence of additional background birds, and 42 ?? 3.4% with the addition of 10 dB of white noise. These findings illustrate some of the inherent difficulties in interpreting avian abundance estimates based on auditory detections, and why estimates that do not account for variations in detection probability will not withstand critical scrutiny. ?? The American Ornithologists' Union, 2007.

  14. Lie detection: Cognitive Processes

    OpenAIRE

    Street, C. N. H.

    2013-01-01

    How do we make decisions when we are uncertain? In more real-world settings there is often a vast array of information available to guide the decision, from an understanding of the social situation, to prior beliefs and experience, to information available in the current environment. Yet much of the research into uncertain decision-making has typically studied the process by isolating it from this rich source of information that decision-makers usually have available to them. This thesis take...

  15. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  16. Malware detection and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Ken; Lloyd, Levi; Crussell, Jonathan; Sanders, Benjamin; Erickson, Jeremy Lee; Fritz, David Jakob

    2016-03-22

    Embodiments of the invention describe systems and methods for malicious software detection and analysis. A binary executable comprising obfuscated malware on a host device may be received, and incident data indicating a time when the binary executable was received and identifying processes operating on the host device may be recorded. The binary executable is analyzed via a scalable plurality of execution environments, including one or more non-virtual execution environments and one or more virtual execution environments, to generate runtime data and deobfuscation data attributable to the binary executable. At least some of the runtime data and deobfuscation data attributable to the binary executable is stored in a shared database, while at least some of the incident data is stored in a private, non-shared database.

  17. A Process Deviation Analysis Framework

    OpenAIRE

    Depaire, Benoit; Swinnen, Jo; Jans, Mieke; Vanhoof, Koen

    2013-01-01

    Process deviation analysis is becoming increasingly important for companies. This paper presents a framework which structures the field of process deviation analysis and identies new research opportunities. Application of the framework starts from managerial questions which relate to specific deviation categories and methodological steps. Finally a general outline to detect high-level process deviations is formulated.

  18. Fingerprint detection and process prediction by multivariate analysis of fed-batch monoclonal antibody cell culture data.

    Science.gov (United States)

    Sokolov, Michael; Soos, Miroslav; Neunstoecklin, Benjamin; Morbidelli, Massimo; Butté, Alessandro; Leardi, Riccardo; Solacroup, Thomas; Stettler, Matthieu; Broly, Hervé

    2015-01-01

    This work presents a sequential data analysis path, which was successfully applied to identify important patterns (fingerprints) in mammalian cell culture process data regarding process variables, time evolution and process response. The data set incorporates 116 fed-batch cultivation experiments for the production of a Fc-Fusion protein. Having precharacterized the evolutions of the investigated variables and manipulated parameters with univariate analysis, principal component analysis (PCA) and partial least squares regression (PLSR) are used for further investigation. The first major objective is to capture and understand the interaction structure and dynamic behavior of the process variables and the titer (process response) using different models. The second major objective is to evaluate those models regarding their capability to characterize and predict the titer production. Moreover, the effects of data unfolding, imputation of missing data, phase separation, and variable transformation on the performance of the models are evaluated. PMID:26399784

  19. Novel image processing approach to detect malaria

    Science.gov (United States)

    Mas, David; Ferrer, Belen; Cojoc, Dan; Finaurini, Sara; Mico, Vicente; Garcia, Javier; Zalevsky, Zeev

    2015-09-01

    In this paper we present a novel image processing algorithm providing good preliminary capabilities for in vitro detection of malaria. The proposed concept is based upon analysis of the temporal variation of each pixel. Changes in dark pixels mean that inter cellular activity happened, indicating the presence of the malaria parasite inside the cell. Preliminary experimental results involving analysis of red blood cells being either healthy or infected with malaria parasites, validated the potential benefit of the proposed numerical approach.

  20. Theoretical and experimental analysis of electroweak corrections to the inclusive jet process. Development of extreme topologies detection methods

    International Nuclear Information System (INIS)

    We have studied the behaviour of the inclusive jet, W+jets and Z+jets processes from the phenomenological and experimental point of view in the ATLAS experiment at LHC in order to understand how important is the impact of Sudakov logarithms on electroweak corrections and in the associated production of weak vector boson and jets at LHC. We have computed the amplitude of the real electroweak corrections to the inclusive jet process due to the real emission of weak vector bosons from jets. We have done this computation with the MCFM and NLOjet++ generators at 7 TeV, 8 TeV and 14 TeV. This study shows that, for the inclusive jet process, the partial cancellation of the virtual weak corrections (due to weak bosons in loops) by the real electroweak corrections occurs. This effect shows that Bloch-Nordsieck violation is reduced for this process. We have then participated to the measure of the differential cross-section for these different processes in the ATLAS experiment at 7 TeV. In particular we have been involved into technical aspects of the measurement such as the study of the QCD background to the W+jets process in the muon channel. We have then combined the different measurements in this channel to compare their behaviour. This tends to show that several effects are giving to the electroweak corrections their relative importance as we see an increase of the relative contribution of weak bosons with jets processes to the inclusive jet process with the transverse momentum of jets, if we explicitly ask for the presence of electroweak bosons in the final state. This study is currently only a preliminary study and aims at showing that this study can be useful to investigate the underlying structure of these processes. Finally we have studied the noises affecting the ATLAS calorimeter. This has allowed for the development of a new way to detect problematic events using well known theorems from statistics. This new method is able to detect bursts of noise and

  1. A Frame Study for Post-Processing Analysis on System Behavior: A Case Study of Deadline Miss Detection

    Directory of Open Access Journals (Sweden)

    Junghee Lee

    2010-01-01

    Full Text Available Problem statement: A lot of data can be obtained by system simulation using transaction level models without affecting the performance of the system. Due to huge amount of the raw data, we often need to post-process them to extract valuable information. Profiling capabilities of commercial tools provide predefined functionalities and don’t allow users to add or modify for their own purpose. Approach: This study proposed a general frame study for the automation of post-processing simulation results using Boolean representation. The proposed frame study consists of Boolean expresser, manipulator and analyzer. Results: The frame study was illustrated with a case study of deadline miss detection. Conclusion: The frame study was practical as it provides flexibility, generality and ease of use.

  2. Error Detection Processes in Problem Solving.

    Science.gov (United States)

    Allwood, Carl Martin

    1984-01-01

    Describes a study which analyzed problem solvers' error detection processes by instructing subjects to think aloud when solving statistical problems. Effects of evaluative episodes on error detection, detection of different error types, error detection processes per se, and relationship of error detection behavior to problem-solving proficiency…

  3. Tracing and elimination of process-induced contaminations in sputter-deposited targets by elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Target foils for nuclear physics experiments have been prepared by high vacuum sputter deposition and analyzed for preparation induced contaminations using ERDA (elastic recoil detection analysis) with heavy ions. Sputter deposition has been shown to be a very effective method for the production of thin self-supporting targets, in particular those which cannot be made in certain cases by other techniques. Its low material consumption is of special importance for targets from expensive isotopes. During the development of this sputter deposition technique, contaminants introduced by the preparation method were traced by ERDA and eliminated by appropriate countermeasures. ERDA has proven to be a useful tool for quantitative thin film analysis. Its sensitivity of about 10-2 at.% is sufficient for nuclear target analysis and roughly equal for all elements. If highly energetic ions, e.g. 170 MeV 127I, are used, elements from H up to the heaviest ones can be detected simultaneously, and therefore the relative content can be determined precisely. Light elements in thin foils have been simply identified by their energy signals in solid state detectors, as for instance H in V foils. For thicker targets or heavier components, an ionization detector with particle identification has been used. In self-supporting test foils of suitable elements the origin of every contaminant could be traced back by systematic analyses and reduced to a tolerable level. Based on this experience, the sputter setup was optimized to allow the preparation of targets of many elements with a minimum of contamination. (orig.)

  4. NASA Hazard Analysis Process

    Science.gov (United States)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  5. Image Processing Technique for Brain Abnormality Detection

    Directory of Open Access Journals (Sweden)

    Ashraf Anwar

    2013-02-01

    Full Text Available Medical imaging is expensive and very much sophisticated because of proprietary software and expert personalities. This paper introduces an inexpensive, user friendly general-purpose image processing tool and visualization program specifically designed in MATLAB to detect much of the brain disorders as early as possible. The application provides clinical and quantitative analysis of medical images. Minute structural difference of brain gradually results in major disorders such as schizophrenia, Epilepsy, inherited speech and language disorder, Alzheimer's dementia etc. Here the main focusing is given to diagnose the disease related to the brain and its psychic nature (Alzheimer’s disease.

  6. Integrated Process Capability Analysis

    Institute of Scientific and Technical Information of China (English)

    Chen; H; T; Huang; M; L; Hung; Y; H; Chen; K; S

    2002-01-01

    Process Capability Analysis (PCA) is a powerful too l to assess the ability of a process for manufacturing product that meets specific ations. The larger process capability index implies the higher process yield, a nd the larger process capability index also indicates the lower process expected loss. Chen et al. (2001) has applied indices C pu, C pl, and C pk for evaluating the process capability for a multi-process product wi th smaller-the-better, larger-the-better, and nominal-the-best spec...

  7. Aluminium Process Fault Detection and Diagnosis

    OpenAIRE

    Nazatul Aini Abd Majid; Taylor, Mark P; Chen, John J. J.; Brent R. Young

    2015-01-01

    The challenges in developing a fault detection and diagnosis system for industrial applications are not inconsiderable, particularly complex materials processing operations such as aluminium smelting. However, the organizing into groups of the various fault detection and diagnostic systems of the aluminium smelting process can assist in the identification of the key elements of an effective monitoring system. This paper reviews aluminium process fault detection and diagnosis systems and propo...

  8. Study on the fundamental technology for CBRNE detection. Numerical analysis on process of particulates dispersion by explosion

    International Nuclear Information System (INIS)

    The dispersion processes of particles such as CBRN (Chemical, Biological, Radiological and Nuclear) materials blown by the explosion on the ground are simulated using the compressible fluid dynamics simulation code. These simulations provide the initial condition of particles and gas flow properties for the incompressible fluid dynamics simulation to predict the long term dispersion process of the particles. The effect of the side walls on the particles dispersion process is estimated in the simulations of explosion inside the building with simplified wall. (author)

  9. Ongoing Active Deformation Processes at Fernandina Volcano (Galapagos) Detected via Multi-Orbit COSMO-SkyMed SAR Data Analysis

    Science.gov (United States)

    Pepe, Susi; Castaldo, Raffaele; De Luca, Claudio; Casu, Francesco; Tizzani, Pietro; Sansosti, Eugenio

    2014-05-01

    Fernandina Volcano, Galápagos (Ecuador), has experienced several uplift and eruption episodes over the last twenty-two years. The ground deformation between 2002 and 2006 was interpreted as the effect of an inflation phenomenon of two separate magma reservoirs beneath the caldera. Moreover, the uplift deformation occurred during the 2005 eruption was concentrated near the circumferential eruptive fissures, while being superimposed on a broad subsidence centred on the caldera. The geodetic studies emphasized the presence of two sub volcanic lateral intrusions from the central storage system in December 2006 and August 2007. The latest eruption in 2009 was characterized by lava flows emitted from the SW radial fissures. We analyze the spatial and temporal ground deformation between March 2012 and July 2013, by using data acquired by COSMO-SkyMed X-band constellation along both ascending and descending orbits and by applying advanced InSAR techniques. In particular, we use the SBAS InSAR approach and combine ascending and descending time series to produce vertical and East-West components of the mean deformation velocity and deformation time series. Our analysis revealed a new uplift phenomenon due to the stress concentration inside the shallow magmatic system of the volcano. In particular, the vertical mean velocity map shows that the deformation pattern is concentrated inside caldera region and is characterized by strongly radial symmetry with a maximum displacement of about 20 cm in uplift; an axial symmetry is also observed in the EW horizontal mean velocity map, showing a maximum displacement of about +12 cm towards East for the SE flank, and -12 cm towards West for the NW flank of the volcano. Moreover, the deformation time series show a rather linear uplift trend from March to September 2012, interrupted by a low deformation rate interval lasting until January 2013. After this stage, the deformation shows again a linear behaviour with an increased uplift rate

  10. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  11. Post-processing noise removal algorithm for magnetic resonance imaging based on edge detection and wavelet analysis

    Energy Technology Data Exchange (ETDEWEB)

    Placidi, Giuseppe; Alecci, Marcello; Sotgiu, Antonello [INFM, c/o Centro di Risonanza Magnetica and Dipartimento di Scienze e Tecnologie Biomediche, Universita dell' Aquila, Via Vetoio 10, 67010 Coppito, L' Aquila (Italy)

    2003-07-07

    A post-processing noise suppression technique for biomedical MRI images is presented. The described procedure recovers both sharp edges and smooth surfaces from a given noisy MRI image; it does not blur the edges and does not introduce spikes or other artefacts. The fine details of the image are also preserved. The proposed algorithm first extracts the edges from the original image and then performs noise reduction by using a wavelet de-noise method. After the application of the wavelet method, the edges are restored to the filtered image. The result is the original image with less noise, fine detail and sharp edges. Edge extraction is performed by using an algorithm based on Sobel operators. The wavelet de-noise method is based on the calculation of the correlation factor between wavelet coefficients belonging to different scales. The algorithm was tested on several MRI images and, as an example of its application, we report the results obtained from a spin echo (multi echo) MRI image of a human wrist collected with a low field experimental scanner (the signal-to-noise ratio, SNR, of the experimental image was 12). Other filtering operations have been performed after the addition of white noise on both channels of the experimental image, before the magnitude calculation. The results at SNR = 7, SNR = 5 and SNR = 3 are also reported. For SNR values between 5 and 12, the improvement in SNR was substantial and the fine details were preserved, the edges were not blurred and no spikes or other artefacts were evident, demonstrating the good performances of our method. At very low SNR (SNR = 3) our result is worse than that obtained by a simpler filtering procedure.

  12. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    Directory of Open Access Journals (Sweden)

    Kunming Qin

    2014-01-01

    Full Text Available Background: In traditional Chinese medicine (TCM, raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae, is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry. Results: The results indicated that there was less arctiin in stir-fried materials than in raw materials. however, there were higher levels of arctigenin in stir-fried materials than in raw materials. Conclusion: We suggest that arctiin reduced significantly following the thermal conversion of arctiin to arctigenin. In conclusion, this finding may shed some light on understanding the differences in the therapeutic values of raw versus PFA in TCM.

  13. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  14. Aluminium Process Fault Detection and Diagnosis

    Directory of Open Access Journals (Sweden)

    Nazatul Aini Abd Majid

    2015-01-01

    Full Text Available The challenges in developing a fault detection and diagnosis system for industrial applications are not inconsiderable, particularly complex materials processing operations such as aluminium smelting. However, the organizing into groups of the various fault detection and diagnostic systems of the aluminium smelting process can assist in the identification of the key elements of an effective monitoring system. This paper reviews aluminium process fault detection and diagnosis systems and proposes a taxonomy that includes four key elements: knowledge, techniques, usage frequency, and results presentation. Each element is explained together with examples of existing systems. A fault detection and diagnosis system developed based on the proposed taxonomy is demonstrated using aluminium smelting data. A potential new strategy for improving fault diagnosis is discussed based on the ability of the new technology, augmented reality, to augment operators’ view of an industrial plant, so that it permits a situation-oriented action in real working environments.

  15. Entanglement of identical particles and the detection process

    DEFF Research Database (Denmark)

    Tichy, Malte C.; de Melo, Fernando; Kus, Marek;

    2013-01-01

    We introduce detector-level entanglement, a unified entanglement concept for identical particles that takes into account the possible deletion of many-particle which-way information through the detection process. The concept implies a measure for the effective indistinguishability of the particles...... statistical behavior depends on their initial entanglement. Our results show that entanglement cannot be attributed to a state of identical particles alone, but that the detection process has to be incorporated in the analysis....

  16. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-12-05

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement over traditional signal-processing methods for the detection limit of various nitrogen and phosphorus compounds from the output of a thermionic detector attached to a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above the threshold. In addition, two of six spikes were detected at levels of 1/2 the concentration of the nominal threshold. Another two of the six would have been detected correctly if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was subsequently identified by analyzing a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods should be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  17. Parallel Image Processing Technology of Surface Detection System

    Institute of Scientific and Technical Information of China (English)

    LI Chang-le; CHENG Wan-sheng; FAN Ji-zhuang; ZHAO Jie

    2008-01-01

    To improve image processing speed and detection precision of a surface detection system on a strip surface, based on the analysis of the characteristics of image data and image processing in detection system on the strip surface, the design of parallel image processing system and the methods of algorithm implementation have been studied. By using field programmable gate array(FPGA) as hardware platform of implementation and considering the characteristic of detection system on the strip surface, a parallel image processing system implemented by using multi IP kernel is designed. According to different computing tasks and the load balancing capability of parallel processing system, the system could set different calculating numbers of nodes to meet the system's demand and save the hardware cost.

  18. Multimode Process Fault Detection Using Local Neighborhood Similarity Analysis☆

    Institute of Scientific and Technical Information of China (English)

    Xiaogang Deng; Xuemin Tian

    2014-01-01

    Traditional data driven fault detection methods assume unimodal distribution of process data so that they often perform not wel in chemical process with multiple operating modes. In order to monitor the multimode chemical process effectively, this paper presents a novel fault detection method based on local neighborhood similarity analysis (LNSA). In the proposed method, prior process knowledge is not required and only the multimode normal operation data are used to construct a reference dataset. For online monitoring of process state, LNSA applies moving window technique to obtain a current snapshot data window. Then neighborhood searching technique is used to acquire the corresponding local neighborhood data window from the reference dataset. Similarity analysis between snapshot and neighborhood data windows is performed, which includes the calculation of principal component analysis (PCA) similarity factor and distance similarity factor. The PCA similarity factor is to capture the change of data direction while the distance similarity factor is used for monitoring the shift of data center position. Based on these similarity factors, two monitoring statistics are built for multimode process fault detection. Final y a simulated continuous stirred tank system is used to demonstrate the effectiveness of the proposed method. The simulation results show that LNSA can detect multimode process changes effectively and performs better than traditional fault detection methods.

  19. Intelligent Signal Processing for Detection System Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Fu, C Y; Petrich, L I; Daley, P F; Burnham, A K

    2004-06-18

    A wavelet-neural network signal processing method has demonstrated approximately tenfold improvement in the detection limit of various nitrogen and phosphorus compounds over traditional signal-processing methods in analyzing the output of a thermionic detector attached to the output of a gas chromatograph. A blind test was conducted to validate the lower detection limit. All fourteen of the compound spikes were detected when above the estimated threshold, including all three within a factor of two above. In addition, two of six were detected at levels 1/2 the concentration of the nominal threshold. We would have had another two correct hits if we had allowed human intervention to examine the processed data. One apparent false positive in five nulls was traced to a solvent impurity, whose presence was identified by running a solvent aliquot evaporated to 1% residual volume, while the other four nulls were properly classified. We view this signal processing method as broadly applicable in analytical chemistry, and we advocate that advanced signal processing methods be applied as directly as possible to the raw detector output so that less discriminating preprocessing and post-processing does not throw away valuable signal.

  20. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  1. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  2. Linear discriminant analysis for welding fault detection

    International Nuclear Information System (INIS)

    This work presents a new method for real time welding fault detection in industry based on Linear Discriminant Analysis (LDA). A set of parameters was calculated from one second blocks of electrical data recorded during welding and based on control data from reference welds under good conditions, as well as faulty welds. Optimised linear combinations of the parameters were determined with LDA and tested with independent data. Short arc welds in overlap joints were studied with various power sources, shielding gases, wire diameters, and process geometries. Out-of-position faults were investigated. Application of LDA fault detection to a broad range of welding procedures was investigated using a similarity measure based on Principal Component Analysis. The measure determines which reference data are most similar to a given industrial procedure and the appropriate LDA weights are then employed. Overall, results show that Linear Discriminant Analysis gives an effective and consistent performance in real-time welding fault detection.

  3. Signal processing aspects of windshear detection

    Science.gov (United States)

    Aalfs, David D.; Baxa, Ernest G., Jr.; Bracalente, Emedio M.

    1993-01-01

    Low-altitude windshear (LAWS) has been identified as a major hazard to aircraft, particularly during takeoff and landing. The Federal Aviation Administration (FAA) has been involved with developing technology to detect LAWS. A key element in this technology is high resolution pulse Doppler weather radar equipped with signal and data processing to provide timely information about possible hazardous conditions.

  4. Signal processing for boiling noise detection

    International Nuclear Information System (INIS)

    The present paper deals with investigations of acoustic signals from a boiling experiment performed on the KNS I loop at KfK Karlsruhe. Signals have been analysed in frequency as well as in time domain. Signal characteristics successfully used to detect the boiling process have been found in time domain. (author). 6 refs, figs

  5. Chemical analysis of raw and processed Fructus arctii by high-performance liquid chromatography/diode array detection-electrospray ionization-mass spectrometry

    OpenAIRE

    Kunming Qin; Qidi Liu; Hao Cai; Gang Cao; Tulin Lu; Baojia Shen; Yachun Shu; Baochang Cai

    2014-01-01

    Background: In traditional Chinese medicine (TCM), raw and processed herbs are used to treat the different diseases. Fructus Arctii, the dried fruits of Arctium lappa l. (Compositae), is widely used in the TCM. Stir-frying is the most common processing method, which might modify the chemical compositions in Fructus Arctii. Materials and Methods: To test this hypothesis, we focused on analysis and identification of the main chemical constituents in raw and processed Fructus Arctii (PFA) by hig...

  6. Lung Cancer Detection Using Image Processing Techniques

    Directory of Open Access Journals (Sweden)

    Mokhled S. AL-TARAWNEH

    2012-08-01

    Full Text Available Recently, image processing techniques are widely used in several medical areas for image improvement in earlier detection and treatment stages, where the time factor is very important to discover the abnormality issues in target images, especially in various cancer tumours such as lung cancer, breast cancer, etc. Image quality and accuracy is the core factors of this research, image quality assessment as well as improvement are depending on the enhancement stage where low pre-processing techniques is used based on Gabor filter within Gaussian rules. Following the segmentation principles, an enhanced region of the object of interest that is used as a basic foundation of feature extraction is obtained. Relying on general features, a normality comparison is made. In this research, the main detected features for accurate images comparison are pixels percentage and mask-labelling.

  7. Detection of radiation processing in onions

    International Nuclear Information System (INIS)

    Two breeds of onions were used for irradiation. Both breeds were divided into two parts - the first was irradiated with a dose of 80 Gy and the second served as a control. The two parts were stored under the same conditions. Conductometry, liquid chromatography and spectrophotometry were used for detecting the radiation processing of the onions. Only from the spectrophotometric determination of 2-desoxysaccharides it was possible to safely distinguish irradiated onions from non-irradiated controls throughout storage time. (E.S.)

  8. Faulty Sensor Detection and Reconstruction for a PVC Making Process

    Institute of Scientific and Technical Information of China (English)

    李元; 周东华; 谢植; S.Joe.Qin

    2004-01-01

    Based on principal component analysis, this paper presents an application of faulty sensor detection and reconstruction in a batch process, polyvinylchloride (PVC) making process. To deal with inconsistency in process data, it is proposed to use the dynamic time warping technique to make the historical data synchronized first,then build a consistent multi-way principal component analysis model. Fault detection is carried out based on squared prediction error statistical control plot. By defining principal component subspace, residual subspace and sensor validity index, faulty sensor can be reconstructed and identified along the fault direction. Finally, application results are illustrated in detail by use of the real data of an industrial PVC making process.

  9. Amalgamation of Anomaly-Detection Indices for Enhanced Process Monitoring

    KAUST Repository

    Harrou, Fouzi

    2016-01-29

    Accurate and effective anomaly detection and diagnosis of modern industrial systems are crucial for ensuring reliability and safety and for maintaining desired product quality. Anomaly detection based on principal component analysis (PCA) has been studied intensively and largely applied to multivariate processes with highly cross-correlated process variables; howver conventional PCA-based methods often fail to detect small or moderate anomalies. In this paper, the proposed approach integrates two popular process-monitoring detection tools, the conventional PCA-based monitoring indices Hotelling’s T2 and Q and the exponentially weighted moving average (EWMA). We develop two EWMA tools based on the Q and T2 statistics, T2-EWMA and Q-EWMA, to detect anomalies in the process mean. The performances of the proposed methods were compared with that of conventional PCA-based anomaly-detection methods by applying each method to two examples: a synthetic data set and experimental data collected from a flow heating system. The results clearly show the benefits and effectiveness of the proposed methods over conventional PCA-based methods.

  10. Digital Signal Processing Based Real Time Vehicular Detection System

    Institute of Scientific and Technical Information of China (English)

    YANG Zhaoxuan; LIN Tao; LI Xiangping; LIU Chunyi; GAO Jian

    2005-01-01

    Traffic monitoring is of major importance for enforcing traffic management policies.To accomplish this task,the detection of vehicle can be achieved by exploiting image analysis techniques.In this paper,a solution is presented to obtain various traffic parameters through vehicular video detection system(VVDS).VVDS exploits the algorithm based on virtual loops to detect moving vehicle in real time.This algorithm uses the background differencing method,and vehicles can be detected through luminance difference of pixels between background image and current image.Furthermore a novel technology named as spatio-temporal image sequences analysis is applied to background differencing to improve detection accuracy.Then a hardware implementation of a digital signal processing (DSP) based board is described in detail and the board can simultaneously process four-channel video from different cameras. The benefit of usage of DSP is that images of a roadway can be processed at frame rate due to DSP′s high performance.In the end,VVDS is tested on real-world scenes and experiment results show that the system is both fast and robust to the surveillance of transportation.

  11. Adapting safety requirements analysis to intrusion detection

    Science.gov (United States)

    Lutz, R.

    2001-01-01

    Several requirements analysis techniques widely used in safety-critical systems are being adapted to support the analysis of secure systems. Perhaps the most relevant system safety techique for Intrusion Detection Systems is hazard analysis.

  12. Automatic detection of NIL defects using microscopy and image processing

    KAUST Repository

    Pietroy, David

    2013-12-01

    Nanoimprint Lithography (NIL) is a promising technology for low cost and large scale nanostructure fabrication. This technique is based on a contact molding-demolding process, that can produce number of defects such as incomplete filling, negative patterns, sticking. In this paper, microscopic imaging combined to a specific processing algorithm is used to detect numerically defects in printed patterns. Results obtained for 1D and 2D imprinted gratings with different microscopic image magnifications are presented. Results are independent on the device which captures the image (optical, confocal or electron microscope). The use of numerical images allows the possibility to automate the detection and to compute a statistical analysis of defects. This method provides a fast analysis of printed gratings and could be used to monitor the production of such structures. © 2013 Elsevier B.V. All rights reserved.

  13. Lameness detection in dairy cattle: single predictor v. multivariate analysis of image-based posture processing and behaviour and performance sensing.

    Science.gov (United States)

    Van Hertem, T; Bahr, C; Schlageter Tello, A; Viazzi, S; Steensels, M; Romanini, C E B; Lokhorst, C; Maltz, E; Halachmi, I; Berckmans, D

    2016-09-01

    The objective of this study was to evaluate if a multi-sensor system (milk, activity, body posture) was a better classifier for lameness than the single-sensor-based detection models. Between September 2013 and August 2014, 3629 cow observations were collected on a commercial dairy farm in Belgium. Human locomotion scoring was used as reference for the model development and evaluation. Cow behaviour and performance was measured with existing sensors that were already present at the farm. A prototype of three-dimensional-based video recording system was used to quantify automatically the back posture of a cow. For the single predictor comparisons, a receiver operating characteristics curve was made. For the multivariate detection models, logistic regression and generalized linear mixed models (GLMM) were developed. The best lameness classification model was obtained by the multi-sensor analysis (area under the receiver operating characteristics curve (AUC)=0.757±0.029), containing a combination of milk and milking variables, activity and gait and posture variables from videos. Second, the multivariate video-based system (AUC=0.732±0.011) performed better than the multivariate milk sensors (AUC=0.604±0.026) and the multivariate behaviour sensors (AUC=0.633±0.018). The video-based system performed better than the combined behaviour and performance-based detection model (AUC=0.669±0.028), indicating that it is worthwhile to consider a video-based lameness detection system, regardless the presence of other existing sensors in the farm. The results suggest that Θ2, the feature variable for the back curvature around the hip joints, with an AUC of 0.719 is the best single predictor variable for lameness detection based on locomotion scoring. In general, this study showed that the video-based back posture monitoring system is outperforming the behaviour and performance sensing techniques for locomotion scoring-based lameness detection. A GLMM with seven specific

  14. Process-based Security Detection Approach for Virtual Machines on Private Cloud Platforms

    OpenAIRE

    Xiyin Liu; Lijun Cao; Min Liu; Kun Han

    2013-01-01

    A process-based security detection method PAMon is proposed in this paper based on analysis of current security detection techniques for virtual machines on private cloud platforms. The modules of PAMon, including semantic reconstruction, hidden process detection, resource utilization analysis, comprehensive analysis, and so forth, are thoroughly analyzed and investigated. To validate the feasibility of PAMon, a miniaturized private cloud was configured aided by Xen and eucalyptus technology....

  15. Process analysis and optimal control

    International Nuclear Information System (INIS)

    The presented publication should considered to be an introduction into some problems of process analysis and optimal control with respect to complex systems. An essential aspect of the work is to present this matter with the view of using microcomputers. The publication contains chapters on some mathematical and theoretical terms, real time regime, process analysis, optimal control, and control of complex systems. (author)

  16. Dynamic analysis of process reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  17. Chemical detection, identification, and analysis system

    International Nuclear Information System (INIS)

    The chemical detection, identification, and analysis system (CDIAS) has three major goals. The first is to display safety information regarding chemical environment before personnel entry. The second is to archive personnel exposure to the environment. Third, the system assists users in identifying the stage of a chemical process in progress and suggests safety precautions associated with that process. In addition to these major goals, the system must be sufficiently compact to provide transportability, and it must be extremely simple to use in order to keep user interaction at a minimum. The system created to meet these goals includes several pieces of hardware and the integration of four software packages. The hardware consists of a low-oxygen, carbon monoxide, explosives, and hydrogen sulfide detector; an ion mobility spectrometer for airborne vapor detection; and a COMPAQ 386/20 portable computer. The software modules are a graphics kernel, an expert system shell, a data-base management system, and an interface management system. A supervisory module developed using the interface management system coordinates the interaction of the other software components. The system determines the safety of the environment using conventional data acquisition and analysis techniques. The low-oxygen, carbon monoxide, hydrogen sulfide, explosives, and vapor detectors are monitored for hazardous levels, and warnings are issued accordingly

  18. Fast Facial Detection by Depth Map Analysis

    OpenAIRE

    Ming-Yuan Shieh; Tsung-Min Hsieh

    2013-01-01

    In order to obtain correct facial recognition results, one needs to adopt appropriate facial detection techniques. Moreover, the effects of facial detection are usually affected by the environmental conditions such as background, illumination, and complexity of objectives. In this paper, the proposed facial detection scheme, which is based on depth map analysis, aims to improve the effectiveness of facial detection and recognition under different environmental illumination conditions. The pro...

  19. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  20. Big Data Analysis of Manufacturing Processes

    Science.gov (United States)

    Windmann, Stefan; Maier, Alexander; Niggemann, Oliver; Frey, Christian; Bernardi, Ansgar; Gu, Ying; Pfrommer, Holger; Steckel, Thilo; Krüger, Michael; Kraus, Robert

    2015-11-01

    The high complexity of manufacturing processes and the continuously growing amount of data lead to excessive demands on the users with respect to process monitoring, data analysis and fault detection. For these reasons, problems and faults are often detected too late, maintenance intervals are chosen too short and optimization potential for higher output and increased energy efficiency is not sufficiently used. A possibility to cope with these challenges is the development of self-learning assistance systems, which identify relevant relationships by observation of complex manufacturing processes so that failures, anomalies and need for optimization are automatically detected. The assistance system developed in the present work accomplishes data acquisition, process monitoring and anomaly detection in industrial and agricultural processes. The assistance system is evaluated in three application cases: Large distillation columns, agricultural harvesting processes and large-scale sorting plants. In this paper, the developed infrastructures for data acquisition in these application cases are described as well as the developed algorithms and initial evaluation results.

  1. Distributed Anomaly Detection using Minimum Volume Elliptical Principal Component Analysis

    OpenAIRE

    O'Reilly, CE; Gluhak, A.; Imran, A.

    2016-01-01

    Principal component analysis and the residual error is an effective anomaly detection technique. In an environment where anomalies are present in the training set, the derived principal components can be skewed by the anomalies. A further aspect of anomaly detection is that data might be distributed across different nodes in a network and their communication to a centralized processing unit is prohibited due to communication cost. Current solutions to distributed anomaly detection rely on a h...

  2. Traffic sign detection and analysis

    DEFF Research Database (Denmark)

    Møgelmose, Andreas; Trivedi, Mohan M.; Moeslund, Thomas B.

    Traffic sign recognition (TSR) is a research field that has seen much activity in the recent decade. This paper introduces the problem and presents 4 recent papers on traffic sign detection and 4 recent papers on traffic sign classification. It attempts to extract recent trends in the field and...

  3. Risk analysis in work process

    International Nuclear Information System (INIS)

    The work organizing process integrated with risk analysis of Gadding Daya Bay Nuclear Power Station (GNPS) is depicted. This process is based upon worldwide successful practice and experience with the special characteristics of the plant organization and culture taken into account

  4. Image processing and analysis software development

    International Nuclear Information System (INIS)

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  5. Altered fingerprints: analysis and detection.

    Science.gov (United States)

    Yoon, Soweon; Feng, Jianjiang; Jain, Anil K

    2012-03-01

    The widespread deployment of Automated Fingerprint Identification Systems (AFIS) in law enforcement and border control applications has heightened the need for ensuring that these systems are not compromised. While several issues related to fingerprint system security have been investigated, including the use of fake fingerprints for masquerading identity, the problem of fingerprint alteration or obfuscation has received very little attention. Fingerprint obfuscation refers to the deliberate alteration of the fingerprint pattern by an individual for the purpose of masking his identity. Several cases of fingerprint obfuscation have been reported in the press. Fingerprint image quality assessment software (e.g., NFIQ) cannot always detect altered fingerprints since the implicit image quality due to alteration may not change significantly. The main contributions of this paper are: 1) compiling case studies of incidents where individuals were found to have altered their fingerprints for circumventing AFIS, 2) investigating the impact of fingerprint alteration on the accuracy of a commercial fingerprint matcher, 3) classifying the alterations into three major categories and suggesting possible countermeasures, 4) developing a technique to automatically detect altered fingerprints based on analyzing orientation field and minutiae distribution, and 5) evaluating the proposed technique and the NFIQ algorithm on a large database of altered fingerprints provided by a law enforcement agency. Experimental results show the feasibility of the proposed approach in detecting altered fingerprints and highlight the need to further pursue this problem. PMID:21808092

  6. Method Validation for the Quantitative Analysis of Aflatoxins (B1, B2, G1, and G2) and Ochratoxin A in Processed Cereal-Based Foods by HPLC with Fluorescence Detection.

    Science.gov (United States)

    Gazioğlu, Işil; Kolak, Ufuk

    2015-01-01

    Modified AOAC 991.31 and AOAC 2000.03 methods for the simultaneous determination of total aflatoxins (AFs), aflatoxin B1, and ochratoxin A (OTA) in processed cereal-based foods by RP-HPLC coupled with fluorescence detection were validated. A KOBRA® Cell derivatization system was used to analyze total AFs. One of the modifications was the extraction procedure of mycotoxins. Both AFs and OTA were extracted with methanol-water (75+25, v/v) and purified with an immunoaffinity column before HPLC analysis. The modified methods were validated by measuring the specificity, selectivity, linearity, sensitivity, accuracy, repeatability, reproducibility, recovery, LOD, and LOQ parameters. The validated methods were successfully applied for the simultaneous determination of mycotoxins in 81 processed cereal-based foods purchased in Turkey. These rapid, sensitive, simple, and validated methods are suitable for the simultaneous determination of AFs and OTA in the processed cereal-based foods. PMID:26268976

  7. Olfactory processing: detection of rapid changes.

    Science.gov (United States)

    Croy, Ilona; Krone, Franziska; Walker, Susannah; Hummel, Thomas

    2015-06-01

    Changes in the olfactory environment have a rather poor chance of being detected. Aim of the present study was to determine, whether the same (cued) or different (uncued) odors can generally be detected at short inter stimulus intervals (ISI) below 2.5 s. Furthermore we investigated, whether inhibition of return, an attentional phenomenon facilitating the detection of new stimuli at longer ISI, is present in the domain of olfaction. Thirteen normosmic people (3 men, 10 women; age range 19-27 years; mean age 23 years) participated. Stimulation was performed using air-dilution olfactometry with 2 odors: phenylethylalcohol and hydrogen disulfide. Reaction time to target stimuli was assessed in cued and uncued conditions at ISIs of 1, 1.5, 2, and 2.5 s. There was a significant main effect of ISI, indicating that odors presented only 1 s apart are missed frequently. Uncued presentation facilitated detection at short ISIs, implying that changes of the olfactory environment are detected better than presentation of the same odor again. Effects in relation to "olfactory inhibition of return," on the other hand, are not supported by our results. This suggests that attention works different for the olfactory system compared with the visual and auditory systems. PMID:25911421

  8. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  9. Advanced Signal Processing for Thermal Flaw Detection; TOPICAL

    International Nuclear Information System (INIS)

    Dynamic thermography is a promising technology for inspecting metallic and composite structures used in high-consequence industries. However, the reliability and inspection sensitivity of this technology has historically been limited by the need for extensive operator experience and the use of human judgment and visual acuity to detect flaws in the large volume of infrared image data collected. To overcome these limitations new automated data analysis algorithms and software is needed. The primary objectives of this research effort were to develop a data processing methodology that is tied to the underlying physics, which reduces or removes the data interpretation requirements, and which eliminates the need to look at significant numbers of data frames to determine if a flaw is present. Considering the strengths and weakness of previous research efforts, this research elected to couple both the temporal and spatial attributes of the surface temperature. Of the possible algorithms investigated, the best performing was a radiance weighted root mean square Laplacian metric that included a multiplicative surface effect correction factor and a novel spatio-temporal parametric model for data smoothing. This metric demonstrated the potential for detecting flaws smaller than 0.075 inch in inspection areas on the order of one square foot. Included in this report is the development of a thermal imaging model, a weighted least squares thermal data smoothing algorithm, simulation and experimental flaw detection results, and an overview of the ATAC (Automated Thermal Analysis Code) software that was developed to analyze thermal inspection data

  10. Processing Ocean Images to Detect Large Drift Nets

    Science.gov (United States)

    Veenstra, Tim

    2009-01-01

    A computer program processes the digitized outputs of a set of downward-looking video cameras aboard an aircraft flying over the ocean. The purpose served by this software is to facilitate the detection of large drift nets that have been lost, abandoned, or jettisoned. The development of this software and of the associated imaging hardware is part of a larger effort to develop means of detecting and removing large drift nets before they cause further environmental damage to the ocean and to shores on which they sometimes impinge. The software is capable of near-realtime processing of as many as three video feeds at a rate of 30 frames per second. After a user sets the parameters of an adjustable algorithm, the software analyzes each video stream, detects any anomaly, issues a command to point a high-resolution camera toward the location of the anomaly, and, once the camera has been so aimed, issues a command to trigger the camera shutter. The resulting high-resolution image is digitized, and the resulting data are automatically uploaded to the operator s computer for analysis.

  11. Less is More: Data Processing with SVM for Intrusion Detection

    Institute of Scientific and Technical Information of China (English)

    XIAO Hai-jun; HONG Fan; WANG Ling

    2009-01-01

    To improve the detection rate and lower down the false positive rate in intrusion detection system,dimensionality reduction is widely used in the intrusion detection system.For this purpose,a data processing (DP) with support vector machine (SVM) was built.Different from traditionally identifying the redundant data before purging the audit data by expert knowledge or utilizing different kinds of subsets of the available 41-connection attributes to build a classifier,the proposed strategy first removes the attributes whose correlation with another attribute exceeds a threshold,and then classifies two sequence samples as one class while removing either of the two samples whose similarity exceeds a threshold.The results of performance experiments showed that the strategy of DP and SVM is superior to the other existing data reduction strategies (e.g.,audit reduction,rule extraction,and feature selection),and that the detection model based on DP and SVM outperforms those based on data mining,soft computing,and hierarchical principal component analysis neural networks.

  12. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  13. Windshear detection radar signal processing studies

    Science.gov (United States)

    Baxa, Ernest G., Jr.

    1993-01-01

    This final report briefly summarizes research work at Clemson in the Radar Systems Laboratory under the NASA Langley Research Grant NAG-1-928 in support of the Antenna and Microwave Branch, Guidance and Control Division, program to develop airborne sensor technology for the detection of low altitude windshear. A bibliography of all publications generated by Clemson personnel is included. An appendix provides abstracts of all publications.

  14. Crack Length Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    1990-01-01

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve a...... resolution better then that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring with...

  15. Crack Detection by Digital Image Processing

    DEFF Research Database (Denmark)

    Lyngbye, Janus; Brincker, Rune

    It is described how digital image processing is used for measuring the length of fatigue cracks. The system is installed in a Personal, Computer equipped with image processing hardware and performs automated measuring on plane metal specimens used in fatigue testing. Normally one can not achieve a...... resolution better than that of the image processing equipment. To overcome this problem an extrapolation technique is used resulting in a better resolution. The system was tested on a specimen loaded with different loads. The error σa was less than 0.031 mm, which is of the same size as human measuring with...

  16. Crack Detection with Lamb Wave Wavenumber Analysis

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Rogge, Matt; Yu, Lingyu

    2013-01-01

    In this work, we present our study of Lamb wave crack detection using wavenumber analysis. The aim is to demonstrate the application of wavenumber analysis to 3D Lamb wave data to enable damage detection. The 3D wavefields (including vx, vy and vz components) in time-space domain contain a wealth of information regarding the propagating waves in a damaged plate. For crack detection, three wavenumber analysis techniques are used: (i) two dimensional Fourier transform (2D-FT) which can transform the time-space wavefield into frequency-wavenumber representation while losing the spatial information; (ii) short space 2D-FT which can obtain the frequency-wavenumber spectra at various spatial locations, resulting in a space-frequency-wavenumber representation; (iii) local wavenumber analysis which can provide the distribution of the effective wavenumbers at different locations. All of these concepts are demonstrated through a numerical simulation example of an aluminum plate with a crack. The 3D elastodynamic finite integration technique (EFIT) was used to obtain the 3D wavefields, of which the vz (out-of-plane) wave component is compared with the experimental measurement obtained from a scanning laser Doppler vibrometer (SLDV) for verification purposes. The experimental and simulated results are found to be in close agreement. The application of wavenumber analysis on 3D EFIT simulation data shows the effectiveness of the analysis for crack detection. Keywords: : Lamb wave, crack detection, wavenumber analysis, EFIT modeling

  17. Real-Time Plasma Process Condition Sensing and Abnormal Process Detection

    Directory of Open Access Journals (Sweden)

    Ryan Yang

    2010-06-01

    Full Text Available The plasma process is often used in the fabrication of semiconductor wafers. However, due to the lack of real-time etching control, this may result in some unacceptable process performances and thus leads to significant waste and lower wafer yield. In order to maximize the product wafer yield, a timely and accurately process fault or abnormal detection in a plasma reactor is needed. Optical emission spectroscopy (OES is one of the most frequently used metrologies in in-situ process monitoring. Even though OES has the advantage of non-invasiveness, it is required to provide a huge amount of information. As a result, the data analysis of OES becomes a big challenge. To accomplish real-time detection, this work employed the sigma matching method technique, which is the time series of OES full spectrum intensity. First, the response model of a healthy plasma spectrum was developed. Then, we defined a matching rate as an indictor for comparing the difference between the tested wafers response and the health sigma model. The experimental results showed that this proposal method can detect process faults in real-time, even in plasma etching tools.

  18. Matrix Characterization in Threat Material Detection Processes

    Science.gov (United States)

    Obhodas, J.; Sudac, D.; Valkovic, V.

    2009-03-01

    Matrix characterization in the threat material detection is of utmost importance, it generates the background against which the threat material signal has to be identified. Threat materials (explosive, chemical warfare, …) are usually contained within small volume inside large volumes of variable matrices. We have studied the influence of matrix materials on the capability of neutron systems to identify hidden threat material. Three specific scenarios are considered in some details: case 1—contraband material in the sea containers, case 2—-explosives in soil (landmines), case 3—explosives and chemical warfare on the sea bottom. Effects of container cargo material on tagged neutron system are seen in the increase of gamma background and the decrease of neutron beam intensity. Detection of landmines is more complex because of variable soil properties. We have studied in detail space and time variations of soil elemental compositions and in particular hydrogen content (humidity). Of special interest are ammunitions and chemical warfare on the sea bottom, damping sites and leftovers from previous conflicts (WW-I, WW-II and local). In this case sea sediment is background source and its role is similar to the role of the soil in the landmine detection. In addition to geochemical cycling of chemical elements in semi-enclosed sea, like the Adriatic Sea, one has to consider also anthropogenic influence, especially when studying small scale variations in concentration levels. Some preliminary experimental results obtained with tagged neutron sensor inside an underwater vehicle are presented as well as data on sediment characterization by X-Ray Fluorescence.

  19. Matrix Characterization in Threat Material Detection Processes

    International Nuclear Information System (INIS)

    Matrix characterization in the threat material detection is of utmost importance, it generates the background against which the threat material signal has to be identified. Threat materials (explosive, chemical warfare, ...) are usually contained within small volume inside large volumes of variable matrices. We have studied the influence of matrix materials on the capability of neutron systems to identify hidden threat material. Three specific scenarios are considered in some details: case 1--contraband material in the sea containers, case 2 - explosives in soil (landmines), case 3 - explosives and chemical warfare on the sea bottom. Effects of container cargo material on tagged neutron system are seen in the increase of gamma background and the decrease of neutron beam intensity. Detection of landmines is more complex because of variable soil properties. We have studied in detail space and time variations of soil elemental compositions and in particular hydrogen content (humidity). Of special interest are ammunitions and chemical warfare on the sea bottom, damping sites and leftovers from previous conflicts (WW-I, WW-II and local). In this case sea sediment is background source and its role is similar to the role of the soil in the landmine detection. In addition to geochemical cycling of chemical elements in semi-enclosed sea, like the Adriatic Sea, one has to consider also anthropogenic influence, especially when studying small scale variations in concentration levels. Some preliminary experimental results obtained with tagged neutron sensor inside an underwater vehicle are presented as well as data on sediment characterization by X-Ray Fluorescence.

  20. Full motion detection system with post-processing

    OpenAIRE

    Douadi, L.; Khoudour, L.; A. Chaari; BOONAERT,J

    2010-01-01

    This paper presents a full motion detection system with post-processing applied to video surveillance. Motion detection is performed based on background subtraction (BGS). Our purpose is to show how an appropriate post-processing improves segmentation result provided by a BGS technique from the literature. First, BGS is performed using the codebook algorithm [8]. Post-processing is then applied on the raw segmented images to enhance the segmentation quality. The proposed post-processing proce...

  1. Processing of Graphene combining Optical Detection and Scanning Probe Lithography

    Directory of Open Access Journals (Sweden)

    Zimmermann Sören

    2015-01-01

    Full Text Available This paper presents an experimental setup tailored for robotic processing of graphene with in-situ vision based control. A robust graphene detection approach is presented applying multiple image processing operations of the visual feedback provided by a high-resolution light microscope. Detected graphene flakes can be modified using a scanning probe based lithographical process that is directly linked to the in-situ optical images. The results of this process are discussed with respect to further application scenarios.

  2. SENTIMENT ANALYSIS FOR ONLINE FORUMS HOTSPOT DETECTION

    Directory of Open Access Journals (Sweden)

    K. Nirmala Devi

    2012-01-01

    Full Text Available The user generated content on the web grows rapidly in this emergent information age. The evolutionary changes in technology make use of such information to capture only the user’s essence and finally the useful information are exposed to information seekers. Most of the existing research on text information processing, focuses in the factual domain rather than the opinion domain. Text mining plays a vital role in online forum opinion mining. But opinion mining from online forum is much more difficult than pure text process due to their semi structured characteristics. In this paper we detect online hotspot forums by computing sentiment analysis for text data available in each forum. This approach analyzes the forum text data and computes value for each piece of text. The proposed approach combines K-means clustering and Support Vector Machine (SVM classification algorithm that can be used to group the forums into two clusters forming hotspot forums and non-hotspot forums within the current time span. The experiment helps to identify that K-means and SVM together achieve highly consistent results. The prediction result of SVM is also compared with other classifiers such as Naïve Bayes, Decision tree and among them SVM performs the best.

  3. Measurement Bias Detection through Factor Analysis

    Science.gov (United States)

    Barendse, M. T.; Oort, F. J.; Werner, C. S.; Ligtvoet, R.; Schermelleh-Engel, K.

    2012-01-01

    Measurement bias is defined as a violation of measurement invariance, which can be investigated through multigroup factor analysis (MGFA), by testing across-group differences in intercepts (uniform bias) and factor loadings (nonuniform bias). Restricted factor analysis (RFA) can also be used to detect measurement bias. To also enable nonuniform…

  4. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  5. Social network analysis community detection and evolution

    CERN Document Server

    Missaoui, Rokia

    2015-01-01

    This book is devoted to recent progress in social network analysis with a high focus on community detection and evolution. The eleven chapters cover the identification of cohesive groups, core components and key players either in static or dynamic networks of different kinds and levels of heterogeneity. Other important topics in social network analysis such as influential detection and maximization, information propagation, user behavior analysis, as well as network modeling and visualization are also presented. Many studies are validated through real social networks such as Twitter. This edit

  6. Signal processing in cryogenic particle detection

    Energy Technology Data Exchange (ETDEWEB)

    Yuryev, Y.N. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Jang, Y.S. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Kim, S.K. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Lee, K.B.; Lee, M.K. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Lee, S.J. [Department of Physics and Astronomy, Seoul National University, Seoul (Korea, Republic of); Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Yoon, W.S. [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of); Kim, Y.H., E-mail: yhkim@kriss.re.k [Korea Research Institute of Standards and Science (KRISS), Daejeon (Korea, Republic of)

    2011-04-11

    We describe a signal-processing program for a data acquisition system for cryogenic particle detectors. The program is based on an optimal-filtering method for high-resolution measurement of calorimetric signals with a significant amount of noise of unknown origin and non-stationary behavior. The program was applied to improve the energy resolution of the alpha particle spectrum of an {sup 241}Am source.

  7. Signal processing in cryogenic particle detection

    Science.gov (United States)

    Yuryev, Y. N.; Jang, Y. S.; Kim, S. K.; Lee, K. B.; Lee, M. K.; Lee, S. J.; Yoon, W. S.; Kim, Y. H.

    2011-04-01

    We describe a signal-processing program for a data acquisition system for cryogenic particle detectors. The program is based on an optimal-filtering method for high-resolution measurement of calorimetric signals with a significant amount of noise of unknown origin and non-stationary behavior. The program was applied to improve the energy resolution of the alpha particle spectrum of an 241Am source.

  8. Signal processing in cryogenic particle detection

    International Nuclear Information System (INIS)

    We describe a signal-processing program for a data acquisition system for cryogenic particle detectors. The program is based on an optimal-filtering method for high-resolution measurement of calorimetric signals with a significant amount of noise of unknown origin and non-stationary behavior. The program was applied to improve the energy resolution of the alpha particle spectrum of an 241Am source.

  9. Analysis of weld seam uniformity through temperature distribution by spatially resolved detector elements in the wavelength range of 0.3μm to 5μm for the detection of structural changing heating and cooling processes

    Science.gov (United States)

    Lempe, B.; Maschke, R.; Rudek, F.; Baselt, T.; Hartmann, P.

    2016-03-01

    Online process control systems often only detecting temperatures at a local area of the machining point and determining an integrated value. In order to determine the proper welding quality and the absence of defects, such as temperature induced stress cracks, it is necessary to do time and space resolved measurements before, during and after the production process. The system under development consists of a beam splitting unit which divides the electromagnetic radiation of the heated component on two different sensor types. For high temperatures, a sensor is used which is sensitive in the visible spectrum and has a dynamic range of 120dB.1 Thus, very high intensity differences can be displayed and a direct analysis of the temperature profile of the weld spots is possible.2 A second sensor is operating in the wavelength range from 1 micron to 5 microns and allows the determination of temperatures from approximately 200°C.3 At the beginning of a welding process, the heat-up phase of the metal is critical to the resultant weld quality. If a defined temperature range exceeded too fast, the risk of cracking is significantly increased.4 During the welding process the thermal supervision of the central processing location is decisive for a high secure weld. In the border areas as well as in connection of the welding process especially cooling processes are crucial for the homogeneity of the results. In order to obtain sufficiently accurate resolution of the dynamic heating- and cooling-processes, the system can carry out up to 500 frames per second.

  10. Processing of GPR data from NIITEK landmine detection system

    Science.gov (United States)

    Legarsky, Justin J.; Broach, J. T.; Bishop, Steven S.

    2003-09-01

    In this paper, a signal processing approach for wide-bandwidth ground-penetrating-radar imagery from Non-Intrusive Inspection Technology, Incorporated (NIITEK) vehicle-mounted landmine detection sensor is investigated. The approach consists of a sequence of processing steps, which include signal filtering, image enhancement and detection. Filtering strategies before detection aid in image visualization by reducing ground bounce, systematic effects and redundant signals. Post-filter image processing helps by enhancing landmine signatures in the NIITEK radar imagery. Study results from applying this signal processing approach are presented for test minefield lane data, which were collected during 2002 from an Army test site.

  11. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  12. Risk analysis: opening the process

    International Nuclear Information System (INIS)

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but now debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risks analysis, even in its most (technical' steps. For stakeholders of different professional background, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and accommodated in action proposals, ranging from countermeasures to educational programs to insurance mechanisms. Management formats must be open to local and political input and other types of operational feedback. (authors)

  13. Risk analysis: opening the process

    Energy Technology Data Exchange (ETDEWEB)

    Hubert, Ph. [CEA/Fontenay-aux-Roses, Inst. de Protection et de Surete Nucleaire, IPSN, 92 (France); Mays, C. [Institut Symlog, 94 - Cachan (France)

    1998-07-01

    This conference on risk analysis took place in Paris, 11-14 october 1999. Over 200 paper where presented in the seven following sessions: perception; environment and health; persuasive risks; objects and products; personal and collective involvement; assessment and valuation; management. Most of the papers are analysed in the INIS data base and twelve in the ETDE data base. A rational approach to risk analysis has been developed in the three last decades. Techniques for risk assessment have been thoroughly enhanced, risk management approaches have been developed, decision making processes have been clarified, the social dimensions of risk perception and management have been investigated. Nevertheless this construction is being challenged by recent events which reveal how deficits in stakeholder involvement, openness and democratic procedures can undermine risk management actions. Indeed, the global process and most components of risk analysis may be radically called into question. Food safety has lately been a prominent issue, but new debates appear, or old debates are revisited in the domains of public health, consumer products safety, waste management, environmental risks, nuclear installations, automobile safety and pollution. To meet the growing pressures for efficiency, openness, accountability, and multi-partner communication in risk analysis, institutional changes are underway in many European countries. However, the need for stakeholders to develop better insight into the process may lead to an evolution of all the components of risk analysis, even in its most 'technical' steps. For stakeholders of different professional backgrounds, political projects, and responsibilities, risk identification procedures must be rendered understandable, quantitative risk assessment must be intelligible and verifiable. Public perceptions and behavioural preferences must be identified and accommodated in action proposals, ranging from countermeasures to educational

  14. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  15. Preliminary hazards analysis -- vitrification process

    International Nuclear Information System (INIS)

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility's construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment

  16. Image Processing and Data Analysis

    Science.gov (United States)

    Starck, Jean-Luc; Murtagh, Fionn D.; Bijaoui, Albert

    1998-07-01

    Powerful techniques have been developed in recent years for the analysis of digital data, especially the manipulation of images. This book provides an in-depth introduction to a range of these innovative, avant-garde data-processing techniques. It develops the reader's understanding of each technique and then shows with practical examples how they can be applied to improve the skills of graduate students and researchers in astronomy, electrical engineering, physics, geophysics and medical imaging. What sets this book apart from others on the subject is the complementary blend of theory and practical application. Throughout, it is copiously illustrated with real-world examples from astronomy, electrical engineering, remote sensing and medicine. It also shows how many, more traditional, methods can be enhanced by incorporating the new wavelet and multiscale methods into the processing. For graduate students and researchers already experienced in image processing and data analysis, this book provides an indispensable guide to a wide range of exciting and original data-analysis techniques.

  17. Detection of insincere grips: multivariate analysis approach

    OpenAIRE

    Dasari, B.D.; Leung, K.F.

    2004-01-01

    BACKGROUND Smith and Chengalur were successful in using grip test and the cut-off criterion method to detect fake grips in healthy subjects and subjects with hand injuries. The purpose of this study is to test if methods other than the cut-off method, i.e. discriminate analysis and logistic analysis methods, cn be used to provide a more accurate detection of fake grips, with the use of sustain grip test. METHOD Two groups of subjects were recruited. Group one consisted of 40 healthy subjects ...

  18. A visual analysis of the process of process modeling

    OpenAIRE

    Claes, J Jan; Vanderfeesten, ITP Irene; Pinggera, J.; Reijers, HA Hajo; Weber, B.; Poels, G

    2015-01-01

    The construction of business process models has become an important requisite in the analysis and optimization of processes. The success of the analysis and optimization efforts heavily depends on the quality of the models. Therefore, a research domain emerged that studies the process of process modeling. This paper contributes to this research by presenting a way of visualizing the different steps a modeler undertakes to construct a process model, in a so-called process of process modeling C...

  19. Instrumenting the Intelligence Analysis Process

    Energy Technology Data Exchange (ETDEWEB)

    Hampson, Ernest; Cowley, Paula J.

    2005-05-02

    The Advanced Research and Development Activity initiated the Novel Intelligence from Massive Data (NIMD) program to develop advanced analytic technologies and methodologies. In order to support this objective, researchers and developers need to understand what analysts do and how they do it. In the past, this knowledge generally was acquired through subjective feedback from analysts. NIMD established the innovative Glass Box Analysis (GBA) Project to instrument a live intelligence mission and unobtrusively capture and objectively study the analysis process. Instrumenting the analysis process requires tailor-made software hooks that grab data from a myriad of disparate application operations and feed into a complex relational database and hierarchical file store to collect, store, retrieve, and distribute analytic data in a manner that maximizes researchers’ understanding. A key to success is determining the correct data to collect and aggregate low-level data into meaningful analytic events. This paper will examine how the GBA team solved some of these challenges, continues to address others, and supports a growing user community in establishing their own GBA environments and/or studying the data generated by GBA analysts working in the Glass Box.

  20. Generating functional analysis of CDMA detection dynamics

    International Nuclear Information System (INIS)

    We investigate the detection dynamics of the parallel interference canceller (PIC) for code-division multiple-access (CDMA) multiuser detection, applied to a randomly spread, fully synchronous base-band uncoded CDMA channel model with additive white Gaussian noise (AWGN) under perfect power control in the large-system limit. It is known that the predictions of the density evolution (DE) can fairly explain the detection dynamics only in the case where the detection dynamics converge. At transients, though, the predictions of DE systematically deviate from computer simulation results. Furthermore, when the detection dynamics fail to converge, the deviation of the predictions of DE from the results of numerical experiments becomes large. As an alternative, generating functional analysis (GFA) can take into account the effect of the Onsager reaction term exactly and does not need the Gaussian assumption of the local field. We present GFA to evaluate the detection dynamics of PIC for CDMA multiuser detection. The predictions of GFA exhibit good consistency with the computer simulation result for any condition, even if the dynamics fail to converge

  1. Signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    At the Specialists' Meeting on Sodium Boiling Detection organized by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency at Chester in the United Kingdom in 1981 various methods of detecting sodium boiling were reported. But, it was not possible to make a comparative assessment of these methods because the signal condition in each experiment was different from others. That is why participants of this meeting recommended that a benchmark test should be carried out in order to evaluate and compare signal processing methods for boiling detection. Organization of the Co-ordinated Research Programme (CRP) on signal processing techniques for sodium boiling noise detection was also recommended at the 16th meeting of the IWGFR. The CRP on Signal Processing Techniques for Sodium Boiling Noise Detection was set up in 1984. Eight laboratories from six countries have agreed to participate in this CRP. The overall objective of the programme was the development of reliable on-line signal processing techniques which could be used for the detection of sodium boiling in an LMFBR core. During the first stage of the programme a number of existing processing techniques used by different countries have been compared and evaluated. In the course of further work, an algorithm for implementation of this sodium boiling detection system in the nuclear reactor will be developed. It was also considered that the acoustic signal processing techniques developed for boiling detection could well make a useful contribution to other acoustic applications in the reactor. This publication consists of two parts. Part I is the final report of the co-ordinated research programme on signal processing techniques for sodium boiling noise detection. Part II contains two introductory papers and 20 papers presented at four research co-ordination meetings since 1985. A separate abstract was prepared for each of these 22 papers. Refs, figs and tabs

  2. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING & EVALUATION MEHTODS & REQUIREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    SCHOFIELD JS

    2007-10-04

    This document has two purposes: {sm_bullet} Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. {sm_bullet} Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals.

  3. Fault Management: Degradation Signature Detection, Modeling, and Processing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Fault to Failure Progression (FFP) signature modeling and processing is a new method for applying condition-based signal data to detect degradation, to identify...

  4. Visualizing the analysis process: CZSaw's History View

    OpenAIRE

    Kadivar, Nazanin

    2011-01-01

    Capturing and visualizing the data analysis process is a growing research domain. Visual Analytics tool designers try to understand the analysis process in order to provide better tools. Existing research in process visualization suggests that capturing and visualizing the history of the analysis process is an effective form of process visualization. CZSaw is a Visual Analytics tool that provides visual representations of both data and the analysis process. In this thesis, we discuss the desi...

  5. Gaussian processes for state space models and change point detection

    OpenAIRE

    Turner, Ryan Darby

    2012-01-01

    This thesis details several applications of Gaussian processes (GPs) for enhanced time series modeling. We first cover different approaches for using Gaussian processes in time series problems. These are extended to the state space approach to time series in two different problems. We also combine Gaussian processes and Bayesian online change point detection (BOCPD) to increase the generality of the Gaussian process time series methods. These methodologies are evaluated on predict...

  6. Processing of radionuclide cystography using factor analysis of dynamic structures

    International Nuclear Information System (INIS)

    Radionuclide cystography allows the detection of vesico-ureteral reflux. Numerical image processing using the region of interest (ROI) method can improve the sensitivity of the detection, but with some disadvantages, particularly, the lack of reproducibility. The use of factor analysis of dynamic structures can avoid this defects and enable vesico-ureteral reflux to be analysed automatically

  7. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  8. Nonlinear processing of radar data for landmine detection

    Science.gov (United States)

    Bartosz, Elizabeth E.; DeJong, Keith; Duvoisin, Herbert A.; Solomon, Geoff Z.; Steinway, William J.; Warren, Albert

    2004-09-01

    Outstanding landmine detection has been achieved by the Handheld Standoff Mine Detection System (HSTAMIDS system) in government-run field tests. The use of anomaly detection using principal component analysis (PCA) on the return of ground penetrating radar (GPR) coupled with metal detection is the key to the success of the HSTAMIDS-like system algorithms. Indications of nonlinearities and asymmetries in Humanitarian Demining (HD) data point to modifications to the current PCA algorithm that might prove beneficial. Asymmetries in the distribution of PCA projections of field data have been quantified in Humanitarian Demining (HD) data. An initial correction for the observed asymmetries has improved the False Alarm Rate (FAR) on this data.

  9. Sequential Detection of Fission Processes for Harbor Defense

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J V; Walston, S E; Chambers, D H

    2015-02-12

    With the large increase in terrorist activities throughout the world, the timely and accurate detection of special nuclear material (SNM) has become an extremely high priority for many countries concerned with national security. The detection of radionuclide contraband based on their γ-ray emissions has been attacked vigorously with some interesting and feasible results; however, the fission process of SNM has not received as much attention due to its inherent complexity and required predictive nature. In this paper, on-line, sequential Bayesian detection and estimation (parameter) techniques to rapidly and reliably detect unknown fissioning sources with high statistical confidence are developed.

  10. Molecular sieves analysis by elastic recoil detection

    International Nuclear Information System (INIS)

    The opportunity of water determination in zeolites via hydrogen detection using the elastic recoil detection analysis (ERDA) was investigated. The radiation effect upon the desorption rate of hydrogen in miscellaneous types of zeolites, e.g. Y-Faujasite, ZSM-5, SK, etc. and in a natural clay, e.g. an Algerian bentonite was discussed. Quantitative measurements were carried out in order to determine the amount and distribution shape of hydrogen in each material. Various explanations dealing with hydration and constitution water in such a crystalline framework were proposed. The experimental results are in a good agreement with the corresponding theoretical values

  11. A signal processing method for the friction-based endpoint detection system of a CMP process

    Energy Technology Data Exchange (ETDEWEB)

    Xu Chi; Guo Dongming; Jin Zhuji; Kang Renke, E-mail: xuchi_dut@163.com [Key Laboratory for Precision and Non-Traditional Machining Technology of Ministry of Education, Dalian University of Technology, Dalian 116024 (China)

    2010-12-15

    A signal processing method for the friction-based endpoint detection system of a chemical mechanical polishing (CMP) process is presented. The signal process method uses the wavelet threshold denoising method to reduce the noise contained in the measured original signal, extracts the Kalman filter innovation from the denoised signal as the feature signal, and judges the CMP endpoint based on the feature of the Kalman filter innovation sequence during the CMP process. Applying the signal processing method, the endpoint detection experiments of the Cu CMP process were carried out. The results show that the signal processing method can judge the endpoint of the Cu CMP process. (semiconductor technology)

  12. Application of processing in performance analysis of business processes

    OpenAIRE

    Marcelyn, Christy

    2011-01-01

    This thesis discusses the application of process mining concept in performance analysis. A case study is used and the log file is provided in this research. Further, some methodologies, such as the process diagnostic methodology and performance sequence diagram analysis are used to examine the event logs. The ProM framework with plug-ins that supports process performance analysis is used to make this analysis achievable. This thesis will also cover the creation of an event log, which is how i...

  13. Operational analysis for the drug detection problem

    Science.gov (United States)

    Hoopengardner, Roger L.; Smith, Michael C.

    1994-10-01

    New techniques and sensors to identify the molecular, chemical, or elemental structures unique to drugs are being developed under several national programs. However, the challenge faced by U.S. drug enforcement and Customs officials goes far beyond the simple technical capability to detect an illegal drug. Entry points into the U.S. include ports, border crossings, and airports where cargo ships, vehicles, and aircraft move huge volumes of freight. Current technology and personnel are able to physically inspect only a small fraction of the entering cargo containers. The complexities of how to best utilize new technology to aid the detection process and yet not adversely affect the processing of vehicles and time-sensitive cargo is the challenge faced by these officials. This paper describes an ARPA sponsored initiative to develop a simple, yet useful, method for examining the operational consequences of utilizing various procedures and technologies in combination to achieve an `acceptable' level of detection probability. Since Customs entry points into the U.S. vary from huge seaports to a one lane highway checkpoint between the U.S. and Canadian or Mexico border, no one system can possibly be right for all points. This approach can examine alternative concepts for using different techniques/systems for different types of entry points. Operational measures reported include the average time to process vehicles and containers, the average and maximum numbers in the system at any time, and the utilization of inspection teams. The method is implemented via a PC-based simulation written in GPSS-PC language. Input to the simulation model is (1) the individual detection probabilities and false positive rates for each detection technology or procedure, (2) the inspection time for each procedure, (3) the system configuration, and (4) the physical distance between inspection stations. The model offers on- line graphics to examine effects as the model runs.

  14. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2013-12-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  15. Privacy preserving similarity detection for data analysis

    OpenAIRE

    Leontiadis, Iraklis; Önen, Melek; Molva, Refik; Chorley, M. J.; Colombo, G.B.

    2013-01-01

    International audience Current applications tend to use personal sensitive information to achieve better quality with respect to their services. Since the third parties are not trusted the data must be protected such that individual data privacy is not compromised but at the same time operations on it would be compatible. A wide range of data analysis operations entails a similarity detection algorithm between user data. For instance clustering on big data groups together objects based on ...

  16. Differential thermal analysis microsystem for explosive detection

    DEFF Research Database (Denmark)

    Olsen, Jesper Kenneth; Greve, Anders; Senesac, L.;

    2011-01-01

    A micro differential thermal analysis (DTA) system is used for detection of trace explosive particles. The DTA system consists of two silicon micro chips with integrated heaters and temperature sensors. One chip is used for reference and one for the measurement sample. The sensor is constructed a...... of the Xsense project at the Technical University of Denmark (DTU) which combines four independent sensing techniques, these micro DNT sensors will be included in handheld explosives detectors with applications in homeland security and landmine clearance....

  17. Cascaded image analysis for dynamic crack detection in material testing

    Science.gov (United States)

    Hampel, U.; Maas, H.-G.

    Concrete probes in civil engineering material testing often show fissures or hairline-cracks. These cracks develop dynamically. Starting at a width of a few microns, they usually cannot be detected visually or in an image of a camera imaging the whole probe. Conventional image analysis techniques will detect fissures only if they show a width in the order of one pixel. To be able to detect and measure fissures with a width of a fraction of a pixel at an early stage of their development, a cascaded image analysis approach has been developed, implemented and tested. The basic idea of the approach is to detect discontinuities in dense surface deformation vector fields. These deformation vector fields between consecutive stereo image pairs, which are generated by cross correlation or least squares matching, show a precision in the order of 1/50 pixel. Hairline-cracks can be detected and measured by applying edge detection techniques such as a Sobel operator to the results of the image matching process. Cracks will show up as linear discontinuities in the deformation vector field and can be vectorized by edge chaining. In practical tests of the method, cracks with a width of 1/20 pixel could be detected, and their width could be determined at a precision of 1/50 pixel.

  18. Simulation of land mine detection processes using nuclear techniques

    International Nuclear Information System (INIS)

    A computer models were designed to study the processes of land mine detection using nuclear technique. Parameters that affect the detection were analyzed . Mines of different masses at different depths in the soil are considered using two types of sources , 252Cf and 14 MeV neutron source. The capability to differentiate between mines and other objects such as concrete , iron , wood , Aluminum ,water and polyethylene were analyzed and studied

  19. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    Directory of Open Access Journals (Sweden)

    Peter J. Best

    2009-03-01

    Full Text Available Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1 threat monitoring-surveillance of security audit logs for ‘red flags’, (2 automated extraction and analysis of data from audit trails, and (3 using forensic investigation techniques to determine whether a fraud has actually occurred. We demonstrate how mySAP, an enterprise system, can be used for audit trail analysis in detecting financial frauds; afterwards we use a case study of a suspected fraud to illustrate how to implement the methodology.

  20. Protein immobilization and detection on laser processed polystyrene surfaces

    International Nuclear Information System (INIS)

    The bovine serum albumin (BSA)-polystyrene (PS) interface layer is laser photo activated at 157 nm for site selective multiple target-protein immobilization. The 5-15 nm photon induced interface layer has different chemical, wetting, and stiffness properties than the PS photon processed surface. The irradiated areas exhibit target-protein binding, followed by localized probe-target protein detection. The photon induced chemical modification of the BSA-PS interface layer is identified by: (1) Morphological, imaging, and analysis of surface parameters with atomic force microscopy, (2) spectroscopic shift (4 cm-1), of the amide I group and formation of new C=N, NH2, C-O, C=O, and O-C=O groups following irradiation, identified with attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy, and (3) the different hydrophilic/hydrophobic and force-distance response of the bare PS and BSA-PS surfaces. Near field edge diffraction (Fresnel) fluorescence imaging specifies the threshold photon energy and the fluence required to optically detect the protein binding on the photon induced BSA-PS interface layer. By approximating the Fresnel integrals with analytical functions, the threshold photon energy and the fluence are expressed as the sum of zero, first, and second order harmonic terms of two characteristic diffracted modes and they are specified to be 8.73x10-9 Jand623 J m-2, respectively. Furthermore, a bioarray of three probe-target proteins is fabricated with 1.5 μm spatial resolution using a 157 nm laser microstepper. The methodology eliminates the use of intermediate polymer layers between the blocking BSA protein and the PS substrate in bioarray fabrication.

  1. Protein immobilization and detection on laser processed polystyrene surfaces

    Science.gov (United States)

    Sarantopoulou, Evangelia; Petrou, Panagiota S.; Kollia, Zoe; Palles, Dimitrios; Spyropoulos-Antonakakis, Nikolaos; Kakabakos, Sotirios; Cefalas, Alkiviadis-Constantinos

    2011-09-01

    The bovine serum albumin (BSA)-polystyrene (PS) interface layer is laser photo activated at 157 nm for site selective multiple target-protein immobilization. The 5-15 nm photon induced interface layer has different chemical, wetting, and stiffness properties than the PS photon processed surface. The irradiated areas exhibit target-protein binding, followed by localized probe-target protein detection. The photon induced chemical modification of the BSA-PS interface layer is identified by: (1) Morphological, imaging, and analysis of surface parameters with atomic force microscopy, (2) spectroscopic shift (4 cm-1), of the amide I group and formation of new C=N, NH2, C-O, C=O, and O-C=O groups following irradiation, identified with attenuated total reflection Fourier transform infrared (ATR-FTIR) spectroscopy, and (3) the different hydrophilic/hydrophobic and force-distance response of the bare PS and BSA-PS surfaces. Near field edge diffraction (Fresnel) fluorescence imaging specifies the threshold photon energy and the fluence required to optically detect the protein binding on the photon induced BSA-PS interface layer. By approximating the Fresnel integrals with analytical functions, the threshold photon energy and the fluence are expressed as the sum of zero, first, and second order harmonic terms of two characteristic diffracted modes and they are specified to be 8.73×10-9Jand623 J m-2, respectively. Furthermore, a bioarray of three probe-target proteins is fabricated with 1.5 μm spatial resolution using a 157 nm laser microstepper. The methodology eliminates the use of intermediate polymer layers between the blocking BSA protein and the PS substrate in bioarray fabrication.

  2. Wavelet Analysis Applied to the Research on Heroin Detection

    International Nuclear Information System (INIS)

    Wavelet analysis is applied to process energy spectrum signal of drug detection by energy scattering X-Ray. In the paper, we put forward a new adaptive correlation denoising algorithm, which could achieve good filter effect. Also, a simple and effective method of peaks seeking is designed, which might anchor both strong peaks and weak peaks. Eventually, comparing experiment result with data by XRD and PDF, it makes out low relative error. (authors)

  3. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  4. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  5. Detecting 2LSB steganography using extended pairs of values analysis

    Science.gov (United States)

    Khalind, Omed; Aziz, Benjamin

    2014-05-01

    In this paper, we propose an extended pairs of values analysis to detect and estimate the amount of secret messages embedded with 2LSB replacement in digital images based on chi-square attack and regularity rate in pixel values. The detection process is separated from the estimation of the hidden message length, as it is the main requirement of any steganalysis method. Hence, the detection process acts as a discrete classifier, which classifies a given set of images into stego and clean classes. The method can accurately detect 2LSB replacement even when the message length is about 10% of the total capacity, it also reaches its best performance with an accuracy of higher than 0.96 and a true positive rate of more than 0.997 when the amount of data are 20% to 100% of the total capacity. However, the method puts no assumptions neither on the image nor the secret message, as it tested with two sets of 3000 images, compressed and uncompressed, embedded with a random message for each case. This method of detection could also be used as an automated tool to analyse a bulk of images for hidden contents, which could be used by digital forensics analysts in their investigation process.

  6. Integrated Seismic Event Detection and Location by Advanced Array Processing

    Energy Technology Data Exchange (ETDEWEB)

    Kvaerna, T; Gibbons, S J; Ringdal, F; Harris, D B

    2007-02-09

    The principal objective of this two-year study is to develop and test a new advanced, automatic approach to seismic detection/location using array processing. We address a strategy to obtain significantly improved precision in the location of low-magnitude events compared with current fully-automatic approaches, combined with a low false alarm rate. We have developed and evaluated a prototype automatic system which uses as a basis regional array processing with fixed, carefully calibrated, site-specific parameters in conjuction with improved automatic phase onset time estimation. We have in parallel developed tools for Matched Field Processing for optimized detection and source-region identification of seismic signals. This narrow-band procedure aims to mitigate some of the causes of difficulty encountered using the standard array processing system, specifically complicated source-time histories of seismic events and shortcomings in the plane-wave approximation for seismic phase arrivals at regional arrays.

  7. A scheme for the detection of mixing processes in vacuum

    CERN Document Server

    Fillion-Gourdeau, F; MacLean, S

    2014-01-01

    A scheme for the detection of photons generated by vacuum mixing processes is proposed. The strategy consists in the utilization of a high numerical aperture parabolic mirror which tightly focuses two co-propagating laser beams with different frequencies. This produces a very high intensity region in the vicinity of the focus, where the photon-photon nonlinear interaction can then induce new electromagnetic radiation by wave mixing processes. These processes are investigated theoretically. The field at the focus is obtained from the Stratton-Chu vector diffraction theory, which can accomodate any configuration of an incoming laser beam. The number of photons generated is evaluated for an incident radially polarized beam. It is demonstrated that using this field configuration, vacuum mixing processes could be detected with envisaged laser technologies.

  8. Advanced Information Processing System - Fault detection and error handling

    Science.gov (United States)

    Lala, J. H.

    1985-01-01

    The Advanced Information Processing System (AIPS) is designed to provide a fault tolerant and damage tolerant data processing architecture for a broad range of aerospace vehicles, including tactical and transport aircraft, and manned and autonomous spacecraft. A proof-of-concept (POC) system is now in the detailed design and fabrication phase. This paper gives an overview of a preliminary fault detection and error handling philosophy in AIPS.

  9. Data reconciliation and gross error detection: application in chemical processes

    OpenAIRE

    EGHBAL AHMADİ, Mohammad Hosein

    2015-01-01

    Abstract. Measured data are normally corrupted by different kinds of errors in many chemical processes. In this work, a brief overview in data reconciliation and gross error detection believed as the most efficient technique in reducing the measurement errors and obtaining accurate information about the process is presented. In addition to defining the basic problem and a survey of recent developments in this area that is categorized in “Real Time Optimization” field, we will describe about a...

  10. Evaluation of a safe sputum processing method for detecting tuberculosis.

    OpenAIRE

    Rattan, A; K. Kishore; Singh, S; Jaber, M; Xess, I; Kumar, R.

    1994-01-01

    AIMS--To evaluate a safe sputum processing method for detection of tuberculosis in developing countries. METHODS--A sample processing method was developed in which acid fast bacilli were killed with 1% sodium hypochlorite and concentrated by flotation on a layer of xylene before staining by the Ziehl Neelsen or auramine O methods. RESULTS--Best results were obtained by auramine O staining after flotation. Staining by the Ziehl Neelsen method after flotation gave better results than direct Zie...

  11. Flow Detection Based on Traffic Video Image Processing

    Directory of Open Access Journals (Sweden)

    Peng Shen

    2013-10-01

    Full Text Available Because in the traffic video image processing, the background image gotten from background modeling by traditional k-means clustering algorithm shows a lot of noises, thus the improvement of k-means clustering algorithm is proposed, and has been applied to the vehicle flow detection of traffic video image. By analyzing the vehicle detection method and comparing the flow detection algorithm, the improved k-means clustering algorithm is experimentally tested at last, and carries out software implementation. The experiment shows that the improved algorithm after background modeling is superior to the traditional one in time complexity, it has better adaptivity and robustness, which has increased the effect of vehicle flow detection.

  12. System for detecting and processing abnormality in electromagnetic shielding

    International Nuclear Information System (INIS)

    The present invention relates to a system for detecting and processing an abnormality in electromagnetic shielding of an intelligent building which is constructed using an electromagnetic shielding material for the skeleton and openings such as windows and doorways so that the whole of the building is formed into an electromagnetic shielding structure. (author). 4 figs

  13. IWTU Process Sample Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Nick Soelberg

    2013-04-01

    CH2M-WG Idaho (CWI) requested that Battelle Energy Alliance (BEA) analyze various samples collected during June – August 2012 at the Integrated Waste Treatment Facility (IWTU). Samples of IWTU process materials were collected from various locations in the process. None of these samples were radioactive. These samples were collected and analyzed to provide more understanding of the compositions of various materials in the process during the time of the process shutdown that occurred on June 16, 2012, while the IWTU was in the process of nonradioactive startup.

  14. Gaussian process regression analysis for functional data

    CERN Document Server

    Shi, Jian Qing

    2011-01-01

    Gaussian Process Regression Analysis for Functional Data presents nonparametric statistical methods for functional regression analysis, specifically the methods based on a Gaussian process prior in a functional space. The authors focus on problems involving functional response variables and mixed covariates of functional and scalar variables.Covering the basics of Gaussian process regression, the first several chapters discuss functional data analysis, theoretical aspects based on the asymptotic properties of Gaussian process regression models, and new methodological developments for high dime

  15. Impact of two types of image processing on cancer detection in mammography

    Science.gov (United States)

    Warren, Lucy M.; Halling-Brown, Mark D.; Looney, Padraig T.; Dance, David R.; Wilkinson, Louise; Wallis, Matthew G.; Given-Wilson, Rosalind M.; Cooke, Julie; McAvinchey, Rita; Young, Kenneth C.

    2016-03-01

    The impact of image processing on cancer detection is still a concern to radiologists and physicists. This work aims to evaluate the effect of two types of image processing on cancer detection in mammography. An observer study was performed in which six radiologists inspected 349 cases (a mixture of normal cases, benign lesions and cancers) processed with two types of image processing. The observers marked areas they were suspicious were cancers. JAFROC analysis was performed to determine if there was a significant difference in cancer detection between the two types of image processing. Cancer detection was significantly better with the standard setting image processing (flavor A) compared with one that provides enhanced image contrast (flavor B), p = 0.036. The image processing was applied to images of the CDMAM test object, which were then analysed using CDCOM. The threshold gold thickness measured with the CDMAM test object was thinner using flavor A than flavor B image processing. Since Flavor A was found to be superior in both the observer study and the measurements using the CDMAM phantom, this may indicate that measurements using the CDMAM correlate with change in cancer detection with different types of image processing.

  16. Confusion Analysis and Detection for Workflow Nets

    Directory of Open Access Journals (Sweden)

    Xiao-liang Chen

    2014-01-01

    Full Text Available Option processes often occur in a business procedure with respect to resource competition. In a business procedure modeled with a workflow net (WF-net, all decision behavior and option operations for business tasks are modeled and performed by the conflicts in corresponding WF-net. Concurrency in WF-nets is applied to keep a high-performance operation of business procedures. However, the firing of concurrent transitions in a WF-net may lead to the disappearance of conflicts in the WF-net. The phenomenon is usually called confusions that produces difficulties for the resolution of conflicts. This paper investigates confusion detection problems in WF-nets. First, confusions are formalized as a class of marked subnets with special conflicting and concurrent features. Second, a detection approach based on the characteristics of confusion subnets and the integer linear programming (ILP is developed, which is not required to compute the reachability graph of a WF-net. Examples of the confusion detection in WF-nets are presented. Finally, the impact of confusions on the properties of WF-nets is specified.

  17. Combustion Analysis and Knock Detection in Single Cylinder DI-Diesel Engine Using Vibration Signature Analysis

    OpenAIRE

    Y.V.V.SatyanarayanaMurthy

    2011-01-01

    The purpose of this paper is to detect the “knock” in Diesel engines which deteriorate the engine performance adversely. The methodology introduced in the present work suggests a newly developed approach towards analyzing the vibration analysis of diesel engines. The method is based on fundamental relationship between the engine vibration pattern and the relative characteristics of the combustion process in each or different cylinders. Knock in diesel engine is detected by measuring the vibra...

  18. Method Taking into Account Process Dispersions to Detect Hardware Trojan Horse by Side-Channel

    OpenAIRE

    Ngo, Xuan Thuy; Najm, Zakaria; Bhasin, Shivam; Guilley, Sylvain; Danger, Jean-Luc

    2014-01-01

    International audience Hardware trojan horses inserted in integrated circuits have received special attention of researchers. Most of the recent researches focus on detecting the presence of hardware trojans through various techniques like reverse engineering, test/verification methods and side-channel analysis (SCA). Previous works using SCA for trojan detection are based on power measurements or even simulations. When using real silicon, the results are strongly biased by the process var...

  19. Detection of outliers by neural network on the gas centrifuge experimental data of isotopic separation process

    International Nuclear Information System (INIS)

    This work presents and discusses the neural network technique aiming at the detection of outliers on a set of gas centrifuge isotope separation experimental data. In order to evaluate the application of this new technique, the result obtained of the detection is compared to the result of the statistical analysis combined with the cluster analysis. This method for the detection of outliers presents a considerable potential in the field of data analysis and it is at the same time easier and faster to use and requests very less knowledge of the physics involved in the process. This work established a procedure for detecting experiments which are suspect to contain gross errors inside a data set where the usual techniques for identification of these errors cannot be applied or its use/demands an excessively long work. (author)

  20. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  1. Colour model analysis for microscopic image processing

    OpenAIRE

    García-Rojo Marcial; González Jesús; Déniz Oscar; González Roberto; Bueno Gloria

    2008-01-01

    Abstract This article presents a comparative study between different colour models (RGB, HSI and CIEL*a*b*) applied to a very large microscopic image analysis. Such analysis of different colour models is needed in order to carry out a successful detection and therefore a classification of different regions of interest (ROIs) within the image. This, in turn, allows both distinguishing possible ROIs and retrieving their proper colour for further ROI analysis. This analysis is not commonly done ...

  2. Advances in face detection and facial image analysis

    CERN Document Server

    Celebi, M; Smolka, Bogdan

    2016-01-01

    This book presents the state-of-the-art in face detection and analysis. It outlines new research directions, including in particular psychology-based facial dynamics recognition, aimed at various applications such as behavior analysis, deception detection, and diagnosis of various psychological disorders. Topics of interest include face and facial landmark detection, face recognition, facial expression and emotion analysis, facial dynamics analysis, face classification, identification, and clustering, and gaze direction and head pose estimation, as well as applications of face analysis.

  3. Processing of radar data for landmine detection: nonlinear transformation

    Science.gov (United States)

    Bartosz, E. E.; Duvoisin, H.; Konduri, R.; Solomon, G. Z.

    2005-06-01

    The Handheld Standoff Mine Detection System (HSTAMIDS system) has achieved outstanding performance in government-run field tests due to its use of anomaly detection using principal component analysis (PCA) on the return of ground penetrating radar (GPR) coupled with metal detection. Indications of nonlinearities and asymmetries in Humanitarian Demining (HD) data point to modifications to the current PCA algorithm that might prove beneficial. Asymmetries in the distribution of PCA projections of field data have been quantified in Humanitarian Demining (HD) data. The data suggest a logarithmic correction to the data. Such a correction has been applied and has improved the FAR on this data set. The increase in performance is comparable to the increase shown using the simpler asymmetric rescaling method.

  4. Detecting Damped Lyman-$\\alpha$ Absorbers with Gaussian Processes

    CERN Document Server

    Garnett, Roman; Bird, Simeon; Schneider, Jeff

    2016-01-01

    We develop an automated technique for detecting damped Lyman-$\\alpha$ absorbers (DLAs) along spectroscopic sightlines to quasi-stellar objects (QSOs or quasars). The detection of DLAs in large-scale spectroscopic surveys such as SDSS-III sheds light on galaxy formation at high redshift, showing the nucleation of galaxies from diffuse gas. We use nearly 50 000 QSO spectra to learn a novel tailored Gaussian process model for quasar emission spectra, which we apply to the DLA detection problem via Bayesian model selection. We propose models for identifying an arbitrary number of DLAs along a given line of sight. We demonstrate our method's effectiveness using a large-scale validation experiment, with excellent performance. We also provide a catalog of our results applied to 162 861 spectra from SDSS-III data release 12.

  5. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  6. Performance Analysis of Intrusion Detection in MANET

    Directory of Open Access Journals (Sweden)

    SAMRIDHI SHARMA

    2011-05-01

    Full Text Available A Mobile Adhoc Network is a collection of autonomous nodes or terminals which communicate with each other by forming a multihop radio network without the aid of any established infrastructure or centralized administration such as a base station. The Adhoc Network provides lack of secure boundaries. At present, the security issues on Mobile Adhoc Network have become one of the primary concerns. The MANET is more vulnerable to attacks as compared to wired networks due to distributed nature and lacks of infrastructure. Those vulnerabilities are nature of the MANET structure that cannot be removed easily. As a result, attacks with malicious intent have been and will be devised to exploit those vulnerabilities and to cripple the MANET Operation. Attacks prevention techniques such as a authentication and encryption, can be used as medium of defense for decreasing the possibilities of attacks. These techniques have a limitation on the effects of prevention techniques in practice and they are designed for a set of known attacks. They are unlikely to prevent newer attacks that are designed for circumventing the existing security measures. For this purpose, there exist a need of mechanism that “detect and response” these type of newer attacks i.e. “Intrusion and Detection”. Intrusion detection provide audit and monitoring capabilities that offer the local security to a node and help to perceive the specific trust level of other node. In addition to this ontology is a proven tool for this type of analysis. In this paper, specific ontology has been modeled which aims to explore and to classify current technique of Intrusion Detection System (IDS aware MANET. To support these ideas, a discussion regarding attacks, IDS architecture and IDS in MANET are presented inclusively and then the comparison among several researches achievement will be evaluated based on these parameters.

  7. Elastic recoil detection analysis of ferroelectric films

    Energy Technology Data Exchange (ETDEWEB)

    Stannard, W.B.; Johnston, P.N.; Walker, S.R.; Bubb, I.F. [Royal Melbourne Inst. of Tech., VIC (Australia); Scott, J.F. [New South Wales Univ., Kensington, NSW (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    There has been considerable progress in developing SrBi{sub 2}Ta{sub 2}O{sub 9} (SBT) and Ba{sub O.7}Sr{sub O.3}TiO{sub 3} (BST) ferroelectric films for use as nonvolatile memory chips and for capacitors in dynamic random access memories (DRAMs). Ferroelectric materials have a very large dielectric constant ( {approx} 1000), approximately one hundred times greater than that of silicon dioxide. Devices made from these materials have been known to experience breakdown after a repeated voltage pulsing. It has been suggested that this is related to stoichiometric changes within the material. To accurately characterise these materials Elastic Recoil Detection Analysis (ERDA) is being developed. This technique employs a high energy heavy ion beam to eject nuclei from the target and uses a time of flight and energy dispersive (ToF-E) detector telescope to detect these nuclei. The recoil nuclei carry both energy and mass information which enables the determination of separate energy spectra for individual elements or for small groups of elements In this work ERDA employing 77 MeV {sup 127}I ions has been used to analyse Strontium Bismuth Tantalate thin films at the heavy ion recoil facility at ANSTO, Lucas Heights. 9 refs., 5 figs.

  8. LEXICAL ANALYSIS TO EFFECTIVELY DETECT USERS’ OPINION

    Directory of Open Access Journals (Sweden)

    Anil Kumar K.M

    2011-11-01

    Full Text Available In this paper we present a lexical approach that will identify opinion of web users popularly expressedusing short words or sms words. These words are pretty popular with diverse web users and are used forexpressing their opinion on the web. The study of opinion from web arises to know the diverse opinion ofweb users. The opinion expressed by web users may be on diverse topics such as politics, sports, products,movies etc. These opinions will be very useful to others such as, leaders of political parties, selectioncommittees of various sports, business analysts and other stake holders of products, directors andproducers of movies as well as to the other concerned web users. We use semantic based approach to findusers opinion from short words or sms words apart of regular opinionated phrases. Our approachefficiently detects opinion from opinionated texts using lexical analysis and is found to be better than theother approaches on different data sets.

  9. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation.......The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  10. Sound Event Detection for Music Signals Using Gaussian Processes

    Directory of Open Access Journals (Sweden)

    Pablo A. Alvarado-Durán

    2013-11-01

    Full Text Available In this paper we present a new methodology for detecting sound events in music signals using Gaussian Processes. Our method firstly takes a time-frequency representation, i.e. the spectrogram, of the input audio signal. Secondly the spectrogram dimension is reduced translating the linear Hertz frequency scale into the logarithmic Mel frequency scale using a triangular filter bank. Finally every short-time spectrum, i.e. every Mel spectrogram column, is classified as “Event” or “Not Event” by a Gaussian Processes Classifier. We compare our method with other event detection techniques widely used. To do so, we use MATLAB® to program each technique and test them using two datasets of music with different levels of complexity. Results show that the new methodology outperforms the standard approaches, getting an improvement by about 1.66 % on the dataset one and 0.45 % on the dataset two in terms of F-measure.

  11. Terrain Mapping and Obstacle Detection Using Gaussian Processes

    DEFF Research Database (Denmark)

    Kjærgaard, Morten; Massaro, Alessandro Salvatore; Bayramoglu, Enis;

    2011-01-01

    In this paper we consider a probabilistic method for extracting terrain maps from a scene and use the information to detect potential navigation obstacles within it. The method uses Gaussian process regression (GPR) to predict an estimate function and its relative uncertainty. To test the new...... show that the estimated maps follow the terrain shape, while protrusions are identified and may be isolated as potential obstacles. Representing the data with a covariance function allows a dramatic reduction of the amount of data to process, while maintaining the statistical properties of the measured...

  12. Terrain Mapping and Obstacle Detection using Gaussian Processes

    DEFF Research Database (Denmark)

    Kjærgaard, Morten; Massaro, Alessandro Salvatore; Bayramoglu, Enis;

    2011-01-01

    In this paper we consider a probabilistic method for extracting terrain maps from a scene and use the information to detect potential navigation obstacles within it. The method uses Gaussian process regression (GPR) to predict an estimate function and its relative uncertainty. To test the new...... show that the estimated maps follow the terrain shape, while protrusions are identified and may be isolated as potential obstacles. Representing the data with a covariance function allows a dramatic reduction of the amount of data to process, while maintaining the statistical properties of the measured...

  13. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-01-01

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  14. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1990-12-31

    Disclosed is a process for detecting the presence of plant substances in a particular environment which comprises the steps of: (1) Measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; (2) measuring the amount of K40 gamma ray radiation emanating from a package containing said plant substance being passed through said environment with said counter; and (3) generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level. Also disclosed is the apparatus and system used to conduct the process.

  15. Automotive Parking Lot and Theft Detection through Image Processing

    Directory of Open Access Journals (Sweden)

    2013-10-01

    Full Text Available Automotive parking lot and theft detection through image processing is a smart parking lot which will save time for the owner to park his car in a more organized way and also prevent theft of the car. It is a technology to optimize the checkout process by analysing a database of images of number plates of cars. The heart of the project is based on image processing. The images of number plates will be detected by Matlab and a picture of the driver will be saved in a similar database. As soon as both the images are saved, the garage entrance pole will shift 90 degrees upward using a DC MOTOR and will remain in that position for 30 seconds to allow the car to enter. After 30 seconds it will return back to its previous position. When the car exits the earlier steps will be repeated and Matlab will match both the images that were taken during entering and leaving. Meanwhile the seven segment display will show that a car has left the parking lot, by decrementing a number from its display. The cars are controlled by a microcontroller which is also able to detect and display if a vacant parking space is available. If there is no vacancy a red LED lights up, where as a green LED is used to display presence of parking space along with how many parking spots are available. It is applicable to be used in super market car parking lots and also apartment garages.

  16. Applications of parallel processing algorithms for DNA sequence analysis.

    OpenAIRE

    Collins, J. F.; Coulson, A F

    1984-01-01

    Programs have been written to apply parallel processing algorithms to the main methods of DNA sequence analysis. These programs allow the largest of currently interesting problems to be handled on a medium-sized computer system. The abundance of information otherwise not readily available has suggested new methods for the detection of homology and order in sequences.

  17. Bistatic processing - analysis and verification

    OpenAIRE

    Natroshvili, Koba

    2008-01-01

    Interest in Bistatic and Multistatic SAR (Synthetic Aperture Radar) systems has grown in the last decade. They bring additional benefits to conventional monostatic SAR systems, such as flexibility, cost reduction, reduced vulnerability, etc. At the same time, processing complexity for bistatic configurations is much higher than for conventional monostatic processors. Until now only some numerical and intuitive solutions were given in this respect. No analytical solution is available. In th...

  18. "Information Design" for "Weak Signal" detection and processing in Economic Intelligence: case study on Health resources

    OpenAIRE

    Sidhom, Sahbi; Lambert, Philippe

    2011-01-01

    Abstract -- The topics of this research cover all phases of the "Information Design" applied to detect and take profit on weak signals in economic intelligence (EI) (or BI: business intelligence). The field of the information design (ID) applies the process of translating complex, unorganized or unstructured data into valuable and meaningful information. The ID's practice requires an interdisciplinary approach which combines skills in graphic design - writing, analysis processing and editing ...

  19. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  20. Preclosure Criticality Analysis Process Report

    International Nuclear Information System (INIS)

    The design approach for criticality of the disposal container and waste package will be dictated by existing regulatory requirements. This conclusion is based on the fact that preclosure operations and facilities have significant similarities to existing facilities and operations currently regulated by the NRC. The major difference would be the use of a risk-informed approach with burnup credit. This approach could reduce licensing delays and costs of the repository. The probability of success for this proposed seamless licensing strategy is increased, since there is precedence of regulation (10 CFR Part 63 and NUREG 1520) and commercial precedence for allowing burnup credit at sites similar to Yucca Mountain during preclosure. While NUREG 1520 is not directly applicable to a facility for handling spent nuclear fuel, the risk-informed approach to criticality analysis in NUREG 1520 is considered indicative of how the NRC will approach risk-informed criticality analysis at spent fuel facilities in the future. The types of design basis events which must be considered during the criticality safety analysis portion of the Integrated Safety Analysis (ISA) are those events which result in unanticipated moderation, loss of neutron absorber, geometric changes in the critical system, or administrative errors in waste form placement (loading) of the disposal container. The specific events to be considered must be based on the review of the system's design, as discussed in Section 3.2. A transition of licensing approach (e.g., deterministic versus risk-informed, performance-based) is not obvious and will require analysis. For commercial spent nuclear fuel, the probability of interspersed moderation may be low enough to allow nearly the same Critical Limit for both preclosure and postclosure, though an administrative margin will be applied to preclosure and possibly not to postclosure. Similarly the Design Basis Events for the waste package may be incredible and therefore not

  1. High efficiency processing for reduced amplitude zones detection in the HRECG signal

    Science.gov (United States)

    Dugarte, N.; Álvarez, A.; Balacco, J.; Mercado, G.; Gonzalez, A.; Dugarte, E.; Olivares, A.

    2016-04-01

    Summary – This article presents part of a more detailed research proposed in the medium to long term, with the intention of establishing a new philosophy of electrocardiogram surface analysis. This research aims to find indicators of cardiovascular disease in its early stage that may go unnoticed with conventional electrocardiography. This paper reports the development of a software processing which collect some existing techniques and incorporates novel methods for detection of reduced amplitude zones (RAZ) in high resolution electrocardiographic signal (HRECG).The algorithm consists of three stages, an efficient processing for QRS detection, averaging filter using correlation techniques and a step for RAZ detecting. Preliminary results show the efficiency of system and point to incorporation of techniques new using signal analysis with involving 12 leads.

  2. On damage detection in wind turbine gearboxes using outlier analysis

    Science.gov (United States)

    Antoniadou, Ifigeneia; Manson, Graeme; Dervilis, Nikolaos; Staszewski, Wieslaw J.; Worden, Keith

    2012-04-01

    The proportion of worldwide installed wind power in power systems increases over the years as a result of the steadily growing interest in renewable energy sources. Still, the advantages offered by the use of wind power are overshadowed by the high operational and maintenance costs, resulting in the low competitiveness of wind power in the energy market. In order to reduce the costs of corrective maintenance, the application of condition monitoring to gearboxes becomes highly important, since gearboxes are among the wind turbine components with the most frequent failure observations. While condition monitoring of gearboxes in general is common practice, with various methods having been developed over the last few decades, wind turbine gearbox condition monitoring faces a major challenge: the detection of faults under the time-varying load conditions prevailing in wind turbine systems. Classical time and frequency domain methods fail to detect faults under variable load conditions, due to the temporary effect that these faults have on vibration signals. This paper uses the statistical discipline of outlier analysis for the damage detection of gearbox tooth faults. A simplified two-degree-of-freedom gearbox model considering nonlinear backlash, time-periodic mesh stiffness and static transmission error, simulates the vibration signals to be analysed. Local stiffness reduction is used for the simulation of tooth faults and statistical processes determine the existence of intermittencies. The lowest level of fault detection, the threshold value, is considered and the Mahalanobis squared-distance is calculated for the novelty detection problem.

  3. The Independent Technical Analysis Process

    Energy Technology Data Exchange (ETDEWEB)

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  4. Heart rate analysis by sparse representation for acute pain detection.

    Science.gov (United States)

    Tejman-Yarden, Shai; Levi, Ofer; Beizerov, Alex; Parmet, Yisrael; Nguyen, Tu; Saunders, Michael; Rudich, Zvia; Perry, James C; Baker, Dewleen G; Moeller-Bertram, Tobias

    2016-04-01

    Objective pain assessment methods pose an advantage over the currently used subjective pain rating tools. Advanced signal processing methodologies, including the wavelet transform (WT) and the orthogonal matching pursuit algorithm (OMP), were developed in the past two decades. The aim of this study was to apply and compare these time-specific methods to heart rate samples of healthy subjects for acute pain detection. Fifteen adult volunteers participated in a study conducted in the pain clinic at a single center. Each subject's heart rate was sampled for 5-min baseline, followed by a cold pressor test (CPT). Analysis was done by the WT and the OMP algorithm with a Fourier/Wavelet dictionary separately. Data from 11 subjects were analyzed. Compared to baseline, The WT analysis showed a significant coefficients' density increase during the pain incline period (p algorithm (p algorithm (p detection of pain events. Statistical analysis proved the OMP to be by far more specific allowing the Fourier coefficients to represent the signal's basic harmonics and the wavelet coefficients to focus on the time-specific painful event. This is an initial study using OMP for pain detection; further studies need to prove the efficiency of this system in different settings. PMID:26264057

  5. Detecting Phishing Attacks In Purchasing Process Through Proactive Approach

    Directory of Open Access Journals (Sweden)

    S.Arun

    2012-06-01

    Full Text Available A Monitor is a software system that observes and analyzes the behavior of target system determining the quality of interest such as satisfaction of the target system. In the modern technology business processes are open and distributed which may lead to failure. Therefore monitoring is an important task for the services that comprise these processes. We are going to present a framework for multilevel monitoring of these service systems. The main objective of this project is monitoring the customer who purchases items from Merchant. Phishing is an online scam that attempts to defraud people of their personal information such as credit card or bank account information. We are going to detect, locate and remove the phishing E-mail. The customer details will be stored in web registry. We are going to demonstrate how the online business processes can be implemented with multiple scenarios that include monitoring open service policy commitments.

  6. Weighted Dickey-Fuller Processes for Detecting Stationarity

    CERN Document Server

    Steland, Ansgar

    2010-01-01

    Aiming at monitoring a time series to detect stationarity as soon as possible, we introduce monitoring procedures based on kernel-weighted sequential Dickey-Fuller (DF) processes, and related stopping times, which may be called weighted Dickey-Fuller control charts. Under rather weak assumptions, (functional) central limit theorems are established under the unit root null hypothesis and local-to-unity alternatives. For gen- eral dependent and heterogeneous innovation sequences the limit processes depend on a nuisance parameter. In this case of practical interest, one can use estimated control limits obtained from the estimated asymptotic law. Another easy-to-use approach is to transform the DF processes to obtain limit laws which are invariant with respect to the nuisance pa- rameter. We provide asymptotic theory for both approaches and compare their statistical behavior in finite samples by simulation.

  7. Algorithms Development in Detection of the Gelatinization Process during Enzymatic ‘Dodol’ Processing

    Directory of Open Access Journals (Sweden)

    Azman Hamzah

    2007-11-01

    Full Text Available Computer vision systems have found wide application in foods processing industry to perform the quality evaluation. The systems enable to replace human inspectors for the evaluation of a variety of quality attributes. This paper describes the implementation of the Fast Fourier Transform and Kalman filtering algorithms to detect the glutinous rice flour slurry (GRFS gelatinization in an enzymatic ‘dodol’ processing. The onset of the GRFS gelatinization is critical in determining the quality of an enzymatic ‘dodol’. Combinations of these two algorithms were able to detect the gelatinization of the GRFS. The result shows that the gelatinization of the GRFS was at the time range of 11.75 minutes to 15.33 minutes for 20 batches of processing. This paper will highlight the capability of computer vision using our proposed algorithms in monitoring and controlling of an enzymatic ‘dodol’ processing via image processing technology.

  8. Risk Analysis for Tea Processing

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@ Itis lbviors that after all the disasters with dilxins, BSE, pathogcns,Footand Mouth disease a. o. and now shortly because of the possibillties of bioterrorism, thatFoodSafetyisalmostatthetopoftheagendaoftheEUfor theyearstocome The implementaion of certainhy gicneprinci plessuchas HA C C P and a transparent hygiene policy applicable to all food and all food operators, from the farm to the table, togetherwith effoctiveinstruments to manage Food Safety will form fsubstantialpart on this agenda. As an example external quality factors such as certain pathogens in tea will. be discussed. Since risk analysis of e. g. my cotoxing have already a quite long histoy and development in sereral international bodies and tea might bear unwanted (or deliberately added by terroristic action)contaminants, the need to monitor teamuch more regularly than is being done today, seems to be a"conditio sine qua non ". Recentoy developed Immuno Flow tests may one day help the consumer perhaps to find out if he gets poisoned.

  9. Respiratory rate detection algorithms by photoplethysmography signal processing.

    Science.gov (United States)

    Lee, E M; Kim, N H; Trang, N T; Hong, J H; Cha, E J; Lee, T S

    2008-01-01

    Photoplethysmography (PPG) offers the clinically meaningful parameters, such as, heart rate, and respiratory rate. In this study, we presented three respiratory signal detection algorithms using photoplethysmography raw data generated from commercial PPG sensor: (1)Min-Max (2)Peak-to-Peak (3)Pulse Shape. As reference signal, nasal sensor signal was acquired simultaneously and compared and analyzed. We used two types of moving average filtering technique to process three PPG parameters. In laboratory experiment, 6 subjects' PPG signals were measured when they respire ten and fifteen, and arbitrary times per minute. From the results, following conclusions were drawn. Min-Max and Peak-to-Peak algorithms perform better than Pulse shape algorithm. They can be used to detect respiratory rate. But, Pulse Shape algorithm was accurate for subject 4 only. More experimental data is necessary to improve the accuracy and reliability. PMID:19162865

  10. Simple data processing for gamma spectroscopy in activation analysis

    International Nuclear Information System (INIS)

    A data processing system was developed for multielement neutron activation analysis. It uses a desk-top programmable calculator that is operated off line. The processing is divided into two main programs. The first one detects the peaks of a series of gamma spectra with a 0.1-keV accuracy. The second program performs the integration of peaks selected by the operator, including those for which he wants a detection limit. Peak intensities corrected for decay and dead time are calculated. This system has many advantages: easy use, great versatility, large possibilities for manual control, and low cost

  11. Spectral analysis method for detecting an element

    Science.gov (United States)

    Blackwood, Larry G [Idaho Falls, ID; Edwards, Andrew J [Idaho Falls, ID; Jewell, James K [Idaho Falls, ID; Reber, Edward L [Idaho Falls, ID; Seabury, Edward H [Idaho Falls, ID

    2008-02-12

    A method for detecting an element is described and which includes the steps of providing a gamma-ray spectrum which has a region of interest which corresponds with a small amount of an element to be detected; providing nonparametric assumptions about a shape of the gamma-ray spectrum in the region of interest, and which would indicate the presence of the element to be detected; and applying a statistical test to the shape of the gamma-ray spectrum based upon the nonparametric assumptions to detect the small amount of the element to be detected.

  12. Streak detection and analysis pipeline for optical images

    Science.gov (United States)

    Virtanen, J.; Granvik, M.; Torppa, J.; Muinonen, K.; Poikonen, J.; Lehti, J.; Säntti, T.; Komulainen, T.; Flohrer, T.

    2014-07-01

    We describe a novel data processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data to support the development and validation of population models, and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. We focus on the low signal-to-noise (SNR) detection of objects with high angular velocities, resulting in long and faint object trails, or streaks, in the optical images. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and, particularly for satellites, within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a 'track-before-detect' problem, resulting in streaks of arbitrary lengths. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, algorithms are not readily available yet. In the ESA-funded StreakDet (Streak detection and astrometric reduction) project, we develop and evaluate an automated processing pipeline applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The algorithmic

  13. Vision based error detection for 3D printing processes

    Directory of Open Access Journals (Sweden)

    Baumann Felix

    2016-01-01

    Full Text Available 3D printers became more popular in the last decade, partly because of the expiration of key patents and the supply of affordable machines. The origin is located in rapid prototyping. With Additive Manufacturing (AM it is possible to create physical objects from 3D model data by layer wise addition of material. Besides professional use for prototyping and low volume manufacturing they are becoming widespread amongst end users starting with the so called Maker Movement. The most prevalent type of consumer grade 3D printers is Fused Deposition Modelling (FDM, also Fused Filament Fabrication FFF. This work focuses on FDM machinery because of their widespread occurrence and large number of open problems like precision and failure. These 3D printers can fail to print objects at a statistical rate depending on the manufacturer and model of the printer. Failures can occur due to misalignment of the print-bed, the print-head, slippage of the motors, warping of the printed material, lack of adhesion or other reasons. The goal of this research is to provide an environment in which these failures can be detected automatically. Direct supervision is inhibited by the recommended placement of FDM printers in separate rooms away from the user due to ventilation issues. The inability to oversee the printing process leads to late or omitted detection of failures. Rejects effect material waste and wasted time thus lowering the utilization of printing resources. Our approach consists of a camera based error detection mechanism that provides a web based interface for remote supervision and early failure detection. Early failure detection can lead to reduced time spent on broken prints, less material wasted and in some cases salvaged objects.

  14. Process and apparatus for detecting presence of plant substances

    International Nuclear Information System (INIS)

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level

  15. Process and apparatus for detecting presence of plant substances

    Energy Technology Data Exchange (ETDEWEB)

    Kirby, J.A.

    1991-04-16

    This patent describes an apparatus and process for detecting the presence of plant substances in a particular environment. It comprises: measuring the background K40 gamma ray radiation level in a particular environment with a 1.46 MeV gamma ray counter system; measuring the amount of K40 gamma ray radiation emanating from a package containing a plant substance being passed through an environment with a counter; and generating an alarm signal when the total K40 gamma ray radiation reaches a predetermined level over and above the background level.

  16. Sound Event Detection for Music Signals Using Gaussian Processes

    OpenAIRE

    Pablo A. Alvarado-Durán; Mauricio A. Álvarez-López; Álvaro A. Orozco-Gutiérrez

    2013-01-01

    In this paper we present a new methodology for detecting sound events in music signals using Gaussian Processes. Our method firstly takes a time-frequency representation, i.e. the spectrogram, of the input audio signal. Secondly the spectrogram dimension is reduced translating the linear Hertz frequency scale into the logarithmic Mel frequency scale using a triangular filter bank. Finally every short-time spectrum, i.e. every Mel spectrogram column, is classified as “Event” or “Not Event” by ...

  17. Normality Analysis for RFI Detection in Microwave Radiometry

    Directory of Open Access Journals (Sweden)

    Adriano Camps

    2009-12-01

    Full Text Available Radio-frequency interference (RFI present in microwave radiometry measurements leads to erroneous radiometric results. Sources of RFI include spurious signals and harmonics from lower frequency bands, spread-spectrum signals overlapping the “protected” band of operation, or out-of-band emissions not properly rejected by the pre-detection filters due to its finite rejection. The presence of RFI in the radiometric signal modifies the detected power and therefore the estimated antenna temperature from which the geophysical parameters will be retrieved. In recent years, techniques to detect the presence of RFI in radiometric measurements have been developed. They include time- and/or frequency domain analyses, or time and/or frequency domain statistical analysis of the received signal which, in the absence of RFI, must be a zero-mean Gaussian process. Statistical analyses performed to date include the calculation of the Kurtosis, and the Shapiro-Wilk normality test of the received signal. Nevertheless, statistical analysis of the received signal could be more extensive, as reported in the Statistics literature. The objective of this work is the study of the performance of a number of normality tests encountered in the Statistics literature when applied to the detection of the presence of RFI in the radiometric signal, which is Gaussian by nature. A description of the normality tests and the RFI detection results for different kinds of RFI are presented in view of determining an omnibus test that can deal with the blind spots of the currently used methods.

  18. Microarrays for Pathogen Detection and Analysis

    OpenAIRE

    McLoughlin, Kevin S.

    2011-01-01

    DNA microarrays have emerged as a viable platform for detection of pathogenic organisms in clinical and environmental samples. These microbial detection arrays occupy a middle ground between low cost, narrowly focused assays such as multiplex PCR and more expensive, broad-spectrum technologies like high-throughput sequencing. While pathogen detection arrays have been used primarily in a research context, several groups are aggressively working to develop arrays for clinical diagnostics, food ...

  19. Bisous model-Detecting filamentary patterns in point processes

    Science.gov (United States)

    Tempel, E.; Stoica, R. S.; Kipper, R.; Saar, E.

    2016-07-01

    The cosmic web is a highly complex geometrical pattern, with galaxy clusters at the intersection of filaments and filaments at the intersection of walls. Identifying and describing the filamentary network is not a trivial task due to the overwhelming complexity of the structure, its connectivity and the intrinsic hierarchical nature. To detect and quantify galactic filaments we use the Bisous model, which is a marked point process built to model multi-dimensional patterns. The Bisous filament finder works directly with the galaxy distribution data and the model intrinsically takes into account the connectivity of the filamentary network. The Bisous model generates the visit map (the probability to find a filament at a given point) together with the filament orientation field. Using these two fields, we can extract filament spines from the data. Together with this paper we publish the computer code for the Bisous model that is made available in GitHub. The Bisous filament finder has been successfully used in several cosmological applications and further development of the model will allow to detect the filamentary network also in photometric redshift surveys, using the full redshift posterior. We also want to encourage the astro-statistical community to use the model and to connect it with all other existing methods for filamentary pattern detection and characterisation.

  20. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  1. Intelligent signal analysis methodologies for nuclear detection, identification and attribution

    Science.gov (United States)

    Alamaniotis, Miltiadis

    Detection and identification of special nuclear materials can be fully performed with a radiation detector-spectrometer. Due to several physical and computational limitations, development of fast and accurate radioisotope identifier (RIID) algorithms is essential for automated radioactive source detection and characterization. The challenge is to identify individual isotope signatures embedded in spectral signature aggregation. In addition, background and isotope spectra overlap to further complicate the signal analysis. These concerns are addressed, in this thesis, through a set of intelligent methodologies recognizing signature spectra, background spectrum and, subsequently, identifying radionuclides. Initially, a method for detection and extraction of signature patterns is accomplished by means of fuzzy logic. The fuzzy logic methodology is applied on three types of radiation signal processing applications, where it exhibits high positive detection, low false alarm rate and very short execution time, while outperforming the maximum likelihood fitting approach. In addition, an innovative Pareto optimal multiobjective fitting of gamma ray spectra using evolutionary computing is presented. The methodology exhibits perfect identification while performs better than single objective fitting. Lastly, an innovative kernel based machine learning methodology was developed for estimating natural background spectrum in gamma ray spectra. The novelty of the methodology lies in the fact that it implements a data based approach and does not require any explicit physics modeling. Results show that kernel based method adequately estimates the gamma background, but algorithm's performance exhibits a strong dependence on the selected kernel.

  2. Detection and Analysis of Threats to the Energy Sector: DATES

    Energy Technology Data Exchange (ETDEWEB)

    Alfonso Valdes

    2010-03-31

    This report summarizes Detection and Analysis of Threats to the Energy Sector (DATES), a project sponsored by the United States Department of Energy and performed by a team led by SRI International, with collaboration from Sandia National Laboratories, ArcSight, Inc., and Invensys Process Systems. DATES sought to advance the state of the practice in intrusion detection and situational awareness with respect to cyber attacks in energy systems. This was achieved through adaptation of detection algorithms for process systems as well as development of novel anomaly detection techniques suited for such systems into a detection suite. These detection components, together with third-party commercial security systems, were interfaced with the commercial Security Information Event Management (SIEM) solution from ArcSight. The efficacy of the integrated solution was demonstrated on two testbeds, one based on a Distributed Control System (DCS) from Invensys, and the other based on the Virtual Control System Environment (VCSE) from Sandia. These achievements advance the DOE Cybersecurity Roadmap [DOE2006] goals in the area of security monitoring. The project ran from October 2007 until March 2010, with the final six months focused on experimentation. In the validation phase, team members from SRI and Sandia coupled the two test environments and carried out a number of distributed and cross-site attacks against various points in one or both testbeds. Alert messages from the distributed, heterogeneous detection components were correlated using the ArcSight SIEM platform, providing within-site and cross-site views of the attacks. In particular, the team demonstrated detection and visualization of network zone traversal and denial-of-service attacks. These capabilities were presented to the DistribuTech Conference and Exhibition in March 2010. The project was hampered by interruption of funding due to continuing resolution issues and agreement on cost share for four months in 2008

  3. Coherent detection and digital signal processing for fiber optic communications

    Science.gov (United States)

    Ip, Ezra

    The drive towards higher spectral efficiency in optical fiber systems has generated renewed interest in coherent detection. We review different detection methods, including noncoherent, differentially coherent, and coherent detection, as well as hybrid detection methods. We compare the modulation methods that are enabled and their respective performances in a linear regime. An important system parameter is the number of degrees of freedom (DOF) utilized in transmission. Polarization-multiplexed quadrature-amplitude modulation maximizes spectral efficiency and power efficiency as it uses all four available DOF contained in the two field quadratures in the two polarizations. Dual-polarization homodyne or heterodyne downconversion are linear processes that can fully recover the received signal field in these four DOF. When downconverted signals are sampled at the Nyquist rate, compensation of transmission impairments can be performed using digital signal processing (DSP). Software based receivers benefit from the robustness of DSP, flexibility in design, and ease of adaptation to time-varying channels. Linear impairments, including chromatic dispersion (CD) and polarization-mode dispersion (PMD), can be compensated quasi-exactly using finite impulse response filters. In practical systems, sampling the received signal at 3/2 times the symbol rate is sufficient to enable an arbitrary amount of CD and PMD to be compensated for a sufficiently long equalizer whose tap length scales linearly with transmission distance. Depending on the transmitted constellation and the target bit error rate, the analog-to-digital converter (ADC) should have around 5 to 6 bits of resolution. Digital coherent receivers are naturally suited for the implementation of feedforward carrier recovery, which has superior linewidth tolerance than phase-locked loops, and does not suffer from feedback delay constraints. Differential bit encoding can be used to prevent catastrophic receiver failure due

  4. Signal detection in FDA AERS database using Dirichlet process.

    Science.gov (United States)

    Hu, Na; Huang, Lan; Tiwari, Ram C

    2015-08-30

    In the recent two decades, data mining methods for signal detection have been developed for drug safety surveillance, using large post-market safety data. Several of these methods assume that the number of reports for each drug-adverse event combination is a Poisson random variable with mean proportional to the unknown reporting rate of the drug-adverse event pair. Here, a Bayesian method based on the Poisson-Dirichlet process (DP) model is proposed for signal detection from large databases, such as the Food and Drug Administration's Adverse Event Reporting System (AERS) database. Instead of using a parametric distribution as a common prior for the reporting rates, as is the case with existing Bayesian or empirical Bayesian methods, a nonparametric prior, namely, the DP, is used. The precision parameter and the baseline distribution of the DP, which characterize the process, are modeled hierarchically. The performance of the Poisson-DP model is compared with some other models, through an intensive simulation study using a Bayesian model selection and frequentist performance characteristics such as type-I error, false discovery rate, sensitivity, and power. For illustration, the proposed model and its extension to address a large amount of zero counts are used to analyze statin drugs for signals using the 2006-2011 AERS data. PMID:25924820

  5. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  6. Meta-analysis using Dirichlet process.

    Science.gov (United States)

    Muthukumarana, Saman; Tiwari, Ram C

    2016-02-01

    This article develops a Bayesian approach for meta-analysis using the Dirichlet process. The key aspect of the Dirichlet process in meta-analysis is the ability to assess evidence of statistical heterogeneity or variation in the underlying effects across study while relaxing the distributional assumptions. We assume that the study effects are generated from a Dirichlet process. Under a Dirichlet process model, the study effects parameters have support on a discrete space and enable borrowing of information across studies while facilitating clustering among studies. We illustrate the proposed method by applying it to a dataset on the Program for International Student Assessment on 30 countries. Results from the data analysis, simulation studies, and the log pseudo-marginal likelihood model selection procedure indicate that the Dirichlet process model performs better than conventional alternative methods. PMID:22802045

  7. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2005-01-01

    Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005.......Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005....

  8. Detection limits in x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    X-ray fluorescence spectrometry is a well established analytical technique for elemental analysis of solids, powders, or liquids. This extended abstract briefly discusses the detection limits or sensitivity of x-ray spectrometers in x-ray fluorescence analysis

  9. Wavelet Analysis on Detecting Pulse-Like Earthquakes

    International Nuclear Information System (INIS)

    A quantitative approach for identifying pulse-like ground motions is proposed herein. It is based on the use of the wavelet transform which has the peculiarity to detect sudden jumps in time histories by separating the contributions of different levels of frequency. Moreover, it has the advantage of low computational cost. Three different wavelet-based signal processing procedures are considered here in order to detect large pulses in near-fault ground motions. The first one is based on the direct decomposition of velocity time histories in frequency level and has been exploited elsewhere in the scientific literature. The other two are introduced here and take into account energy and power spectra. It is shown that wavelet analysis of the energy allows one to put in evidence even pulses that can be hardly recognized in the analysis of velocity time-histories. The proposed procedure permits also to distinguish the various energy contributions in different frequency ranges. By analyzing the wavelet coefficients, in fact, it is possible to verify if the mechanical energy release rate associated with a certain earthquake is due to a few severe events or to a series of 'small' events. It is also possible to evidence the frequency contents of a specific pulse (let say the one with highest amount of energy and corresponding power), isolating its analysis from the rest of the ground motion

  10. Effects of image processing on the detective quantum efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hye-Suk; Kim, Hee-Joung; Cho, Hyo-Min; Lee, Chang-Lae; Lee, Seung-Wan; Choi, Yu-Na [Yonsei University, Wonju (Korea, Republic of)

    2010-02-15

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  11. Effects of image processing on the detective quantum efficiency

    International Nuclear Information System (INIS)

    The evaluation of image quality is an important part of digital radiography. The modulation transfer function (MTF), the noise power spectrum (NPS), and the detective quantum efficiency (DQE) are widely accepted measurements of the digital radiographic system performance. However, as the methodologies for such characterization have not been standardized, it is difficult to compare directly reported the MTF, NPS, and DQE results. In this study, we evaluated the effect of an image processing algorithm for estimating the MTF, NPS, and DQE. The image performance parameters were evaluated using the international electro-technical commission (IEC 62220-1)-defined RQA5 radiographic techniques. Computed radiography (CR) posterior-anterior (PA) images of a hand for measuring the signal to noise ratio (SNR), the slit images for measuring the MTF, and the white images for measuring the NPS were obtained, and various multi-Scale image contrast amplification (MUSICA) factors were applied to each of the acquired images. All of the modifications of the images obtained by using image processing had a considerable influence on the evaluated image quality. In conclusion, the control parameters of image processing can be accounted for evaluating characterization of image quality in same way. The results of this study should serve as a baseline for based on evaluating imaging systems and their imaging characteristics by MTF, NPS, and DQE measurements.

  12. Analysis on Credit Card Fraud Detection Methods

    OpenAIRE

    Renu; De, Suman

    2014-01-01

    Due to the theatrical increase of fraud which results in loss of dollars worldwide each year, several modern techniques in detecting fraud are persistently evolved and applied to many business fields. Fraud detection involves monitoring the activities of populations of users in order to estimate, perceive or avoid undesirable behavior. Undesirable behavior is a broad term including delinquency, fraud, intrusion, and account defaulting. This paper presents a survey of current techniques used i...

  13. Abstract Syntax Tree Analysis for Plagiarism Detection

    OpenAIRE

    Nilsson, Erik

    2012-01-01

    Today, universities rely heavily on systems for detecting plagiarism in students’essays and reports. Code submissions however require specific tools. A numberof approaches to finding plagiarisms in code have already been tried, includingtechniques based on comparing textual transformations of code, token strings,parse trees and graph representations. In this master’s thesis, a new system, cojac,is presented which combines textual, tree and graph techniques to detect a broadspectrum of plagiar...

  14. Direct detection of DNA conformation in hybridization processes.

    Science.gov (United States)

    Papadakis, George; Tsortos, Achilleas; Bender, Florian; Ferapontova, Elena E; Gizeli, Electra

    2012-02-21

    DNA hybridization studies at surfaces normally rely on the detection of mass changes as a result of the addition of the complementary strand. In this work we propose a mass-independent sensing principle based on the quantitative monitoring of the conformation of the immobilized single-strand probe and of the final hybridized product. This is demonstrated by using a label-free acoustic technique, the quartz crystal microbalance (QCM-D), and oligonucleotides of specific sequences which, upon hybridization, result in DNAs of various shapes and sizes. Measurements of the acoustic ratio ΔD/ΔF in combination with a "discrete molecule binding" approach are used to confirm the formation of straight hybridized DNA molecules of specific lengths (21, 75, and 110 base pairs); acoustic results are also used to distinguish between single- and double-stranded molecules as well as between same-mass hybridized products with different shapes, i.e., straight or "Y-shaped". Issues such as the effect of mono- and divalent cations to hybridization and the mechanism of the process (nucleation, kinetics) when it happens on a surface are carefully considered. Finally, this new sensing principle is applied to single-nucleotide polymorphism detection: a DNA hairpin probe hybridized to the p53 target gene gave products of distinct geometrical features depending on the presence or absence of the SNP, both readily distinguishable. Our results suggest that DNA conformation probing with acoustic wave sensors is a much more improved detection method over the popular mass-related, on/off techniques offering higher flexibility in the design of solid-phase hybridization assays. PMID:22248021

  15. Pattern Detection and Extreme Value Analysis on Large Climate Data

    Science.gov (United States)

    Prabhat, M.; Byna, S.; Paciorek, C.; Weber, G.; Wu, K.; Yopes, T.; Wehner, M. F.; Ostrouchov, G.; Pugmire, D.; Strelitz, R.; Collins, W.; Bethel, W.

    2011-12-01

    We consider several challenging problems in climate that require quantitative analysis of very large data volumes generated by modern climate simulations. We demonstrate new software capable of addressing these challenges that is designed to exploit petascale platforms using state-of-the-art methods in high performance computing. Atmospheric rivers and Hurricanes are important classes of extreme weather phenomena. Developing analysis tools that can automatically detect these events in large climate datasets can provide us with invaluable information about the frequency of these events. Application of these tools to different climate model outputs can provide us with quality metrics that evaluate whether models produce this important class of phenomena and how the statistics of these events will likely vary in the future. In this work, we present an automatic technique for detecting atmospheric rivers. We use techniques from image processing and topological analysis to extract these features. We implement this technique in a massively parallel fashion on modern supercomputing platforms, and apply the resulting software to both observational data and various models from the CMIP-3 archive. We have successfully completed atmospheric river detections on 1TB of data on 10000 hopper cores in 10 seconds. For hurricane tracking, we have adapted code from GFDL to run in parallel on large datasets. We present results from the application of this code to some recent high resolution CAM5 simulations. Our code is capable of processing 1TB of data in 10 seconds. Extreme value analysis involves statistical techniques for estimating the probability of extreme events and variations in the probabilities over time and space. Because of their rarity, there is a high degree of uncertainty when estimating the behavior of extremes from data at any one location. We are developing a local likelihood approach to borrow strength from multiple locations, with uncertainty estimated using the

  16. Damage Detection in Composite Structures with Wavenumber Array Data Processing

    Science.gov (United States)

    Tian, Zhenhua; Leckey, Cara; Yu, Lingyu

    2013-01-01

    Guided ultrasonic waves (GUW) have the potential to be an efficient and cost-effective method for rapid damage detection and quantification of large structures. Attractive features include sensitivity to a variety of damage types and the capability of traveling relatively long distances. They have proven to be an efficient approach for crack detection and localization in isotropic materials. However, techniques must be pushed beyond isotropic materials in order to be valid for composite aircraft components. This paper presents our study on GUW propagation and interaction with delamination damage in composite structures using wavenumber array data processing, together with advanced wave propagation simulations. Parallel elastodynamic finite integration technique (EFIT) is used for the example simulations. Multi-dimensional Fourier transform is used to convert time-space wavefield data into frequency-wavenumber domain. Wave propagation in the wavenumber-frequency domain shows clear distinction among the guided wave modes that are present. This allows for extracting a guided wave mode through filtering and reconstruction techniques. Presence of delamination causes spectral change accordingly. Results from 3D CFRP guided wave simulations with delamination damage in flat-plate specimens are used for wave interaction with structural defect study.

  17. Detecting geomorphic processes and change with high resolution topographic data

    Science.gov (United States)

    Mudd, Simon; Hurst, Martin; Grieve, Stuart; Clubb, Fiona; Milodowski, David; Attal, Mikael

    2016-04-01

    The first global topographic dataset was released in 1996, with 1 km grid spacing. It is astonishing that in only 20 years we now have access to tens of thousands of square kilometres of LiDAR data at point densities greater than 5 points per square meter. This data represents a treasure trove of information that our geomorphic predecessors could only dream of. But what are we to do with this data? Here we explore the potential of high resolution topographic data to dig deeper into geomorphic processes across a wider range of landscapes and using much larger spatial coverage than previously possible. We show how this data can be used to constrain sediment flux relationships using relief and hillslope length, and how this data can be used to detect landscape transience. We show how the nonlinear sediment flux law, proposed for upland, soil mantled landscapes by Roering et al. (1999) is consistent with a number of topographic tests. This flux law allows us to predict how landscapes will respond to tectonic forcing, and we show how these predictions can be used to detect erosion rate perturbations across a range of tectonic settings.

  18. Natural Language Processing for Detecting Forward Reference in a Document

    Directory of Open Access Journals (Sweden)

    Daniel Siahaan

    2012-11-01

    Full Text Available Meyer’s seven sins have been recognized as types of mistakes that a requirements specialist are often fallen to when specifying requirements. Such mistakes play a significant role in plunging a project into failure. Many researchers were focusing in ambiguity and contradiction type of mistakes. Other types of mistakes have been given less attentions. Those mistakes often happened in reality and may equally costly as the first two mistakes. This paper introduces an approach to detect forward reference. It traverses through a requirements document, extracts, and processes each statement. During the statement extraction, any terms that may reside in the statement is also extracted. Based on certain rules which utilize POS patterns, the statement is classified as a term definition or not. For each term definition, a term is added to a list of defined terms. At the same time, every time a new term is found in a statement, it is check against the list of defined terms. If it is not found, then the requirements statement is classified as statement with forward reference. The experimentation on 30 requirements documents from various domains of software project shows that the approach has considerably almost perfect agreement with domain expert in detecting forward reference, given 0.83 kappa index value.

  19. Infective endocarditis detection through SPECT/CT images digital processing

    Science.gov (United States)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  20. A method for detecting damage to rolling bearings in toothed gears of processing lines

    OpenAIRE

    T. Figlus; M. Stańczyk

    2016-01-01

    This paper presents a method of diagnosing damage to rolling bearings in toothed gears of processing lines. The research has shown the usefulness of vibration signal measurements performed with a laser vibrometer and of the method of denoising signals by means of a discrete wavelet transform in detecting damage to bearings. The application of the method of analysis of the characteristic frequencies of changes in the vibration signal amplitude made it possible to draw conclusions about the typ...

  1. The identification of high abilitey students: Results of a detection process in Navarra (Spain)

    OpenAIRE

    Tourón, J. (Javier); Repáraz, C. (Charo); Peralta, F

    1999-01-01

    This article focuses on the development and analysis of a two-stage early detection process (screening and diagnostic) for high ability students carried out in the region of Navarra (Spain) on a random sample of 1,274 elementary school students. Spanish versions of the Raven Progressive Matrices (SPM), the Renzulli Scales for Rating Behavioural Characteristics of Superior Students (SRBCSS), and participants' academic achievement were the main variables in the initial screening phase. Th...

  2. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  3. Detection and localization of leak of pipelines of RBMK reactor. Methods of processing of acoustic noise

    International Nuclear Information System (INIS)

    For realization of leak detection of input pipelines and output pipelines of RBMK reactor the method, based on detection and control of acoustic leak signals, was designed. In this report the review of methods of processing and analysis of acoustic noise is submitted. These methods were included in the software of the leak detection system and are used for the decision of the following problems: leak detection by method of sound pressure level in conditions of powerful background noise and strong attenuation of a signal; detection of a small leak in early stage by high-sensitivity correlation method; determination of a point of a sound source in conditions of strong reflection of a signal by a correlation method and sound pressure method; evaluation of leak size by the analysis of a sound level and point of a sound source. The work of considered techniques is illustrated on an example of test results of a fragment of the leak detection system. This test was executed on a Leningrad NPP, operated at power levels of 460, 700, 890 and 1000 MWe. 16 figs

  4. Obfuscated Malicious Code Detection with Path Condition Analysis

    OpenAIRE

    Wenqing Fan; Xue Lei; Jing An

    2014-01-01

    Code obfuscation is one of the main methods to hide malicious code. This paper proposes a new dynamic method which can effectively detect obfuscated malicious code. This method uses ISR to conduct dynamic debugging. The constraint solving during debugging process can detect deeply hidden malicious code by covering different execution paths. Besides, for malicious code that reads external resources, usually the detection of abnormal behaviors can only be detected by taking the resources into c...

  5. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  6. CPAS Preflight Drop Test Analysis Process

    Science.gov (United States)

    Englert, Megan E.; Bledsoe, Kristin J.; Romero, Leah M.

    2015-01-01

    Throughout the Capsule Parachute Assembly System (CPAS) drop test program, the CPAS Analysis Team has developed a simulation and analysis process to support drop test planning and execution. This process includes multiple phases focused on developing test simulations and communicating results to all groups involved in the drop test. CPAS Engineering Development Unit (EDU) series drop test planning begins with the development of a basic operational concept for each test. Trajectory simulation tools include the Flight Analysis and Simulation Tool (FAST) for single bodies, and the Automatic Dynamic Analysis of Mechanical Systems (ADAMS) simulation for the mated vehicle. Results are communicated to the team at the Test Configuration Review (TCR) and Test Readiness Review (TRR), as well as at Analysis Integrated Product Team (IPT) meetings in earlier and intermediate phases of the pre-test planning. The ability to plan and communicate efficiently with rapidly changing objectives and tight schedule constraints is a necessity for safe and successful drop tests.

  7. BUSINESS PROCESS MANAGEMENT SYSTEMS TECHNOLOGY COMPONENTS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Andrea Giovanni Spelta

    2007-05-01

    Full Text Available The information technology that supports the implementation of the business process management appproach is called Business Process Management System (BPMS. The main components of the BPMS solution framework are process definition repository, process instances repository, transaction manager, conectors framework, process engine and middleware. In this paper we define and characterize the role and importance of the components of BPMS's framework. The research method adopted was the case study, through the analysis of the implementation of the BPMS solution in an insurance company called Chubb do Brasil. In the case study, the process "Manage Coinsured Events"" is described and characterized, as well as the components of the BPMS solution adopted and implemented by Chubb do Brasil for managing this process.

  8. Planar Inlet Design and Analysis Process (PINDAP)

    Science.gov (United States)

    Slater, John W.; Gruber, Christopher R.

    2005-01-01

    The Planar Inlet Design and Analysis Process (PINDAP) is a collection of software tools that allow the efficient aerodynamic design and analysis of planar (two-dimensional and axisymmetric) inlets. The aerodynamic analysis is performed using the Wind-US computational fluid dynamics (CFD) program. A major element in PINDAP is a Fortran 90 code named PINDAP that can establish the parametric design of the inlet and efficiently model the geometry and generate the grid for CFD analysis with design changes to those parameters. The use of PINDAP is demonstrated for subsonic, supersonic, and hypersonic inlets.

  9. Probabilistic analysis of a thermosetting pultrusion process

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem C.; Hattel, Jesper Henri

    2016-01-01

    process. A new application for the probabilistic analysis of the pultrusion process is introduced using the response surface method (RSM). The results obtained from the RSM are validated by employing the Monte Carlo simulation (MCS) with Latin hypercube sampling technique. According to the results......In the present study, the effects of uncertainties in the material properties of the processing composite material and the resin kinetic parameters, as well as process parameters such as pulling speed and inlet temperature, on product quality (exit degree of cure) are investigated for a pultrusion...

  10. LRS data processing methods for detection of lunar subsurface echoes

    Science.gov (United States)

    Oshigami, Shoko; Mochizuki, Kengo; Watanabe, Shiho; Watanabe, Toshiki; Yamaguchi, Yasushi; Yamaji, Atsushi; Ono, Takayuki; Kumamoto, Atsushi; Nakagawa, Hiromu; Kobayashi, Takao; Kasahara, Yoshiya

    Lunar Radar Sounder (LRS) is an instrument for one of fifteen science missions of SE- LENE (KAGUYA). LRS is a ground-penetrating FM-CW radar system of HF-band. LRS detects echoes reflected from subsurface discontinuities where dielectric constants of the rocks change. The range resolution of LRS is 75 m in free space, whereas the sampling interval in the flight direction is about 75 m when the spacecraft altitude is 100 km. The primary objective of LRS is to investigate lunar subsurface structures. We plan to perform global soundings by LRS to contribute to studying the evolution of the Moon. In this presentation, we introduce the techniques to process LRS data to produce data products and to detect subsurface echoes. We have two standard data products of LRS under consideration. The time series data of ‘A-scope' which is a plot of signal power spectrum as a function of range derived from of the waveform data are called ‘B-scan'. Because LRS instruments change timing of data recording (measurement delay time) according to the predicted distance between KAGUYA spacecraft and lunar surface, observation range with respect to the spacecraft varies from pulse to pulse. In addition, flight altitude of KAGUYA changes in the range of several tens of kilometers. Therefore a trace of surface nadir echoes in unprocessed B-scan images does not correspond to actual lunar topography. We corrected variations of the measurement delay time and flight altitude of KAGUYA to produce a B-scan data product with the original spatial resolution (BScan high) and a reduced spatial resolution product (BScan low) both in the PDS format. The echo signals in A-scope data might be classified in the following categories; (1) a surface nadir echo, (2) surface off-nadir backscattering echoes, and (3) subsurface echoes. The most intense signal usually comes from the nadir point, when KAGUYA is flying over a level surface. The A-scope data also include various noises resulted from, for example

  11. Application of flaw detection methods for detection of fatigue processes in low-alloyed steel

    Directory of Open Access Journals (Sweden)

    Zbigniew H. śUREK

    2007-01-01

    Full Text Available The paper presents the investigations conducted in the Fraunhofer Institute (IZFP Saarbrücken by use of a BEMI microscope (BEMI= Barkhausenrausch- und Wirbelstrom-Mikroskopie or Barkhausen Noise and Eddy Current Microscopy. The ability to detect cyclic and contact fatigue load influences has been investigated. The measurement amplitudes obtained with Barkhausen Noise and Eddy Current probes havebeen analysed. Correlation of measurement results and material’s condition has been observed in case of the eddy current mode method for frequencies above 2 MHz (for contact-loaded material samples. Detection of material’s fatigue process (at 80 % fatiguelife in the sample subjected to series of high-cyclic loads has been proven to be practically impossible. Application of flaw detection methods in material fatigue tests requires modification of test methods and use of investigation methods relevant to physical parameters of the investigated material. The magnetic leakage field method, which has been abandoned by many researchers, may be of significant use in the material fatigue assessment and may provide new research prospects.

  12. Does facial processing prioritize change detection?: change blindness illustrates costs and benefits of holistic processing.

    Science.gov (United States)

    Wilford, Miko M; Wells, Gary L

    2010-11-01

    There is broad consensus among researchers both that faces are processed more holistically than other objects and that this type of processing is beneficial. We predicted that holistic processing of faces also involves a cost, namely, a diminished ability to localize change. This study (N = 150) utilized a modified change-blindness paradigm in which some trials involved a change in one feature of an image (nose, chin, mouth, hair, or eyes for faces; chimney, porch, window, roof, or door for houses), whereas other trials involved no change. People were better able to detect the occurrence of a change for faces than for houses, but were better able to localize which feature had changed for houses than for faces. Half the trials used inverted images, a manipulation that disrupts holistic processing. With inverted images, the critical interaction between image type (faces vs. houses) and task (change detection vs. change localization) disappeared. The results suggest that holistic processing reduces change-localization abilities. PMID:20935169

  13. During air cool process aerosol absorption detection with photothermal interferometry

    Science.gov (United States)

    Li, Baosheng; Xu, Limei; Huang, Junling; Ma, Fei; Wang, Yicheng; Li, Zhengqiang

    2014-11-01

    This paper studies the basic principle of laser photothermal interferometry method of aerosol particles absorption coefficient. The photothermal interferometry method with higher accuracy and lower uncertainty can directly measure the absorption coefficient of atmospheric aerosols and not be affected by scattered light. With Jones matrix expression, the math expression of a special polarization interferometer is described. This paper using folded Jamin interferometer, which overcomes the influence of vibration on measuring system. Interference come from light polarization beam with two orthogonal and then combine to one beam, finally aerosol absorption induced refractive index changes can be gotten with four beam of phase orthogonal light. These kinds of styles really improve the stability of system and resolution of the system. Four-channel detections interact with interference fringes, to reduce the light intensity `zero drift' effect on the system. In the laboratory, this device typical aerosol absorption index, it shows that the result completely agrees with actual value. After heated by laser, cool process of air also show the process of aerosol absorption. This kind of instrument will be used to monitor ambient aerosol absorption and suspended particulate matter chemical component. Keywords: Aerosol absorption coefficient; Photothermal interferometry; Suspended particulate matter.

  14. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  15. Non-Harmonic Fourier Analysis for bladed wheels damage detection

    Science.gov (United States)

    Neri, P.; Peeters, B.

    2015-11-01

    The interaction between bladed wheels and the fluid distributed by the stator vanes results in cyclic loading of the rotating components. Compressors and turbines wheels are subject to vibration and fatigue issues, especially when resonance conditions are excited. Even if resonance conditions can be often predicted and avoided, high cycle fatigue failures can occur, causing safety issues and economic loss. Rigorous maintenance programs are then needed, forcing the system to expensive shut-down. Blade crack detection methods are beneficial for condition-based maintenance. While contact measurement systems are not always usable in exercise conditions (e.g. high temperature), non-contact methods can be more suitable. One (or more) stator-fixed sensor can measure all the blades as they pass by, in order to detect the damaged ones. The main drawback in this situation is the short acquisition time available for each blade, which is shortened by the high rotational speed of the components. A traditional Discrete Fourier Transform (DFT) analysis would result in a poor frequency resolution. A Non-Harmonic Fourier Analysis (NHFA) can be executed with an arbitrary frequency resolution instead, allowing to obtain frequency information even with short-time data samples. This paper shows an analytical investigation of the NHFA method. A data processing algorithm is then proposed to obtain frequency shift information from short time samples. The performances of this algorithm are then studied by experimental and numerical tests.

  16. Analysis of Spatial Data Structures for Proximity Detection

    Institute of Scientific and Technical Information of China (English)

    Anupreet Walia; Jochen Teizer

    2008-01-01

    Construction is a dangerous business.According to statistics,in every of the past thirteen years more than 1000 workers died in the USA construction industry.In order to minimize the overall number of these incidents,the research presented in this paper investigates to monitor and analyze the trejectories of construction resources first in a simulated environment and later on the actual job site.Due to the complex nature of the construction environment,three dimensional (3D) positioning data of workers is hardly col-lected.Although technology is available that allows tracking construction assets in real-time,indoors and outdoors,in 3D,at the same time,the continuously changing spatial and temporal arrangement of job sites requires any successfully working data processing system to work in real-time.This research paper focuses is safety on spatial data structures that offer the capability of realigning itself and reporting the distance of the closest neighbor in real-time.This paper presents results to simulations that allow the processing of real-time location data for collision detection and proximity analysis.The presented data structures and perform-ance results to the developed algorithms demonstmte that real-time tracking and proximity detection of re-sources is feasible.

  17. Double JPEG Compression Detection Using Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Ehsan Nowroozi

    Full Text Available Nowadays with advancement of technology, tampering of digital images using computer and advanced software packages like Photoshop has become a simple task. Many algorithms have been proposed to detect tampered images that have been kept developing. In thi ...

  18. Detection and Analysis of Solar Eclipse

    CERN Document Server

    Sridhar, Sarrvesh Seethapuram; Jackson, I Kenny; Kannan, P

    2012-01-01

    We propose an algorithm that can be used by amateur astronomers to analyze the images acquired during solar eclipses. The proposed algorithm analyzes the image, detects the eclipse and produces results for parameters like magnitude of eclipse, eclipse obscuration and the approximate distance between the Earth and the Moon.

  19. Detection limits for Gamma-Ray spectral analysis

    International Nuclear Information System (INIS)

    This paper presents a study concerning the formulation of sensitivity limits in radioactivity analysis using gamma-ray spectrometer. Expressions for practical evaluation of the detection level (SD) and the detection limit (LD) are given. The detection level is defined as a detection criterion (a posteriori) and the detection limit is defined either a posteriori, for undetected radionuclides, or a priori, for the weakest activity which can be detected in the analysis conditions. Given expressions use only the total background at the place of gamma-ray photopeak and the Full Width at Half Maximum (FWHM) for this peak. The main parameters occurring in these limits are studied and some recommendations are given. The use of these limits in some expressions of radioactivity analysis results (single measure, sum, mean) is also indicated

  20. Nuclear physics, and nuclear detection and analysis technology

    International Nuclear Information System (INIS)

    We present a brief review of the principle, methods and main techniques of radiation and particle detection, which have developed along with the advances in nuclear physics. Some important and typical applications of nuclear physics, nuclear detection and nuclear analysis technology are reviewed, such as particle activation analysis, Mössbauer spectroscopy, nuclear magnetic resonance, accelerator mass spectroscopy, nuclear medicine imaging, synchrotron radiation technology, neutron scattering analysis, radioactive tracing, and so on. (authors)

  1. A Systematic Analysis of Coal Accumulation Process

    Institute of Scientific and Technical Information of China (English)

    CHENG Aiguo

    2008-01-01

    Formation of coal seam and coal-rich zone is an integrated result of a series of factors in coal accumulation process. The coal accumulation system is an architectural aggregation of coal accumulation factors. It can be classified into 4 levels: the global coal accumulation super-system, the coal accumulation domain mega.system, the coal accumulation basin system, and the coal seam or coal seam set sub-system. The coal accumulation process is an open, dynamic, and grey system, and is meanwhile a system with such natures as aggregation, relevance, entirety, purpose-orientated, hierarchy, and environment adaptability. In this paper, we take coal accumulation process as a system to study origin of coal seam and coal-rich zone; and we will discuss a methodology of the systematic analysis of coal accumulation process. As an example, the Ordos coal basin was investigated to elucidate the application of the method of the coal accumulation system analysis.

  2. Exergetic analysis of human & natural processes

    CERN Document Server

    Banhatti, Dilip G

    2011-01-01

    Using the concept of available work or exergy, each human and natural process can be characterized by its contextual efficiency, that is, efficiency with the environment as a reference. Such an efficiency is termed exergy efficiency. Parts of the process which need to be made more efficient & less wasteful stand out in such an analysis, in contrast to an energy analysis. Any new idea for a process can be similarly characterized. This exercise naturally generates paths to newer ideas in given contexts to maximize exergy efficiency. The contextual efficiency is not just output/input, it also naturally includes environmental impact (to be minimized) and any other relevant parameter(s) to be optimized. Natural life processes in different terrestrial environments are already optimized for their environments, and act as guides, for example, in seeking to evolve sustainable energy practices in different contexts. Energy use at lowest possible temperature for each situation is a natural result. Variety of renewab...

  3. Bearing defect detection and diagnosis using a time encoded signal processing and pattern recognition method

    International Nuclear Information System (INIS)

    Many new bearing monitoring and diagnosis methods have been explored in the last two decades to provide a technique that is capable of picking up an incipient bearing fault. Vibration analysis is a commonly used condition monitoring technique in world industry and has proved an effective method for rolling bearing monitoring systems. The focus of this paper is to combine two conventional methods: wavelet transform and envelope analysis with the Time Encoded Signal Processing and Recognition (TESPAR) to develop a better technique for detection of small bearing faults. Results show that TESPAR with these two combinations provides good fault discrimination in terms of location and severity for different bearing conditions.

  4. KNOWLEDGE PROCESS ANALYSIS:FRAMEWORK AND EXPERIENCE

    Institute of Scientific and Technical Information of China (English)

    Kozo SUGIYAMA; Bertolt MEYER

    2008-01-01

    We present a step-by-step approach for constructing a framework for knowledge process analysis (KPA).We intend to apply this framework to the analysis of own research projects in an exploratory way and elaborate it through the accumulation of case studies.This study is based on a methodology consisting of knowledge process modeling,primitives synthesis,and reflective verification.We describe details of the methodology and present the results of case studies:a novd methodology,a practical work guide,and a tool for KPA;insights for improving future research projects and education;and the integration of existing knowledge creation theories.

  5. Parallel processing of structural integrity analysis codes

    International Nuclear Information System (INIS)

    Structural integrity analysis forms an important role in assessing and demonstrating the safety of nuclear reactor components. This analysis is performed using analytical tools such as Finite Element Method (FEM) with the help of digital computers. The complexity of the problems involved in nuclear engineering demands high speed computation facilities to obtain solutions in reasonable amount of time. Parallel processing systems such as ANUPAM provide an efficient platform for realising the high speed computation. The development and implementation of software on parallel processing systems is an interesting and challenging task. The data and algorithm structure of the codes plays an important role in exploiting the parallel processing system capabilities. Structural analysis codes based on FEM can be divided into two categories with respect to their implementation on parallel processing systems. The first category codes such as those used for harmonic analysis, mechanistic fuel performance codes need not require the parallelisation of individual modules of the codes. The second category of codes such as conventional FEM codes require parallelisation of individual modules. In this category, parallelisation of equation solution module poses major difficulties. Different solution schemes such as domain decomposition method (DDM), parallel active column solver and substructuring method are currently used on parallel processing systems. Two codes, FAIR and TABS belonging to each of these categories have been implemented on ANUPAM. The implementation details of these codes and the performance of different equation solvers are highlighted. (author). 5 refs., 12 figs., 1 tab

  6. Applied behavior analysis and statistical process control?

    OpenAIRE

    Hopkins, B. L.

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Ex...

  7. Analysis of Software-Engineering-Processes

    OpenAIRE

    Teichmann, Clemens; Schreiber, Andreas

    2013-01-01

    The German Aerospace Center (DLR) is one of the biggest software development facilities in Germany. Its employees create complex software using various development processes. To assure high software quality, innovative software engineering methods and tools need to be incorporated. A current problem in the field of computer science is to identify the effectiveness of those methods and tools to ensure quality. An analysis of the incorporated processes is needed to determine which parts sup...

  8. Performance Analysis of Cone Detection Algorithms

    CERN Document Server

    Mariotti, Letizia

    2015-01-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of two popular cone detection algorithms and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the three algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimat...

  9. PERFORMANCE ANALYSIS OF HARDWARE TROJAN DETECTION METHODS

    OpenAIRE

    Ehsan, Sharifi; Kamal, Mohammadiasl; Mehrdad, Havasi; Amir, Yazdani

    2015-01-01

    Due to the increasing use of information and communication technologies in most aspects of life, security of the information has drawn the attention of governments and industry as well as the researchers. In this regard, structural attacks on the functions of a chip are called hardware Trojans, and are capable of rendering ineffective the security protecting our systems and data. This method represents a big challenge for cyber-security as it is nearly impossible to detect with any currently ...

  10. Trace analysis for 300 MM wafers and processes with TXRF

    International Nuclear Information System (INIS)

    Efficient fabrication of semiconductor devices is combined with an increasing size of silicon wafers. The contamination level of processes, media, and equipment has to decrease continuously. A new test laboratory for 300 mm was installed in view of the above mentioned aspects. Aside of numerous processing tools this platform consist electrical test methods, particle detection, vapor phase decomposition (VPD) preparation, and TXRF. The equipment is installed in a cleanroom. It is common to perform process or equipment control, development, evaluation and qualification with monitor wafers. The evaluation and the qualification of 300 mm equipment require direct TXRF on 300 mm wafers. A new TXRF setup was installed due to the wafer size of 300 mm. The 300 mm TXRF is equipped with tungsten and molybdenum anode. This combination allows a sensitive detection of elements with fluorescence energy below 10 keV for tungsten excitation. The molybdenum excitation enables the detection of a wide variety of elements. The detection sensitivity for the tungsten anode excited samples is ten times higher than for molybdenum anode measured samples. The system is calibrated with 1 ng Ni. This calibration shows a stability within 5 % when monitored to control system stability. Decreasing the amount of Ni linear results in a linear decrease of the measured Ni signal. This result is verified for a range of elements by multielement samples. New designs demand new processes and materials, e.g. ferroelectric layers and copper. The trace analysis of many of these materials is supported by the higher excitation energy of the molybdenum anode. Reclaim and recycling of 300 mm wafers demand for an accurate contamination control of the processes to avoid cross contamination. Polishing or etching result in modified surfaces. TXRF as a non-destructive test method allows the simultaneously detection of a variety of elements on differing surfaces in view of contamination control and process

  11. Detecting Inhomogeneity in Daily Climate Series Using Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    YAN Zhongwei; Phil D.JONES

    2008-01-01

    A wavelet method was applied to detect inhomogeneities in daily meteorological series,data which are being increasingly applied in studies of climate extremes.The wavelet method has been applied to a few well-established long-term daily temperature series back to the 18th century,which have been "homogenized" with conventional approaches.Various types of problems remaining in the series were revealed with the wavelet method.Their influences on analyses of change in climate extremes are discussed.The results have importance for understanding issues in conventional climate data processing and for development of improved methods of homogenization in order to improve analysis of climate extremes based on daily data.

  12. Applications of random process excursion analysis

    CERN Document Server

    Brainina, Irina S

    2013-01-01

    This book addresses one of the key problems in signal processing, the problem of identifying statistical properties of excursions in a random process in order to simplify the theoretical analysis and make it suitable for engineering applications. Precise and approximate formulas are explained, which are relatively simple and can be used for engineering applications such as the design of devices which can overcome the high initial uncertainty of the self-training period. The information presented in the monograph can be used to implement adaptive signal processing devices capable of d

  13. Radionuclides for process analysis in industry

    International Nuclear Information System (INIS)

    The process analysis in industrial plants includes the overall determination of process and operating parameters, such as throughput, material composition, residence time, mixing behaviour, flow rates etc. in the individual steps of a process. General instructions for the performance of industrial tracer experiments are presented, and radionuclide investigations in briquetting and in the large-scale production of caprolactam are discussed. In both cases the results of the tracer experiments indicated the way towards greater operational efficiency and high economic benefits. They had a positive effect on both the output and the quality of the final products. (author)

  14. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper;

    residuals generalise the well-known residuals for point processes in time, used in signal processing and survival analysis. An important difference is that the conditional intensity or hazard rate of the temporal point process must be replaced by the Papangelou conditional intensity $lambda$ of the spatial....... The conditional intensity $lambda$ plays the role of the mean response. Elaborating this analogy gives us recommendations for diagnostic plots and model criticism using these residuals. A plot of smoothed residuals against spatial location, or against a spatial covariate, is effective in diagnosing spatial trend...

  15. Real-time computational processing and implementation for concealed object detection

    Science.gov (United States)

    Lee, Dong-Su; Yeom, Seokwon; Chang, YuShin; Lee, Mun-Kyo; Jung, Sang-Won

    2012-07-01

    Millimeter wave (MMW) readily penetrates fabrics, thus it can be used to detect objects concealed under clothing. A passive MMW imaging system can operate as a stand-off type sensor that scans people both indoors and outdoors. However, because of the diffraction limit and low signal level, the imaging system often suffers from low image quality. Therefore, suitable computational processing would be required for automatic analysis of the images. The authors present statistical and computational algorithms and their implementations for real-time concealed object detection. The histogram of the image is modeled as a Gaussian mixture distribution, and hidden object areas are segmented by a multilevel scheme involving the expectation-maximization algorithm. The complete algorithm has been implemented in both MATLAB and C++. Experimental and simulation results confirm that the implemented system can achieve real-time detection of concealed objects.

  16. On Multiview Analysis for Fingerprint Liveness Detection

    OpenAIRE

    Bottino, Andrea Giuseppe; Cumani, Sandro; Toosi, Amirhosein

    2015-01-01

    Fingerprint recognition systems, as any other biometric system, can be subject to attacks, which are usually carried out using artificial fingerprints. Several approaches to discriminate between live and fake fingerprint images have been presented to address this issue. These methods usually rely on the analysis of individual features extracted from the fingerprint images. Such features represent different and complementary views of the object in analysis, and their fusion is likely to improv...

  17. Experimental exploration to thermal infrared imaging for detecting the transient process of solid impact

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the analysis and the comparison of stress pattern analysis by thermal emission (SPATE) and remote sensing rock mechanics (RSRM), the idea to detect the transient process of solid impact with thermal infrared (TIR) imaging technology is introduced. By means of TVS-8100MKII T IR imaging system, which has high recording speed, high space distinguishability and high temperature sensibility, TIR imaging experiments on free falling steel ball impacting on marble, granite, concrete, steel, organic-glass and wood plate are conducted. It was discovered that: (i) the target's TIR temperature increases remarkably after impact; (ii) when ball's size is not changed, the variation amplitude of target's TIR temperature proportionally increases with the ball's potential energy or falling height; (iii) the variation amplitude of target's TIR temperature is involved with the material type and the surface glabrous condition of the target, and the amplitudes are in order as concrete, unpolished marble, steel plate, wood plate, polished granite, polished marble and organic-glass plate; and (iv) the TIR radiation of fragile targets decreases gradually after impact, while there is delayed TIR radiation strengthening for plastic target. It is deduced that once the relational runctions and technical parameters, which are involved with certain impact body and target material, are set up through experimental study, the remote detection and back analysis based on TIR imaging for the transient process of solid impact will be no problem. Besides, there is also important scientific meaning for the omen mechanics study and satellite TIR detection and prediction for structural earthquake.

  18. Artificial intelligence applied to process signal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Corsberg, D.

    1986-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge-based appraoch to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored. 8 refs.

  19. Bisous model - detecting filamentary patterns in point processes

    CERN Document Server

    Tempel, E; Kipper, R; Saar, E

    2016-01-01

    The cosmic web is a highly complex geometrical pattern, with galaxy clusters at the intersection of filaments and filaments at the intersection of walls. Identifying and describing the filamentary network is not a trivial task due to the overwhelming complexity of the structure, its connectivity and the intrinsic hierarchical nature. To detect and quantify galactic filaments we use the Bisous model, which is a marked point process built to model multi-dimensional patterns. The Bisous filament finder works directly with the galaxy distribution data and the model intrinsically takes into account the connectivity of the filamentary network. The Bisous model generates the visit map (the probability to find a filament at a given point) together with the filament orientation field. Using these two fields, we can extract filament spines from the data. Together with this paper we publish the computer code for the Bisous model that is made available in GitHub. The Bisous filament finder has been successfully used in s...

  20. Probing Interfacial Processes on Graphene Surface by Mass Detection

    Science.gov (United States)

    Kakenov, Nurbek; Kocabas, Coskun

    2013-03-01

    In this work we studied the mass density of graphene, probed interfacial processes on graphene surface and examined the formation of graphene oxide by mass detection. The graphene layers were synthesized by chemical vapor deposition method on copper foils and transfer-printed on a quartz crystal microbalance (QCM). The mass density of single layer graphene was measured by investigating the mechanical resonance of the QCM. Moreover, we extended the developed technique to probe the binding dynamics of proteins on the surface of graphene, were able to obtain nonspecific binding constant of BSA protein of graphene surface in aqueous solution. The time trace of resonance signal showed that the BSA molecules rapidly saturated by filling the available binding sites on graphene surface. Furthermore, we monitored oxidation of graphene surface under oxygen plasma by tracing the changes of interfacial mass of the graphene controlled by the shifts in Raman spectra. Three regimes were observed the formation of graphene oxide which increases the interfacial mass, the release of carbon dioxide and the removal of small graphene/graphene oxide flakes. Scientific and Technological Research Council of Turkey (TUBITAK) grant no. 110T304, 109T209, Marie Curie International Reintegration Grant (IRG) grant no 256458, Turkish Academy of Science (TUBA-Gebip).

  1. Application of wavelet analysis to crustal deformation data processing

    Institute of Scientific and Technical Information of China (English)

    张燕; 吴云; 刘永启; 施顺英

    2004-01-01

    The time-frequency analysis and anomaly detection of wavelet transformation make the method irresistibly advantageous in non-stable signal processing. In the paper, the two characteristics are analyzed and demonstrated withsynthetic signal. By applying wavelet transformation to deformation data processing, we find that about 4 monthsbefore strong earthquakes, several deformation stations near the epicenter received at the same time the abnormalsignal with the same frequency and the period from several days to more than ten days. The GPS observation stations near the epicenter all received the abnormal signal whose period is from 3 months to half a year. These abnormal signals are possibly earthquake precursors.

  2. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  3. Applied Behavior Analysis and Statistical Process Control?

    Science.gov (United States)

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  4. Detection and analysis of explosives by nuclear techniques

    International Nuclear Information System (INIS)

    In today's global environment of international terrorism, there is a well recognised need for sensitive and specific techniques for detection and analysis of explosives. Sensitivity is needed for detection of small amounts of explosives hidden or mixed with post explosion residues. Specificity is needed in order to avoid response, to non-explosive substances. The conventional techniques for detection of explosives based on vapour detection either by sniffer dogs or by some instrumental vapour detectors have limitations because of their inability to detect plastic bonded explosives which has vapour pressure significantly lower than pure explosives. The paper reviews the various nuclear techniques such as thermal, fast and pulsed fast neutron activation analysis which are used for detection of pure explosives, mixtures of explosives, and also aluminized and plastic bonded explosives. (author)

  5. Fault Detection and Diagnosis in Process Data Using Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Fang Wu

    2014-01-01

    Full Text Available For the complex industrial process, it has become increasingly challenging to effectively diagnose complicated faults. In this paper, a combined measure of the original Support Vector Machine (SVM and Principal Component Analysis (PCA is provided to carry out the fault classification, and compare its result with what is based on SVM-RFE (Recursive Feature Elimination method. RFE is used for feature extraction, and PCA is utilized to project the original data onto a lower dimensional space. PCA T2, SPE statistics, and original SVM are proposed to detect the faults. Some common faults of the Tennessee Eastman Process (TEP are analyzed in terms of the practical system and reflections of the dataset. PCA-SVM and SVM-RFE can effectively detect and diagnose these common faults. In RFE algorithm, all variables are decreasingly ordered according to their contributions. The classification accuracy rate is improved by choosing a reasonable number of features.

  6. Performance analysis of cone detection algorithms.

    Science.gov (United States)

    Mariotti, Letizia; Devaney, Nicholas

    2015-04-01

    Many algorithms have been proposed to help clinicians evaluate cone density and spacing, as these may be related to the onset of retinal diseases. However, there has been no rigorous comparison of the performance of these algorithms. In addition, the performance of such algorithms is typically determined by comparison with human observers. Here we propose a technique to simulate realistic images of the cone mosaic. We use the simulated images to test the performance of three popular cone detection algorithms, and we introduce an algorithm which is used by astronomers to detect stars in astronomical images. We use Free Response Operating Characteristic (FROC) curves to evaluate and compare the performance of the four algorithms. This allows us to optimize the performance of each algorithm. We observe that performance is significantly enhanced by up-sampling the images. We investigate the effect of noise and image quality on cone mosaic parameters estimated using the different algorithms, finding that the estimated regularity is the most sensitive parameter. PMID:26366758

  7. Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method

    OpenAIRE

    DR.K.KUPPUSAMY; S. Murugan

    2010-01-01

    This paper is deliberate to provide a model for “Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method ” , It describes the state’s overall requirements regarding the acquisition and implementation of intrusion prevention and detection systems with intelligence (IIPS/IIDS). This is designed to provide a deeper understanding of intrusion prevention and detection principles with intelligence may be responsible for acquiring, implementing or monitoring such sy...

  8. Detection of short transients in colored noise by multiresolution analysis

    OpenAIRE

    Stevens, John Davenport.

    2000-01-01

    Detecting short transients is a signal processing application that has a wide range of military uses. To be specific in Undersea Warfare, sensitive signal detection schemes can increase the effective range of active and passive sonar operations. Current research is being done to improve the capability of detecting short signals buried within background noise, particularly in Littoral waters. Starting with a colored noise model, this thesis will introduce two denoising methods based on multire...

  9. Development of Quantum Devices and Algorithms for Radiation Detection and Radiation Signal Processing

    International Nuclear Information System (INIS)

    The main functions of spectroscopy system are signal detection, filtering and amplification, pileup detection and recovery, dead time correction, amplitude analysis and energy spectrum analysis. Safeguards isotopic measurements require the best spectrometer systems with excellent resolution, stability, efficiency and throughput. However, the resolution and throughput, which depend mainly on the detector, amplifier and the analog-to-digital converter (ADC), can still be improved. These modules have been in continuous development and improvement. For this reason we are interested with both the development of quantum detectors and efficient algorithms of the digital processing measurement. Therefore, the main objective of this thesis is concentrated on both 1. Study quantum dot (QD) devices behaviors under gamma radiation 2. Development of efficient algorithms for handling problems of gamma-ray spectroscopy For gamma radiation detection, a detailed study of nanotechnology QD sources and infrared photodetectors (QDIP) for gamma radiation detection is introduced. There are two different types of quantum scintillator detectors, which dominate the area of ionizing radiation measurements. These detectors are QD scintillator detectors and QDIP scintillator detectors. By comparison with traditional systems, quantum systems have less mass, require less volume, and consume less power. These factors are increasing the need for efficient detector for gamma-ray applications such as gamma-ray spectroscopy. Consequently, the nanocomposite materials based on semiconductor quantum dots has potential for radiation detection via scintillation was demonstrated in the literature. Therefore, this thesis presents a theoretical analysis for the characteristics of QD sources and infrared photodetectors (QDIPs). A model of QD sources under incident gamma radiation detection is developed. A novel methodology is introduced to characterize the effect of gamma radiation on QD devices. The rate

  10. Geospatial Image Stream Processing: Models, techniques, and applications in remote sensing change detection

    Science.gov (United States)

    Rueda-Velasquez, Carlos Alberto

    Detection of changes in environmental phenomena using remotely sensed data is a major requirement in the Earth sciences, especially in natural disaster related scenarios where real-time detection plays a crucial role in the saving of human lives and the preservation of natural resources. Although various approaches formulated to model multidimensional data can in principle be applied to the inherent complexity of remotely sensed geospatial data, there are still challenging peculiarities that demand a precise characterization in the context of change detection, particularly in scenarios of fast changes. In the same vein, geospatial image streams do not fit appropriately in the standard Data Stream Management System (DSMS) approach because these systems mainly deal with tuple-based streams. Recognizing the necessity for a systematic effort to address the above issues, the work presented in this thesis is a concrete step toward the foundation and construction of an integrated Geospatial Image Stream Processing framework, GISP. First, we present a data and metadata model for remotely sensed image streams. We introduce a precise characterization of images and image streams in the context of remotely sensed geospatial data. On this foundation, we define spatially-aware temporal operators with a consistent semantics for change analysis tasks. We address the change detection problem in settings where multiple image stream sources are available, and thus we introduce an architectural design for the processing of geospatial image streams from multiple sources. With the aim of targeting collaborative scientific environments, we construct a realization of our architecture based on Kepler, a robust and widely used scientific workflow management system, as the underlying computational support; and open data and Web interface standards, as a means to facilitate the interoperability of GISP instances with other processing infrastructures and client applications. We demonstrate our

  11. Qualitative Analysis for Maintenance Process Assessment

    Science.gov (United States)

    Brand, Lionel; Kim, Yong-Mi; Melo, Walcelio; Seaman, Carolyn; Basili, Victor

    1996-01-01

    In order to improve software maintenance processes, we first need to be able to characterize and assess them. These tasks must be performed in depth and with objectivity since the problems are complex. One approach is to set up a measurement-based software process improvement program specifically aimed at maintenance. However, establishing a measurement program requires that one understands the problems to be addressed by the measurement program and is able to characterize the maintenance environment and processes in order to collect suitable and cost-effective data. Also, enacting such a program and getting usable data sets takes time. A short term substitute is therefore needed. We propose in this paper a characterization process aimed specifically at maintenance and based on a general qualitative analysis methodology. This process is rigorously defined in order to be repeatable and usable by people who are not acquainted with such analysis procedures. A basic feature of our approach is that actual implemented software changes are analyzed in order to understand the flaws in the maintenance process. Guidelines are provided and a case study is shown that demonstrates the usefulness of the approach.

  12. Outlier Detection with Space Transformation and Spectral Analysis

    DEFF Research Database (Denmark)

    Dang, Xuan-Hong; Micenková, Barbora; Assent, Ira;

    2013-01-01

    Detecting a small number of outliers from a set of data observations is always challenging. In this paper, we present an approach that exploits space transformation and uses spectral analysis in the newly transformed space for outlier detection. Unlike most existing techniques in the literature w...

  13. Space Applications for Ensemble Detection and Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Ensemble Detection is both a measurement technique and analysis tool. Like a prism that separates light into spectral bands, an ensemble detector mixes a signal...

  14. Statistical analysis of interferometer detection systems

    International Nuclear Information System (INIS)

    Interferometers are often used to measure small phase shifts introduced into the paths of recombining beams by some physical influence. An analysis of the associated counting statistics is presented leading to a maximum likelihood prescription for extracting the (small) phase shift. Although presented in the context of neutron interferometry, the results are easily extended to other situations involving particle counting. 4 refs., 3 figs

  15. Cross-Disciplinary Detection and Analysis of Network Motifs

    OpenAIRE

    Ngoc Tam L. Tran; Luke DeLuccia; McDonald, Aidan F; Chun-Hsi Huang

    2015-01-01

    The detection of network motifs has recently become an important part of network analysis across all disciplines. In this work, we detected and analyzed network motifs from undirected and directed networks of several different disciplines, including biological network, social network, ecological network, as well as other networks such as airlines, power grid, and co-purchase of political books networks. Our analysis revealed that undirected networks are similar at the basic three and four nod...

  16. Design and implementation of network attack analysis and detect system

    International Nuclear Information System (INIS)

    This paper first analyzes the present research state of IDS (intrusion detection system), classifies and compares existing methods. According to the problems existing in IDS, such as false-positives, false-negatives and low information visualization, this paper suggests a system named NAADS which supports multi data sources. Through a series of methods such as clustering analysis, association analysis and visualization, rate of detection and usability of NAADS are increased. (authors)

  17. Signal detection using change point analysis in postmarket surveillance†

    OpenAIRE

    Xu, Zhiheng; Kass-Hout, Taha; Anderson-Smits, Colin; Gray, Gerry

    2015-01-01

    Purpose Signal detection methods have been used extensively in postmarket surveillance to identify elevated risks of adverse events associated with medical products (drugs, vaccines, and devices). However, current popular disproportionality methods ignore useful information such as trends when the data are aggregated over time for signal detection. Methods In this paper, we applied change point analysis (CPA) to trend analysis of medical products in a spontaneous adverse event reporting syste...

  18. Analysis code for gamma spectra detected with Ge detectors

    International Nuclear Information System (INIS)

    The Atomic Energy Center of Miyagi Prefecture introduced a Ge detector γ-ray spectrometer system for the purpose of examining environmental radiation around Onagawa Nuclear Power Station in March, 1981, and also developed an automatic spectrum analyzing code ARACC (automatic radioactivity calculation code) for qualitatively and quantitatively determining nuclides from the data obtained. After that, the data processing method and output format were improved. The program includes 17 sub-programs under the main program ARACC, in addition, there are the programs for preparing tables, calibrating the energy channels and calculating the efficiency. The Ge system of the Center can obtain normal, anti-compton and coincident spectrum data in a single measurement, but the ARACC is provided for analyzing only the normal spectra among these. In this report, not only the start of program, parameter input, printing output and magnetic tape output, but also the conversion of spectrum data format, peak searching, peak area determination, the treatment of disturbing peaks and the limit of detection are described as the methods of data analysis. The limit of detection is defined as such quantity that the significance level of concluding essentially existing radioactivity as non-existent as a result of measurement is lower than 5 %. (Wakatsuki, Y.)

  19. Analysis of the Industrial Biodiesel Production Process

    International Nuclear Information System (INIS)

    The reaction of transesterification is the chemical transformation through which you get biodiesel from vegetable oils. The purpose of this work is to plan carefully all the stages of various biodiesel production processes on the basis of recent results obtained in the experimental research. These results allow defining the proper thermodynamic models to be used, the right interpretation of the phenomena and identifying the parameters which affect the process. The modelling was done with ASPENPLUS (R) defining three possible processes used in industrial purpose. A subsequent sensitivity analysis was done for each process allowing the identification of the optimal configurations. By comparing these solutions it is possible to choose the most efficient one to reduce the costs of the final product.

  20. Ergonomic analysis of radiopharmaceuticals samples preparation process

    International Nuclear Information System (INIS)

    The doses of radioisotopes to be administrated in patients for diagnostic effect or therapy are prepared in the radiopharmacological sector. The preparation process adopts techniques that are aimed to reduce the exposition time of the professionals and the absorption of excessive doses for patients. The ergonomic analysis of this process contributes in the prevention of occupational illnesses and to prevent risks of accidents during the routines, providing welfare and security to the involved users and conferring to the process an adequate working standard. In this context it is perceived relevance of studies that deal with the analysis of factors that point with respect to the solution of problems and for establishing proposals that minimize risks in the exercise of the activities. Through a methodology that considers the application of the concepts of Ergonomics, it is searched the improvement of the effectiveness or the quality and reduction of the difficulties lived for the workers. The work prescribed, established through norms and procedures codified will be faced with the work effectively carried through, the real work, shaped to break the correct appreciation, with focus in the activities. This work has as objective to argue an ergonomic analysis of samples preparation process of radioisotopes in the Setor de Radiofarmacia do Hospital Universitario Clementino Fraga Filho da Universidade Federal do Rio de Janeiro (UFRJ). (author)

  1. Analysis of Exhaled Breath for Disease Detection

    Science.gov (United States)

    Amann, Anton; Miekisch, Wolfram; Schubert, Jochen; Buszewski, Bogusław; Ligor, Tomasz; Jezierski, Tadeusz; Pleil, Joachim; Risby, Terence

    2014-06-01

    Breath analysis is a young field of research with great clinical potential. As a result of this interest, researchers have developed new analytical techniques that permit real-time analysis of exhaled breath with breath-to-breath resolution in addition to the conventional central laboratory methods using gas chromatography-mass spectrometry. Breath tests are based on endogenously produced volatiles, metabolites of ingested precursors, metabolites produced by bacteria in the gut or the airways, or volatiles appearing after environmental exposure. The composition of exhaled breath may contain valuable information for patients presenting with asthma, renal and liver diseases, lung cancer, chronic obstructive pulmonary disease, inflammatory lung disease, or metabolic disorders. In addition, oxidative stress status may be monitored via volatile products of lipid peroxidation. Measurement of enzyme activity provides phenotypic information important in personalized medicine, whereas breath measurements provide insight into perturbations of the human exposome and can be interpreted as preclinical signals of adverse outcome pathways.

  2. Improved Facial-Feature Detection for AVSP via Unsupervised Clustering and Discriminant Analysis

    Directory of Open Access Journals (Sweden)

    Lucey Simon

    2003-01-01

    Full Text Available An integral part of any audio-visual speech processing (AVSP system is the front-end visual system that detects facial-features (e.g., eyes and mouth pertinent to the task of visual speech processing. The ability of this front-end system to not only locate, but also give a confidence measure that the facial-feature is present in the image, directly affects the ability of any subsequent post-processing task such as speech or speaker recognition. With these issues in mind, this paper presents a framework for a facial-feature detection system suitable for use in an AVSP system, but whose basic framework is useful for any application requiring frontal facial-feature detection. A novel approach for facial-feature detection is presented, based on an appearance paradigm. This approach, based on intraclass unsupervised clustering and discriminant analysis, displays improved detection performance over conventional techniques.

  3. Improved Facial-Feature Detection for AVSP via Unsupervised Clustering and Discriminant Analysis

    Science.gov (United States)

    Lucey, Simon; Sridharan, Sridha; Chandran, Vinod

    2003-12-01

    An integral part of any audio-visual speech processing (AVSP) system is the front-end visual system that detects facial-features (e.g., eyes and mouth) pertinent to the task of visual speech processing. The ability of this front-end system to not only locate, but also give a confidence measure that the facial-feature is present in the image, directly affects the ability of any subsequent post-processing task such as speech or speaker recognition. With these issues in mind, this paper presents a framework for a facial-feature detection system suitable for use in an AVSP system, but whose basic framework is useful for any application requiring frontal facial-feature detection. A novel approach for facial-feature detection is presented, based on an appearance paradigm. This approach, based on intraclass unsupervised clustering and discriminant analysis, displays improved detection performance over conventional techniques.

  4. Detection of Epileptic Seizures with Multi-modal Signal Processing

    DEFF Research Database (Denmark)

    Conradsen, Isa

    The main focus of this dissertation lies within the area of epileptic seizure detection. Medically refractory epileptic patients suffer from the unawareness of when the next seizure sets in, and what the consequences will be. A wearable device based on uni- or multi-modalities able to detect and ...

  5. Processing of polarimetric infrared images for landmine detection

    NARCIS (Netherlands)

    Cremer, F.; Jong, W. de; Schutte, K.

    2003-01-01

    Infrared (IR) cameras are often used in a vehicle based multi-sensor platform for landmine detection. Additional to thermal contrasts, an IR polarimetric sensor also measures surface properties and therefore has the potential of increased detection performance. We have developed a polarimetric IR se

  6. Analysis of rocket engine injection combustion processes

    Science.gov (United States)

    Salmon, J. W.

    1976-01-01

    A critique is given of the JANNAF sub-critical propellant injection/combustion process analysis computer models and application of the models to correlation of well documented hot fire engine data bases. These programs are the distributed energy release (DER) model for conventional liquid propellants injectors and the coaxial injection combustion model (CICM) for gaseous annulus/liquid core coaxial injectors. The critique identifies model inconsistencies while the computer analyses provide quantitative data on predictive accuracy. The program is comprised of three tasks: (1) computer program review and operations; (2) analysis and data correlations; and (3) documentation.

  7. Integrating human factors into process hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kariuki, S.G. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany); Loewe, K. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany)]. E-mail: katharina.loewe@tu-berlin.de

    2007-12-15

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors.

  8. Nuclear processes of non-intrusive analysis

    International Nuclear Information System (INIS)

    Non-intrusive nuclear analysis processes use gamma and neutron beams to analyze the content of objects. These processes are used for instance in military applications for the inspection of munition charges, nuclear materials, chemical explosives, for the localization of landmines or for the characterization of the content of a suspect parcel. These techniques are also applied in the industry (petroleum prospecting, cement factories) and in preventive medicine. This paper gives an overview of these techniques with details about their principle: neutron and photon interrogation, excitation sources, identification techniques and examples of application. (J.S.)

  9. Predictive Analysis for Social Processes II: Predictability and Warning Analysis

    CERN Document Server

    Colbaugh, Richard

    2009-01-01

    This two-part paper presents a new approach to predictive analysis for social processes. Part I identifies a class of social processes, called positive externality processes, which are both important and difficult to predict, and introduces a multi-scale, stochastic hybrid system modeling framework for these systems. In Part II of the paper we develop a systems theory-based, computationally tractable approach to predictive analysis for these systems. Among other capabilities, this analytic methodology enables assessment of process predictability, identification of measurables which have predictive power, discovery of reliable early indicators for events of interest, and robust, scalable prediction. The potential of the proposed approach is illustrated through case studies involving online markets, social movements, and protest behavior.

  10. Multiuser detection and independent component analysis-Progress and perspective

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The latest progress in the multiuser detection and independent component analysis (ICA) is reviewed systematically. Then two novel classes of multiuser detection methods based on ICA algorithms and feedforward neural networks are proposed. Theoretical analysis and computer simulation show that ICA algorithms are effective to detect multiuser signals in code-division multiple-access (CDMA) system. The performances of these methods are not identical entirely in various channels, but all of them are robust, efficient, fast and suitable for real-time implementations.

  11. Detecting Anomaly Regions in Satellite Image Time Series Based on Sesaonal Autocorrelation Analysis

    Science.gov (United States)

    Zhou, Z.-G.; Tang, P.; Zhou, M.

    2016-06-01

    Anomaly regions in satellite images can reflect unexpected changes of land cover caused by flood, fire, landslide, etc. Detecting anomaly regions in satellite image time series is important for studying the dynamic processes of land cover changes as well as for disaster monitoring. Although several methods have been developed to detect land cover changes using satellite image time series, they are generally designed for detecting inter-annual or abrupt land cover changes, but are not focusing on detecting spatial-temporal changes in continuous images. In order to identify spatial-temporal dynamic processes of unexpected changes of land cover, this study proposes a method for detecting anomaly regions in each image of satellite image time series based on seasonal autocorrelation analysis. The method was validated with a case study to detect spatial-temporal processes of a severe flooding using Terra/MODIS image time series. Experiments demonstrated the advantages of the method that (1) it can effectively detect anomaly regions in each of satellite image time series, showing spatial-temporal varying process of anomaly regions, (2) it is flexible to meet some requirement (e.g., z-value or significance level) of detection accuracies with overall accuracy being up to 89% and precision above than 90%, and (3) it does not need time series smoothing and can detect anomaly regions in noisy satellite images with a high reliability.

  12. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters. PMID:27305769

  13. Real-time Forward Vehicle Detection Method Based on Edge Analysis

    Institute of Scientific and Technical Information of China (English)

    Young-suk JI; Hwan-ik CHUNG; Hern-soo HAHN

    2010-01-01

    This paper proposes a method which uses the extended edge analysis to supplement the inaccurate edge information for better vehicle detection during vehicle detection. The extended edge analysis method detects two vertical edge items, which are the borderlines of both sides of the vehicle, by extending the horizontal edges obtained inaccurately due to the illumination or noise existing on the image. The proposed method extracts the horizontal edges with the method of merging edges by using the horizontal edge information inside the Region of Interest (ROI), which is set up on the pre-processing step. The bottom line is determined by detecting the shadow regions of the vehicle from the extracted horizontal edge one. The general width of the vehicle detecting and the extended edge analyzing methods are carried out side by side on the bottom line of the vehicle to determine width of the vehicle. Finally, the final vehicle is detected through the verification step. On the road image with complicate background, the vehicle detecting method based on the extended edge analysis is more efficient than the existing vehicle detecting method which uses the edge information. The excellence of the proposed vehicle detecting method is confirmed by carrying out the vehicle detecting experiment on the complicate road image.

  14. Detection of ancient Egyptian archaeological sites using satellite remote sensing and digital image processing

    Science.gov (United States)

    Corrie, Robert K.

    2011-11-01

    Satellite remote sensing is playing an increasingly important role in the detection and documentation of archaeological sites. Surveying an area from the ground using traditional methods often presents challenges due to the time and costs involved. In contrast, the multispectral synoptic approach afforded by the satellite sensor makes it possible to cover much larger areas in greater spectral detail and more cost effectively. This is especially the case for larger scale regional surveys, which are helping to contribute to a better understanding of ancient Egyptian settlement patterns. This study presents an overview of satellite remote sensing data products, methodologies, and image processing techniques for detecting lost or undiscovered archaeological sites with reference to Egypt and the Near East. Key regions of the electromagnetic spectrum useful for site detection are discussed, including the visible near-infrared (VNIR), shortwave infrared (SWIR), thermal infrared (TIR), and microwave (radar). The potential of using Google Earth as both a data provider and a visualization tool is also examined. Finally, a case study is presented for detecting tell sites in Egypt using Landsat ETM+, ASTER, and Google Earth imagery. The results indicated that principal components analysis (PCA) was successfully able to detect and differentiate tell sites from modern settlements in Egypt's northwestern Nile Delta region.

  15. Strategic analysis of a data processing company

    OpenAIRE

    Chen, George C. M.

    2005-01-01

    This paper contains a strategic analysis of a data processing company that provides outsourced payroll services to employers in Canada. For the purpose of confidentiality, the name of this company is disguised as ABC Canada. This company is in a mature industry, which is characterized by slow growth and narrowing of product differentiation as new entrants are able to penetrate the market at lower cost due to the advancement in information technology. The growth of this industry in the last fo...

  16. Process Simulation Analysis of HF Stripping

    OpenAIRE

    Thaer A. Abdulla

    2013-01-01

       HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq). Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee),...

  17. Process Correlation Analysis Model for Process Improvement Identification

    OpenAIRE

    Su-jin Choi; Dae-Kyoo Kim; Sooyong Park

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practice...

  18. Image Corruption Detection in Diffusion Tensor Imaging for Post-Processing and Real-Time Monitoring

    Science.gov (United States)

    Li, Yue; Shea, Steven M.; Lorenz, Christine H.; Jiang, Hangyi; Chou, Ming-Chung; Mori, Susumu

    2013-01-01

    Due to the high sensitivity of diffusion tensor imaging (DTI) to physiological motion, clinical DTI scans often suffer a significant amount of artifacts. Tensor-fitting-based, post-processing outlier rejection is often used to reduce the influence of motion artifacts. Although it is an effective approach, when there are multiple corrupted data, this method may no longer correctly identify and reject the corrupted data. In this paper, we introduce a new criterion called “corrected Inter-Slice Intensity Discontinuity” (cISID) to detect motion-induced artifacts. We compared the performance of algorithms using cISID and other existing methods with regard to artifact detection. The experimental results show that the integration of cISID into fitting-based methods significantly improves the retrospective detection performance at post-processing analysis. The performance of the cISID criterion, if used alone, was inferior to the fitting-based methods, but cISID could effectively identify severely corrupted images with a rapid calculation time. In the second part of this paper, an outlier rejection scheme was implemented on a scanner for real-time monitoring of image quality and reacquisition of the corrupted data. The real-time monitoring, based on cISID and followed by post-processing, fitting-based outlier rejection, could provide a robust environment for routine DTI studies. PMID:24204551

  19. Analysis of Rheumatoid Arthritis through Image Processing

    Directory of Open Access Journals (Sweden)

    Arpita Mittal

    2012-11-01

    Full Text Available Rheumatoid arthritis (RA is the most common inflammatory arthropathy worldwide, but may be less prevalent in Asian populations causing pain, swelling, stiffness, and loss of function in joints. The spectrum of magnetic resonance imaging findings encountered in the musculoskeletal system in this disease but these images often found non-productive due to noise present in the image and creates troublesome situation for analysis point of view. The role of image processing in rheumatoid arthritis lies, not in diagnosis, but in evaluation of the integrity of structures affected by the disease process. Since Magnetic resonance imaging is more sensitive to synovial changes than any radiography tech, and may permit quantification of changes in disease activity, as well as evaluation of the effects of drug therapy, and of complications of the disease and its treatment. So analyzing the Magnetic resonance images through image processing tool of Matlab provides easy going approach for diagnosis of the disease.

  20. Onboard Detection of Snow, Ice, Clouds, and Other Processes

    Data.gov (United States)

    National Aeronautics and Space Administration — The detection of clouds within a satellite image is essential for retrieving surface geophysical parameters from optical and thermal imagery. Even a small...

  1. Sensitivity Enhancement of RF Plasma Etch Endpoint Detection With K-means Cluster Analysis

    Science.gov (United States)

    Lee, Honyoung; Jang, Haegyu; Lee, Hak-Seung; Chae, Heeyeop

    2015-09-01

    Plasma etching process is the core process in semiconductor fabrication, and the etching endpoint detection is one of the essential FDC (Fault Detection and Classification) for yield management and mass production. In general, Optical emission spectrocopy (OES) has been used to detect endpoint because OES can be a non-invasive and real-time plasma monitoring tool. In OES, the trend of a few sensitive wavelengths is traced. However, in case of small-open area etch endpoint detection (ex. contact etch), it is at the boundary of the detection limit because of weak signal intensities of reaction reactants and products. Furthemore, the various materials covering the wafer such as photoresist, dielectric materials, and metals make the analysis of OES signals complicated. In this study, full spectra of optical emission signals were collected and the data were analyzed by a data-mining approach, modified K-means cluster analysis. The K-means cluster analysis is modified suitably to analyze a thousand of wavelength variables from OES. This technique can improve the sensitivity of EPD for small area oxide layer etching processes: about 1.0% oxide area. This technique is expected to be applied to various plasma monitoring applications including fault detections as well as EPD. Plasma Etch, EPD, K-means Cluster Analysis.

  2. Microfluidic system to detect select DNA fragments using agglutination process

    OpenAIRE

    Chhina, Sumanpreet Kaur

    2011-01-01

    This thesis investigates the design, fabrication, and testing of an easy-to-use, disposable and portable microfluidic system for DNA amplification detection; this is suitable for point-of-care testing (POCT) applications. The microfluidic system utilizes biotin-labelled DNA to agglutinate streptavidin-coated microspheres. The microfluidic system is designed to retain aggregates of cross-linked microspheres as opposed to single microspheres, indicating the detection of biotin-labelled DNA. The...

  3. Fault detection in rotating machines by vibration signal processing techniques

    OpenAIRE

    D'Elia, Gianluca

    2008-01-01

    Machines with moving parts give rise to vibrations and consequently noise. The setting up and the status of each machine yield to a peculiar vibration signature. Therefore, a change in the vibration signature, due to a change in the machine state, can be used to detect incipient defects before they become critical. This is the goal of condition monitoring, in which the informations obtained from a machine signature are used in order to detect faults at an early stage. There are...

  4. Application of signal processing techniques to the detection of tip vortex cavitation noise in marine propeller

    Institute of Scientific and Technical Information of China (English)

    LEE Jeung-Hoon; HAN Jae-Moon; PARK Hyung-Gil; SEO Jong-Soo

    2013-01-01

    The tip vortex cavitation and its relevant noise has been the subject of extensive researches up to now.In most cases of experimental approaches,the accurate and objective decision of cavitation inception is primary,which is the main topic of this paper.Although the conventional power spectrum is normally adopted as a signal processing tool for the analysis of cavitation noise,a faithful exploration cannot be made especially for the cavitation inception.Alternatively,the periodic occurrence of bursting noise induced from tip vortex cavitation gives a diagnostic proof that the repeating frequency of the bursting contents can be exploited as an indication of the inception.This study,hence,employed the Short-Time Fourier Transform (STFT) analysis and the Detection of Envelope Modulation On Noise (DEMON) spectrum analysis,both which are appropriate for finding such a repeating frequency.Through the acoustical measurement in a water tunnel,the two signal processing techniques show a satisfactory result in detecting the inception of tip vortex cavitation.

  5. QRS DETECTION OF ECG - A STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    I.S. Siva Rao

    2015-03-01

    Full Text Available Electrocardiogram (ECG is a graphical representation generated by heart muscle. ECG plays an important role in diagnosis and monitoring of heart’s condition. The real time analyzer based on filtering, beat recognition, clustering, classification of signal with maximum few seconds delay can be done to recognize the life threatening arrhythmia. ECG signal examines and study of anatomic and physiologic facets of the entire cardiac muscle. The inceptive task for proficient scrutiny is the expulsion of noise. It is attained by the use of wavelet transform analysis. Wavelets yield temporal and spectral information concurrently and offer stretchability with a possibility of wavelet functions of different properties. This paper is concerned with the extraction of QRS complexes of ECG signals using Discrete Wavelet Transform based algorithms aided with MATLAB. By removing the inconsistent wavelet transform coefficient, denoising is done in ECG signal. In continuation, QRS complexes are identified and in which each peak can be utilized to discover the peak of separate waves like P and T with their derivatives. Here we put forth a new combinatory algorithm builded on using Pan-Tompkins' method and multi-wavelet transform.

  6. Multivariate Statistical Process Monitoring Using Robust Nonlinear Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHAO Shijian; XU Yongmao

    2005-01-01

    The principal component analysis (PCA) algorithm is widely applied in a diverse range of fields for performance assessment, fault detection, and diagnosis. However, in the presence of noise and gross errors, the nonlinear PCA (NLPCA) using autoassociative bottle-neck neural networks is so sensitive that the obtained model differs significantly from the underlying system. In this paper, a robust version of NLPCA is introduced by replacing the generally used error criterion mean squared error with a mean log squared error. This is followed by a concise analysis of the corresponding training method. A novel multivariate statistical process monitoring (MSPM) scheme incorporating the proposed robust NLPCA technique is then investigated and its efficiency is assessed through application to an industrial fluidized catalytic cracking plant. The results demonstrate that, compared with NLPCA, the proposed approach can effectively reduce the number of false alarms and is, hence, expected to better monitor real-world processes.

  7. Sound vibration signal processing for detection and identification detonation (knock) to optimize performance Otto engine

    Science.gov (United States)

    Sujono, A.; Santoso, B.; Juwana, W. E.

    2016-03-01

    Problems of detonation (knock) on Otto engine (petrol engine) is completely unresolved problem until now, especially if want to improve the performance. This research did sound vibration signal processing engine with a microphone sensor, for the detection and identification of detonation. A microphone that can be mounted is not attached to the cylinder block, that's high temperature, so that its performance will be more stable, durable and inexpensive. However, the method of analysis is not very easy, because a lot of noise (interference). Therefore the use of new methods of pattern recognition, through filtration, and the regression function normalized envelope. The result is quite good, can achieve a success rate of about 95%.

  8. Size-varying small target detection for infrared image processing

    Science.gov (United States)

    Li, Miao; Zhu, Ran; Long, Yunli; An, Wei; Zhou, Yiyu

    2015-10-01

    IRST (Infrared Search and Track) has been applied to many military or civil fields such as precise guidance, aerospace, early warning. As a key technique, small target detection based on infrared image plays an important role. However, infrared targets have their own characteristics, such as target size variation, which make the detection work quite difficult. In practical application, the target size may vary due to many reasons, such as optic angle of sensors, imaging distance, environment and so on. For conventional detection methods, it is difficult to detect such size-varying targets, especially when the backgrounds have strong clutters. This paper presents a novel method to detect size-varying infrared targets in a cluttered background. It is easy to find that the target region is salient in infrared images. It means that target region have a signature of discontinuity with its neighboring regions and concentrates in a relatively small region, which can be considered as a homogeneous compact region, and the background is consistent with its neighboring regions. Motivated by the saliency feature and gradient feature, we introduce minimum target intensity (MTI) to measure the dissimilarity between different scales, and use mean gradient to restrict the target scale in a reasonable range. They are integrated to be multiscale MTI filter. The proposed detection method is designed based on multiscale MTI filter. Firstly, salient region is got by morphological low-pass filtering, where the potential target exists in. Secondly, the candidate target regions are extracted by multiscale minimum target intensity filter, which can effectively give the optimal target size. At last, signal-to-clutter ratio (SCR) is used to segment targets, which is computed based on optimal scale of candidate targets. The experimental results indicate that the proposed method can achieve both higher detection precision and robustness in complex background.

  9. Real time loss detection for SNM in process

    International Nuclear Information System (INIS)

    This paper discusses the basis of a design for real time special nuclear material (SNM) loss detectors. The design utilizes process measurements and signal processing techniques to produce a timely estimate of material loss. A state estimator is employed as the primary signal processing algorithm. Material loss is indicated by changes in the states or process innovations (residuals). The design philosophy is discussed in the context of these changes

  10. Knowledge Base Approach for 3D Objects Detection in Point Clouds Using 3D Processing and Specialists Knowledge

    OpenAIRE

    Ben Hmida, Helmi; Cruz, Christophe; Boochs, Frank; Nicolle, Christophe

    2013-01-01

    International audience This paper presents a knowledge-based detection of objects approach using the OWL ontology language, the Semantic Web Rule Language, and 3D processing built-ins aiming at combining geometrical analysis of 3D point clouds and specialist's knowledge. Here, we share our experience regarding the creation of 3D semantic facility model out of unorganized 3D point clouds. Thus, a knowledge-based detection approach of objects using the OWL ontology language is presented. Thi...

  11. Auto Landing Process for Autonomous Flying Robot by Using Image Processing Based on Edge Detection

    Directory of Open Access Journals (Sweden)

    Bahram Lavi Sefidgari

    2014-01-01

    Full Text Available In today’s technological life, everyone is quite familiar with the importance of security measures in our lives. So in this regard, many attempts have been made by researchers and one of them is flying robots technology. One well-known usage of flying robot, perhaps, is its capability in security and care measurements which made this device extremely practical, not only for its unmanned movement, but also for the unique manoeuvre during flight over the arbitrary areas. In this research, the automatic landing of a flying robot is discussed. The system is based on the frequent interruptions that is sent from main microcontroller to camera module in order to take images; these images have been distinguished by image processing system based on edge detection, after analysing the image the system can tell whether or not to land on the ground. This method shows better performance in terms of precision as well as experimentally.

  12. Efficient signal processing for time-resolved fluorescence detection of nitrogen-vacancy spins in diamond

    Science.gov (United States)

    Gupta, A.; Hacquebard, L.; Childress, L.

    2016-03-01

    Room-temperature fluorescence detection of the nitrogen-vacancy center electronic spin typically has low signal to noise, requiring long experiments to reveal an averaged signal. Here, we present a simple approach to analysis of time-resolved fluorescence data that permits an improvement in measurement precision through signal processing alone. Applying our technique to experimental data reveals an improvement in signal to noise equivalent to a 14% increase in photon collection efficiency. We further explore the dependence of the signal to noise ratio on excitation power, and analyze our results using a rate equation model. Our results provide a rubric for optimizing fluorescence spin detection, which has direct implications for improving precision of nitrogen-vacancy-based sensors.

  13. Relative Saliency in Change Signals Affects Perceptual Comparison and Decision Processes in Change Detection

    Science.gov (United States)

    Yang, Cheng-Ta

    2011-01-01

    Change detection requires perceptual comparison and decision processes on different features of multiattribute objects. How relative salience between two feature-changes influences the processes has not been addressed. This study used the systems factorial technology to investigate the processes when detecting changes in a Gabor patch with visual…

  14. Across frequency processes involved in auditory detection of coloration

    DEFF Research Database (Denmark)

    Buchholz, Jörg; Kerketsos, P

    2008-01-01

    When an early wall reflection is added to a direct sound, a spectral modulation is introduced to the signal's power spectrum. This spectral modulation typically produces an auditory sensation of coloration or pitch. Throughout this study, auditory spectral-integration effects involved in coloration...... detection are investigated. Coloration detection thresholds were therefore measured as a function of reflection delay and stimulus bandwidth. In order to investigate the involved auditory mechanisms, an auditory model was employed that was conceptually similar to the peripheral weighting model [Yost, JASA...... filterbank was designed to approximate auditory filter-shapes measured by Oxenham and Shera [JARO, 2003, 541-554], derived from forward masking data. The results of the present study demonstrate that a “purely” spectrum-based model approach can successfully describe auditory coloration detection even at high...

  15. Microbiological Analysis of Rice Cake Processing in Korea.

    Science.gov (United States)

    Wang, Jun; Park, Joong-Hyun; Choi, Na-Jung; Ha, Sang-Do; Oh, Deog-Hwan

    2016-01-01

    This study was conducted to evaluate the microbial contamination in rice cake materials and products during processing and in the operation environment in nonhazard analysis [and] critical control point factories. Furthermore, the environmental health of the processing facilities and the bacterial and fungal contamination on the workers' hands were investigated. Pour plate methods were used for enumeration of aerobic plate count (APC), yeast and molds (YM), Bacillus cereus, Staphylococcus aureus, and Clostridium perfringens, whereas Petrifilm count plates were used for enumeration of coliforms and Escherichia coli. The respective microbial levels of APC, coliforms, YM, and B. cereus were in the range of 2.6 to 4.7, 1.0 to 3.8, not detected (ND) to 2.9, and ND to 2.8 log CFU/g in the raw materials and in the range of 2.3 to 6.2, ND to 3.6, ND to 2.7, and ND to 3.7 log CFU/g during processing of the rice cake products. During the processing of rice cakes, APC, coliforms, YM, and B. cereus increased during soaking and smashing treatments and decreased after steaming treatment. E. coli, S. aureus, and C. perfringens were not detected in any of the raw materials and operating areas or during processing. B. cereus was detected on the operators' hands at microbial contamination levels of 1.9 ± 0.19 to 2.0 ± 0.19 log CFU/g. The results showed that B. cereus in the end product is presumably the main concern for rice cakes. In addition, the high contamination level of B. cereus during manufacturing processes, including soaking, smashing, and molding, and the absence of B. cereus from the air sampling plates indicated that the contaminated equipment showed the potential risk to cause cross-contamination. PMID:26735044

  16. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  17. Preliminary Hazards Analysis Plasma Hearth Process

    International Nuclear Information System (INIS)

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment

  18. Latency and mode of error detection in a process industry

    International Nuclear Information System (INIS)

    Licensee event reports (LERs) from an industry provide important information feedback about safety to the industry itself, the regulators and to the public. LERs from four nuclear power reactors were analyzed to find out about detection times, mode of detection and qualitative differences in reports from different reactors. The reliability of the codings was satisfactory and measured as the covariance between the ratings from two independent judges. The results showed differences in detection time across the reactors. On the average about 10% of the errors remained undetected for 100 weeks or more, but the great majority of errors were detected soon after their first appearance in the plant. On the average 40% of the errors were detected in regular tests and 40% through alarms. Operators found about 16% of the errors through noticing something abnormal in the plant. The remaining errors were detected in other ways. There were qualitative differences between the LERs from the different reactors reflecting the different conditions in the plants. The number of reports differed by a magnitude 1:2 between the different plants. However, a greater number of LERs can indicate both higher safety standards (e.g. a greater willingness to report all possible events to be able to learn from them) and lower safety standards (e.g. reporting as few events as possible to make a good impression). It was pointed out that LERs are indispensable in order to maintain safety of an industry and that the differences between plants found in the analyses of this study indicate how error reports can be used to initiate further investigations for improved safety

  19. Identifying time measurement tampering in the traversal time and hop count analysis (TTHCA) wormhole detection algorithm.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2013-01-01

    Traversal time and hop count analysis (TTHCA) is a recent wormhole detection algorithm for mobile ad hoc networks (MANET) which provides enhanced detection performance against all wormhole attack variants and network types. TTHCA involves each node measuring the processing time of routing packets during the route discovery process and then delivering the measurements to the source node. In a participation mode (PM) wormhole where malicious nodes appear in the routing tables as legitimate nodes, the time measurements can potentially be altered so preventing TTHCA from successfully detecting the wormhole. This paper analyses the prevailing conditions for time tampering attacks to succeed for PM wormholes, before introducing an extension to the TTHCA detection algorithm called ∆T Vector which is designed to identify time tampering, while preserving low false positive rates. Simulation results confirm that the ∆T Vector extension is able to effectively detect time tampering attacks, thereby providing an important security enhancement to the TTHCA algorithm. PMID:23686143

  20. Fault detection and isolation in processes involving induction machines

    Energy Technology Data Exchange (ETDEWEB)

    Zell, K.; Medvedev, A. [Control Engineering Group, Luleaa University of Technology, Luleaa (Sweden)

    1997-12-31

    A model-based technique for fault detection and isolation in electro-mechanical systems comprising induction machines is introduced. Two coupled state observers, one for the induction machine and another for the mechanical load, are used to detect and recognize fault-specific behaviors (fault signatures) from the real-time measurements of the rotor angular velocity and terminal voltages and currents. Practical applicability of the method is verified in full-scale experiments with a conveyor belt drive at SSAB, Luleaa Works. (orig.) 3 refs.

  1. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy

    OpenAIRE

    Xiangbo An; Jingjing Wang; Hao Li; Zhizhen Lu; Yan Bai; Han Xiao; Youyi Zhang; Yao Song

    2016-01-01

    Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO) mouse model. Pathological hypertrophy...

  2. A Novel Approach to Detect Malware Based on API Call Sequence Analysis

    OpenAIRE

    Youngjoon Ki; Eunjin Kim; Huy Kang Kim

    2015-01-01

    In the era of ubiquitous sensors and smart devices, detecting malware is becoming an endless battle between ever-evolving malware and antivirus programs that need to process ever-increasing security related data. For malware detection, various approaches have been proposed. Among them, dynamic analysis is known to be effective in terms of providing behavioral information. As malware authors increasingly use obfuscation techniques, it becomes more important to monitor how malware behaves for i...

  3. Physical Meaning of the Optimum Measurement Process in Quantum Detection Theory

    Science.gov (United States)

    Osaki, Masao; Kozuka, Haruhisa; Hirota, Osamu

    1996-01-01

    The optimum measurement processes are represented as the optimum detection operators in the quantum detection theory. The error probability by the optimum detection operators goes beyond the standard quantum limit automatically. However the optimum detection operators are given by pure mathematical descriptions. In order to realize a communication system overcoming the standard quantum limit, we try to give the physical meaning of the optimum detection operators.

  4. POST-PROCESSING ANALYSIS FOR THC SEEPAGE

    International Nuclear Information System (INIS)

    This report describes the selection of water compositions for the total system performance assessment (TSPA) model of results from the thermal-hydrological-chemical (THC) seepage model documented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). The selection has been conducted in accordance with ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Post-Processing Analysis for THC Seepage) Report Integration'' (BSC 2004 [DIRS 171334]). This technical work plan (TWP) was prepared in accordance with AP-2.27Q, ''Planning for Science Activities''. Section 1.2.3 of the TWP describes planning information pertaining to the technical scope, content, and management of this report. The post-processing analysis for THC seepage (THC-PPA) documented in this report provides a methodology for evaluating the near-field compositions of water and gas around a typical waste emplacement drift as these relate to the chemistry of seepage, if any, into the drift. The THC-PPA inherits the conceptual basis of the THC seepage model, but is an independently developed process. The relationship between the post-processing analysis and other closely related models, together with their main functions in providing seepage chemistry information for the Total System Performance Assessment for the License Application (TSPA-LA), are illustrated in Figure 1-1. The THC-PPA provides a data selection concept and direct input to the physical and chemical environment (P and CE) report that supports the TSPA model. The purpose of the THC-PPA is further discussed in Section 1.2. The data selection methodology of the post-processing analysis (Section 6.2.1) was initially applied to results of the THC seepage model as presented in ''Drift-Scale THC Seepage Model'' (BSC 2004 [DIRS 169856]). Other outputs from the THC seepage model (DTN: LB0302DSCPTHCS.002 [DIRS 161976]) used in the P and CE (BSC 2004 [DIRS 169860

  5. Leak detection in pipelines through spectral analysis of pressure signals

    OpenAIRE

    Souza A.L.; Cruz S.L.; Pereira J.F.R.

    2000-01-01

    The development and test of a technique for leak detection in pipelines is presented. The technique is based on the spectral analysis of pressure signals measured in pipeline sections where the formation of stationary waves is favoured, allowing leakage detection during the start/stop of pumps. Experimental tests were performed in a 1250 m long pipeline for various operational conditions of the pipeline (liquid flow rate and leakage configuration). Pressure transients were obtained by four tr...

  6. RFI detection by automated feature extraction and statistical analysis

    OpenAIRE

    Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorit...

  7. A Framework for Occupational Fraud Detection by Social Network Analysis

    OpenAIRE

    Lookman, Sanni; Nurcan, Selmin

    2015-01-01

    This paper explores issues related to occupational fraud detection. We observe over the past years, a broad use of network research across social and physical sciences including but not limited to social sharing and filtering, recommendation systems, marketing and customer intelligence, counter intelligence and law enforcement. However, the rate of social network analysis adoption in organizations by control professionals or even by academics for insider fraud detection purpose is still very ...

  8. Iterated Function System Models in Data Analysis: Detection and Separation

    CERN Document Server

    Alexander, Zachary; Garland, Joshua; Meiss, James D

    2011-01-01

    We investigate the use of iterated function system (IFS) models for data analysis. An IFS is a collection of dynamical systems that switches between deterministic regimes. An algorithm is developed to detect the regime switches under the assumption of continuity. This method is tested on a simple IFS and applied to an experimental computer performance data set. This methodology has a wide range of potential uses: from change-point detection in time-series data, to the field of digital communications.

  9. Continuous Fraud Detection in Enterprise Systems through Audit Trail Analysis

    OpenAIRE

    Peter J. Best; Pall Rikhardsson; Mark Toleman

    2009-01-01

    Enterprise systems, real time recording and real time reporting pose new and significant challenges to the accounting and auditing professions. This includes developing methods and tools for continuous assurance and fraud detection. In this paper we propose a methodology for continuous fraud detection that exploits security audit logs, changes in master records and accounting audit trails in enterprise systems. The steps in this process are: (1) threat monitoring-surveillance of security audi...

  10. Detection of Porphyromonas gingivalis from Saliva by PCR by Using a Simple Sample-Processing Method

    OpenAIRE

    Mättö, Jaana; Saarela, Maria; Alaluusua, Satu; Oja, Virva; Jousimies-Somer, Hannele; Asikainen, Sirkka

    1998-01-01

    Simple sample-processing methods for PCR detection of Porphyromonas gingivalis, a major pathogen causing adult periodontitis, from saliva were studied. The ability to detect P. gingivalis from 118 salivary samples by PCR after boiling and Chelex 100 processing was compared with bacterial culture. P. gingivalis was detected three times more often by PCR than by culture. Chelex 100 processing of saliva proved to be effective in preventing PCR inhibition and was applied to determine the occurren...

  11. Protecting Student Intellectual Property in Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Barber, Craig

    2012-01-01

    The rapid development of the Internet along with increasing computer literacy has made it easy and tempting for digital natives to copy-paste someone's work. Plagiarism is now a burning issue in education, industry and even in the research community. In this study, the authors concentrate on plagiarism detection with particular focus on the…

  12. Protecting Students' Intellectual Property in the Web Plagiarism Detection Process

    Science.gov (United States)

    Butakov, Sergey; Dyagilev, Vadim; Tskhay, Alexander

    2012-01-01

    Learning management systems (LMS) play a central role in communications in online and distance education. In the digital era, with all the information now accessible at students' fingertips, plagiarism detection services (PDS) have become a must-have part of LMS. Such integration provides a seamless experience for users, allowing PDS to check…

  13. Mathematical Analysis and Optimization of Infiltration Processes

    Science.gov (United States)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  14. Point processes in forestry : an application to tree crown detection

    OpenAIRE

    Perrin, Guillaume; Descombes, Xavier; Zerubia, Josiane

    2006-01-01

    In this research report, we aim at extracting tree crowns from remotely sensed images using marked point processes of discs and ellipses. Our approach is indeed to consider that the data are some realizations of a marked point process. Once a geometrical object is defined, we sample a marked point process defined by a density with a Reversible Jump Markov Chain Monte Carlo dynamics and simulated annealing to get the maximum a posteriori estimator of the tree crown distribution on the image. I...

  15. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  16. ECG Signal Analysis and Arrhythmia Detection using Wavelet Transform

    Science.gov (United States)

    Kaur, Inderbir; Rajni, Rajni; Marwaha, Anupma

    2016-06-01

    Electrocardiogram (ECG) is used to record the electrical activity of the heart. The ECG signal being non-stationary in nature, makes the analysis and interpretation of the signal very difficult. Hence accurate analysis of ECG signal with a powerful tool like discrete wavelet transform (DWT) becomes imperative. In this paper, ECG signal is denoised to remove the artifacts and analyzed using Wavelet Transform to detect the QRS complex and arrhythmia. This work is implemented in MATLAB software for MIT/BIH Arrhythmia database and yields the sensitivity of 99.85 %, positive predictivity of 99.92 % and detection error rate of 0.221 % with wavelet transform. It is also inferred that DWT outperforms principle component analysis technique in detection of ECG signal.

  17. Specific capture of the hydrolysate on magnetic beads for sensitive detecting plant vacuolar processing enzyme activity.

    Science.gov (United States)

    Zhou, Jun; Cheng, Meng; Zeng, Lizhang; Liu, Weipeng; Zhang, Tao; Xing, Da

    2016-05-15

    Conventional plant protease detection always suffers from high background interference caused by the complex coloring metabolites in plant cells. In this study, a bio-modified magnetic beads-based strategy was developed for sensitive and quantitative detection of plant vacuolar processing enzyme (VPE) activity. Cleavage of the peptide substrate (ESENCRK-FITC) after asparagine residue by VPE resulted in the 2-cyano-6-amino-benzothiazole (CABT)-functionalized magnetic beads capture of the severed substrate CRK-FITC via a condensation reaction between CABT and cysteine (Cys). The catalytic activity was subsequently obtained by the confocal microscopy imaging and flow cytometry quantitative analysis. The sensor system integrated advantages of (i) the high efficient enrichment and separation capabilities of magnetic beads and (ii) the catalyst-free properties of the CABT-Cys condensation reaction. It exhibited a linear relationship between the fluorescence signal and the concentration of severed substrate in the range of 10-600 pM. The practical results showed that, compared with normal growth conditions, VPE activity was increased by 2.7-fold (307.2 ± 25.3 μM min(-1)g(-1)) upon cadmium toxicity stress. This platform effectively overcame the coloring metabolites-caused background interference, showing fine applicability for the detection of VPE activity in real samples. The strategy offers great sensitivity and may be further extended to other protease activity detection. PMID:26797250

  18. SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS

    Data.gov (United States)

    National Aeronautics and Space Administration — SCALABLE TIME SERIES CHANGE DETECTION FOR BIOMASS MONITORING USING GAUSSIAN PROCESS VARUN CHANDOLA AND RANGA RAJU VATSAVAI Abstract. Biomass monitoring,...

  19. Scalable time series change detection for biomass monitoring using gaussian process

    Energy Technology Data Exchange (ETDEWEB)

    Chandola, Varun [ORNL; Vatsavai, Raju [ORNL

    2010-01-01

    Biomass monitoring, specifically detecting changes in the biomass or vegetation of a geographical region, is vital for studying the carbon cycle of the system and has significant implications in the context of understanding climate change and its impacts. Recently, several time series change detection methods have been proposed to identify land cover changes in temporal profiles (time series) of vegetation collected using remote sensing instruments. In this paper, we adapt Gaussian process regression to detect changes in such time series in an online fashion. While Gaussian process (GP) has been widely used as a kernel based learning method for regression and classification, their applicability to massive spatio-temporal data sets, such as remote sensing data, has been limited owing to the high computational costs involved. In this paper we address the scalability aspect of GP based time series change detection. Specifically, we exploit the special structure of the covariance matrix generated for GP analysis to come up with methods that can efficiently estimate the hyper-parameters associated with GP as well as identify changes in the time series while requiring a memory footprint which is linear in the size of input data, as compared to traditional method which involves solving a linear system of equations for the Choleksy decomposition of the quadratic sized covariance matrix. Experimental results show that our proposed method achieves significant speedups, as high as 1000, when processing long time series, while maintaining a small memory footprint. To further improve the computational complexity of the proposed method, we provide a parallel version which can concurrently process multiple input time series using the same set of hyper-parameters. The parallel version exploits the natural parallelization potential of the serial algorithm and is shown to perform significantly better than the serial version, with speedups as high as 10. Finally, we demonstrate the

  20. Fast Enzymatic Processing of Proteins for MS Detection with a Flow-through Microreactor.

    Science.gov (United States)

    Lazar, Iulia M; Deng, Jingren; Smith, Nicole

    2016-01-01

    The vast majority of mass spectrometry (MS)-based protein analysis methods involve an enzymatic digestion step prior to detection, typically with trypsin. This step is necessary for the generation of small molecular weight peptides, generally with MW microreactor with immobilized enzymes or of a range of complementary physical processes that reduce the time necessary for proteolytic digestion to a few minutes (e.g., microwave or high-pressure). In this work, we describe a simple and cost-effective approach that can be implemented in any laboratory for achieving fast enzymatic digestion of a protein. The protein (or protein mixture) is adsorbed on C18-bonded reversed-phase high performance liquid chromatography (HPLC) silica particles preloaded in a capillary column, and trypsin in aqueous buffer is infused over the particles for a short period of time. To enable on-line MS detection, the tryptic peptides are eluted with a solvent system with increased organic content directly in the MS ion source. This approach avoids the use of high-priced immobilized enzyme particles and does not necessitate any aid for completing the process. Protein digestion and complete sample analysis can be accomplished in less than ~3 min and ~30 min, respectively. PMID:27078683

  1. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets. PMID:26336154

  2. Deterring digital plagiarism, how effective is the digital detection process?

    OpenAIRE

    Jayati Chaudhuri

    2008-01-01

    Academic dishonesty or plagiarism is a growing problem in today's digital world. Use of plagiarism detection tools can assist faculty to combat this form of academic dishonesty. In this article, a special emphasis is given to text-matching software called SafeAssignmentTM. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in detail. The advantages and disadvantages of using automated text matching software's are discussed and analyzed in d...

  3. Affine-Detection Loophole in Quantum Data Processing

    OpenAIRE

    Vlasov, Alexander Yu.

    2002-01-01

    Here is considered a specific detection loophole, that is relevant not only to testing of quantum nonlocality, but also to some other applications of quantum computations and communications. It is described by a simple affine relation between different quantum "data structures" like pure and mixed state, separable and inseparable one. It is shown also, that due to such relations imperfect device for a classical model may mimic measurements of quantum correlations on ideal equipment.

  4. Signal processing of Shiley heart valve data for fracture detection

    Energy Technology Data Exchange (ETDEWEB)

    Mullenhoff, C.

    1993-09-01

    Given digital acoustic data emanating from the heart sounds of the beating heart measured from laboratory sheep with implanted Bjoerk-Shiley Convexo-Concave heart valves, it is possible to detect and extract the opening and closing heart beats from the data. Once extracted, spectral or other information can then obtained from the heartbeats and passed on to feature extraction algorithms, neural networks, or pattern recognizers so that the valve condition, either fractured or intact, may be determined.

  5. Information Design for “Weak Signal” detection and processing in Economic Intelligence: A case study on Health resources

    Directory of Open Access Journals (Sweden)

    Sahbi Sidhom

    2011-12-01

    Full Text Available The topics of this research cover all phases of “Information Design” applied to detect and profit from weak signals in economic intelligence (EI or business intelligence (BI. The field of the information design (ID applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing, human performances technology and human factors. Applied in the context of information system, it allows end-users to easily detect implicit topics known as “weak signals” (WS. In our approach to implement the ID, the processes cover the development of a knowledge management (KM process in the context of EI. A case study concerning information monitoring health resources is presented using ID processes to outline weak signals. Both French and American bibliographic databases were applied to make the connection to multilingual concepts in the health watch process.

  6. Novel Flood Detection and Analysis Method Using Recurrence Property

    Science.gov (United States)

    Wendi, Dadiyorto; Merz, Bruno; Marwan, Norbert

    2016-04-01

    Temporal changes in flood hazard are known to be difficult to detect and attribute due to multiple drivers that include processes that are non-stationary and highly variable. These drivers, such as human-induced climate change, natural climate variability, implementation of flood defence, river training, or land use change, could impact variably on space-time scales and influence or mask each other. Flood time series may show complex behavior that vary at a range of time scales and may cluster in time. This study focuses on the application of recurrence based data analysis techniques (recurrence plot) for understanding and quantifying spatio-temporal changes in flood hazard in Germany. The recurrence plot is known as an effective tool to visualize the dynamics of phase space trajectories i.e. constructed from a time series by using an embedding dimension and a time delay, and it is known to be effective in analyzing non-stationary and non-linear time series. The emphasis will be on the identification of characteristic recurrence properties that could associate typical dynamic behavior to certain flood situations.

  7. Next Generation Detection Systems for Radioactive Material Analysis

    Science.gov (United States)

    Britton, R.; Regan, P. H.; Burnett, J. L.; Davies, A. V.

    2014-05-01

    Compton Suppression techniques have been widely used to reduce the Minimum Detectable Activity of various radionuclides when performing gamma spectroscopy of environmental samples. This is achieved by utilising multiple detectors to reduce the contribution of photons that Compton Scatter out the detector crystal, only partially depositing their energy. Photons that are Compton Scattered out of the primary detector are captured by a surrounding detector, and the corresponding events vetoed from the final dataset using coincidence based fast-timing electronics. The current work presents the use of a LynxTM data acquisition module from Canberra Industries (USA) to collect data in 'List-Mode', where each event is time stamped for offline analysis. A post-processor developed to analyse such datasets allows the optimisation of the coincidence delay, and then identifies and suppresses events within this time window. This is the same process used in conventional systems with fast-timing electronics, however, in the work presented, data can be re-analysed using multiple time and energy windows. All data is also preserved and recorded (in traditional systems, coincident events are lost as they are vetoed in real time), and the results are achieved with a greatly simplified experimental setup. Monte-Carlo simulations of Compton Suppression systems have been completed to support the optimisation work, and are also presented here.

  8. People detection in nuclear plants by video processing for safety purpose

    International Nuclear Information System (INIS)

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  9. People detection in nuclear plants by video processing for safety purpose

    Energy Technology Data Exchange (ETDEWEB)

    Jorge, Carlos Alexandre F.; Mol, Antonio Carlos A., E-mail: calexandre@ien.gov.b, E-mail: mol@ien.gov.b [Instituto de Engenharia Nuclear (IEN/CNEN), Rio de Janeiro, RJ (Brazil); Seixas, Jose M.; Silva, Eduardo Antonio B., E-mail: seixas@lps.ufrj.b, E-mail: eduardo@lps.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Eletrica; Cota, Raphael E.; Ramos, Bruno L., E-mail: brunolange@poli.ufrj.b [Universidade Federal do Rio de Janeiro (EP/UFRJ), RJ (Brazil). Dept. de Engenharia Eletronica e de Computacao

    2011-07-01

    This work describes the development of a surveillance system for safety purposes in nuclear plants. The final objective is to track people online in videos, in order to estimate the dose received by personnel, during the execution of working tasks in nuclear plants. The estimation will be based on their tracked positions and on dose rate mapping in a real nuclear plant at Instituto de Engenharia Nuclear, Argonauta nuclear research reactor. Cameras have been installed within Argonauta's room, supplying the data needed. Both video processing and statistical signal processing techniques may be used for detection, segmentation and tracking people in video. This first paper reports people segmentation in video using background subtraction, by two different approaches, namely frame differences, and blind signal separation based on the independent component analysis method. Results are commented, along with perspectives for further work. (author)

  10. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  11. Partial wave analysis using graphics processing units

    International Nuclear Information System (INIS)

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  12. Heart Beat Detection in Noisy ECG Signals Using Statistical Analysis of the Automatically Detected Annotations

    Directory of Open Access Journals (Sweden)

    Andrius Gudiškis

    2015-07-01

    Full Text Available This paper proposes an algorithm to reduce the noise distortion influence in heartbeat annotation detection in electrocardiogram (ECG signals. Boundary estimation module is based on energy detector. Heartbeat detection is usually performed by QRS detectors that are able to find QRS regions in a ECG signal that are a direct representation of a heartbeat. However, QRS performs as intended only in cases where ECG signals have high signal to noise ratio, when there are more noticeable signal distortion detectors accuracy decreases. Proposed algorithm uses additional data, taken from arterial blood pressure signal which was recorded in parallel to ECG signal, and uses it to support the QRS detection process in distorted signal areas. Proposed algorithm performs as well as classical QRS detectors in cases where signal to noise ratio is high, compared to the heartbeat annotations provided by experts. In signals with considerably lower signal to noise ratio proposed algorithm improved the detection accuracy to up to 6%.

  13. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    Science.gov (United States)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  14. Equipment to detect bad isolator in hard disk drive testing process

    Directory of Open Access Journals (Sweden)

    Winai Tumthong

    2014-09-01

    Full Text Available With increasing data capacity of hard disk drive (HDD today, the testing time of a hard disk drive becames longer. It takes approximately 25 - 30 hours. Vibrations during HDD testing process significantly contribute to testing errors. Isolator and pocket slot were used to reduce the vibration. The physical property of isolators was charged, that led to the results of HDD testing. Because of the error, the testing process needs to be redone. As a result, the testing time is extended. This research developed a system to detect isolator function using “Ariduino Due” microcontroller along with “Micro-electromechanical System – MEMS” accelerator sensor. For the precise analysis, the statistical method called t test was used to group isolators by their functioning conditions. It was found that, the measured vibration amplitude in any direction canbe used as the criteria for identifying the isolator conditions. This detective system can determine the bad isolator at early stage before the exceeding vibration will affect the testing process.

  15. Image processing for femoral endosteal anatomy detection: description and testing of a computed tomography based program

    International Nuclear Information System (INIS)

    A computed tomography (CT)-based image processing computer program was developed for three-dimensional (3D) femoral endosteal cavity shape modelling. For the examinations 50 cadaver femora were used. In the CT imaging 30 axial slices were taken above and below the lesser trochanter area from each femur. Different image analysis methods were used for femoral cavity detection depending on the structure of the processed slice. In the femoral shaft area simple thresholding methods succeeded, but in the problem areas of the metaphyseal femur edge, detection operators and local thresholding were required. In contour tracking several criteria were used to check the validity of the border pixels. The results were saved as four output data files: (i) a file for the longest anteroposterior (ap), ediolateral (ml) and oblique diameters computed by a Euclidian method, (ii) nd (iii) files for 2D and 3D data respectively, and (iv) a file for centre points of each slice. Finally, testing of the results and dimensions obtained from the image analysis were carried out manually by sawing the femora into 0 stipulated horizontal slices. The ap and ml dimensions were measured with caliper ruler. The CT-based image processing yielded a peak distribution of dimensions with a negative difference to those obtained in manual measurements. The mean difference between the image processing and the manual measurements was 1.1 mm (±0.7 mm, ±1 SD). The difference was highest in he proximal slices of the femora of group I (with lowest cortical thickness), i.e. 1.3 mm (±0.8 mm) and lowest in the distal slices of the femora from group III (with highest cortical thickness), i.e. 0.9 mm (±0.6 m). The results are acceptable for further use of the program to study endosteal anatomy for individual femoral component selection and designing asis. (author)

  16. High-Speed Digital Signal Processing Method for Detection of Repeating Earthquakes Using GPGPU-Acceleration

    Science.gov (United States)

    Kawakami, Taiki; Okubo, Kan; Uchida, Naoki; Takeuchi, Nobunao; Matsuzawa, Toru

    2013-04-01

    Repeating earthquakes are occurring on the similar asperity at the plate boundary. These earthquakes have an important property; the seismic waveforms observed at the identical observation site are very similar regardless of their occurrence time. The slip histories of repeating earthquakes could reveal the existence of asperities: The Analysis of repeating earthquakes can detect the characteristics of the asperities and realize the temporal and spatial monitoring of the slip in the plate boundary. Moreover, we are expecting the medium-term predictions of earthquake at the plate boundary by means of analysis of repeating earthquakes. Although the previous works mostly clarified the existence of asperity and repeating earthquake, and relationship between asperity and quasi-static slip area, the stable and robust method for automatic detection of repeating earthquakes has not been established yet. Furthermore, in order to process the enormous data (so-called big data) the speedup of the signal processing is an important issue. Recently, GPU (Graphic Processing Unit) is used as an acceleration tool for the signal processing in various study fields. This movement is called GPGPU (General Purpose computing on GPUs). In the last few years the performance of GPU keeps on improving rapidly. That is, a PC (personal computer) with GPUs might be a personal supercomputer. GPU computing gives us the high-performance computing environment at a lower cost than before. Therefore, the use of GPUs contributes to a significant reduction of the execution time in signal processing of the huge seismic data. In this study, first, we applied the band-limited Fourier phase correlation as a fast method of detecting repeating earthquake. This method utilizes only band-limited phase information and yields the correlation values between two seismic signals. Secondly, we employ coherence function using three orthogonal components (East-West, North-South, and Up-Down) of seismic data as a

  17. Generic Packing Detection Using Several Complexity Analysis for Accurate Malware Detection

    Directory of Open Access Journals (Sweden)

    Dr. Mafaz Mohsin Khalil Al-Anezi

    2014-01-01

    Full Text Available The attackers do not want their Malicious software (or malwares to be reviled by anti-virus analyzer. In order to conceal their malware, malware programmers are getting utilize the anti reverse engineering techniques and code changing techniques such as the packing, encoding and encryption techniques. Malware writers have learned that signature based detectors can be easily evaded by “packing” the malicious payload in layers of compression or encryption. State-of-the-art malware detectors have adopted both static and dynamic techniques to recover the payload of packed malware, but unfortunately such techniques are highly ineffective. If the malware is packed or encrypted, then it is very difficult to analyze. Therefore, to prevent the harmful effects of malware and to generate signatures for malware detection, the packed and encrypted executable codes must initially be unpacked. The first step of unpacking is to detect the packed executable files. The objective is to efficiently and accurately distinguish between packed and non-packed executables, so that only executables detected as packed will be sent to an general unpacker, thus saving a significant amount of processing time. The generic method of this paper show that it achieves very high detection accuracy of packed executables with a low average processing time. In this paper, a packed file detection technique based on complexity measured by several algorithms, and it has tested using a packed and unpacked dataset of file type .exe. The preliminary results are very promising where achieved high accuracy with enough performance. Where it achieved about 96% detection rate on packed files and 93% detection rate on unpacked files. The experiments also demonstrate that this generic technique can effectively prepared to detect unknown, obfuscated malware and cannot be evaded by known evade techniques.

  18. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    Science.gov (United States)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  19. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr3 (Ce) fast scintillator detector and 252Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  20. Information Design for "Weak Signal" detection and processing in Economic Intelligence: A case study on Health resources

    OpenAIRE

    Sahbi Sidhom; Philippe Lambert

    2011-01-01

    The topics of this research cover all phases of "Information Design" applied to detect and profit from weak signals in economic intelligence (EI) or business intelligence (BI). The field of the information design (ID) applies to the process of translating complex, unorganized or unstructured data into valuable and meaningful information. ID practice requires an interdisciplinary approach, which combines skills in graphic design (writing, analysis processing and editing), human performances te...

  1. A New Method of Color Edge Detection Based on Local Structure Analysis

    Institute of Scientific and Technical Information of China (English)

    JIANG Shu; ZHOU Yue; ZHU Wei-wei

    2008-01-01

    Human's real life is within a colorful world.Compared to the gray images, color images contain more information and have better visual effects.In today's digital image processing, image segmentation is an important section for computers to "understand" images and edge detection is always one of the most important methods in the field of image segmentation.Edges in color images are considered as local discontinuities both in color and spatial domains.Despite the intensive study based on integration of single-channel edge detection results, and on vector space analysis, edge detection in color images remains as a challenging issue.

  2. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  3. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    International Nuclear Information System (INIS)

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (±4% of deviation between the calculated and measured doses) by calculating a control process capability (Cpc) index. The Cpc index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  4. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B.; Novascone, Stephen R.; Wright, Jerry P.

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  5. Elastic recoil detection analysis of hydrogen in polymers

    Energy Technology Data Exchange (ETDEWEB)

    Winzell, T.R.H.; Whitlow, H.J. [Lund Univ. (Sweden); Bubb, I.F.; Short, R.; Johnston, P.N. [Royal Melbourne Inst. of Tech., VIC (Australia)

    1996-12-31

    Elastic recoil detection analysis (ERDA) of hydrogen in thick polymeric films has been performed using 2.5 MeV He{sup 2+} ions from the tandem accelerator at the Royal Melbourne Institute of Technology. The technique enables the use of the same equipment as in Rutherford backscattering analysis, but instead of detecting the incident backscattered ion, the lighter recoiled ion is detected at a small forward angle. The purpose of this work is to investigate how selected polymers react when irradiated by helium ions. The polymers are to be evaluated for their suitability as reference standards for hydrogen depth profiling. Films investigated were Du Pont`s Kapton and Mylar, and polystyrene. 11 refs., 3 figs.

  6. Detection of charged particles through a photodiode: design and analysis

    International Nuclear Information System (INIS)

    This project develops and construct an charge particle detector mean a pin photodiode array, design and analysis using a silicon pin Fotodiodo that generally is used to detect visible light, its good efficiency, size compact and reduced cost specifically allows to its use in the radiation monitoring and alpha particle detection. Here, so much, appears the design of the system of detection like its characterization for alpha particles where one is reported as alpha energy resolution and detection efficiency. The equipment used in the development of work consists of alpha particle a triple source composed of Am-241, Pu-239 and Cm-244 with 5,55 KBq as total activity, Maestro 32 software made by ORTEC, a multi-channel card Triumph from ORTEC and one low activity electroplated uranium sample. (Author)

  7. Spiral-Bevel-Gear Damage Detected Using Decision Fusion Analysis

    Science.gov (United States)

    Dempsey, Paula J.; Handschuh, Robert F.

    2003-01-01

    Helicopter transmission integrity is critical to helicopter safety because helicopters depend on the power train for propulsion, lift, and flight maneuvering. To detect impending transmission failures, the ideal diagnostic tools used in the health-monitoring system would provide real-time health monitoring of the transmission, demonstrate a high level of reliable detection to minimize false alarms, and provide end users with clear information on the health of the system without requiring them to interpret large amounts of sensor data. A diagnostic tool for detecting damage to spiral bevel gears was developed. (Spiral bevel gears are used in helicopter transmissions to transfer power between nonparallel intersecting shafts.) Data fusion was used to integrate two different monitoring technologies, oil debris analysis and vibration, into a health-monitoring system for detecting surface fatigue pitting damage on the gears.

  8. Fault detection of a benchmark wind turbine using interval analysis

    DEFF Research Database (Denmark)

    Tabatabaeipour, Seyed Mojtaba; Odgaard, Peter Fogh; Bak, Thomas

    of the measurement with a closed set that is computed based on the past measurements and a model of the system. If the measurement is not consistent with this set, a fault is detected. The result demonstrates effectiveness of the method for fault detection of the benchmark wind turbine.......This paper investigates a state estimation set- membership approach for fault detection of a benchmark wind turbine. The main challenges in the benchmark are high noise on the wind speed measurement and the nonlinearities in the aerodynamic torque such that the overall model of the turbine is...... nonlinear. We use an effective wind speed estimator to estimate the effective wind speed and then using interval analysis and monotonicity of the aerodynamic torque with respect to the effective wind speed, we can apply the method to the nonlinear system. The fault detection algorithm checks the consistency...

  9. Apolipoprotein B100 analysis in microchip with electrochemical detection

    Institute of Scientific and Technical Information of China (English)

    Cheng Cheng Liu; Yun Liu; Hui Xiang Wang; Yan Bo Qi; Peng Yuan Yang; Bao Hong Liu

    2011-01-01

    Apolipoprotein B100 (apoB-100) is a major protein of the cholesterol-rich low-density lipoprotein (LDL) and reflects a better assessment of total atherogenic burden to the vascular system than LDL. In this work, a simple and sensitive method has been developed to determine picoliter apoB-100s using the PMMA microfluidic chip coupled with electrochemical detection system. This method performs very well with a detectable linear range of 1-800 pg/mL and a detection limit of 1 pg/mL. A real serum sample has further been detected by this microchip-based biosensor. The results show that this kind of method is practicable and has the potential application in clinical analysis and diagnosis.

  10. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  11. Gravitational wave detection and data analysis for pulsar timing arrays

    NARCIS (Netherlands)

    Haasteren, Rutger van

    2011-01-01

    Long-term precise timing of Galactic millisecond pulsars holds great promise for measuring long-period (months-to-years) astrophysical gravitational waves. In this work we develop a Bayesian data analysis method for projects called pulsar timing arrays; projects aimed to detect these gravitational w

  12. Detection of the ionization of food. Analysis and characterization

    International Nuclear Information System (INIS)

    Nowadays, the physico-chemical analysis allows to detect food and ionized medicines. Some protocols are normalized by the CEN (European Committee of Normalization). These qualitative methods are applied in quality control in agro-alimentary or for controlling the frauds at the ticketing. (O.M.)

  13. Shaft crack detection on centrifugal pumps by vibration analysis

    International Nuclear Information System (INIS)

    A prospective method for detecting fatigue cracks in rotating shafts by vibration analysis is proposed as an alternative to presently used methods, which are ineffective. By normalizing the actual vibration levels with respect to one reference signature of the shaft system, the crack evolution may be revealed and a threshold alarm level can be established. (author). 8 refs., 1 fig

  14. Integrated polymer waveguides for absorbance detection in chemical analysis systems

    DEFF Research Database (Denmark)

    Mogensen, Klaus Bo; El-Ali, Jamil; Wolff, Anders;

    2003-01-01

    A chemical analysis system for absorbance detection with integrated polymer waveguides is reported for the first time. The fabrication procedure relies on structuring of a single layer of the photoresist SU-8, so both the microfluidic channel network and the optical components, which include planar...

  15. Optodynamic analysis of pulsed-laser processing with a Nd:YAG laser

    OpenAIRE

    Strgar, Simon; Možina, Janez

    2015-01-01

    Laser drilling and laser marking of metals with a pulsed Nd:YAG laser are discussed. Some characteristics of pulsed-laser processing and the possibilities of process optodynamic analysis are presented for the laser-drilling of aluminium. The optodynamic analysis is based on observation of generated shock waves, which propagate in the material as well as in the surrounding air during laser processing. For the detection of laser-induced shock waves in the air and for measurements of their chara...

  16. Image processing and analysis using neural networks for optometry area

    Science.gov (United States)

    Netto, Antonio V.; Ferreira de Oliveira, Maria C.

    2002-11-01

    In this work we describe the framework of a functional system for processing and analyzing images of the human eye acquired by the Hartmann-Shack technique (HS), in order to extract information to formulate a diagnosis of eye refractive errors (astigmatism, hypermetropia and myopia). The analysis is to be carried out using an Artificial Intelligence system based on Neural Nets, Fuzzy Logic and Classifier Combination. The major goal is to establish the basis of a new technology to effectively measure ocular refractive errors that is based on methods alternative those adopted in current patented systems. Moreover, analysis of images acquired with the Hartmann-Shack technique may enable the extraction of additional information on the health of an eye under exam from the same image used to detect refraction errors.

  17. Fault Detection and Isolation in Industrial Systems Based on Spectral Analysis Diagnosis

    OpenAIRE

    Attia Daoudi; Mouloud Guemana; Ahmed Hafaifa

    2013-01-01

    The diagnoses in industrial systems represent an important economic objective in process industrial automation area. To guarantee the safety and the continuity in production exploitation and to record the useful events with the feedback experience for the curative maintenance. We propose in this work to examine and illustrate the application ability of the spectral analysis approach, in the area of fault detection and isolation industrial systems. In this work, we use a combined analysis dia...

  18. A Bubble Detection Algorithm Based on Sparse and Redundant Image Processing

    Directory of Open Access Journals (Sweden)

    Ye Tian

    2013-06-01

    Full Text Available Deinked pulp flotation column has been applied in wastepaper recycling. Bubble size in deinked pulp flotation column is very important during the flotation process. In this paper, bubble images of deinked pulp flotation column were first caught by digital camera, and then the bubbles were detected by using a detection algorithm based on sparse and redundant image processing. The results show the algorithms are very practical and effective on bubble detection in deinked pulp flotation column.

  19. A Bubble Detection Algorithm Based on Sparse and Redundant Image Processing

    OpenAIRE

    Ye Tian(University of South Carolina); Change Zheng; Qiuhong Ke

    2013-01-01

    Deinked pulp flotation column has been applied in wastepaper recycling. Bubble size in deinked pulp flotation column is very important during the flotation process. In this paper, bubble images of deinked pulp flotation column were first caught by digital camera, and then the bubbles were detected by using a detection algorithm based on sparse and redundant image processing. The results show the algorithms are very practical and effective on bubble detection in deinked pulp flotation co...

  20. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  1. Application of envelop analysis and wavelet transform for detection of gear failure

    International Nuclear Information System (INIS)

    Vibration analysis is widely used in machinery diagnosis and the wavelet transform has also been implemented in many applications in the condition monitoring of machinery. In contrast to previous applications, this paper examines whether acoustic signal can be used effectively along vibration signal to detect the various local fault, in local fault of gearboxes using the wavelet transform. Moreover, envelop analysis is well known as useful tool for the detection of rolling element bearing fault. In this paper, a Acoustic Emission (AE) sensor is employed to detect gearbox damage by installing them around bearing housing at driven-end side. Signal processing is conducted by wavelet transform and enveloping to detect her fault all at once gearbox using AE signal

  2. Hybridization kinetics analysis of an oligonucleotide microarray for microRNA detection

    Institute of Scientific and Technical Information of China (English)

    Botao Zhao; Shuo Ding; Wei Li; Youxin Jin

    2011-01-01

    MicroRNA (miRNA) microarrays have been successfully used for profiling miRNA expression in many physiological processes such as development, differentiation, oncogenesis,and other disease processes. Detecting miRNA by miRNA microarray is actually based on nucleic acid hybridization between target molecules and their corresponding complementary probes. Due to the small size and high degree of similarity among miRNA sequences, the hybridization condition must be carefully optimized to get specific and reliable signals. Previously, we reported a microarray platform to detect miRNA expression. In this study, we evaluated the sensitivity and specificity of our microarray platform. After systematic analysis, we determined an optimized hybridization condition with high sensitivity and specificity for miRNA detection. Our results would be helpful for other hybridization-based miRNA detection methods, such as northern blot and nuclease protection assay.

  3. Rapid Detection of Biological and Chemical Threat Agents Using Physical Chemistry, Active Detection, and Computational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Myung; Dong, Li; Fu, Rong; Liotta, Lance; Narayanan, Aarthi; Petricoin, Emanuel; Ross, Mark; Russo, Paul; Zhou, Weidong; Luchini, Alessandra; Manes, Nathan; Chertow, Jessica; Han, Suhua; Kidd, Jessica; Senina, Svetlana; Groves, Stephanie

    2007-01-01

    Basic technologies have been successfully developed within this project: rapid collection of aerosols and a rapid ultra-sensitive immunoassay technique. Water-soluble, humidity-resistant polyacrylamide nano-filters were shown to (1) capture aerosol particles as small as 20 nm, (2) work in humid air and (3) completely liberate their captured particles in an aqueous solution compatible with the immunoassay technique. The immunoassay technology developed within this project combines electrophoretic capture with magnetic bead detection. It allows detection of as few as 150-600 analyte molecules or viruses in only three minutes, something no other known method can duplicate. The technology can be used in a variety of applications where speed of analysis and/or extremely low detection limits are of great importance: in rapid analysis of donor blood for hepatitis, HIV and other blood-borne infections in emergency blood transfusions, in trace analysis of pollutants, or in search of biomarkers in biological fluids. Combined in a single device, the water-soluble filter and ultra-sensitive immunoassay technique may solve the problem of early warning type detection of aerosolized pathogens. These two technologies are protected with five patent applications and are ready for commercialization.

  4. Thermodynamic Analysis of Nanoporous Membrane Separation Processes

    Science.gov (United States)

    Rogers, David; Rempe, Susan

    2011-03-01

    We give an analysis of desalination energy requirements in order to quantify the potential for future improvements in desalination membrane technology. Our thermodynamic analysis makes it possible to draw conclusions from the vast array of equilibrium molecular dynamics simulations present in the literature as well as create a standardized comparison for measuring and reporting experimental reverse osmosis material efficiency. Commonly employed methods for estimating minimum desalination energy costs have been revised to include operations at positive input stream recovery ratios using a thermodynamic cycle analogous to the Carnot cycle. Several gaps in the statistical mechanical theory of irreversible processes have also been identified which may in the future lead to improved communication between materials engineering models and statistical mechanical simulation. Simulation results for silica surfaces and nanochannels are also presented. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  5. Decision analysis applications and the CERCLA process

    Energy Technology Data Exchange (ETDEWEB)

    Purucker, S.T.; Lyon, B.F. [Oak Ridge National Lab., TN (United States). Risk Analysis Section]|[Univ. of Tennessee, Knoxville, TN (United States)

    1994-06-01

    Quantitative decision methods can be developed during environmental restoration projects that incorporate stakeholder input and can complement current efforts that are undertaken for data collection and alternatives evaluation during the CERCLA process. These decision-making tools can supplement current EPA guidance as well as focus on problems that arise as attempts are made to make informed decisions regarding remedial alternative selection. In examining the use of such applications, the authors discuss the use of decision analysis tools and their impact on collecting data and making environmental decisions from a risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based perspective. They will look at the construction of objective functions for quantifying different risk-based decision rules that incorporate stakeholder concerns. This represents a quantitative method for implementing the Data Quality Objective (DQO) process. These objective functions can be expressed using a variety of indices to analyze problems that currently arise in the environmental field. Examples include cost, magnitude of risk, efficiency, and probability of success or failure. Based on such defined objective functions, a project can evaluate the impact of different risk and decision selection strategies on data worth and alternative selection.

  6. Comparison of performance between rescaled range analysis and rescaled variance analysis in detecting abrupt dynamic change

    Institute of Scientific and Technical Information of China (English)

    何文平; 刘群群; 姜允迪; 卢莹

    2015-01-01

    In the present paper, a comparison of the performance between moving cutting data-rescaled range analysis (MC-R/S) and moving cutting data-rescaled variance analysis (MC-V/S) is made. The results clearly indicate that the operating efficiency of the MC-R/S algorithm is higher than that of the MC-V/S algorithm. In our numerical test, the computer time consumed by MC-V/S is approximately 25 times that by MC-R/S for an identical window size in artificial data. Except for the difference in operating efficiency, there are no significant differences in performance between MC-R/S and MC-V/S for the abrupt dynamic change detection. MC-R/S and MC-V/S both display some degree of anti-noise ability. However, it is important to consider the influences of strong noise on the detection results of MC-R/S and MC-V/S in practical application processes.

  7. 动态 QCM 生物传感器检测尿微量白蛋白及过程分析%Dynamic QCM Biosensor Detection of Human Serum Albumin and Process Analysis

    Institute of Scientific and Technical Information of China (English)

    刘宪华; 赵勇; 张林; 冯梦南; 赵友全; 鲁逸人

    2012-01-01

    The screening of urine human serum albumin (HSA) is significant for the early diagnosis and intervention of the nephropathies. An immunosensor based on a quartz crystal microbalance (QCM) was developed for dynamic and real-time detection of HSA. Firstly, quartz crystal resonator was modified with lipoic acid and a self-assembly monolayer was formed on the surface. Then, carboxyl group on the modified surface was activated with l-(3-(dimethylamino)propyl)-3-ethylcarbodimide hydrochloride (EDC) and N-hydroxysuccinimide (NHS) in order to couple anti-HSA antibody. In this experiment, the frequency changes were recorded by QCM biosensor and were related to HSA concentrations. The results show that HSA can be quantitatively detected with the developed immunosensor within the mass concentration range of 0.1-6 μg/mL. Minimum detectable mass concentration of the biosensor is 0.1 μg/mL. The linear equation between frequency response and HSA concentration is y=11.276x + 10.351 with R2 = 0.999 4. In the range of detection, the linear relationship is satisfactory.%尿微量白蛋白(HSA)的检测对肾损伤的早期诊断及治疗具有重要意义.通过搭建基于石英晶体微天平(QCM)原理的免疫型生物传感器,对尿微量白蛋白实现动态实时的检测.首先使用硫辛酸修饰石英晶体表面,形成自组装单分子膜,然后再用N-羟基琥珀酰亚胺(NHS)和1-(3-二甲氨基丙基)-3-乙基碳二亚胺盐酸盐(EDC)来活化晶体表面的羧基,使得HSA抗体能够与之结合.实验通过记录传感器的频率变化,反映被测HSA样品的浓度大小.通过测定不同浓度的HSA,确定所搭建的生物传感器对HSA的检测范围是0.1~6 μg/mL,最小检测质量浓度0.1 μg/mL,频率变化和样品浓度的拟合曲线为y=11.276x+ 10.351,拟合优度R2为0.999 4.在检测范围内,两者线性关系较好.

  8. Multiple scattering problems in heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    A number of groups use Heavy Ion Elastic Recoil Detection Analysis (HIERDA) to study materials science problems. Nevertheless, there is no standard methodology for the analysis of HIERDA spectra. To overcome this deficiency we have been establishing codes for 2-dimensional data analysis. A major problem involves the effects of multiple and plural scattering which are very significant, even for quite thin (∼100 nm) layers of the very heavy elements. To examine the effects of multiple scattering we have made comparisons between the small-angle model of Sigmund et al. and TRIM calculations. (authors)

  9. Citation-based plagiarism detection detecting disguised and cross-language plagiarism using citation pattern analysis

    CERN Document Server

    Gipp, Bela

    2014-01-01

    Plagiarism is a problem with far-reaching consequences for the sciences. However, even today's best software-based systems can only reliably identify copy & paste plagiarism. Disguised plagiarism forms, including paraphrased text, cross-language plagiarism, as well as structural and idea plagiarism often remain undetected. This weakness of current systems results in a large percentage of scientific plagiarism going undetected. Bela Gipp provides an overview of the state-of-the art in plagiarism detection and an analysis of why these approaches fail to detect disguised plagiarism forms. The aut

  10. Adulteration detection of Brazilian gasoline samples by statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    L.S.M. Wiedemann; L.A. d' Avila; D.A. Azevedo [Universidade Federal do Rio de Janeiro, Rio de Janeiro (Brazil). Departamento de Quimica Organica, Instituto de Quimica

    2005-03-01

    Unfortunately, addition of organic solvents (heavy aliphatic, light aliphatic and aromatic hydrocarbons) in Brazilian gasoline is very frequent, and this illegal practice does not guarantee gasoline quality. Organic solvent adulterations of gasoline samples have been investigated. For characterization and comparison of these samples, physico-chemical parameters together with gas chromatographic analyses data were proposed as the factors for multivariate analysis. Hierarchical clusters analysis was used to improve the detection of the type of solvent and their relative proportion used for this practice. More detailed information of their compositions was revealed. It was found that using physico-chemical properties of gasoline samples together with statistical analysis are a useful method to adulteration detection. 20 refs., 5 figs., 2 tabs.

  11. LANL organic analysis detection capabilities for chemical and biological warfare agents

    Energy Technology Data Exchange (ETDEWEB)

    Ansell, G.B.; Cournoyer, M.E.; Hollis, K.W.; Monagle, M.

    1996-12-31

    Organic analysis is the analytical arm for several Los Alamos National Laboratory (LANL) research programs and nuclear materials processes, including characterization and certification of nuclear and nonnuclear materials used in weapons, radioactive waste treatment and waste certification programs. Organic Analysis has an extensive repertoire of analytical technique within the group including headspace gas, PCBs/pesticides, volatile organics and semivolatile organic analysis. In addition organic analysis has mobile labs with analytic capabilities that include volatile organics, total petroleum hydrocarbon, PCBs, pesticides, polyaromatic hydrocarbons and high explosive screening. A natural extension of these capabilities can be applied to the detection of chemical and biological agents,

  12. Detection Tuna and Processed Products Based Protein and DNA Barcoding

    Directory of Open Access Journals (Sweden)

    Nuring Wulansari

    2015-11-01

    Full Text Available Tuna is the second largest fishery commodity in Indonesia after the shrimp. Since the high demand and the limited stock of tuna resulted in fraudulent chance. Authentication is required to meassure consumers regarding the accuracy of its labeling and food safety. In this study, the authentication was based on protein and DNA barcoding using cytochrome-b gene (cyt-b of the mitochondrial DNA as the target of gene. Primer of cyt b gene was designed based on the tuna species. This study aimed to identify the authenticity of tuna fresh and its processed products through protein using SDS-PAGE and DNA barcoding techniques. The phases of this research were protein electrophoresis by SDS-PAGE, DNA extraction, PCR amplification, electrophoresis and sequencing. Samples of fresh fish (Tu1, Tu2, Tu3, Tu4, and Tu5 and processed tuna (canned and steak were successfully extracted. Result showed that SDS-PAGE proved the damage of proteins in the processed tuna, so this method was not appropriate if it is used to identify the authenticity of tuna. PCR electrophoresis results showed that the samples of tuna, tuna steak, sushi, meat ball, abon, and caned tuna were successfully amplified in the range of 500-750 bp except Ka3, which was in line with the target of DNA (620 bp. Resulted sequences of Tu2, Tu3, Tu4 and Tu5 were identified according the results of morphometric namely T. albacares, while Tu1 was identified as T. obesus with homology level of 99%. Processed tunas (steak and canned tuna were identified as T. albacares, as stated on the labels.

  13. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection. PMID:26811073

  14. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    Science.gov (United States)

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity. PMID:27396650

  15. Multisensor Network System for Wildfire Detection Using Infrared Image Processing

    Directory of Open Access Journals (Sweden)

    I. Bosch

    2013-01-01

    Full Text Available This paper presents the next step in the evolution of multi-sensor wireless network systems in the early automatic detection of forest fires. This network allows remote monitoring of each of the locations as well as communication between each of the sensors and with the control stations. The result is an increased coverage area, with quicker and safer responses. To determine the presence of a forest wildfire, the system employs decision fusion in thermal imaging, which can exploit various expected characteristics of a real fire, including short-term persistence and long-term increases over time. Results from testing in the laboratory and in a real environment are presented to authenticate and verify the accuracy of the operation of the proposed system. The system performance is gauged by the number of alarms and the time to the first alarm (corresponding to a real fire, for different probability of false alarm (PFA. The necessity of including decision fusion is thereby demonstrated.

  16. Low-power signal processing devices for portable ECG detection.

    Science.gov (United States)

    Lee, Shuenn-Yuh; Cheng, Chih-Jen; Wang, Cheng-Pin; Kao, Wei-Chun

    2008-01-01

    An analog front end for diagnosing and monitoring the behavior of the heart is presented. This sensing front end has two low-power processing devices, including a 5(th)-order Butterworth operational transconductance-C (OTA-C) filter and an 8-bit successive approximation analog-to-digital converter (SAADC). The components fabricated in a 0.18-microm CMOS technology feature with power consumptions of 453 nW (filter) and 940 nW (ADC) at a supply voltage of 1 V, respectively. The system specifications in terms of output noise and linearity associated with the two integrated circuits are described in this paper. PMID:19163002

  17. Cotton Pests and Diseases Detection Based on Image Processing

    OpenAIRE

    Qinghai He; Benxue Ma; Duanyang Qu; Qiang Zhang; Xinmin Hou; Jing Zhao

    2013-01-01

    Extract the damaged image form the cotton image in order to measure the damage ratio of the cotton leaf which caused by the diseases or pests. Several algorithms like image enhancement, image filtering which suit for cotton leaf processing were explored in this paper. Three different color models for extracting the damaged image from cotton leaf images were implemented, namely RGB color model, HSI color model, and YCbCr color model. The ratio of damage (γ) was chosen as feature to measure  th...

  18. Detecting fast, online reasoning processes in clinical decision making.

    Science.gov (United States)

    Flores, Amanda; Cobos, Pedro L; López, Francisco J; Godoy, Antonio

    2014-06-01

    In an experiment that used the inconsistency paradigm, experienced clinical psychologists and psychology students performed a reading task using clinical reports and a diagnostic judgment task. The clinical reports provided information about the symptoms of hypothetical clients who had been previously diagnosed with a specific mental disorder. Reading times of inconsistent target sentences were slower than those of control sentences, demonstrating an inconsistency effect. The results also showed that experienced clinicians gave different weights to different symptoms according to their relevance when fluently reading the clinical reports provided, despite the fact that all the symptoms were of equal diagnostic value according to the Diagnostic and Statistical Manual of Mental Disorders (4th ed., text rev.; American Psychiatric Association, 2000). The diagnostic judgment task yielded a similar pattern of results. In contrast to previous findings, the results of the reading task may be taken as direct evidence of the intervention of reasoning processes that occur very early, rapidly, and online. We suggest that these processes are based on the representation of mental disorders and that these representations are particularly suited to fast retrieval from memory and to making inferences. They may also be related to the clinicians' causal reasoning. The implications of these results for clinician training are also discussed. PMID:24274045

  19. Application of flaw detection methods for detection of fatigue processes in low-alloyed steel

    OpenAIRE

    Zbigniew H. śUREK

    2007-01-01

    The paper presents the investigations conducted in the Fraunhofer Institute (IZFP Saarbrücken) by use of a BEMI microscope (BEMI= Barkhausenrausch- und Wirbelstrom-Mikroskopie or Barkhausen Noise and Eddy Current Microscopy). The ability to detect cyclic and contact fatigue load influences has been investigated. The measurement amplitudes obtained with Barkhausen Noise and Eddy Current probes havebeen analysed. Correlation of measurement results and material’s condition has been observed in c...

  20. Speckle Tracking Based Strain Analysis Is Sensitive for Early Detection of Pathological Cardiac Hypertrophy.

    Directory of Open Access Journals (Sweden)

    Xiangbo An

    Full Text Available Cardiac hypertrophy is a key pathological process of many cardiac diseases. However, early detection of cardiac hypertrophy is difficult by the currently used non-invasive method and new approaches are in urgent need for efficient diagnosis of cardiac malfunction. Here we report that speckle tracking-based strain analysis is more sensitive than conventional echocardiography for early detection of pathological cardiac hypertrophy in the isoproterenol (ISO mouse model. Pathological hypertrophy was induced by a single subcutaneous injection of ISO. Physiological cardiac hypertrophy was established by daily treadmill exercise for six weeks. Strain analysis, including radial strain (RS, radial strain rate (RSR and longitudinal strain (LS, showed marked decrease as early as 3 days after ISO injection. Moreover, unlike the regional changes in cardiac infarction, strain analysis revealed global cardiac dysfunction that affects the entire heart in ISO-induced hypertrophy. In contrast, conventional echocardiography, only detected altered E/E', an index reflecting cardiac diastolic function, at 7 days after ISO injection. No change was detected on fractional shortening (FS, E/A and E'/A' at 3 days or 7 days after ISO injection. Interestingly, strain analysis revealed cardiac dysfunction only in ISO-induced pathological hypertrophy but not the physiological hypertrophy induced by exercise. Taken together, our study indicates that strain analysis offers a more sensitive approach for early detection of cardiac dysfunction than conventional echocardiography. Moreover, multiple strain readouts distinguish pathological cardiac hypertrophy from physiological hypertrophy.

  1. Differential functioning of retrieval/comparison processing in detection of the presence and absence of change.

    Directory of Open Access Journals (Sweden)

    Takuma Murakoshi

    Full Text Available BACKGROUND: The phenomenon of change blindness may reflect the failure to detect the presence of change or the absence of change. Although performing the latter is considered more difficult than the former, the differential functioning of retrieval/comparison processing that leads to differences between the detection of the presence and the absence of change has not been clarified. This study aimed to fill this research gap by comparing performance in the detection of the presence and the absence of a change in one item among a set of items. METHODOLOGY/PRINCIPAL FINDINGS: Twenty subjects performed two types of change detection tasks, the first task was detection of one changed item among a set of unchanged items (detection of the presence of a change and the other was the detection of one unchanged item among a set of changed items (detection of the absence of a change. The ANOVA results for the percentage of correct responses and signal detection measurement of A' values regarding change detection and the pattern of the results indicate that the subjects found (1 detection of the presence of change less difficult than detection of the absence of change (2, rejection of the presence of change less difficult than acceptance of the presence of change, and (3 rejection of the absence of change as difficult as acceptance of the absence of change. CONCLUSIONS/SIGNIFICANCE: Retrieval/comparison processing for the detection of the presence of change differs from that for the absence of change, likely because the retrieval/comparison process appears aimed at determining whether an item has changed but not whether an item appears the same as it had previously. This conclusion suggests the existence of an identification process that recognizes each item as the same as that observed previously that exists apart from the mechanism underlying retrieval/comparison processing.

  2. Digital image processing: effect on detectability of simulated low-contrast radiographic patterns

    International Nuclear Information System (INIS)

    Detection studies of simulated low-contrast radiographic patterns were performed with a high-quality digital image processing system. The original images, prepared with conventional screen-film systems, were processed digitally to enhance contrast by a ''windowing'' technique. The detectability of simulated patterns was quantified in terms of the results of observer performance experiments by using the multiple-alternative forced-choice method. The processed images demonstrated a significant increase in observer detection performance over that for the original images. These results are related to the displayed and perceived signal-to-noise ratios derived from signal detection theory. The improvement in detectability is ascribed to a reduction in the relative magnitude of the human observer's ''internal'' noise after image processing. The measured dependence of threshold signal contrast on object size and noise level is accounted for by a statistical decision theory model that includes internal noise

  3. Comparison between charged aerosol detection and light scattering detection for the analysis of Leishmania membrane phospholipids.

    Science.gov (United States)

    Ramos, R Godoy; Libong, D; Rakotomanga, M; Gaudin, K; Loiseau, P M; Chaminade, P

    2008-10-31

    The performance of charged aerosol detection (CAD) was compared to evaporative light scattering detection (ELSD) for the analysis of Leishmania membrane phospholipid (PL) classes by NP-HPLC. In both methods, a PVA-Sil column was used for the determination of the major Leishmania membrane PLs, phosphatidic acid, phosphatidylglycerol, cardiolipin, phosphatidylinositol, phosphatidylethathanolamine, phosphatidylserine, lysophosphatidylethathanolamine, phosphatidylcholine, sphingomyelin and lysophosphatidylcholine in the same analysis. Although the response of both detection methods can be fitted to a power function, CAD response can also be described by a linear model with determination coefficients (R(2)) ranging from 0.993 to 0.998 for an injected mass of 30 ng to 20.00 microg. CAD appeared to be directly proportional when a restricted range was used and it was found to be more sensitive at lowest mass range than ELSD. With HPLC-ELSD the limits of detection (LODs) were between 71 and 1195 ng and the limits of quantification (LOQs) were between 215 and 3622 ng. With HPLC-CAD, the LODs were between 15 and 249 ng whereas the limits of quantification (LOQs) were between 45 and 707 ng. The accuracy of the methods ranged from 62.8 to 115.8% and from 58.4 to 110.5% for ELSD and CAD, respectively. The HPLC-CAD method is suitable to assess the influence of miltefosine on the composition of Leishmania membrane phospholipids. PMID:18823632

  4. Cotton Pests and Diseases Detection Based on Image Processing

    Directory of Open Access Journals (Sweden)

    Qinghai He

    2013-06-01

    Full Text Available Extract the damaged image form the cotton image in order to measure the damage ratio of the cotton leaf which caused by the diseases or pests. Several algorithms like image enhancement, image filtering which suit for cotton leaf processing were explored in this paper. Three different color models for extracting the damaged image from cotton leaf images were implemented, namely RGB color model, HSI color model, and YCbCr color model. The ratio of damage (γ was chosen as feature to measure  the degree of damage which caused by diseases or pests. This paper also shows the comparison of the results obtained by the implementing in different color models, the comparison of results shows good accuracy in both color models and YCbCr color space is considered as the best color model for extracting the damaged image.

  5. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  6. Analysis of accelerants and fire debris using aroma detection technology

    Energy Technology Data Exchange (ETDEWEB)

    Barshick, S.A.

    1997-01-17

    The purpose of this work was to investigate the utility of electronic aroma detection technologies for the detection and identification of accelerant residues in suspected arson debris. Through the analysis of known accelerant residues, a trained neural network was developed for classifying suspected arson samples. Three unknown fire debris samples were classified using this neural network. The item corresponding to diesel fuel was correctly identified every time. For the other two items, wide variations in sample concentration and excessive water content, producing high sample humidities, were shown to influence the sensor response. Sorbent sampling prior to aroma detection was demonstrated to reduce these problems and to allow proper neural network classification of the remaining items corresponding to kerosene and gasoline.

  7. Automatic Fatigue Detection of Drivers through Yawning Analysis

    Science.gov (United States)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  8. Sentiment Detection of Web Users Using Probabilistic Latent Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Weijian Ren

    2014-10-01

    Full Text Available With the wide application of Internet in almost all fields, it has become the most important way for information publication, providing a large number of channels for spreading public opinion. Public opinions, as the response of Internet users to the information such as social events and government policies, reflect the status of both society and economics, which is highly valuable for the decision-making and public relations of enterprises. At present, the analysis methods for Internet public opinion are mainly based on discriminative approaches, such as Support Vector Machine (SVM and neural network. However, when these approaches analyze the sentiment of Internet public opinion, they are failed to exploit information hidden in text, e.g. topic. Motivated by the above observation, this paper proposes a detection method for public sentiment based on Probabilistic Latent Semantic Analysis (PLSA model. PLSA inherits the advantages of LSA, exploiting the semantic topic hidden in data. The procedure of detecting the public sentiment using this algorithm is composed of three main steps: (1 Chinese word segmentation and word refinement, with which each document is represented by a bag of words; (2 modeling the probabilistic distribution of documents using PLSA; (3 using the Z-vector of PLSA as the features of documents and delivering it to SVM for sentiment detection. We collect a set of text data from Weibo, blog, BBS etc. to evaluate our proposed approach. The experimental results shows that the proposed method in this paper can detect the public sentiment with high accuracy, outperforming the state-of-the-art approaches, i.e., word histogram based approach. The results also suggest that, text semantic analysis using PLSA could significantly boost the sentiment detection

  9. Sub pixel analysis and processing of sensor data for mobile target intelligence information and verification

    Science.gov (United States)

    Williams, Theresa Allen

    This dissertation introduces a novel process to study and analyze sensor data in order to obtain information pertaining to mobile targets at the sub-pixel level. The process design is modular in nature and utilizes a set of algorithmic tools for change detection, target extraction and analysis, super-pixel processing and target refinement. The scope of this investigation is confined to a staring sensor that records data of sub-pixel vehicles traveling horizontally across the ground. Statistical models of the targets and background are developed with noise and jitter effects. Threshold Change Detection, Duration Change Detection and Fast Adaptive Power Iteration (FAPI) Detection techniques are the three methods used for target detection. The PolyFit and FermiFit are two tools developed and employed for target analysis, which allows for flexible processing. Tunable parameters in the detection methods, along with filters for false alarms, show the adaptability of the procedures. Super-pixel processing tools are designed, and Refinement Through Tracking (RTT) techniques are investigated as post-processing refinement options. The process is tested on simulated datasets, and validated with sensor datasets obtained from RP Flight Systems, Inc.

  10. Data processing of ground-penetrating radar signals for the detection of discontinuities using polarization diversity

    Science.gov (United States)

    Tebchrany, Elias; Sagnard, Florence; Baltazart, Vincent; Tarel, Jean-Phillippe

    2014-05-01

    In civil engineering, ground penetrating radar (GPR) is used to survey pavement thickness at traffic speed, detect and localize buried objects (pipes, cables, voids, cavities), zones of cracks and discontinuities in concrete or soils. In this work, a ground-coupled radar made of a pair of transmitting and receiving bowtie-slot antennas is moved linearly on the soil surface to detect the reflected waves induced by discontinuities in the subsurface. The GPR system operates in the frequency domain using a step-frequency continuous wave (SFCW) using a Vector Network Analyzer (VNA) in an ultra-wide band [0.3 ; 4] GHz. The detection of targets is usually focused on time imaging. Thus, the targets (limited in size) are usually shown by diffraction hyperbolas on a Bscan image that is an unfocused depiction of the scatterers. The contrast in permittivity and the ratio between the size of the object and the wavelength are important parameters in the detection process. Thus, we have made a first study on the use of polarization diversity to obtain additional information relative to the contrast between the soil and the target and the dielectric characteristics of a target. The two main polarizations configurations of the radar have been considered in the presence of objects having a pipe geometry: the TM (Transverse Magnetic) and TE (Transverse Electric. To interpret the diffraction hyperbolas on a Bscan image, we have used pre-processing techniques are necessary to reduce the clutter signal which can overlap and obscure the target responses, particularly shallow objects. The clutter, which can be composed of the direct coupling between the antennas and the reflected wave from the soil surface, the scattering on the heterogeneities due to the granular nature of the subsurface material, and some additive noise, varies with soil dielectric characteristics and/or surface roughness and leads to uncertainty in the measurements (additive noise). Because of the statistical nature of

  11. Improved evaporative light scattering detection for carbohydrate analysis.

    Science.gov (United States)

    Condezo-Hoyos, Luis; Pérez-López, Elena; Rupérez, Pilar

    2015-08-01

    Optimization and validation of evaporative light scattering detector (ELSD), aided by response surface methodology (RSM), has been developed for the liquid chromatography analysis of a wide molecular weight (MW) range of carbohydrates, including polysaccharides and oligosaccharides. Optimal experimental parameters for the ELSD detection were: 88.8°C evaporator temperature, 77.9°C nebulizer temperature and 1.1 standard litres per minute nitrogen flow rate. Optimal ELSD detection, used together with high performance size exclusion chromatography (HPSEC) of carbohydrates, gave a linear range from 250 to 1000 mg L(-1) (R(2)>0.998), with limits of detection and quantitation of 4.83-11.67 and 16.11-38.91 mg L(-1), respectively. Relative standard deviation was lower than 1.8% for intra-day and inter-day repeatability for apple pectin, inulin, verbascose, stachyose and raffinose. Recovery ranged from 103.7% to 118.3% for fructo-oligosaccharides, α-galacto-oligosaccharides and disaccharides. Optimized and validated ELSD detection is proposed for the analysis of high- to low-MW carbohydrates with high sensitivity, precision and accuracy. PMID:25766827

  12. Hyperspectral image compression and target detection using nonlinear principal component analysis

    Science.gov (United States)

    Du, Qian; Wei, Wei; Ma, Ben; Younan, Nicolas H.

    2013-09-01

    The widely used principal component analysis (PCA) is implemented in nonlinear by an auto-associative neural network. Compared to other nonlinear versions, such as kernel PCA, such a nonlinear PCA has explicit encoding and decoding processes, and the data can be transformed back to the original space. Its data compression performance is similar to that of PCA, but data analysis performance such as target detection is much better. To expedite its training process, graphics computing unit (GPU)-based parallel computing is applied.

  13. Detection of Wind Turbine Power Performance Abnormalities Using Eigenvalue Analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Sweeney, Christian Walsted; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech; Mijatovic, Nenad; Holbøll, Joachim

    2014-01-01

    commercially available and functioning successfully in fixed speed and vari- able speed turbines. Power performance analysis is a method specifically applicable to wind turbines for the detection of power generation changes due to external factors, such as ic- ing, internal factors, such as controller...... malfunction, or delib- erate actions, such as power de-rating. In this paper, power performance analysis is performed by sliding a time-power window and calculating the two eigenvalues corresponding to the two dimensional wind speed - power generation dis- tribution. The power is classified into five bins in...... order to achieve better resolution and thus identify the most proba- ble root cause of the power deviation. An important aspect of the proposed technique is its independence of the power curve provided by the turbine manufacturer. It is shown that by detecting any changes of the two eigenvalues trends...

  14. Detection of bearing damage by statistic vibration analysis

    Science.gov (United States)

    Sikora, E. A.

    2016-04-01

    The condition of bearings, which are essential components in mechanisms, is crucial to safety. The analysis of the bearing vibration signal, which is always contaminated by certain types of noise, is a very important standard for mechanical condition diagnosis of the bearing and mechanical failure phenomenon. In this paper the method of rolling bearing fault detection by statistical analysis of vibration is proposed to filter out Gaussian noise contained in a raw vibration signal. The results of experiments show that the vibration signal can be significantly enhanced by application of the proposed method. Besides, the proposed method is used to analyse real acoustic signals of a bearing with inner race and outer race faults, respectively. The values of attributes are determined according to the degree of the fault. The results confirm that the periods between the transients, which represent bearing fault characteristics, can be successfully detected.

  15. DROIDSWAN: DETECTING MALICIOUS ANDROID APPLICATIONS BASED ON STATIC FEATURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Babu Rajesh V

    2015-07-01

    Full Text Available Android being a widely used mobile platform has witnessed an increase in the number of malicious samples on its market place. The availability of multiple sources for downloading applications has also contributed to users falling prey to malicious applications. Classification of an Android application as malicious or benign remains a challenge as malicious applications maneuver to pose themselves as benign. This paper presents an approach which extracts various features from Android Application Package file (APK using static analysis and subsequently classifies using machine learning techniques. The contribution of this work includes deriving, extracting and analyzing crucial features of Android applications that aid in efficient classification. The analysis is carried out using various machine learning algorithms with both weighted and non-weighted approaches. It was observed that weighted approach depicts higher detection rates using fewer features. Random Forest algorithm exhibited high detection rate and shows the least false positive rate.

  16. Clinical Detection and Feature Analysis on Neuro Signals

    Institute of Scientific and Technical Information of China (English)

    张晓文; 杨煜普; 许晓鸣; 胡天培; 高忠华; 张键; 陈中伟; 陈统一

    2004-01-01

    Research on neuro signals is challenging and significative in modern natural science. By clinical experiment, signals from three main nerves (median nerve, radial nerve and ulnar nerve) are successfully detected and recorded without any infection. Further analysis on their features under different movements, their mechanics and correlations in dominating actions are also performed. The original discovery and first-hand materials make it possible for developing practical neuro-prosthesis.

  17. Big Data and Specific Analysis Methods for Insurance Fraud Detection

    Directory of Open Access Journals (Sweden)

    Ramona BOLOGA

    2014-02-01

    Full Text Available Analytics is the future of big data because only transforming data into information gives them value and can turn data in business in competitive advantage. Large data volumes, their variety and the increasing speed their growth, stretch the boundaries of traditional data warehouses and ETL tools. This paper investigates the benefits of Big Data technology and main methods of analysis that can be applied to the particular case of fraud detection in public health insurance system in Romania.

  18. Detecting errors in micro and trace analysis by using statistics

    DEFF Research Database (Denmark)

    Heydorn, K.

    1993-01-01

    By assigning a standard deviation to each step in an analytical method it is possible to predict the standard deviation of each analytical result obtained by this method. If the actual variability of replicate analytical results agrees with the expected, the analytical method is said to be in...... to results for chlorine in freshwater from BCR certification analyses by highly competent analytical laboratories in the EC. Titration showed systematic errors of several percent, while radiochemical neutron activation analysis produced results without detectable bias....

  19. Detecting Pandemic and Endemic Incidents through Network Telescopes: Security Analysis

    OpenAIRE

    Gagadis, Fotis

    2009-01-01

    Moore et al., from the Cooperative Association for Internet Data Analysis (CAIDA), proposed in recent years another measurement and monitoring method for networks and the Internet. Network Telescopes are used to detect malicious traffc events generated from Denial of Service attacks, worm infected hosts and misconfiguration. This report is focused on endemic and pandemic incidents (DoS, Worm) and how these incidents observed through different Darknet topologies and sta...

  20. RADIA: RNA and DNA Integrated Analysis for Somatic Mutation Detection

    OpenAIRE

    Radenbaugh, Amie J.; Ma, Singer; Ewing, Adam; Stuart, Joshua M.; Collisson, Eric A.; Zhu, Jingchun; Haussler, David

    2014-01-01

    The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA) to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis), a method that combines the patient-matched normal and tumor DNA with the tumor RN...

  1. A Survey on Malware Propagation, Analysis, and Detection

    Directory of Open Access Journals (Sweden)

    Mohsen Damshenas

    2015-05-01

    Full Text Available Lately, a new kind of war takes place between the security community and malicious software developers, the security specialists use all possible techniques, methods and strategies to stop and remove the threats while the malware developers utilize new types of malwares that bypass implemented security features. In this study we closely looked into malware, to understand the definition, types, propagation of malware, and detecting/defending mechanisms in order to contribute to the process of protection and security enhancement.

  2. Intersections of behavior analysis with cognitive models of contingency detection

    OpenAIRE

    Cigales, Maricel

    1997-01-01

    Bower and Watson have offered, respectively, a logical hypothesis-testing model and a conditional probability model of contingency detection by young infants. Although each could represent cognitive processes concomitant with operant learning, empirical support for these models is sparse. The limitations of each model are discussed, and suggestions are made for a more parsimonious approach by focusing on the areas of overlap between the two.

  3. Application of factor analysis to the explosive detection

    International Nuclear Information System (INIS)

    The detection of explosive devices hidden in airline baggage is significant problem, particularly in view of the development of modern plastic explosives which can formed into various innocent-appearing shapes and which are sufficiently powerful that small quantities can destroy an aircraft in flight. Besides, the biggest difficulty occurs from long detection time required for the explosive detection system based on thermal neutron interrogation, which involves exposing baggage to slow neutrons having energy in the order of 0.025 eV. The elemental compositions of explosives can be determined by the Neutron Induced Prompt gamma Spectroscopy (NIPS) which has been installed in Korea Atomic Energy Research Institute as a tool for the detection of explosives in passenger baggage. In this work, the factor analysis has been applied to the NIPS system to increase the signal-to-noise ratio of the prompt gamma spectrum for the detection of explosive hidden in a passenger's baggage, especially for the noisy prompt gamma spectrum obtained with short measurement time

  4. Fitness Landscape Analysis for Optimum Multiuser Detection Problem

    Institute of Scientific and Technical Information of China (English)

    WANG Shaowei; ZHU Qiuping

    2007-01-01

    Optimum multiuser detection (OMD) for CDMA systems is an NP-complete combinatorial optimization problem. Fitness landscape has been proven to be very useful for understanding the behavior of combinatorial optimization algorithms and can help in predicting their performance. This paper analyzes the statistic properties of the fitness landscape of the OMD problem by performing autocorrelation analysis, fitness distance correlation test and epistasis measure. The analysis results explain why some random search algorithms are effective methods for OMD problem and give hints how to design more efficient randomized search heuristic algorithms for OMD.

  5. Limits of detection in instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Lower limits of detection (LLODs), frequently referred to simply as limits of detection and abbreviated as LODs, often appear in the literature of analytical chemistry - for numerous different methods of elemental and/or molecular analysis. In this chapter, one particular method of quantitative elemental analysis, that of instrumental neutron activation analysis (INAA), is the subject discussed, with reference to LODs. Particularly in the literature of neutron activation analysis (NAA), many tables of 'interference-free' NAA LODs are available. Not all of these are of much use, because (1) for many the definition used for LOD is not clear, or reasonable, (2) for many, the analysis conditions used are not clearly specified, and (3) for many, the analysis conditions used are specified, but not very practicable for most laboratories. For NAA work, such tables of interference-free LODs are, in any case, only applicable to samples in which, at the time of counting, only one radionuclide is present to any significant extent in the activated sample. It is important to note that tables of INAA LODs, per se, do not exist - since the LOD for a given element, under stated analysis conditions, can vary by orders of magnitude, depending on the elemental composition of the matrix in which it is present. For any given element, its INAA LOD will always be as large as, and usually much larger than, its tabulated 'interference-free' NAA LOD - how much larger depending upon the elemental composition of the matrix in which it is present. As discussed in this chapter, however, an INAA computer program exists that can calculate realistic INAA LODs for any elements of interest, in any kind of specified sample matrix, under any given set of analysis conditions

  6. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  7. Two stages of parafoveal processing during reading: Evidence from a display change detection task.

    Science.gov (United States)

    Angele, Bernhard; Slattery, Timothy J; Rayner, Keith

    2016-08-01

    We used a display change detection paradigm (Slattery, Angele, & Rayner Human Perception and Performance, 37, 1924-1938 2011) to investigate whether display change detection uses orthographic regularity and whether detection is affected by the processing difficulty of the word preceding the boundary that triggers the display change. Subjects were significantly more sensitive to display changes when the change was from a nonwordlike preview than when the change was from a wordlike preview, but the preview benefit effect on the target word was not affected by whether the preview was wordlike or nonwordlike. Additionally, we did not find any influence of preboundary word frequency on display change detection performance. Our results suggest that display change detection and lexical processing do not use the same cognitive mechanisms. We propose that parafoveal processing takes place in two stages: an early, orthography-based, preattentional stage, and a late, attention-dependent lexical access stage. PMID:26769246

  8. PVC生产过程故障传感器的检测与重构%Faulty Sensor Detection and Reconstruction for a PVC Making Process

    Institute of Scientific and Technical Information of China (English)

    李元; 周东华; 谢植; S.Joe.Qin

    2004-01-01

    Based on principal component analysis, this paper presents an application of faulty sensor detection and reconstruction in a batch process, polyvinylchloride (PVC) making process. To deal with inconsistency in process data, it is proposed to use the dynamic time warping technique to make the historical data synchronized first, then build a consistent multi-way principal component analysis model. Fault detection is carried out based on squared prediction error statistical control plot. By defining principal component subspace, residual subspace and sensor validity index, faulty sensor can be reconstructed and identified along the fault direction. Finally, application results are illustrated in detail by use of the real data of an industrial PVC making process.

  9. Compositional analysis of polycrystalline hafnium oxide thin films by heavy-ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, F.L. [Departamento de Electronica y Tecnologia de Computadoras, Universidad Politecnica de Cartagena, Campus Universitario Muralla del Mar, E-30202 Cartagena (Spain)]. E-mail: Felix.Martinez@upct.es; Toledano, M. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); San Andres, E. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Martil, I. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Gonzalez-Diaz, G. [Departamento de Fisica Aplicada III, Universidad Complutense de Madrid, E-28025 Madrid (Spain); Bohne, W. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany); Roehrich, J. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany); Strub, E. [Hahn-Meitner-Institut Berlin, Abteilung SF-4, D-14109 Berlin (Germany)

    2006-10-25

    The composition of polycrystalline hafnium oxide thin films has been measured by heavy-ion elastic recoil detection analysis (HI-ERDA). The films were deposited by high-pressure reactive sputtering (HPRS) on silicon wafers using an oxygen plasma at pressures between 0.8 and 1.6 mbar and during deposition times between 0.5 and 3.0 h. Hydrogen was found to be the main impurity and its concentration increased with deposition pressure. The composition was always slightly oxygen-rich, which is attributed to the oxygen plasma. Additionally, an interfacial silicon oxide thin layer was detected and taken into account. The thickness of the hafnium oxide film was found to increase linearly with deposition time and to decrease exponentially with deposition pressure, whereas the thickness of the silicon oxide interfacial layer has a minimum as a function of pressure at around 1.2 mbar and increases slightly as a function of time. The measurements confirmed that this interfacial layer is formed mainly during the early stages of the deposition process.

  10. [Research on a non-invasive pulse wave detection and analysis system].

    Science.gov (United States)

    Li, Ting; Yu, Gang

    2008-10-01

    A novel non-invasive pulse wave detection and analysis system has been developed, including the software and the hardware. Bi-channel signals can be acquired, stored and shown on the screen dynamically at the same time. Pulse wave can be reshown and printed after pulse wave analysis and pulse wave velocity analysis. This system embraces a computer which is designed for fast data saving, analyzing and processing, and a portable data sampling machine which is based on a singlechip. Experimental results have shown that the system is stable and easy to use, and the parameters are calculated accurately. PMID:19024446

  11. Risk Analysis for Nonthermal process interventions

    Science.gov (United States)

    Over the last few years a number of nonthermal process interventions including ionizing radiation and ultraviolet light, high pressure processing, pulsed-electric and radiofrequency electric fields, microwave and infrared technologies, bacteriophages, etc. have been approved by regulatory agencies, ...

  12. The early component of middle latency auditory-evoked potentials in the process of deviance detection.

    Science.gov (United States)

    Li, Linfeng; Gong, Qin

    2016-07-01

    The aim of the present study was to investigate both the encoding mechanism and the process of deviance detection when deviant stimuli were presented in various patterns in an environment featuring repetitive sounds. In adults with normal hearing, middle latency responses were recorded within an oddball paradigm containing complex tones or speech sounds, wherein deviant stimuli featured different change patterns. For both complex tones and speech sounds, the Na and Pa components of middle latency responses showed an increase in the mean amplitude and a reduction in latency when comparing rare deviant stimuli with repetitive standard stimuli in a stimulation block. However, deviant stimuli with a rising frequency induced signals with smaller amplitudes than other deviant stimuli. The present findings indicate that deviant stimuli with different change patterns induce differing responses in the primary auditory cortex. In addition, the Pa components of speech sounds typically feature a longer latency and similar mean amplitude compared with complex tones, which suggests that the auditory system requires more complex processing for the analysis of speech sounds before processing in the auditory cortex. PMID:27203294

  13. Multidimensional data modeling for business process analysis

    OpenAIRE

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    2007-01-01

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging thes...

  14. Multidimensional Data Modeling for Business Process Analysis

    Science.gov (United States)

    Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.

    The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.

  15. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension to...... marked Hawkes processes are discussed....

  16. The Development of Manufacturing Process Analysis: Lesson Learned from Process Mining

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2014-01-01

    Full Text Available Process analysis is recognized as a major stage in business process reengineering that has developed over the last two decades. In the field of manufacturing, manufacturing process analysis (MPA is defined as performance analysis of the production process. The performance analysis is an outline from data and knowledge into useful forms that can be broadly applied in manufacturing sectors. Process mining, an emerge tool focusing on process perspective and resource perspective, is a way to analyze system based on the event log. The objective of this study is to extend the existing process analysis framework by considering attribute perspective. This study also aims to learn the lessons from some experiences on process mining in manufacture industries. The result of this study will help manufacturing organizations to utilize the process mining approach for analyzing their respective processes.

  17. HIGH RESOLUTION RESISTIVITY LEAK DETECTION DATA PROCESSING and EVALUATION MEHTODS and REQUIREMENTS

    International Nuclear Information System (INIS)

    This document has two purposes: (smbullet) Describe how data generated by High Resolution REsistivity (HRR) leak detection (LD) systems deployed during single-shell tank (SST) waste retrieval operations are processed and evaluated. (smbullet) Provide the basic review requirements for HRR data when Hrr is deployed as a leak detection method during SST waste retrievals

  18. Infrared processing and sensor fusion for anti-personnel land-mine detection

    NARCIS (Netherlands)

    Schavemaker, J.G.M.; Cremer, F.; Schutte, K.; Breejen, E. den

    2000-01-01

    In this paper we present the results of infrared processing and sensor fusion obtained within the European research project GEODE (Ground Explosive Ordnance DEtection) that strives for the realization of a vehicle-mounted, multi-sensor anti-personnel land-mine detection system for humanitarian demin

  19. MULTISPECTRAL DETECTION OF ORGANIC RESIDUES ON POULTRY PROCESSING PLANT EQUIPMENT BASED ON HYPERSPECTRAL REFLECTANCE IMAGING TECHNIQUE

    Science.gov (United States)

    Diluted organic residues, such as feces, ingesta and other biological substances on poultry processing plant equipment surfaces, not easily discernible by human eye, are potential contamination sources for poultry carcasses. Development of sensitive detection methods for diluted organic residues is ...

  20. Managing Analysis Models in the Design Process

    Science.gov (United States)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  1. Cross-disciplinary detection and analysis of network motifs.

    Science.gov (United States)

    Tran, Ngoc Tam L; DeLuccia, Luke; McDonald, Aidan F; Huang, Chun-Hsi

    2015-01-01

    The detection of network motifs has recently become an important part of network analysis across all disciplines. In this work, we detected and analyzed network motifs from undirected and directed networks of several different disciplines, including biological network, social network, ecological network, as well as other networks such as airlines, power grid, and co-purchase of political books networks. Our analysis revealed that undirected networks are similar at the basic three and four nodes, while the analysis of directed networks revealed the distinction between networks of different disciplines. The study showed that larger motifs contained the three-node motif as a subgraph. Topological analysis revealed that similar networks have similar small motifs, but as the motif size increases, differences arise. Pearson correlation coefficient showed strong positive relationship between some undirected networks but inverse relationship between some directed networks. The study suggests that the three-node motif is a building block of larger motifs. It also suggests that undirected networks share similar low-level structures. Moreover, similar networks share similar small motifs, but larger motifs define the unique structure of individuals. Pearson correlation coefficient suggests that protein structure networks, dolphin social network, and co-authorships in network science belong to a superfamily. In addition, yeast protein-protein interaction network, primary school contact network, Zachary's karate club network, and co-purchase of political books network can be classified into a superfamily. PMID:25983553

  2. A New MANET Wormhole Detection Algorithm Based on Traversal Time and Hop Count Analysis

    Directory of Open Access Journals (Sweden)

    Göran Pulkkis

    2011-11-01

    Full Text Available As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA, which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability.

  3. A New MANET wormhole detection algorithm based on traversal time and hop count analysis.

    Science.gov (United States)

    Karlsson, Jonny; Dooley, Laurence S; Pulkkis, Göran

    2011-01-01

    As demand increases for ubiquitous network facilities, infrastructure-less and self-configuring systems like Mobile Ad hoc Networks (MANET) are gaining popularity. MANET routing security however, is one of the most significant challenges to wide scale adoption, with wormhole attacks being an especially severe MANET routing threat. This is because wormholes are able to disrupt a major component of network traffic, while concomitantly being extremely difficult to detect. This paper introduces a new wormhole detection paradigm based upon Traversal Time and Hop Count Analysis (TTHCA), which in comparison to existing algorithms, consistently affords superior detection performance, allied with low false positive rates for all wormhole variants. Simulation results confirm that the TTHCA model exhibits robust wormhole route detection in various network scenarios, while incurring only a small network overhead. This feature makes TTHCA an attractive choice for MANET environments which generally comprise devices, such as wireless sensors, which possess a limited processing capability. PMID:22247657

  4. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  5. Detection and quantification of waterborne microorganisms using an image cytometer based on angular spatial frequency processing

    CERN Document Server

    Pérez, Juan Miguel; Martínez, Pedro; Pruneri, Valerio

    2015-01-01

    We introduce a new image cytometer design for detection of very small particulate and demonstrate its capability in water analysis. The device is a compact microscope composed of off--the--shelf components, such as a light emitting diode (LED) source, a complementary metal--oxide--semiconductor (CMOS) image sensor, and a specific combination of optical lenses that allow, through an appropriate software, Fourier transform processing of the sample volume. Waterborne microorganisms, such as Escherichia coli (E. coli), Legionella pneumophila (L. pneumophila) and Phytoplankton, are detected by interrogating the volume sample either in a fluorescent or label-free mode, i.e. with or without fluorescein isothiocyanate (FITC) molecules attached to the micro-organisms, respectively. We achieve a sensitivity of 50 CFU/ml, which can be further increased to 0.2 CFU/ml by pre-concentrating an initial sample volume of 500 ml with an ad hoc fluidic system. We also prove the capability of the proposed image cytometer of diffe...

  6. Articulating the Resources for Business Process Analysis and Design

    Science.gov (United States)

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  7. Effect of Using Automated Auditing Tools on Detecting Compliance Failures in Unmanaged Processes

    Science.gov (United States)

    Doganata, Yurdaer; Curbera, Francisco

    The effect of using automated auditing tools to detect compliance failures in unmanaged business processes is investigated. In the absence of a process execution engine, compliance of an unmanaged business process is tracked by using an auditing tool developed based on business provenance technology or employing auditors. Since budget constraints limit employing auditors to evaluate all process instances, a methodology is devised to use both expert opinion on a limited set of process instances and the results produced by fallible automated audit machines on all process instances. An improvement factor is defined based on the average number of non-compliant process instances detected and it is shown that the improvement depends on the prevalence of non-compliance in the process as well as the sensitivity and the specificity of the audit machine.

  8. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  9. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  10. Series Arc Fault Detection Algorithm Based on Autoregressive Bispectrum Analysis

    Directory of Open Access Journals (Sweden)

    Kai Yang

    2015-10-01

    Full Text Available Arc fault is one of the most critical reasons for electrical fires. Due to the diversity, randomness and concealment of arc faults in low-voltage circuits, it is difficult for general methods to protect all loads from series arc faults. From the analysis of many series arc faults, a large number of high frequency signals generated in circuits are found. These signals are easily affected by Gaussian noise which is difficult to be eliminated as a result of frequency aliasing. Thus, a novel detection algorithm is developed to accurately detect series arc faults in this paper. Initially, an autoregressive model of the mixed high frequency signals is modelled. Then, autoregressive bispectrum analysis is introduced to analyze common series arc fault features. The phase information of arc fault signal is preserved using this method. The influence of Gaussian noise is restrained effectively. Afterwards, several features including characteristic frequency, fluctuation of phase angles, diffused distribution and incremental numbers of bispectrum peaks are extracted for recognizing arc faults. Finally, least squares support vector machine is used to accurately identify series arc faults from the load states based on these frequency features of bispectrum. The validity of the algorithm is experimentally verified obtaining arc fault detection rate above 97%.

  11. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  12. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Directory of Open Access Journals (Sweden)

    Amie J Radenbaugh

    Full Text Available The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis, a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84% and very high precision (98% and 99% for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA.

  13. RADIA: RNA and DNA integrated analysis for somatic mutation detection.

    Science.gov (United States)

    Radenbaugh, Amie J; Ma, Singer; Ewing, Adam; Stuart, Joshua M; Collisson, Eric A; Zhu, Jingchun; Haussler, David

    2014-01-01

    The detection of somatic single nucleotide variants is a crucial component to the characterization of the cancer genome. Mutation calling algorithms thus far have focused on comparing the normal and tumor genomes from the same individual. In recent years, it has become routine for projects like The Cancer Genome Atlas (TCGA) to also sequence the tumor RNA. Here we present RADIA (RNA and DNA Integrated Analysis), a novel computational method combining the patient-matched normal and tumor DNA with the tumor RNA to detect somatic mutations. The inclusion of the RNA increases the power to detect somatic mutations, especially at low DNA allelic frequencies. By integrating an individual's DNA and RNA, we are able to detect mutations that would otherwise be missed by traditional algorithms that examine only the DNA. We demonstrate high sensitivity (84%) and very high precision (98% and 99%) for RADIA in patient data from endometrial carcinoma and lung adenocarcinoma from TCGA. Mutations with both high DNA and RNA read support have the highest validation rate of over 99%. We also introduce a simulation package that spikes in artificial mutations to patient data, rather than simulating sequencing data from a reference genome. We evaluate sensitivity on the simulation data and demonstrate our ability to rescue back mutations at low DNA allelic frequencies by including the RNA. Finally, we highlight mutations in important cancer genes that were rescued due to the incorporation of the RNA. PMID:25405470

  14. Fissile Material Detection by Differential Die Away Analysis

    Science.gov (United States)

    Shaw, Timothy J.; Strellis, Dan A.; Stevenson, John; Keeley, Doug; Gozani, Tsahi

    2009-03-01

    Detection and interdiction of Special Nuclear Material (SNM) in transportation is one of the most critical security issues facing the United States. Active inspection by inducing fission in fissile nuclear materials, such as 235U and 239Pu, provides several strong and unique signatures that make the detection of concealed nuclear materials technically very feasible. Differential Die-Away Analysis (DDAA) is a very efficient, active neutron-based technique that uses the abundant prompt fission neutrons signature. It benefits from high penetrability of the probing and signature neutrons, high fission cross section, high detection sensitivity, ease of deployment and relatively low cost. DDAA can use any neutron source or energy as long as it can be suitably pulsed. The neutron generator produces pulses of neutrons that are directed into a cargo. As each pulse passes through the cargo, the neutrons are thermalized and absorbed. If SNM is present, the thermalized neutrons create a new source of (fission) neutrons with a distinctive time profile. An efficient laboratory system was designed, fabricated and tested under a US Government DHS DNDO contract. It was shown that a small uranium sample can be detected in a large variety of cargo types and configurations within practical measurement times using commercial compact (d,T) sources. Using stronger sources and wider detector distribution will further cut inspection time. The system can validate or clear alarms from a primary inspection system such as an automated x-ray system.

  15. Analysis of the theoretical bias in dark matter direct detection

    International Nuclear Information System (INIS)

    Fitting the model ''A'' to dark matter direct detection data, when the model that underlies the data is ''B'', introduces a theoretical bias in the fit. We perform a quantitative study of the theoretical bias in dark matter direct detection, with a focus on assumptions regarding the dark matter interactions, and velocity distribution. We address this problem within the effective theory of isoscalar dark matter-nucleon interactions mediated by a heavy spin-1 or spin-0 particle. We analyze 24 benchmark points in the parameter space of the theory, using frequentist and Bayesian statistical methods. First, we simulate the data of future direct detection experiments assuming a momentum/velocity dependent dark matter-nucleon interaction, and an anisotropic dark matter velocity distribution. Then, we fit a constant scattering cross section, and an isotropic Maxwell-Boltzmann velocity distribution to the simulated data, thereby introducing a bias in the analysis. The best fit values of the dark matter particle mass differ from their benchmark values up to 2 standard deviations. The best fit values of the dark matter-nucleon coupling constant differ from their benchmark values up to several standard deviations. We conclude that common assumptions in dark matter direct detection are a source of potentially significant bias

  16. Detection of irradiated chicken by 2-alkylcyclobutanone analysis

    International Nuclear Information System (INIS)

    Chicken meat irradiated at 0.5 kGy or higher doses were identified by GC/MS method analyzing 2-dodecylcyclobutanone (2-DCB) and 2-tetradecylcyclobutanone (2-TCB), which are formed from palmitic acid and stearic acid respectively, and isolated using extraction procedures of soxhlet-florisil chromatography. Many fat-containing foods have oleic acid in abundance as parent fatty acid, and chicken meat contains palmitoleic acid to the amount as much as stearic acid. In this study, we detected 2-tetradec-5'-enylcyclobutanone (2-TeCB) and 2-dodec-5'-enylcyclobutanone (2-DeCB) in chicken meat, which are formed from oleic acid and palmitoleic acid by irradiation respectively, using GC/MS method. Sensitivity in detection of both 2-TeCB and 2-DeCB were lower than that of 2-DCB. However, at least 0.57 μg/g/fat of 2-TeCB was detected in chicken meat irradiated at 0.5 kGy, so 2-TeCB seems to be a useful marker for the identification of irradiated foods containing fat. On the contrary, 2-DeCB was not detected clearly at low doses. This suggests that 2-DeCB may be a useful marker for irradiated fat in the food having enough amount of palmitoleic acid needed to analysis. In addition, 2-tetradecadienylcyclobutanone, which is formed from linoleic acid was also found in chicken meat. (author)

  17. Modeling and anomalous cluster detection for point processes using process convolutions

    OpenAIRE

    Liang, WWJ; Colvin, JB; Sansó, B; Lee, HKH

    2014-01-01

    We present a model using process convolutions,which describes spatial and temporal variations of the intensity of events that occur at random geographical locations. An inhomogeneous Poisson process is used to model the intensity over a spatial region with multiplicative spatial and temporal covariate effects. Temporal variation in the structure of the intensity is obtained by employing a time-varying process for the convolution. Use of a compactly supported kernel in the convolution improves...

  18. In-Process Detection of Weld Defects Using Laser-Based Ultrasonic Lamb Waves

    Energy Technology Data Exchange (ETDEWEB)

    Kercel, S W

    2001-01-04

    Laser-based ultrasonic (LBU) measurement shows great promise for on-line monitoring of weld quality in tailor-welded blanks. Tailor-welded blanks are steel blanks made from plates of differing thickness and/or properties butt-welded together; they are used in automobile manufacturing to produce body, frame, and closure panels. LBU uses a pulsed laser to generate the ultrasound and a continuous wave (CW) laser interferometer to detect the ultrasound at the point of interrogation to perform ultrasonic inspection. LBU enables in-process measurements since there is no sensor contact or near-contact with the workpiece. The authors have used laser-generated plate (Lamb) waves to propagate from one plate into the weld nugget as a means of detecting defects. This report recounts an investigation of a number of inspection architectures based on processing of signals from selected plate waves, which are either reflected from or transmitted through the weld zone. Bayesian parameter estimation and wavelet analysis (both continuous and discrete) have shown that the LBU time-series signal is readily separable into components that provide distinguishing features, which describe weld quality. The authors anticipate that, in an on-line industrial application, these measurements can be implemented just downstream from the weld cell. Then the weld quality data can be fed back to control critical weld parameters or alert the operator of a problem requiring maintenance. Internal weld defects and deviations from the desired surface profile can then be corrected before defective parts are produced. The major conclusions of this study are as follows. Bayesian parameter estimation is able to separate entangled Lamb wave modes. Pattern recognition algorithms applied to Lamb mode features have produced robust features for distinguishing between several types of weld defects. In other words, the information is present in the output of the laser ultrasonic hardware, and it is feasible to

  19. A User Requirements Analysis Approach Based on Business Processes

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yue-bin; HAN Wen-xiu

    2001-01-01

    Requirements analysis is the most important phase of information system development.Existing requirements analysis techniques concern little or no about features of different business processes.This paper presents a user requirements analysis approach which focuses business processes on the early stage of requirements analysis. It also gives an example of the using of this approach in the analysis of an enterprise information system.

  20. Economic analysis of thermal solvent processes

    International Nuclear Information System (INIS)

    Vapour extraction (VAPEX) uses horizontal well pairs and a gaseous solvent to mobilize the oil. Hybrid solvent processes inject a light hydrocarbon solvent in addition to sufficient amounts of steam to vaporize the solvent. This paper reviewed various laboratory model experiments that evaluated VAPEX and solvent-based processes for the recovery of heavy oil or bitumen. The project compared a VAPEX process, a thermal solvent reflux process and a hybrid-solvent SAGD process using scaled laboratory models. Several experimental models were used. The first high-pressure thermal solvent experiment was conducted with a laboratory model designed to scale a 20 m thick Burnt Lake reservoir. Propane was used as the solvent. The second sequence of experiments scaled a range of processes from VAPEX to hybrid solvents for an Athabasca bitumen reservoir using a sealed can type of model confined by a gaseous overburden with propane as the solvent. The third experiment was a hybrid solvent experiment in which propane and steam were injected simultaneously into the injector well. The final experiment was a propane-steam hybrid experiment at a higher steam injection rate. The aim of the study was to evaluate the processes, build a database of experimental performance and to determine whether any single process had a significant economic advantage. It was concluded that the lowest cost process for Athabasca bitumen was the thermal solvent hybrid process followed by low pressure SAGD. The thermal solvent experiment using hot propane injection recovered heavy oil at costs competitive to SAGD. Many of the experiments suggested a process life longer than 15 years, as the high viscosity of Athabasca bitumen and the resulting low diffusivity resulted in a slower oil recovery process. 5 refs., 3 tabs., 16 figs

  1. Detection of short circuit in pulse gas metal arc welding process

    Directory of Open Access Journals (Sweden)

    P.K.D.V. Yarlagadda

    2007-09-01

    Full Text Available Purpose: The paper discusses several methods of detecting occurrence of short circuit and short circuit severity in pulse gas metal arc welding process (GMAW-P.Design/methodology/approach: Welding experiments with different values of pulsing parameter and simultaneous recording of high speed camera pictures and welding signals (such as current and voltage were used to identify the occurrence of short circuit and its severity in GMAW-P process. The investigation is based on the measurement of welding signals specifically current and voltage signals and their synchronization with high speed camera to investigate the short circuit phenomenon in GMAW-P process.Findings: The results reveal that short circuit can be detected using signal processing techniques and its severity can be predicted by using statistical models and artificial intelligence techniques in GMAW-P process.Research limitations/implications: Several factors are responsible for short circuit occurrence in GMAW-P process. The results show that voltage and current signal carry rich information about the metal transfer and especially short circuit occurrence in GMAW-P process. Hence it’s possible to detect short circuit occurrence in GMAW-P process. Future work should concentrate on development of advance techniques to improve reliability of techniques mentioned in this paper for short circuit detection and prediction in GMAW-P process.Originality/value: For achieving atomization of the welding processes, implementation of real time monitoring of weld quality is essential. Specifically for GMAW-P process which is widely used for light weight metal which is widely gaining popularity in manufacturing industry. However, in case of GMAW-P process hardly any attempt is made to analyse techniques to detect and predict occurrence of short circuit. This paper analyses different techniques that can be employed for real time monitoring and prediction of short circuit and its severity in the

  2. Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method

    Directory of Open Access Journals (Sweden)

    DR.K.KUPPUSAMY

    2010-12-01

    Full Text Available This paper is deliberate to provide a model for “Intelligence Intrusion Detection Prevention Systems using Object Oriented Analysis method ” , It describes the state’s overall requirements regarding the acquisition and implementation of intrusion prevention and detection systems with intelligence (IIPS/IIDS. This is designed to provide a deeper understanding of intrusion prevention and detection principles with intelligence may be responsible for acquiring, implementing or monitoring such systems in understanding the technology and strategies available.With the need for evolution, if not revolution, of current network architectures and the Internet, autonomous and spontaneous management will be a key feature of future networks and information systems. In this context, security is an essential property. It must be thought at the early stage of conception of these systems and designed to be also autonomous and spontaneous.Future networks and systems must be able to automatically configure themselves with respect to their security policies. The security policy specification must be dynamic and adapt itself to the changing environment. Those networks and systems should interoperate securely when their respective security policies are heterogeneous and possibly conflicting. They must be able to autonomously evaluate the impact of an intrusion in order to spontaneously select the appropriate and relevant response when a given intrusion is detected.Autonomous and spontaneous security is a major requirement of future networks and systems. Of course, it is crucial to address this issue in different wireless and mobile technologies available today such as RFID,Wifi, Wimax, 3G, etc. Other technologies such as ad hoc and sensor networks, which introduce new type of services, also share similar requirements for an autonomous and spontaneous management of security.Intelligence Intrusion Prevention Systems (IIPS are designed to aid in preventing the

  3. Amorphous silicon batch process cost analysis

    International Nuclear Information System (INIS)

    This report describes the development of baseline manufacturing cost data to assist PVMaT monitoring teams in assessing current and future subcontracts, which an emphasis on commercialization and production. A process for the manufacture of a single-junction, large-area, a Si module was modeled using an existing Research Triangle Institute (RTI) computer model. The model estimates a required, or breakeven, price for the module based on its production process and the financial structure of the company operating the process. Sufficient detail on cost drivers is presented so the relationship of the process features and business characteristics can be related to the estimated required price

  4. Partial defect detection using a digital Cerenkov viewing device and image processing

    International Nuclear Information System (INIS)

    The digital Cerenkov viewing device (DCVD) is used by the inspectors of the International Atomic Energy Agency to non-intrusively verify long-cooled spent fuel. Collimated Cerenkov light between the fuel rods in spent fuel, detected when the instrument is moved a few centimetres off alignment, indicates the presence of fission products in the fuel assembly. The digital nature of the DCVD opens up the area of image processing. Image enhancement and noise reduction assist in the identification of fuel designs and give the potential to automatically detect missing and substituted fuel rods (partial defects). Image analysis programs such as Image Pro, MATLAB and LabVIEW were used to examine spent LWR fuel to detect characteristics of missing fuel rods, substituted fuel rods and partial-length fuel rods in a number of BWR fuel assemblies. One example is shown below in which MATLAB was used to measure the intensity of selected regions of interest. The example shows that the missing rods (identified as 2 and 3) clearly show higher counts than the immediately adjacent short-rod regions (1 and 4), indicating that more light is coming up from the full length of the assembly than the adjacent water regions where a partial length rod occupies much of the assembly length. Other areas of study included noise reduction by filtering, dark frame characterization, frame averaging/bad frame removal techniques and running average optimizations. Pattern (or object) recognition procedures were used to identify anomalous regions within a fuel assembly to enable highlighting of suspect rods in the assembly image. Graduated background intensities from near neighbours were also subtracted, pixel by pixel, from assemblies of interest. This paper provides results of this study, discusses the degree of success and indicates directions for future development

  5. The application of image processing to the detection of corrosion by radiography

    International Nuclear Information System (INIS)

    The computer processing of digitised radiographs has been investigated with a view to improving x-radiography as a method for detecting corrosion. Linearisation of the image-density distribution in a radiograph has been used to enhance information which can be attributed to corrosion, making the detection of corrosion by radiography both easier and more reliable. However, conclusive evidence has yet to be obtained that image processing can result in the detection of corrosion which was not already faintly apparent on an unprocessed radiograph. A potential method has also been discovered for analysing the history of a corrosion site

  6. Materials loss-detection sensitivities using process-grade measurements at AGNS BNFP

    International Nuclear Information System (INIS)

    Process quality measurement data from cold runs at AGNS BNFP are used to demonstrate near-real-time accounting by closing hourly materials balances and to evaluate contractor inventory estimation techniques. Loss-detection sensitivities for 1 day of between 4 and 18 kg uranium, at 50% detection probability and 2.5% false-alarm probability, are calculated for selected accounting areas. Pulsed-column inventory estimators are used to calculate an inventory that is generally within 10% of column dump measurements. Loss-detection sensitivity could be improved by incorporating on-line waste stream measurements, improving laboratory measurements for process streams, and refining the pulsed-column inventory estimates

  7. Theoretical Performance Analysis of Eigenvalue-based Detection

    CERN Document Server

    Penna, Federico

    2009-01-01

    In this paper we develop a complete analytical framework based on Random Matrix Theory for the performance evaluation of Eigenvalue-based Detection. While, up to now, analysis was limited to false-alarm probability, we have obtained an analytical expression also for the probability of missed detection, by using the theory of spiked population models. A general scenario with multiple signals present at the same time is considered. The theoretical results of this paper allow to predict the error probabilities, and to set the decision threshold accordingly, by means of a few mathematical formulae. In this way the design of an eigenvalue-based detector is made conceptually identical to that of a traditional energy detector. As additional results, the paper discusses the conditions of signal identifiability for single and multiple sources. All the analytical results are validated through numerical simulations, covering also convergence, identifiabilty and non-Gaussian practical modulations.

  8. Geomorphological change detection of fluvial processes of lower Siret channel using LIDAR data

    Science.gov (United States)

    Niculita, Mihai; Obreja, Florin; Boca, Bogdan

    2015-04-01

    :121-134. Lague D., Brodu N., Leroux J., 2013. Accurate 3D comparison of complex topography with terrestrial laser scanner: application to the Rangitikei canyon (N-Z), ISPRS journal of Photogrammmetry and Remote Sensing, 80:10-26. James L.A., Hodgson M.E., Ghoshal S., Latiolais M.M., 2012. Geomorphic change detection using historic maps and DEM differencing: the temporal dimension of geospatial analysis. Geomorphology, 137:181-198. Nedelcu G., Borcan M., Branescu E., Petre C., Teleanu B., Preda A., Murafa R., 2011. Exceptional floods from the years 2008 and 2010 in Siret river basin, Proceedings of the Annual Scientific Conference of National Romanian Institute of Hydrology and Water Administration, 1-3 November 2011. (in Romanian) Olariu P., Obreja F., Obreja I., 2009. Some aspects regarding the sediment transit from Trotus catchment and lower sector of Siret river during the exceptional floods from 1991 and 2005, Annals of Stefan cel Mare University of Suceava, XVIII:93-104.(in Romanian) Serbu M., Obreja F., Olariu P., 2009. The 2008 floods from upper Siret catchment. Causes, effects, evaluation, Hidrotechnics, 54(12):1-38. (in Romanian) Wheaton J.M., Brasington J., Darby S., Sear D., 2009. Accounting for uncertainty in DEMs from repeat topographic surveys: improved sediment budgets. Earth Surface Processes and Landforms, 35(2):136-156.

  9. The application study of wavelets analysis method in the damage detection of a structure

    International Nuclear Information System (INIS)

    There will be some oscillation, friction, crackle or even rupture in somewhere of a structure in the vibrating process. At this time, the vibration signals often appear a little of singularity or carry some breaking information. These information reflect the damage situation of the structure. The signal singularity detection theory based on wavelets analysis is given. Through the study of those singularity, the damage situation of the structure may be discovered

  10. Detection of Harbours from High Resolution Remote Sensing Imagery via Saliency Analysis and Feature Learning

    Science.gov (United States)

    Wang, Yetianjian; Pan, Li; Wang, Dagang; Kang, Yifei

    2016-06-01

    Harbours are very important objects in civil and military fields. To detect them from high resolution remote sensing imagery is important in various fields and also a challenging task. Traditional methods of detecting harbours mainly focus on the segmentation of water and land and the manual selection of knowledge. They do not make enough use of other features of remote sensing imagery and often fail to describe the harbours completely. In order to improve the detection, a new method is proposed. First, the image is transformed to Hue, Saturation, Value (HSV) colour space and saliency analysis is processed via the generation and enhancement of the co-occurrence histogram to help detect and locate the regions of interest (ROIs) that is salient and may be parts of the harbour. Next, SIFT features are extracted and feature learning is processed to help represent the ROIs. Then, by using classified feature of the harbour, a classifier is trained and used to check the ROIs to find whether they belong to the harbour. Finally, if the ROIs belong to the harbour, a minimum bounding rectangle is formed to include all the harbour ROIs and detect and locate the harbour. The experiment on high resolution remote sensing imagery shows that the proposed method performs better than other methods in precision of classifying ROIs and accuracy of completely detecting and locating harbours.

  11. A program for activation analysis data processing

    International Nuclear Information System (INIS)

    An ALGOL program for activation analysis data handling is presented. The program may be used either for single channel spectrometry data or for multichannel spectrometry. The calculation of instrumental error and of analysis standard deviation is carried out. The outliers are tested, and the regression line diagram with the related observations are plotted by the program. (author)

  12. Shielding analysis of the advanced voloxidation process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Park, J. J.; Lee, J. W.; Shin, J. M.; Park, G. I.; Song, K. C

    2008-09-15

    This report deals describes how much a shielding benefit can be obtained by the Advanced Voloxidation process. The calculation was performed with the MCNPX code and a simple problem was modeled with a spent fuel source which was surrounded by a concrete wall. The source terms were estimated with the ORIGEN-ARP code and the gamma spectrum and the neutron spectrum were also obtained. The thickness of the concrete wall was estimated before and after the voloxidation process. From the results, the gamma spectrum after the voloxidation process was estimated as a 67% reduction compared with that of before the voloxidation process due to the removal of several gamma emission elements such as cesium and rubidium. The MCNPX calculations provided that the thickness of the general concrete wall could be reduced by 12% after the voloxidation process. And the heavy concrete wall provided a 28% reduction in the shielding of the source term after the voloxidation process. This can be explained in that there lots of gamma emission isotopes still exist after the advanced voloxidation process such as Pu-241, Y-90, and Sr-90 which are independent of the voloxidation process.

  13. Detection of neovascularization based on fractal and texture analysis with interaction effects in diabetic retinopathy.

    Directory of Open Access Journals (Sweden)

    Jack Lee

    Full Text Available Diabetic retinopathy is a major cause of blindness. Proliferative diabetic retinopathy is a result of severe vascular complication and is visible as neovascularization of the retina. Automatic detection of such new vessels would be useful for the severity grading of diabetic retinopathy, and it is an important part of screening process to identify those who may require immediate treatment for their diabetic retinopathy. We proposed a novel new vessels detection method including statistical texture analysis (STA, high order spectrum analysis (HOS, fractal analysis (FA, and most importantly we have shown that by incorporating their associated interactions the accuracy of new vessels detection can be greatly improved. To assess its performance, the sensitivity, specificity and accuracy (AUC are obtained. They are 96.3%, 99.1% and 98.5% (99.3%, respectively. It is found that the proposed method can improve the accuracy of new vessels detection significantly over previous methods. The algorithm can be automated and is valuable to detect relatively severe cases of diabetic retinopathy among diabetes patients.

  14. Image Post-Processing and Analysis. Chapter 17

    International Nuclear Information System (INIS)

    For decades, scientists have used computers to enhance and analyse medical images. At first, they developed simple computer algorithms to enhance the appearance of interesting features in images, helping humans read and interpret them better. Later, they created more advanced algorithms, where the computer would not only enhance images but also participate in facilitating understanding of their content. Segmentation algorithms were developed to detect and extract specific anatomical objects in images, such as malignant lesions in mammograms. Registration algorithms were developed to align images of different modalities and to find corresponding anatomical locations in images from different subjects. These algorithms have made computer aided detection and diagnosis, computer guided surgery and other highly complex medical technologies possible. Nowadays, the field of image processing and analysis is a complex branch of science that lies at the intersection of applied mathematics, computer science, physics, statistics and biomedical sciences. This chapter will give a general overview of the most common problems in this field and the algorithms that address them

  15. In-Process Detection of Weld Defects Using Laser-Based Ultrasound

    International Nuclear Information System (INIS)

    Laser-based ultrasonic (LBU) measurement shows great promise for on-line monitoring of weld quality in tailor-welded blanks. Tailor-welded blanks are steel blanks made from plates of differing thickness and/or properties butt-welded together; they are used in automobile manufacturing to produce body, frame, and closure panels. LBU uses a pulsed laser to generate the ultrasound and a continuous wave (CW) laser interferometer to detect the ultrasound at the point of interrogation to perform ultrasonic inspection. LBU enables in-process measurements since there is no sensor contact or near-contact with the workpiece. The authors are using laser-generated plate (Lamb) waves to propagate from one plate into the weld nugget as a means of detecting defects. This paper reports the results of the investigation of a number of inspection architectures based on processing of signals from selected plate waves, which are either reflected from or transmitted through the weld zone. Bayesian parameter estimation and wavelet analysis (both continuous and discrete) have shown that the LBU time-series signal is readily separable into components that provide distinguishing features which describe weld quality. The authors anticipate that, in an on-line industrial application, these measurements can be implemented just downstream from the weld cell. Then the weld quality data can be fed back to control critical weld parameters or alert the operator of a problem requiring maintenance. Internal weld defects and deviations from the desired surface profile can then be corrected before defective parts are produced

  16. Comparative analysis of model assessment in community detection

    CERN Document Server

    Kawamoto, Tatsuro

    2016-01-01

    Bayesian cluster inference with a flexible generative model allows us to detect various types of structures. However, it has problems stemming from computational complexity and difficulties in model assessment. We consider the stochastic block model with restricted hyperparameter space, which is known to correspond to modularity maximization. We show that it not only reduces computational complexity, but is also beneficial for model assessment. Using various criteria, we conduct a comparative analysis of the model assessments, and analyze whether each criterion tends to overfit or underfit. We also show that the learning of hyperparameters leads to qualitative differences in Bethe free energy and cross-validation errors.

  17. Modal Analysis for Crack Detection in Small Wind Turbine Blades

    DEFF Research Database (Denmark)

    Ulriksen, Martin Dalgaard; Skov, Jonas falk; Dickow, Kristoffer Ahrens;

    2013-01-01

    The aim of the present paper is to evaluate structural health monitoring (SHM) techniques based on modal analysis for crack detection in small wind turbine blades. A finite element (FE) model calibrated to measured modal parameters will be introduced to cracks with different sizes along one edge of...... the blade. Changes in modal parameters from the FE model are compared with data obtained from experimental tests. These comparisons will be used to validate the FE model and subsequently discuss the usability of SHM techniques based on modal parameters for condition monitoring of wind turbine blades....

  18. SCHEME ANALYSIS TREE DIMENSIONS AND TOLERANCES PROCESSING

    Directory of Open Access Journals (Sweden)

    Constanta RADULESCU

    2011-07-01

    Full Text Available This paper presents one of the steps that help us to determine the optimal tolerances depending on thetechnological capability of processing equipment. To determine the tolerances in this way is necessary to takethe study and to represent schematically the operations are used in technological process of making a piece.Also in this phase will make the tree diagram of the dimensions and machining tolerances, dimensions andtolerances shown that the design execution. Determination processes, and operations of the dimensions andtolerances tree scheme will make for a machined piece is both indoor and outdoor.

  19. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  20. Studies of energy transfer processes in triplet states using optically detected magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Lewellyn, M.T.

    1977-06-01

    The techniques of both continuous wave and pulsed-coherent optically detected magnetic resonance spectroscopy are used to study energy transfer processes in the lowest triplet states of two aromatic molecular crystals (1,2,4,5-tetrachlorobenzene and pyrimidine). Of particular interest are the effects of crystal dimensionality and isotopic trap states on the efficiency of the transfer process in these systems.

  1. Error detection in GPS observations by means of Multi-process models

    DEFF Research Database (Denmark)

    Thomsen, Henrik F.

    2001-01-01

    The main purpose of this article is to present the idea of using Multi-process models as a method of detecting errors in GPS observations. The theory behind Multi-process models, and double differenced phase observations in GPS is presented shortly. It is shown how to model cycle slips in the Mul...

  2. Inter-laboratory validation study of two immunochemical methods for detection of processed ruminant proteins

    NARCIS (Netherlands)

    Raamsdonk, Van L.W.D.; Margry, R.J.C.F.; Kaathoven, Van R.G.C.; Bremer, M.G.E.G.

    2015-01-01

    In order to facilitate safe re-introduction of non-ruminant processed animal proteins (PAPs) in aqua feed, two immunoassays have been tested in an interlaboratory study for their capability to detect ruminant PAPs processed under European conditions. The sensitivity of the MELISA-TEK assay was im

  3. Adaptive Image Processing Methods for Improving Contaminant Detection Accuracy on Poultry Carcasses

    Science.gov (United States)

    Technical Abstract A real-time multispectral imaging system has demonstrated a science-based tool for fecal and ingesta contaminant detection during poultry processing. In order to implement this imaging system at commercial poultry processing industry, the false positives must be removed. For doi...

  4. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks.

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes. PMID:26193332

  5. Analysis and asynchronous detection of gradually unfolding errors during monitoring tasks

    Science.gov (United States)

    Omedes, Jason; Iturrate, Iñaki; Minguez, Javier; Montesano, Luis

    2015-10-01

    Human studies on cognitive control processes rely on tasks involving sudden-onset stimuli, which allow the analysis of these neural imprints to be time-locked and relative to the stimuli onset. Human perceptual decisions, however, comprise continuous processes where evidence accumulates until reaching a boundary. Surpassing the boundary leads to a decision where measured brain responses are associated to an internal, unknown onset. The lack of this onset for gradual stimuli hinders both the analyses of brain activity and the training of detectors. This paper studies electroencephalographic (EEG)-measurable signatures of human processing for sudden and gradual cognitive processes represented as a trajectory mismatch under a monitoring task. Time-locked potentials and brain-source analysis of the EEG of sudden mismatches revealed the typical components of event-related potentials and the involvement of brain structures related to cognitive control processing. For gradual mismatch events, time-locked analyses did not show any discernible EEG scalp pattern, despite related brain areas being, to a lesser extent, activated. However, and thanks to the use of non-linear pattern recognition algorithms, it is possible to train an asynchronous detector on sudden events and use it to detect gradual mismatches, as well as obtaining an estimate of their unknown onset. Post-hoc time-locked scalp and brain-source analyses revealed that the EEG patterns of detected gradual mismatches originated in brain areas related to cognitive control processing. This indicates that gradual events induce latency in the evaluation process but that similar brain mechanisms are present in sudden and gradual mismatch events. Furthermore, the proposed asynchronous detection model widens the scope of applications of brain-machine interfaces to other gradual processes.

  6. DETECTING ABNORMAL BEHAVIOR IN SOCIAL NETWORK WEBSITES BY USING A PROCESS MINING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Mahdi Sahlabadi

    2014-01-01

    Full Text Available Detecting abnormal user activity in social network websites could prevent from cyber-crime occurrence. The previous research focused on data mining while this research is based on user behavior process. In this study, the first step is defining a normal user behavioral pattern and the second step is detecting abnormal behavior. These two steps are applied on a case study that includes real and syntactic data sets to obtain more tangible results. The chosen technique used to define the pattern is process mining, which is an affordable, complete and noise-free event log. The proposed model discovers a normal behavior by genetic process mining technique and abnormal activities are detected by the fitness function, which is based on Petri Net rules. Although applying genetic mining is time consuming process, it can overcome the risks of noisy data and produces a comprehensive normal model in Petri net representation form.

  7. Computer program performs statistical analysis for random processes

    Science.gov (United States)

    Newberry, M. H.

    1966-01-01

    Random Vibration Analysis Program /RAVAN/ performs statistical analysis on a number of phenomena associated with flight and captive tests, but can also be used in analyzing data from many other random processes.

  8. Laplace-Laplace analysis of the fractional Poisson process

    OpenAIRE

    Gorenflo, Rudolf; Mainardi, Francesco

    2013-01-01

    We generate the fractional Poisson process by subordinating the standard Poisson process to the inverse stable subordinator. Our analysis is based on application of the Laplace transform with respect to both arguments of the evolving probability densities.

  9. Damage Detection and Quantification Using Transmissibility Coherence Analysis

    Directory of Open Access Journals (Sweden)

    Yun-Lai Zhou

    2015-01-01

    Full Text Available A new transmissibility-based damage detection and quantification approach is proposed. Based on the operational modal analysis, the transmissibility is extracted from system responses and transmissibility coherence is defined and analyzed. Afterwards, a sensitive-damage indicator is defined in order to detect and identify the severity of damage and compared with an indicator developed by other authors. The proposed approach is validated on data from a physics-based numerical model as well as experimental data from a three-story aluminum frame structure. For both numerical simulation and experiment the results of the new indicator reveal a better performance than coherence measure proposed in Rizos et al., 2008, Rizos et al., 2002, Fassois and Sakellariou, 2007, especially when nonlinearity occurs, which might be further used in real engineering. The main contribution of this study is the construction of the relation between transmissibility coherence and frequency response function coherence and the construction of an effective indicator based on the transmissibility modal assurance criteria for damage (especially for minor nonlinearity detection as well as quantification.

  10. Detection and Monitoring of Neurotransmitters - a Spectroscopic Analysis

    Science.gov (United States)

    Manciu, Felicia; Lee, Kendall; Durrer, William; Bennet, Kevin

    2012-10-01

    In this work we demonstrate the capability of confocal Raman mapping spectroscopy for simultaneously and locally detecting important compounds in neuroscience such as dopamine, serotonin, and adenosine. The Raman results show shifting of the characteristic vibrations of the compounds, observations consistent with previous spectroscopic studies. Although some vibrations are common in these neurotransmitters, Raman mapping was achieved by detecting non-overlapping characteristic spectral signatures of the compounds, as follows: for dopamine the vibration attributed to C-O stretching, for serotonin the indole ring stretching vibration, and for adenosine the adenine ring vibrations. Without damage, dyeing, or preferential sample preparation, confocal Raman mapping provided positive detection of each neurotransmitter, allowing association of the high-resolution spectra with specific micro-scale image regions. Such information is particularly important for complex, heterogeneous samples, where modification of the chemical or physical composition can influence the neurotransmission processes. We also report an estimated dopamine diffusion coefficient two orders of magnitude smaller than that calculated by the flow-injection method.

  11. Automatic ultrasonic image analysis method for defect detection

    International Nuclear Information System (INIS)

    Ultrasonic examination of austenitic steel weld seams raises well known problems of interpreting signals perturbed by this type of material. The JUKEBOX ultrasonic imaging system developed at the Cadarache Nuclear Research Center provides a major improvement in the general area of defect localization and characterization, based on processing overall images obtained by (X, Y) scanning. (X, time) images are formed by juxtaposing input signals. A series of parallel images shifted on the Y-axis is also available. The authors present a novel defect detection method based on analysing the timeline positions of the maxima and minima recorded on (X, time) images. This position is statistically stable when a defect is encountered, and is random enough under spurious noise conditions to constitute a discriminating parameter. The investigation involves calculating the trace variance: this parameters is then taken into account for detection purposes. Correlation with parallel images enhances detection reliability. A significant increase in the signal-to-noise ratio during tests on artificial defects is shown

  12. Non-destructive analysis and detection of internal characteristics of spruce logs through X computerized tomography

    International Nuclear Information System (INIS)

    Computerized tomography allows a direct access to internal features of scanned logs on the basis of density and moisture content variations. The objective of this work is to assess the feasibility of an automatic detection of internal characteristics with the final aim of conducting scientific analyses. The database is constituted by CT images of 24 spruces obtained with a medical CT scanner. Studied trees are representative of several social status and are coming from four stands located in North-Eastern France, themselves are representative of several age, density and fertility classes. The automatic processing developed are the following. First, pith detection in logs dealing with the problem of knot presence and ring eccentricity. The accuracy of the localisation was less than one mm. Secondly, the detection of the sapwood/heart-wood limit in logs dealing with the problem of knot presence (main source of difficulty). The error on the diameter was 1.8 mm which corresponds to a relative error of 1.3 per cent. Thirdly, the detection of the whorls location and comparison with an optical method. Fourthly the detection of individualized knots. This process allows to count knots and to locate them in a log (longitudinal position and azimuth); however, the validation of the method and extraction of branch diameter and inclination are still to be developed. An application of this work was a variability analysis of the sapwood content in the trunk: at the within-tree level, the sapwood width was found to be constant under the living crown; at the between-tree level, a strong correlation was found with the amount of living branches. A great number of analyses are possible from our work results, among others: architectural analysis with the pith tracking and the apex death occurrence; analysis of radial variations of the heart-wood shape; analysis of the knot distribution in logs. (author)

  13. Effect of image processing version on detection of non-calcification cancers in 2D digital mammography imaging

    Science.gov (United States)

    Warren, L. M.; Cooke, J.; Given-Wilson, R. M.; Wallis, M. G.; Halling-Brown, M.; Mackenzie, A.; Chakraborty, D. P.; Bosmans, H.; Dance, D. R.; Young, K. C.

    2013-03-01

    Image processing (IP) is the last step in the digital mammography imaging chain before interpretation by a radiologist. Each manufacturer has their own IP algorithm(s) and the appearance of an image after IP can vary greatly depending upon the algorithm and version used. It is unclear whether these differences can affect cancer detection. This work investigates the effect of IP on the detection of non-calcification cancers by expert observers. Digital mammography images for 190 patients were collected from two screening sites using Hologic amorphous selenium detectors. Eighty of these cases contained non-calcification cancers. The images were processed using three versions of IP from Hologic - default (full enhancement), low contrast (intermediate enhancement) and pseudo screen-film (no enhancement). Seven experienced observers inspected the images and marked the location of regions suspected to be non-calcification cancers assigning a score for likelihood of malignancy. This data was analysed using JAFROC analysis. The observers also scored the clinical interpretation of the entire case using the BSBR classification scale. This was analysed using ROC analysis. The breast density in the region surrounding each cancer and the number of times each cancer was detected were calculated. IP did not have a significant effect on the radiologists' judgment of the likelihood of malignancy of individual lesions or their clinical interpretation of the entire case. No correlation was found between number of times each cancer was detected and the density of breast tissue surrounding that cancer.

  14. Chemical Sensing for Buried Landmines - Fundamental Processes Influencing Trace Chemical Detection

    Energy Technology Data Exchange (ETDEWEB)

    PHELAN, JAMES M.

    2002-05-01

    Mine detection dogs have a demonstrated capability to locate hidden objects by trace chemical detection. Because of this capability, demining activities frequently employ mine detection dogs to locate individual buried landmines or for area reduction. The conditions appropriate for use of mine detection dogs are only beginning to emerge through diligent research that combines dog selection/training, the environmental conditions that impact landmine signature chemical vapors, and vapor sensing performance capability and reliability. This report seeks to address the fundamental soil-chemical interactions, driven by local weather history, that influence the availability of chemical for trace chemical detection. The processes evaluated include: landmine chemical emissions to the soil, chemical distribution in soils, chemical degradation in soils, and weather and chemical transport in soils. Simulation modeling is presented as a method to evaluate the complex interdependencies among these various processes and to establish conditions appropriate for trace chemical detection. Results from chemical analyses on soil samples obtained adjacent to landmines are presented and demonstrate the ultra-trace nature of these residues. Lastly, initial measurements of the vapor sensing performance of mine detection dogs demonstrates the extreme sensitivity of dogs in sensing landmine signature chemicals; however, reliability at these ultra-trace vapor concentrations still needs to be determined. Through this compilation, additional work is suggested that will fill in data gaps to improve the utility of trace chemical detection.

  15. Network structure detection and analysis of Shanghai stock market

    Directory of Open Access Journals (Sweden)

    Sen Wu

    2015-04-01

    Full Text Available Purpose: In order to investigate community structure of the component stocks of SSE (Shanghai Stock Exchange 180-index, a stock correlation network is built to find the intra-community and inter-community relationship. Design/methodology/approach: The stock correlation network is built taking the vertices as stocks and edges as correlation coefficients of logarithm returns of stock price. It is built as undirected weighted at first. GN algorithm is selected to detect community structure after transferring the network into un-weighted with different thresholds. Findings: The result of the network community structure analysis shows that the stock market has obvious industrial characteristics. Most of the stocks in the same industry or in the same supply chain are assigned to the same community. The correlation of the internal stock prices’ fluctuation is closer than in different communities. The result of community structure detection also reflects correlations among different industries. Originality/value: Based on the analysis of the community structure in Shanghai stock market, the result reflects some industrial characteristics, which has reference value to relationship among industries or sub-sectors of listed companies.

  16. Analysis of patents on preeclampsia detection and diagnosis: a perspective.

    Science.gov (United States)

    Telang, M A; Bhutkar, S P; Hirwani, R R

    2013-01-01

    Computerized patent databases have made it possible to access a wealth of technological information contained in patent documents. Analysis of patent information can greatly help in monitoring technology trends and evolution as well as identification of research gaps. Detection and diagnosis of preeclampsia (PE) was chosen as a case study to emphasize the informative potential of patent analysis. PE complicates about 2-8% pregnancies, affecting a total of about 8.5 million women worldwide. PE can lead to potentially life threatening problems of the liver, kidneys, brain and blood clotting system of the mother. Risks for the baby include poor growth and prematurity. No effective ways of predicting or preventing PE have been found, which highlights the need for further research in this field. Some researchers believe that the incidence of PE is on the rise due to an increased prevalence of predisposing disorders, such as chronic hypertension, diabetes, and obesity. PE thus represents a huge health care burden world over. Complications of PE might be prevented to a certain extent if diagnosed early. Patents on detection and diagnosis of PE were analyzed to gain a better understanding of the technical approaches followed by various research groups around the world. PMID:23200058

  17. Analysis of digitized cervical images to detect cervical neoplasia

    Science.gov (United States)

    Ferris, Daron G.

    2004-05-01

    Cervical cancer is the second most common malignancy in women worldwide. If diagnosed in the premalignant stage, cure is invariably assured. Although the Papanicolaou (Pap) smear has significantly reduced the incidence of cervical cancer where implemented, the test is only moderately sensitive, highly subjective and skilled-labor intensive. Newer optical screening tests (cervicography, direct visual inspection and speculoscopy), including fluorescent and reflective spectroscopy, are fraught with certain weaknesses. Yet, the integration of optical probes for the detection and discrimination of cervical neoplasia with automated image analysis methods may provide an effective screening tool for early detection of cervical cancer, particularly in resource poor nations. Investigative studies are needed to validate the potential for automated classification and recognition algorithms. By applying image analysis techniques for registration, segmentation, pattern recognition, and classification, cervical neoplasia may be reliably discriminated from normal epithelium. The National Cancer Institute (NCI), in cooperation with the National Library of Medicine (NLM), has embarked on a program to begin this and other similar investigative studies.

  18. Detection and analysis of diamond fingerprinting feature and its application

    Energy Technology Data Exchange (ETDEWEB)

    Li Xin; Huang Guoliang; Li Qiang; Chen Shengyi, E-mail: tshgl@tsinghua.edu.cn [Department of Biomedical Engineering, the School of Medicine, Tsinghua University, Beijing, 100084 (China)

    2011-01-01

    Before becoming a jewelry diamonds need to be carved artistically with some special geometric features as the structure of the polyhedron. There are subtle differences in the structure of this polyhedron in each diamond. With the spatial frequency spectrum analysis of diamond surface structure, we can obtain the diamond fingerprint information which represents the 'Diamond ID' and has good specificity. Based on the optical Fourier Transform spatial spectrum analysis, the fingerprinting identification of surface structure of diamond in spatial frequency domain was studied in this paper. We constructed both the completely coherent diamond fingerprinting detection system illuminated by laser and the partially coherent diamond fingerprinting detection system illuminated by led, and analyzed the effect of the coherence of light source to the diamond fingerprinting feature. We studied rotation invariance and translation invariance of the diamond fingerprinting and verified the feasibility of real-time and accurate identification of diamond fingerprint. With the profit of this work, we can provide customs, jewelers and consumers with a real-time and reliable diamonds identification instrument, which will curb diamond smuggling, theft and other crimes, and ensure the healthy development of the diamond industry.

  19. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors...... (sampling variances) can be reduced greatly however, and sampling biases can be eliminated completely, by respecting a simple set of rules and guidelines provided by TOS. A systematic approach for description of process heterogeneity furnishes in-depth knowledge about the specific variability of any 1-D lot....... The variogram and its derived auxiliary functions together with a set of error generating functions provide critical information on: - process variation over time or space, - the number of extracted increments to composite into a final, optimal sample, - the frequency with which to extract increments...

  20. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  1. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the need to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.

  2. Seeking a fingerprint: analysis of point processes in actigraphy recording

    Science.gov (United States)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  3. 300 Area process trench sediment analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, M.G.; Kossik, C.D.

    1987-12-01

    This report describes the results of a sampling program for the sediments underlying the Process Trenches serving the 300 Area on the Hanford reservation. These Process Trenches were the subject of a Closure Plan submitted to the Washington State Department of Ecology and to the US Environmental Protection Agency in lieu of a Part B permit application on November 8, 1985. The closure plan described a proposed sampling plan for the underlying sediments and potential remedial actions to be determined by the sample analyses results. The results and proposed remedial action plan are presented and discussed in this report. 50 refs., 6 figs., 8 tabs.

  4. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  5. Image analysis, classification, and change detection in remote sensing with algorithms for ENVI/IDL

    CERN Document Server

    Canty, Morton J

    2011-01-01

    Demonstrating the breadth and depth of growth in the field since the publication of the popular first edition, Image Analysis, Classification and Change Detection in Remote Sensing, with Algorithms for ENVI/IDL, Second Edition has been updated and expanded to keep pace with the latest versions of the ENVI software environment. Effectively interweaving theory, algorithms, and computer codes, the text supplies an accessible introduction to the techniques used in the processing of remotely sensed imagery. This significantly expanded edition presents numerous image analysis examples and algorithms

  6. INTEGRATION OF POKA YOKE INTO PROCESS FAILURE MODE AND EFFECT ANALYSIS: A CASE STUDY

    Directory of Open Access Journals (Sweden)

    A. P. Puvanasvaran

    2014-01-01

    Full Text Available The Failure Mode and Effect Analysis (FMEA is a one of the requirements which was required by the Automotive Industries Action Group (AIAG to all the automotive suppliers and manufacturers worldwide through the TS16949 Quality System. There were a lot of dicrepencies detected on implementing the FMEA which directly related to the user experinces and knowledge. The descrepencies cause the FMEA not meeting the objectives of it. Conceptually, Poka Yoke is able to fit into the Process FMEA. Failure Mode and Effect Analysis (FMEA helps predict and prevent problems through proper control or detection methods. Mistake proofing emphasizes detection and correction of mistakes before they become defects. Poka Yoke helps people and processes work correctly the first time. It refers to techniques that make mistakes impossible to commit. These techniques eliminate defects from products and processes as well as substantially improve their quality and reliability. Poka Yoke can be considered an extension of FMEA. The use of simple Poka Yoke ideas and methods in product and process design eliminates both human and mechanical errors. Ultimately, both FMEA and Poka Yoke methodologies result in zero defects and benefit either the end or the next-in-line customer. The first concept of Poka Yoke emphasizes elimination of the cause or occurrence of the error that creates the defects by concentrating on the cause of the error in the process. The defect is prevented by stopping the line or the machine when the root cause of the defect is triggered or detected. The second concept of Poka Yoke focuses on the effectiveness of the detection system. The foolproof detection system eliminates the defect or detects the error that causes defects. The implementation of the Poka Yoke concept in a foolproof detection system eliminates the possibility that error or defects will slip through the process and reach the customer.

  7. Streak detection and analysis pipeline for space-debris optical images

    Science.gov (United States)

    Virtanen, Jenni; Poikonen, Jonne; Säntti, Tero; Komulainen, Tuomo; Torppa, Johanna; Granvik, Mikael; Muinonen, Karri; Pentikäinen, Hanna; Martikainen, Julia; Näränen, Jyri; Lehti, Jussi; Flohrer, Tim

    2016-04-01

    We describe a novel data-processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data, to support the development and validation of population models and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. The ESA-funded StreakDet (streak detection and astrometric reduction) activity has aimed at formulating and discussing suitable approaches for the detection and astrometric reduction of object trails, or streaks, in optical observations. Our two main focuses are objects in lower altitudes and space-based observations (i.e., high angular velocities), resulting in long (potentially curved) and faint streaks in the optical images. In particular, we concentrate on single-image (as compared to consecutive frames of the same field) and low-SNR detection of objects. Particular attention has been paid to the process of extraction of all necessary information from one image (segmentation), and subsequently, to efficient reduction of the extracted data (classification). We have developed an automated streak detection and processing pipeline and demonstrated its performance with an extensive database of semisynthetic images simulating streak observations both from ground-based and space-based observing platforms. The average processing time per image is about 13 s for a typical 2k-by-2k image. For long streaks (length >100 pixels), primary targets of the pipeline, the detection sensitivity (true positives) is about 90% for

  8. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-08-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data. PMID:26737641

  9. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  10. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...

  11. SPAN C - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN-C, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on cards.

  12. SPAN - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on tape.

  13. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  14. Population analysis for atomic cascade decay processes

    International Nuclear Information System (INIS)

    Down-stream cascade decay processes in atomic systems are analyzed by solving a coupled rate equation for which an analytical solution for a population in each excited state is obtained. Some typical numerical examples for populations to interpret the decay passes connecting to features of optical or electron spectra observed in various collision experiments are also given. (author)

  15. Core power distribution fault detection process for a nuclear pressurized water reactor device for carrying out the process

    International Nuclear Information System (INIS)

    Power distribution faults in the core of a PWR are detected by measuring at least one parameter representative of the core power, each parameter being measured at a predetermined number of points. For each parameter one calculates the difference between the two extreme measured values and the ratio of this difference to the smallest measured value and compare this ratio with a reference value, a faults being detecter if this ratio is greater than the reference value. The detectors are placed symetrically near the core periphery. Preferentially, a detector as placed near the core center. The detectors are, according to the realization, neutron flux measuring chambers and/or temperature sensors. The detection process is very sensitive and can localise faults, whatever their cause, at any position in the core

  16. Testing effort dependent software reliability model for imperfect debugging process considering both detection and correction

    International Nuclear Information System (INIS)

    This paper studies the fault detection process (FDP) and fault correction process (FCP) with the incorporation of testing effort function and imperfect debugging. In order to ensure high reliability, it is essential for software to undergo a testing phase, during which faults can be detected and corrected by debuggers. The testing resource allocation during this phase, which is usually depicted by the testing effort function, considerably influences not only the fault detection rate but also the time to correct a detected fault. In addition, testing is usually far from perfect such that new faults may be introduced. In this paper, we first show how to incorporate testing effort function and fault introduction into FDP and then develop FCP as delayed FDP with a correction effort. Various specific paired FDP and FCP models are obtained based on different assumptions of fault introduction and correction effort. An illustrative example is presented. The optimal release policy under different criteria is also discussed

  17. A critical evaluation of the principal component analysis detection of polarized signatures using real stellar data

    Science.gov (United States)

    Paletou, F.

    2012-08-01

    The general context of this study is the post-processing of multiline spectropolarimetric observations of stars, and in particular the numerical analysis techniques aiming at detecting and characterizing polarized signatures. Using real observational data, we compare and clarify several points concerning various methods of analysis. We applied and compared the results of simple line addition, least-squares deconvolution, and denoising by principal component analysis to polarized stellar spectra available from the TBLegacy database of the Narval spectropolarimeter. This comparison of various approaches of distinct sophistication levels allows us to make a safe choice for the next implementation of on-line post-processing of our unique database for the stellar physics community.

  18. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  19. Laser Particle Heating Process in a Stand-off Photo-thermal Explosive Detection System

    OpenAIRE

    Cassady, Philip

    2011-01-01

    Recent publications have described a method for stand-off optical detection of explosives using resonant infra-red photothermal imaging. This technique uses tuned lasers to selectively heat small particles of explosive lying on a substrate surface. The presence of these heated particles is then detected using thermal infra-red imagery. Although the method has been experimentally demonstrated, no adequate theoretical analysis of the laser heating and subsequent particle cooling has been develo...

  20. ELINT signal processing on reconfigurable computers for detection and classification of LPI Emitters

    OpenAIRE

    Brown, Dane A.

    2006-01-01

    This thesis describes the implementation of an ELINT algorithm for the detection and classification of Low Probability of Intercept (LPI) signals. The algorithm was coded in the C programming language and executed on a Field Programmable Gate Array based reconfigurable computer; the SRC-6 manufactured by SRC Computers, Inc. Specifically, this thesis focuses on the preprocessing stage of an LPI signal processing algorithm. This stage receives a detected signal that has been run through a Q...