WorldWideScience

Sample records for emd-based event analysis

  1. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  2. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  3. EMD-Based Symbolic Dynamic Analysis for the Recognition of Human and Nonhuman Pyroelectric Infrared Signals

    Directory of Open Access Journals (Sweden)

    Jiaduo Zhao

    2016-01-01

    Full Text Available In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector’s false alarms.

  4. EMD-Based Symbolic Dynamic Analysis for the Recognition of Human and Nonhuman Pyroelectric Infrared Signals.

    Science.gov (United States)

    Zhao, Jiaduo; Gong, Weiguo; Tang, Yuzhen; Li, Weihong

    2016-01-20

    In this paper, we propose an effective human and nonhuman pyroelectric infrared (PIR) signal recognition method to reduce PIR detector false alarms. First, using the mathematical model of the PIR detector, we analyze the physical characteristics of the human and nonhuman PIR signals; second, based on the analysis results, we propose an empirical mode decomposition (EMD)-based symbolic dynamic analysis method for the recognition of human and nonhuman PIR signals. In the proposed method, first, we extract the detailed features of a PIR signal into five symbol sequences using an EMD-based symbolization method, then, we generate five feature descriptors for each PIR signal through constructing five probabilistic finite state automata with the symbol sequences. Finally, we use a weighted voting classification strategy to classify the PIR signals with their feature descriptors. Comparative experiments show that the proposed method can effectively classify the human and nonhuman PIR signals and reduce PIR detector's false alarms.

  5. Bi-spectrum based-EMD applied to the non-stationary vibration signals for bearing faults diagnosis.

    Science.gov (United States)

    Saidi, Lotfi; Ali, Jaouher Ben; Fnaiech, Farhat

    2014-09-01

    Empirical mode decomposition (EMD) has been widely applied to analyze vibration signals behavior for bearing failures detection. Vibration signals are almost always non-stationary since bearings are inherently dynamic (e.g., speed and load condition change over time). By using EMD, the complicated non-stationary vibration signal is decomposed into a number of stationary intrinsic mode functions (IMFs) based on the local characteristic time scale of the signal. Bi-spectrum, a third-order statistic, helps to identify phase coupling effects, the bi-spectrum is theoretically zero for Gaussian noise and it is flat for non-Gaussian white noise, consequently the bi-spectrum analysis is insensitive to random noise, which are useful for detecting faults in induction machines. Utilizing the advantages of EMD and bi-spectrum, this article proposes a joint method for detecting such faults, called bi-spectrum based EMD (BSEMD). First, original vibration signals collected from accelerometers are decomposed by EMD and a set of IMFs is produced. Then, the IMF signals are analyzed via bi-spectrum to detect outer race bearing defects. The procedure is illustrated with the experimental bearing vibration data. The experimental results show that BSEMD techniques can effectively diagnosis bearing failures. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Application of EMD-Based SVD and SVM to Coal-Gangue Interface Detection

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2014-01-01

    Full Text Available Coal-gangue interface detection during top-coal caving mining is a challenging problem. This paper proposes a new vibration signal analysis approach to detecting the coal-gangue interface based on singular value decomposition (SVD techniques and support vector machines (SVMs. Due to the nonstationary characteristics in vibration signals of the tail boom support of the longwall mining machine in this complicated environment, the empirical mode decomposition (EMD is used to decompose the raw vibration signals into a number of intrinsic mode functions (IMFs by which the initial feature vector matrices can be formed automatically. By applying the SVD algorithm to the initial feature vector matrices, the singular values of matrices can be obtained and used as the input feature vectors of SVMs classifier. The analysis results of vibration signals from the tail boom support of a longwall mining machine show that the method based on EMD, SVD, and SVM is effective for coal-gangue interface detection even when the number of samples is small.

  7. A novel EMD selecting thresholding method based on multiple iteration for denoising LIDAR signal

    Science.gov (United States)

    Li, Meng; Jiang, Li-hui; Xiong, Xing-long

    2015-06-01

    Empirical mode decomposition (EMD) approach has been believed to be potentially useful for processing the nonlinear and non-stationary LIDAR signals. To shed further light on its performance, we proposed the EMD selecting thresholding method based on multiple iteration, which essentially acts as a development of EMD interval thresholding (EMD-IT), and randomly alters the samples of noisy parts of all the corrupted intrinsic mode functions to generate a better effect of iteration. Simulations on both synthetic signals and LIDAR signals from real world support this method.

  8. Noise reduction in Lidar signal using correlation-based EMD combined with soft thresholding and roughness penalty

    Science.gov (United States)

    Chang, Jianhua; Zhu, Lingyan; Li, Hongxu; Xu, Fan; Liu, Binggang; Yang, Zhenbo

    2018-01-01

    Empirical mode decomposition (EMD) is widely used to analyze the non-linear and non-stationary signals for noise reduction. In this study, a novel EMD-based denoising method, referred to as EMD with soft thresholding and roughness penalty (EMD-STRP), is proposed for the Lidar signal denoising. With the proposed method, the relevant and irrelevant intrinsic mode functions are first distinguished via a correlation coefficient. Then, the soft thresholding technique is applied to the irrelevant modes, and the roughness penalty technique is applied to the relevant modes to extract as much information as possible. The effectiveness of the proposed method was evaluated using three typical signals contaminated by white Gaussian noise. The denoising performance was then compared to the denoising capabilities of other techniques, such as correlation-based EMD partial reconstruction, correlation-based EMD hard thresholding, and wavelet transform. The use of EMD-STRP on the measured Lidar signal resulted in the noise being efficiently suppressed, with an improved signal to noise ratio of 22.25 dB and an extended detection range of 11 km.

  9. The Influence of Arginine on the Response of Enamel Matrix Derivative (EMD Proteins to Thermal Stress: Towards Improving the Stability of EMD-Based Products.

    Directory of Open Access Journals (Sweden)

    Alessandra Apicella

    Full Text Available In a current procedure for periodontal tissue regeneration, enamel matrix derivative (EMD, which is the active component, is mixed with a propylene glycol alginate (PGA gel carrier and applied directly to the periodontal defect. Exposure of EMD to physiological conditions then causes it to precipitate. However, environmental changes during manufacture and storage may result in modifications to the conformation of the EMD proteins, and eventually premature phase separation of the gel and a loss in therapeutic effectiveness. The present work relates to efforts to improve the stability of EMD-based formulations such as Emdogain™ through the incorporation of arginine, a well-known protein stabilizer, but one that to our knowledge has not so far been considered for this purpose. Representative EMD-buffer solutions with and without arginine were analyzed by 3D-dynamic light scattering, UV-Vis spectroscopy, transmission electron microscopy and Fourier transform infrared spectroscopy at different acidic pH and temperatures, T, in order to simulate the effect of pH variations and thermal stress during manufacture and storage. The results provided evidence that arginine may indeed stabilize EMD against irreversible aggregation with respect to variations in pH and T under these conditions. Moreover, stopped-flow transmittance measurements indicated arginine addition not to suppress precipitation of EMD from either the buffers or the PGA gel carrier when the pH was raised to 7, a fundamental requirement for dental applications.

  10. Cathodic Polarization Coats Titanium Based Implant Materials with Enamel Matrix Derivate (EMD

    Directory of Open Access Journals (Sweden)

    Matthias J. Frank

    2014-03-01

    Full Text Available The idea of a bioactive surface coating that enhances bone healing and bone growth is a strong focus of on-going research for bone implant materials. Enamel matrix derivate (EMD is well documented to support bone regeneration and activates growth of mesenchymal tissues. Thus, it is a prime candidate for coating of existing implant surfaces. The aim of this study was to show that cathodic polarization can be used for coating commercially available implant surfaces with an immobilized but functional and bio-available surface layer of EMD. After coating, XPS revealed EMD-related bindings on the surface while SIMS showed incorporation of EMD into the surface. The hydride layer of the original surface could be activated for coating in an integrated one-step process that did not require any pre-treatment of the surface. SEM images showed nano-spheres and nano-rods on coated surfaces that were EMD-related. Moreover, the surface roughness remained unchanged after coating, as it was shown by optical profilometry. The mass peaks observed in the matrix-assisted laser desorption/ionization time-of-flight mass spectroscopy (MALDI-TOF MS analysis confirmed the integrity of EMD after coating. Assessment of the bioavailability suggested that the modified surfaces were active for osteoblast like MC3M3-E1 cells in showing enhanced Coll-1 gene expression and ALP activity.

  11. Study on multi-fractal fault diagnosis based on EMD fusion in hydraulic engineering

    International Nuclear Information System (INIS)

    Lu, Shibao; Wang, Jianhua; Xue, Yangang

    2016-01-01

    Highlights: • The measured shafting vibration data signal of the hydroelectric generating set is acquired through EMD. • The vibration signal waveform is identified and purified with EMD to obtain approximation coefficient of various fault signals. • The multi-fractal spectrum provides the distributed geometrical or probabilistic information of point. • EMD provides the real information for the next subsequent analysis and recognition. - Abstract: The vibration signal analysis of the hydraulic turbine unit aims at extracting the characteristic information of the unit vibration. The effective signal processing and information extraction are the key to state monitoring and fault diagnosis of the hydraulic turbine unit. In this paper, the vibration fault diagnosis model is established, which combines EMD, multi-fractal spectrum and modified BP neural network; the vibration signal waveform is identified and purified with EMD to obtain approximation coefficient of various fault signals; the characteristic vector of the vibration fault is acquired with the multi-fractal spectrum algorithm, which is classified and identified as input vector of BP neural network. The signal characteristics are extracted through the waveform, the diagnosis and identification are carried out in combination of the multi-fractal spectrum to provide a new method for fault diagnosis of the hydraulic turbine unit. After the application test, the results show that the method can improve the intelligence and humanization of diagnosis, enhance the man–machine interaction, and produce satisfactory identification result.

  12. Denoising GPS-Based Structure Monitoring Data Using Hybrid EMD and Wavelet Packet

    Directory of Open Access Journals (Sweden)

    Lu Ke

    2017-01-01

    Full Text Available High-frequency components are often discarded for data denoising when applying pure wavelet multiscale or empirical mode decomposition (EMD based approaches. Instead, they may raise the problem of energy leakage in vibration signals. Hybrid EMD and wavelet packet (EMD-WP is proposed to denoise Global Positioning System- (GPS- based structure monitoring data. First, field observables are decomposed into a collection of intrinsic mode functions (IMFs with different characteristics. Second, high-frequency IMFs are denoised using the wavelet packet; then the monitoring data are reconstructed using the denoised IMFs together with the remaining low-frequency IMFs. Our algorithm is demonstrated on a synthetic displacement response of a 3-story frame excited by El Centro earthquake along with a set of Gaussian random white noises on different levels added. We find that the hybrid method can effectively weaken the multipath effect with low frequency and can potentially extract vibration feature. However, false modals may still exist by the rest of the noise contained in the high-frequency IMFs and when the frequency of the noise is located in the same band as that of effective vibration. Finally, real GPS observables are implemented to evaluate the efficiency of EMD-WP method in mitigating low-frequency multipath.

  13. Forecasting crude oil price with an EMD-based neural network ensemble learning paradigm

    International Nuclear Information System (INIS)

    Yu, Lean; Wang, Shouyang; Lai, Kin Keung

    2008-01-01

    In this study, an empirical mode decomposition (EMD) based neural network ensemble learning paradigm is proposed for world crude oil spot price forecasting. For this purpose, the original crude oil spot price series were first decomposed into a finite, and often small, number of intrinsic mode functions (IMFs). Then a three-layer feed-forward neural network (FNN) model was used to model each of the extracted IMFs, so that the tendencies of these IMFs could be accurately predicted. Finally, the prediction results of all IMFs are combined with an adaptive linear neural network (ALNN), to formulate an ensemble output for the original crude oil price series. For verification and testing, two main crude oil price series, West Texas Intermediate (WTI) crude oil spot price and Brent crude oil spot price, are used to test the effectiveness of the proposed EMD-based neural network ensemble learning methodology. Empirical results obtained demonstrate attractiveness of the proposed EMD-based neural network ensemble learning paradigm. (author)

  14. Online Condition Monitoring of Gripper Cylinder in TBM Based on EMD Method

    Science.gov (United States)

    Li, Lin; Tao, Jian-Feng; Yu, Hai-Dong; Huang, Yi-Xiang; Liu, Cheng-Liang

    2017-11-01

    The gripper cylinder that provides braced force for Tunnel Boring Machine (TBM) might fail due to severe vibration when the TBM excavates in the tunnel. Early fault diagnosis of the gripper cylinder is important for the safety and efficiency of the whole tunneling project. In this paper, an online condition monitoring system based on the Empirical Mode Decomposition (EMD) method is established for fault diagnosis of the gripper cylinder while TBM is working. Firstly, the lumped mass parameter model of the gripper cylinder is established considering the influence of the variable stiffness at the rock interface, the equivalent stiffness of the oil, the seals, and the copper guide sleeve. The dynamic performance of the gripper cylinder is investigated to provide basis for its health condition evaluation. Then, the EMD method is applied to identify the characteristic frequencies of the gripper cylinder for fault diagnosis and a field test is used to verify the accuracy of the EMD method for detection of the characteristic frequencies. Furthermore, the contact stiffness at the interface between the barrel and the rod is calculated with Hertz theory and the relationship between the natural frequency and the stiffness varying with the health condition of the cylinder is simulated based on the dynamic model. The simulation shows that the characteristic frequencies decrease with the increasing clearance between the barrel and the rod, thus the defects could be indicated by monitoring the natural frequency. Finally, a health condition management system of the gripper cylinder based on the vibration signal and the EMD method is established, which could ensure the safety of TBM.

  15. Fishery landing forecasting using EMD-based least square support vector machine models

    Science.gov (United States)

    Shabri, Ani

    2015-05-01

    In this paper, the novel hybrid ensemble learning paradigm integrating ensemble empirical mode decomposition (EMD) and least square support machine (LSSVM) is proposed to improve the accuracy of fishery landing forecasting. This hybrid is formulated specifically to address in modeling fishery landing, which has high nonlinear, non-stationary and seasonality time series which can hardly be properly modelled and accurately forecasted by traditional statistical models. In the hybrid model, EMD is used to decompose original data into a finite and often small number of sub-series. The each sub-series is modeled and forecasted by a LSSVM model. Finally the forecast of fishery landing is obtained by aggregating all forecasting results of sub-series. To assess the effectiveness and predictability of EMD-LSSVM, monthly fishery landing record data from East Johor of Peninsular Malaysia, have been used as a case study. The result shows that proposed model yield better forecasts than Autoregressive Integrated Moving Average (ARIMA), LSSVM and EMD-ARIMA models on several criteria..

  16. EMDS users guide (version 2.0): knowledge-based decision support for ecological assessment.

    Science.gov (United States)

    Keith M. Reynolds

    1999-01-01

    The USDA Forest Service Pacific Northwest Research Station in Corvallis, Oregon, has developed the ecosystem management decision support (EMDS) system. The system integrates the logical formalism of knowledge-based reasoning into a geographic information system (GIS) environment to provide decision support for ecological landscape assessment and evaluation. The...

  17. Multivariate EMD-Based Modeling and Forecasting of Crude Oil Price

    Directory of Open Access Journals (Sweden)

    Kaijian He

    2016-04-01

    Full Text Available Recent empirical studies reveal evidence of the co-existence of heterogeneous data characteristics distinguishable by time scale in the movement crude oil prices. In this paper we propose a new multivariate Empirical Mode Decomposition (EMD-based model to take advantage of these heterogeneous characteristics of the price movement and model them in the crude oil markets. Empirical studies in benchmark crude oil markets confirm that more diverse heterogeneous data characteristics can be revealed and modeled in the projected time delayed domain. The proposed model demonstrates the superior performance compared to the benchmark models.

  18. EMD-Based Predictive Deep Belief Network for Time Series Prediction: An Application to Drought Forecasting

    Directory of Open Access Journals (Sweden)

    Norbert A. Agana

    2018-02-01

    Full Text Available Drought is a stochastic natural feature that arises due to intense and persistent shortage of precipitation. Its impact is mostly manifested as agricultural and hydrological droughts following an initial meteorological phenomenon. Drought prediction is essential because it can aid in the preparedness and impact-related management of its effects. This study considers the drought forecasting problem by developing a hybrid predictive model using a denoised empirical mode decomposition (EMD and a deep belief network (DBN. The proposed method first decomposes the data into several intrinsic mode functions (IMFs using EMD, and a reconstruction of the original data is obtained by considering only relevant IMFs. Detrended fluctuation analysis (DFA was applied to each IMF to determine the threshold for robust denoising performance. Based on their scaling exponents, irrelevant intrinsic mode functions are identified and suppressed. The proposed method was applied to predict different time scale drought indices across the Colorado River basin using a standardized streamflow index (SSI as the drought index. The results obtained using the proposed method was compared with standard methods such as multilayer perceptron (MLP and support vector regression (SVR. The proposed hybrid model showed improvement in prediction accuracy, especially for multi-step ahead predictions.

  19. Emotion Recognition from EEG Signals Using Multidimensional Information in EMD Domain.

    Science.gov (United States)

    Zhuang, Ning; Zeng, Ying; Tong, Li; Zhang, Chi; Zhang, Hanming; Yan, Bin

    2017-01-01

    This paper introduces a method for feature extraction and emotion recognition based on empirical mode decomposition (EMD). By using EMD, EEG signals are decomposed into Intrinsic Mode Functions (IMFs) automatically. Multidimensional information of IMF is utilized as features, the first difference of time series, the first difference of phase, and the normalized energy. The performance of the proposed method is verified on a publicly available emotional database. The results show that the three features are effective for emotion recognition. The role of each IMF is inquired and we find that high frequency component IMF1 has significant effect on different emotional states detection. The informative electrodes based on EMD strategy are analyzed. In addition, the classification accuracy of the proposed method is compared with several classical techniques, including fractal dimension (FD), sample entropy, differential entropy, and discrete wavelet transform (DWT). Experiment results on DEAP datasets demonstrate that our method can improve emotion recognition performance.

  20. A new approach for crude oil price analysis based on empirical mode decomposition

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shou-Yang; Lai, K.K.

    2008-01-01

    The importance of understanding the underlying characteristics of international crude oil price movements attracts much attention from academic researchers and business practitioners. Due to the intrinsic complexity of the oil market, however, most of them fail to produce consistently good results. Empirical Mode Decomposition (EMD), recently proposed by Huang et al., appears to be a novel data analysis method for nonlinear and non-stationary time series. By decomposing a time series into a small number of independent and concretely implicational intrinsic modes based on scale separation, EMD explains the generation of time series data from a novel perspective. Ensemble EMD (EEMD) is a substantial improvement of EMD which can better separate the scales naturally by adding white noise series to the original time series and then treating the ensemble averages as the true intrinsic modes. In this paper, we extend EEMD to crude oil price analysis. First, three crude oil price series with different time ranges and frequencies are decomposed into several independent intrinsic modes, from high to low frequency. Second, the intrinsic modes are composed into a fluctuating process, a slowly varying part and a trend based on fine-to-coarse reconstruction. The economic meanings of the three components are identified as short term fluctuations caused by normal supply-demand disequilibrium or some other market activities, the effect of a shock of a significant event, and a long term trend. Finally, the EEMD is shown to be a vital technique for crude oil price analysis. (author)

  1. An imbalance fault detection method based on data normalization and EMD for marine current turbines.

    Science.gov (United States)

    Zhang, Milu; Wang, Tianzhen; Tang, Tianhao; Benbouzid, Mohamed; Diallo, Demba

    2017-05-01

    This paper proposes an imbalance fault detection method based on data normalization and Empirical Mode Decomposition (EMD) for variable speed direct-drive Marine Current Turbine (MCT) system. The method is based on the MCT stator current under the condition of wave and turbulence. The goal of this method is to extract blade imbalance fault feature, which is concealed by the supply frequency and the environment noise. First, a Generalized Likelihood Ratio Test (GLRT) detector is developed and the monitoring variable is selected by analyzing the relationship between the variables. Then, the selected monitoring variable is converted into a time series through data normalization, which makes the imbalance fault characteristic frequency into a constant. At the end, the monitoring variable is filtered out by EMD method to eliminate the effect of turbulence. The experiments show that the proposed method is robust against turbulence through comparing the different fault severities and the different turbulence intensities. Comparison with other methods, the experimental results indicate the feasibility and efficacy of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Dopamine agonist activity of EMD 23,448

    Energy Technology Data Exchange (ETDEWEB)

    Martin, G E; Pettibone, D J [Merck Sharp and Dohme Research Laboratories, West Point, Pennsylvania (USA). Dept. of Pharmacology

    1985-01-01

    EMD 23,448 was examined in tests of dopaminergic function and was found to be an atypical dopamine (DA) agonist. EMD 23,448 was a weak or inactive DA agonist when examined in tests of normal postsynaptic DA receptor function: production of stereotypy in the rat (ED/sub 50/ greater than sign 5.0 mg/kg.i.p.); production of emesis in beagles (minimum effective dose = 81..mu..g/kg i.v.); and, enhanced locomotor activity of the mouse (no excitation in doses <=50 mg/i.p.). Moreover, EMD 23,448 was relatively weak in competing for (/sup 3/H)-apomorphine binding to rat striatal membranes (Ki, 205 nM). On the other hand, this indolyl-3-butylamine did activate supersensitive postsynaptic DA receptors. Specifically, it elicited contralateral turning in rats with a unilateral 6-hydroxydopamine lesion of the substantia nigra (ED/sub 50/ value = 0.9 mg/kg) and did elicit stereotypy in rats given chronic daily haloperidol treatments. EMD 23,448 also exerted pharmacological effects in tests designed to measure activation of dopamine autoreceptors. It inhibited the ..gamma..-butyrolactone-induced increase in striatal dopa levels (ED/sub 50/ = 1 mg/kg i.p.) and produced a dose-related fall in the locomotor activity of the mouse. The results are discussed and contrasted with data derived for apomorphine and the putatively selective autoreceptor agonist (+-)-3-PPP.

  3. A hybrid approach EMD-HW for short-term forecasting of daily stock market time series data

    Science.gov (United States)

    Awajan, Ahmad Mohd; Ismail, Mohd Tahir

    2017-08-01

    Recently, forecasting time series has attracted considerable attention in the field of analyzing financial time series data, specifically within the stock market index. Moreover, stock market forecasting is a challenging area of financial time-series forecasting. In this study, a hybrid methodology between Empirical Mode Decomposition with the Holt-Winter method (EMD-HW) is used to improve forecasting performances in financial time series. The strength of this EMD-HW lies in its ability to forecast non-stationary and non-linear time series without a need to use any transformation method. Moreover, EMD-HW has a relatively high accuracy and offers a new forecasting method in time series. The daily stock market time series data of 11 countries is applied to show the forecasting performance of the proposed EMD-HW. Based on the three forecast accuracy measures, the results indicate that EMD-HW forecasting performance is superior to traditional Holt-Winter forecasting method.

  4. An adaptive filtering method based on EMD for X-ray pulsar navigation with uncertain measurement noise

    Directory of Open Access Journals (Sweden)

    Li N.

    2017-01-01

    Full Text Available Affected by the unstable pulse radiation and the pulsar directional errors, the statistical characteristics of the pulsar measurement noise may vary with time slowly and cannot be accurately determined, which cause the filtering accuracy of the extended Kalman filter(EKF in pulsar navigation positioning system decline sharply or even diverge. To solve this problem, an adaptive extended Kalman filtering algorithm based on the empirical mode decomposition(EMD is proposed. In this method, the high frequency noise is separated from measurement information of pulsar by the method of EMD, and the noise variance can be estimated to update the parameters of EKF. The simulation results demonstrate that compared with conventional EKF, the proposed method can adaptively track the change of the measurement noise, and still keeps high estimation accuracy with unknown measurement noise, the positioning accuracy of the pulsar navigation is improved simultaneously.

  5. Removal of muscle artifact from EEG data: comparison between stochastic (ICA and CCA) and deterministic (EMD and wavelet-based) approaches

    Science.gov (United States)

    Safieddine, Doha; Kachenoura, Amar; Albera, Laurent; Birot, Gwénaël; Karfoul, Ahmad; Pasnicu, Anca; Biraben, Arnaud; Wendling, Fabrice; Senhadji, Lotfi; Merlet, Isabelle

    2012-12-01

    Electroencephalographic (EEG) recordings are often contaminated with muscle artifacts. This disturbing myogenic activity not only strongly affects the visual analysis of EEG, but also most surely impairs the results of EEG signal processing tools such as source localization. This article focuses on the particular context of the contamination epileptic signals (interictal spikes) by muscle artifact, as EEG is a key diagnosis tool for this pathology. In this context, our aim was to compare the ability of two stochastic approaches of blind source separation, namely independent component analysis (ICA) and canonical correlation analysis (CCA), and of two deterministic approaches namely empirical mode decomposition (EMD) and wavelet transform (WT) to remove muscle artifacts from EEG signals. To quantitatively compare the performance of these four algorithms, epileptic spike-like EEG signals were simulated from two different source configurations and artificially contaminated with different levels of real EEG-recorded myogenic activity. The efficiency of CCA, ICA, EMD, and WT to correct the muscular artifact was evaluated both by calculating the normalized mean-squared error between denoised and original signals and by comparing the results of source localization obtained from artifact-free as well as noisy signals, before and after artifact correction. Tests on real data recorded in an epileptic patient are also presented. The results obtained in the context of simulations and real data show that EMD outperformed the three other algorithms for the denoising of data highly contaminated by muscular activity. For less noisy data, and when spikes arose from a single cortical source, the myogenic artifact was best corrected with CCA and ICA. Otherwise when spikes originated from two distinct sources, either EMD or ICA offered the most reliable denoising result for highly noisy data, while WT offered the better denoising result for less noisy data. These results suggest that

  6. The Fault Feature Extraction of Rolling Bearing Based on EMD and Difference Spectrum of Singular Value

    Directory of Open Access Journals (Sweden)

    Te Han

    2016-01-01

    Full Text Available Nowadays, the fault diagnosis of rolling bearing in aeroengines is based on the vibration signal measured on casing, instead of bearing block. However, the vibration signal of the bearing is often covered by a series of complex components caused by other structures (rotor, gears. Therefore, when bearings cause failure, it is still not certain that the fault feature can be extracted from the vibration signal on casing. In order to solve this problem, a novel fault feature extraction method for rolling bearing based on empirical mode decomposition (EMD and the difference spectrum of singular value is proposed in this paper. Firstly, the vibration signal is decomposed by EMD. Next, the difference spectrum of singular value method is applied. The study finds that each peak on the difference spectrum corresponds to each component in the original signal. According to the peaks on the difference spectrum, the component signal of the bearing fault can be reconstructed. To validate the proposed method, the bearing fault data collected on the casing are analyzed. The results indicate that the proposed rolling bearing diagnosis method can accurately extract the fault feature that is submerged in other component signals and noise.

  7. An improved EMD method for modal identification and a combined static-dynamic method for damage detection

    Science.gov (United States)

    Yang, Jinping; Li, Peizhen; Yang, Youfa; Xu, Dian

    2018-04-01

    Empirical mode decomposition (EMD) is a highly adaptable signal processing method. However, the EMD approach has certain drawbacks, including distortions from end effects and mode mixing. In the present study, these two problems are addressed using an end extension method based on the support vector regression machine (SVRM) and a modal decomposition method based on the characteristics of the Hilbert transform. The algorithm includes two steps: using the SVRM, the time series data are extended at both endpoints to reduce the end effects, and then, a modified EMD method using the characteristics of the Hilbert transform is performed on the resulting signal to reduce mode mixing. A new combined static-dynamic method for identifying structural damage is presented. This method combines the static and dynamic information in an equilibrium equation that can be solved using the Moore-Penrose generalized matrix inverse. The combination method uses the differences in displacements of the structure with and without damage and variations in the modal force vector. Tests on a four-story, steel-frame structure were conducted to obtain static and dynamic responses of the structure. The modal parameters are identified using data from the dynamic tests and improved EMD method. The new method is shown to be more accurate and effective than the traditional EMD method. Through tests with a shear-type test frame, the higher performance of the proposed static-dynamic damage detection approach, which can detect both single and multiple damage locations and the degree of the damage, is demonstrated. For structures with multiple damage, the combined approach is more effective than either the static or dynamic method. The proposed EMD method and static-dynamic damage detection method offer improved modal identification and damage detection, respectively, in structures.

  8. Dopamine agonist activity of EMD 23,448

    International Nuclear Information System (INIS)

    Martin, G.E.; Pettibone, D.J.

    1985-01-01

    EMD 23,448 was examined in tests of dopaminergic function and was found to be an atypical dopamine (DA) agonist. EMD 23,448 was a weak or inactive DA agonist when examined in tests of normal postsynaptic DA receptor function: production of stereotypy in the rat (ED 50 greater than sign 5.0 mg/kg.i.p.); production of emesis in beagles (minimum effective dose = 81μg/kg i.v.); and, enhanced locomotor activity of the mouse (no excitation in doses 3 H]-apomorphine binding to rat striatal membranes (Ki, 205 nM). On the other hand, this indolyl-3-butylamine did activate supersensitive postsynaptic DA receptors. Specifically, it elicited contralateral turning in rats with a unilateral 6-hydroxydopamine lesion of the substantia nigra (ED 50 value = 0.9 mg/kg) and did elicit stereotypy in rats given chronic daily haloperidol treatments. EMD 23,448 also exerted pharmacological effects in tests designed to measure activation of dopamine autoreceptors. It inhibited the γ-butyrolactone-induced increase in striatal dopa levels (ED 50 = 1 mg/kg i.p.) and produced a dose-related fall in the locomotor activity of the mouse. The results are discussed and contrasted with data derived for apomorphine and the putatively selective autoreceptor agonist (+-)-3-PPP. (Author)

  9. Properties of a Bacteriocin Produced by Bacillus subtilis EMD4 Isolated from Ganjang (Soy Sauce).

    Science.gov (United States)

    Liu, Xiaoming; Lee, Jae Yong; Jeong, Seon-Ju; Cho, Kye Man; Kim, Gyoung Min; Shin, Jung-Hye; Kim, Jong-Sang; Kim, Jeong Hwan

    2015-09-01

    A Bacillus species, EMD4, with strong antibacterial activity was isolated from ganjang (soy sauce) and identified as B. subtilis. B. subtilis EMD4 strongly inhibited the growth of B. cereus ATCC14579 and B. thuringiensis ATCC33679. The antibacterial activity was stable at pH 3-9 but inactive at pH 10 and above. The activity was fully retained after 15 min at 80°C but reduced by 50% after 15 min at 90°C. The activity was completely destroyed by proteinase K and protease treatment, indicating its proteinaceous nature. The bacteriocin (BacEMD4) was partially purified from culture supernatant by ammonium sulfate precipitation, and QSepharose and Sephadex G-50 column chromatographies. The specific activity was increased from 769.2 AU/mg protein to 8,347.8 AU/mg protein and the final yield was 12.6%. The size of BacEMD4 was determined to be 3.5 kDa by Tricine SDS-PAGE. The N-terminal amino acid sequence was similar with that of Subtilosin A. Nucleotide sequencing of the cloned gene confirmed that BacEMD4 was Subtilosin A. BacEMD4 showed bactericidal activity against B. cereus ATCC14579.

  10. Improving prediction accuracy of cooling load using EMD, PSR and RBFNN

    Science.gov (United States)

    Shen, Limin; Wen, Yuanmei; Li, Xiaohong

    2017-08-01

    To increase the accuracy for the prediction of cooling load demand, this work presents an EMD (empirical mode decomposition)-PSR (phase space reconstruction) based RBFNN (radial basis function neural networks) method. Firstly, analyzed the chaotic nature of the real cooling load demand, transformed the non-stationary cooling load historical data into several stationary intrinsic mode functions (IMFs) by using EMD. Secondly, compared the RBFNN prediction accuracies of each IMFs and proposed an IMF combining scheme that is combine the lower-frequency components (called IMF4-IMF6 combined) while keep the higher frequency component (IMF1, IMF2, IMF3) and the residual unchanged. Thirdly, reconstruct phase space for each combined components separately, process the highest frequency component (IMF1) by differential method and predict with RBFNN in the reconstructed phase spaces. Real cooling load data of a centralized ice storage cooling systems in Guangzhou are used for simulation. The results show that the proposed hybrid method outperforms the traditional methods.

  11. Patient-Specific Seizure Detection in Long-Term EEG Using Signal-Derived Empirical Mode Decomposition (EMD)-based Dictionary Approach.

    Science.gov (United States)

    Kaleem, Muhammad; Gurve, Dharmendra; Guergachi, Aziz; Krishnan, Sridhar

    2018-06-25

    The objective of the work described in this paper is development of a computationally efficient methodology for patient-specific automatic seizure detection in long-term multi-channel EEG recordings. Approach: A novel patient-specific seizure detection approach based on signal-derived Empirical Mode Decomposition (EMD)-based dictionary approach is proposed. For this purpose, we use an empirical framework for EMD-based dictionary creation and learning, inspired by traditional dictionary learning methods, in which the EMD-based dictionary is learned from the multi-channel EEG data being analyzed for automatic seizure detection. We present the algorithm for dictionary creation and learning, whose purpose is to learn dictionaries with a small number of atoms. Using training signals belonging to seizure and non-seizure classes, an initial dictionary, termed as the raw dictionary, is formed. The atoms of the raw dictionary are composed of intrinsic mode functions obtained after decomposition of the training signals using the empirical mode decomposition algorithm. The raw dictionary is then trained using a learning algorithm, resulting in a substantial decrease in the number of atoms in the trained dictionary. The trained dictionary is then used for automatic seizure detection, such that coefficients of orthogonal projections of test signals against the trained dictionary form the features used for classification of test signals into seizure and non-seizure classes. Thus no hand-engineered features have to be extracted from the data as in traditional seizure detection approaches. Main results: The performance of the proposed approach is validated using the CHB-MIT benchmark database, and averaged accuracy, sensitivity and specificity values of 92.9%, 94.3% and 91.5%, respectively, are obtained using support vector machine classifier and five-fold cross-validation method. These results are compared with other approaches using the same database, and the suitability

  12. An Improved EMD-Based Dissimilarity Metric for Unsupervised Linear Subspace Learning

    Directory of Open Access Journals (Sweden)

    Xiangchun Yu

    2018-01-01

    Full Text Available We investigate a novel way of robust face image feature extraction by adopting the methods based on Unsupervised Linear Subspace Learning to extract a small number of good features. Firstly, the face image is divided into blocks with the specified size, and then we propose and extract pooled Histogram of Oriented Gradient (pHOG over each block. Secondly, an improved Earth Mover’s Distance (EMD metric is adopted to measure the dissimilarity between blocks of one face image and the corresponding blocks from the rest of face images. Thirdly, considering the limitations of the original Locality Preserving Projections (LPP, we proposed the Block Structure LPP (BSLPP, which effectively preserves the structural information of face images. Finally, an adjacency graph is constructed and a small number of good features of a face image are obtained by methods based on Unsupervised Linear Subspace Learning. A series of experiments have been conducted on several well-known face databases to evaluate the effectiveness of the proposed algorithm. In addition, we construct the noise, geometric distortion, slight translation, slight rotation AR, and Extended Yale B face databases, and we verify the robustness of the proposed algorithm when faced with a certain degree of these disturbances.

  13. Assessment of autonomic nervous system by using empirical mode decomposition-based reflection wave analysis during non-stationary conditions

    International Nuclear Information System (INIS)

    Chang, C C; Hsiao, T C; Kao, S C; Hsu, H Y

    2014-01-01

    Arterial blood pressure (ABP) is an important indicator of cardiovascular circulation and presents various intrinsic regulations. It has been found that the intrinsic characteristics of blood vessels can be assessed quantitatively by ABP analysis (called reflection wave analysis (RWA)), but conventional RWA is insufficient for assessment during non-stationary conditions, such as the Valsalva maneuver. Recently, a novel adaptive method called empirical mode decomposition (EMD) was proposed for non-stationary data analysis. This study proposed a RWA algorithm based on EMD (EMD-RWA). A total of 51 subjects participated in this study, including 39 healthy subjects and 12 patients with autonomic nervous system (ANS) dysfunction. The results showed that EMD-RWA provided a reliable estimation of reflection time in baseline and head-up tilt (HUT). Moreover, the estimated reflection time is able to assess the ANS function non-invasively, both in normal, healthy subjects and in the patients with ANS dysfunction. EMD-RWA provides a new approach for reflection time estimation in non-stationary conditions, and also helps with non-invasive ANS assessment. (paper)

  14. Partial differential equation-based approach for empirical mode decomposition: application on image analysis.

    Science.gov (United States)

    Niang, Oumar; Thioune, Abdoulaye; El Gueirea, Mouhamed Cheikh; Deléchelle, Eric; Lemoine, Jacques

    2012-09-01

    The major problem with the empirical mode decomposition (EMD) algorithm is its lack of a theoretical framework. So, it is difficult to characterize and evaluate this approach. In this paper, we propose, in the 2-D case, the use of an alternative implementation to the algorithmic definition of the so-called "sifting process" used in the original Huang's EMD method. This approach, especially based on partial differential equations (PDEs), was presented by Niang in previous works, in 2005 and 2007, and relies on a nonlinear diffusion-based filtering process to solve the mean envelope estimation problem. In the 1-D case, the efficiency of the PDE-based method, compared to the original EMD algorithmic version, was also illustrated in a recent paper. Recently, several 2-D extensions of the EMD method have been proposed. Despite some effort, 2-D versions for EMD appear poorly performing and are very time consuming. So in this paper, an extension to the 2-D space of the PDE-based approach is extensively described. This approach has been applied in cases of both signal and image decomposition. The obtained results confirm the usefulness of the new PDE-based sifting process for the decomposition of various kinds of data. Some results have been provided in the case of image decomposition. The effectiveness of the approach encourages its use in a number of signal and image applications such as denoising, detrending, or texture analysis.

  15. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  16. SigEMD: A powerful method for differential gene expression analysis in single-cell RNA sequencing data.

    Science.gov (United States)

    Wang, Tianyu; Nabavi, Sheida

    2018-04-24

    Differential gene expression analysis is one of the significant efforts in single cell RNA sequencing (scRNAseq) analysis to discover the specific changes in expression levels of individual cell types. Since scRNAseq exhibits multimodality, large amounts of zero counts, and sparsity, it is different from the traditional bulk RNA sequencing (RNAseq) data. The new challenges of scRNAseq data promote the development of new methods for identifying differentially expressed (DE) genes. In this study, we proposed a new method, SigEMD, that combines a data imputation approach, a logistic regression model and a nonparametric method based on the Earth Mover's Distance, to precisely and efficiently identify DE genes in scRNAseq data. The regression model and data imputation are used to reduce the impact of large amounts of zero counts, and the nonparametric method is used to improve the sensitivity of detecting DE genes from multimodal scRNAseq data. By additionally employing gene interaction network information to adjust the final states of DE genes, we further reduce the false positives of calling DE genes. We used simulated datasets and real datasets to evaluate the detection accuracy of the proposed method and to compare its performance with those of other differential expression analysis methods. Results indicate that the proposed method has an overall powerful performance in terms of precision in detection, sensitivity, and specificity. Copyright © 2018 Elsevier Inc. All rights reserved.

  17. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  18. Volatility behavior of visibility graph EMD financial time series from Ising interacting system

    Science.gov (United States)

    Zhang, Bo; Wang, Jun; Fang, Wen

    2015-08-01

    A financial market dynamics model is developed and investigated by stochastic Ising system, where the Ising model is the most popular ferromagnetic model in statistical physics systems. Applying two graph based analysis and multiscale entropy method, we investigate and compare the statistical volatility behavior of return time series and the corresponding IMF series derived from the empirical mode decomposition (EMD) method. And the real stock market indices are considered to be comparatively studied with the simulation data of the proposed model. Further, we find that the degree distribution of visibility graph for the simulation series has the power law tails, and the assortative network exhibits the mixing pattern property. All these features are in agreement with the real market data, the research confirms that the financial model established by the Ising system is reasonable.

  19. Drug and alcohol-impaired driving among electronic music dance event attendees.

    Science.gov (United States)

    Furr-Holden, Debra; Voas, Robert B; Kelley-Baker, Tara; Miller, Brenda

    2006-10-15

    Drug-impaired driving has received increased attention resulting from development of rapid drug-screening procedures used by police and state laws establishing per se limits for drug levels in drivers. Venues that host electronic music dance events (EMDEs) provide a unique opportunity to assess drug-impaired driving among a high proportion of young adult drug users. EMDEs are late-night dance parties marked by a substantial number of young adult attendees and elevated drug involvement. No studies to date have examined drug-impaired driving in a natural environment with active drug and alcohol users. Six EMDEs were sampled in San Diego, California, and Baltimore, Maryland. A random sample of approximately 40 attendees per event were administered surveys about alcohol and other drug (AOD) use and driving status, given breath tests for alcohol, and asked to provide oral fluid samples to test for illicit drug use upon entering and exiting the events. Driving status reduced the level of alcohol use (including abstaining) but the impact on drug-taking was not significant. However, 62% of individuals who reported their intention to drive away from the events were positive for drugs or alcohol upon leaving. This suggests that these events and settings are appropriate ones for developing interventions for reducing risks for young adults.

  20. EMD-RBFNN Coupling Prediction Model of Complex Regional Groundwater Depth Series: A Case Study of the Jiansanjiang Administration of Heilongjiang Land Reclamation in China

    Directory of Open Access Journals (Sweden)

    Qiang Fu

    2016-08-01

    Full Text Available The accurate and reliable prediction of groundwater depth is the basis of the sustainable utilization of regional groundwater resources. However, the complexity of the prediction has been ignored in previous studies of regional groundwater depth system analysis and prediction, making it difficult to realize the scientific management of groundwater resources. To address this defect, taking complexity diagnosis as the research foundation, this paper proposes a new coupling forecast strategy for evaluating groundwater depth based on empirical mode decomposition (EMD and a radial basis function neural network (RBFNN. The data used for complexity analysis and modelling are the monthly groundwater depth series monitoring data from 15 long-term monitoring wells from 1997 to 2007, which were collected from the Jiansanjiang Administration of Heilongjiang Agricultural Reclamation in China. The calculation results of the comprehensive complexity index for each groundwater depth series obtained are based on wavelet theory, fractal theory, and the approximate entropy method. The monthly groundwater depth sequence of District 8 of Farm Nongjiang, which has the highest complexity among the five farms in the Jiansanjiang Administration midland, was chosen as the modelling sample series. The groundwater depth series of District 8 of Farm Nongjiang was separated into five intrinsic mode function (IMF sequences and a remainder sequence by applying the EMD method, which revealed that local groundwater depth has a significant one-year periodic character and an increasing trend. The RBFNN was then used to forecast and stack each EMD separation sequence. The results suggest that the future groundwater depth will remain at approximately 10 m if the past pattern of water use continues, exceeding the ideal depth of groundwater. Thus, local departments should take appropriate countermeasures to conserve groundwater resources effectively.

  1. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  2. 75 FR 17939 - EMD Chemicals, Inc.; Withdrawal of Color Additive Petition

    Science.gov (United States)

    2010-04-08

    ... petition (CAP 8C0262) proposing an amendment of the color additive regulations to provide for the safe use... color food and to provide for the safe use of titanium dioxide to color food at levels higher than the...] (formerly Docket No. 1998C-0790) EMD Chemicals, Inc.; Withdrawal of Color Additive Petition AGENCY: Food and...

  3. Noise Reduction of Steel Cord Conveyor Belt Defect Electromagnetic Signal by Combined Use of Improved Wavelet and EMD

    Directory of Open Access Journals (Sweden)

    Hong-Wei Ma

    2016-09-01

    Full Text Available In order to reduce the noise of a defect electromagnetic signal of the steel cord conveyor belt used in coal mines, a new signal noise reduction method by combined use of the improved threshold wavelet and Empirical Mode Decomposition (EMD is proposed. Firstly, the denoising method based on the improved threshold wavelet is applied to reduce the noise of a defect electromagnetic signal obtained by an electromagnetic testing system. Then, the EMD is used to decompose the denoised signal and then the effective Intrinsic Mode Function (IMF is extracted by the dominant eigenvalue strategy. Finally, the signal reconstruction is carried out by utilizing the obtained IMF. In order to verify the proposed noise reduction method, the experiments are carried out in two cases including the defective joint and steel wire rope break. The experimental results show that the proposed method in this paper obtains the higher Signal to Noise Ratio (SNR for the defect electromagnetic signal noise reduction of steel cord conveyor belts.

  4. Epileptic Seizure Detection based on Wavelet Transform Statistics Map and EMD Method for Hilbert-Huang Spectral Analyzing in Gamma Frequency Band of EEG Signals

    Directory of Open Access Journals (Sweden)

    Morteza Behnam

    2015-08-01

    Full Text Available Seizure detection using brain signal (EEG analysis is the important clinical methods in drug therapy and the decisions before brain surgery. In this paper, after signal conditioning using suitable filtering, the Gamma frequency band has been extracted and the other brain rhythms, ambient noises and the other bio-signal are canceled. Then, the wavelet transform of brain signal and the map of wavelet transform in multi levels are computed. By dividing the color map to different epochs, the histogram of each sub-image is obtained and the statistics of it based on statistical momentums and Negentropy values are calculated. Statistical feature vector using Principle Component Analysis (PCA is reduced to one dimension. By EMD algorithm and sifting procedure for analyzing the data by Intrinsic Mode Function (IMF and computing the residues of brain signal using spectrum of Hilbert transform and Hilbert – Huang spectrum forming, one spatial feature based on the Euclidian distance for signal classification is obtained. By K-Nearest Neighbor (KNN classifier and by considering the optimal neighbor parameter, EEG signals are classified in two classes, seizure and non-seizure signal, with the rate of accuracy 76.54% and with variance of error 0.3685 in the different tests.

  5. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  6. EMD-regression for modelling multi-scale relationships, and application to weather-related cardiovascular mortality

    Science.gov (United States)

    Masselot, Pierre; Chebana, Fateh; Bélanger, Diane; St-Hilaire, André; Abdous, Belkacem; Gosselin, Pierre; Ouarda, Taha B. M. J.

    2018-01-01

    In a number of environmental studies, relationships between natural processes are often assessed through regression analyses, using time series data. Such data are often multi-scale and non-stationary, leading to a poor accuracy of the resulting regression models and therefore to results with moderate reliability. To deal with this issue, the present paper introduces the EMD-regression methodology consisting in applying the empirical mode decomposition (EMD) algorithm on data series and then using the resulting components in regression models. The proposed methodology presents a number of advantages. First, it accounts of the issues of non-stationarity associated to the data series. Second, this approach acts as a scan for the relationship between a response variable and the predictors at different time scales, providing new insights about this relationship. To illustrate the proposed methodology it is applied to study the relationship between weather and cardiovascular mortality in Montreal, Canada. The results shed new knowledge concerning the studied relationship. For instance, they show that the humidity can cause excess mortality at the monthly time scale, which is a scale not visible in classical models. A comparison is also conducted with state of the art methods which are the generalized additive models and distributed lag models, both widely used in weather-related health studies. The comparison shows that EMD-regression achieves better prediction performances and provides more details than classical models concerning the relationship.

  7. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  8. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  9. Non-destructive testing of full-length bonded rock bolts based on HHT signal analysis

    Science.gov (United States)

    Shi, Z. M.; Liu, L.; Peng, M.; Liu, C. C.; Tao, F. J.; Liu, C. S.

    2018-04-01

    Full-length bonded rock bolts are commonly used in mining, tunneling and slope engineering because of their simple design and resistance to corrosion. However, the length of a rock bolt and grouting quality do not often meet the required design standards in practice because of the concealment and complexity of bolt construction. Non-destructive testing is preferred when testing a rock bolt's quality because of the convenience, low cost and wide detection range. In this paper, a signal analysis method for the non-destructive sound wave testing of full-length bonded rock bolts is presented, which is based on the Hilbert-Huang transform (HHT). First, we introduce the HHT analysis method to calculate the bolt length and identify defect locations based on sound wave reflection test signals, which includes decomposing the test signal via empirical mode decomposition (EMD), selecting the intrinsic mode functions (IMF) using the Pearson Correlation Index (PCI) and calculating the instantaneous phase and frequency via the Hilbert transform (HT). Second, six model tests are conducted using different grouting defects and bolt protruding lengths to verify the effectiveness of the HHT analysis method. Lastly, the influence of the bolt protruding length on the test signal, identification of multiple reflections from defects, bolt end and protruding end, and mode mixing from EMD are discussed. The HHT analysis method can identify the bolt length and grouting defect locations from signals that contain noise at multiple reflected interfaces. The reflection from the long protruding end creates an irregular test signal with many frequency peaks on the spectrum. The reflections from defects barely change the original signal because they are low energy, which cannot be adequately resolved using existing methods. The HHT analysis method can identify reflections from the long protruding end of the bolt and multiple reflections from grouting defects based on mutations in the instantaneous

  10. Ecosystem Management Decision Support (EMDS) Applied to Watershed Assessment on California's North Coast

    Science.gov (United States)

    Rich Walker; Chris Keithley; Russ Henly; Scott Downie; Steve Cannata

    2007-01-01

    In 2001, the state of California initiated the North Coast Watershed Assessment Program (2003a) to assemble information on the status of coastal watersheds that have historically supported anadromous fish. The five-agency consortium explored the use of Ecosystem Management Decision Support (EMDS) (Reynolds and others 1996) as a means to help assess overall watershed...

  11. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  12. Automatic screening of obstructive sleep apnea from the ECG based on empirical mode decomposition and wavelet analysis

    International Nuclear Information System (INIS)

    Mendez, M O; Cerutti, S; Bianchi, A M; Corthout, J; Van Huffel, S; Matteucci, M; Penzel, T

    2010-01-01

    This study analyses two different methods to detect obstructive sleep apnea (OSA) during sleep time based only on the ECG signal. OSA is a common sleep disorder caused by repetitive occlusions of the upper airways, which produces a characteristic pattern on the ECG. ECG features, such as the heart rate variability (HRV) and the QRS peak area, contain information suitable for making a fast, non-invasive and simple screening of sleep apnea. Fifty recordings freely available on Physionet have been included in this analysis, subdivided in a training and in a testing set. We investigated the possibility of using the recently proposed method of empirical mode decomposition (EMD) for this application, comparing the results with the ones obtained through the well-established wavelet analysis (WA). By these decomposition techniques, several features have been extracted from the ECG signal and complemented with a series of standard HRV time domain measures. The best performing feature subset, selected through a sequential feature selection (SFS) method, was used as the input of linear and quadratic discriminant classifiers. In this way we were able to classify the signals on a minute-by-minute basis as apneic or nonapneic with different best-subset sizes, obtaining an accuracy up to 89% with WA and 85% with EMD. Furthermore, 100% correct discrimination of apneic patients from normal subjects was achieved independently of the feature extractor. Finally, the same procedure was repeated by pooling features from standard HRV time domain, EMD and WA together in order to investigate if the two decomposition techniques could provide complementary features. The obtained accuracy was 89%, similarly to the one achieved using only Wavelet analysis as the feature extractor; however, some complementary features in EMD and WA are evident

  13. Multifractal features of EUA and CER futures markets by using multifractal detrended fluctuation analysis based on empirical model decomposition

    International Nuclear Information System (INIS)

    Cao, Guangxi; Xu, Wei

    2016-01-01

    Basing on daily price data of carbon emission rights in futures markets of Certified Emission Reduction (CER) and European Union Allowances (EUA), we analyze the multiscale characteristics of the markets by using empirical mode decomposition (EMD) and multifractal detrended fluctuation analysis (MFDFA) based on EMD. The complexity of the daily returns of CER and EUA futures markets changes with multiple time scales and multilayered features. The two markets also exhibit clear multifractal characteristics and long-range correlation. We employ shuffle and surrogate approaches to analyze the origins of multifractality. The long-range correlations and fat-tail distributions significantly contribute to multifractality. Furthermore, we analyze the influence of high returns on multifractality by using threshold method. The multifractality of the two futures markets is related to the presence of high values of returns in the price series.

  14. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  15. Analysis of multi-scale chaotic characteristics of wind power based on Hilbert–Huang transform and Hurst analysis

    International Nuclear Information System (INIS)

    Liang, Zhengtang; Liang, Jun; Zhang, Li; Wang, Chengfu; Yun, Zhihao; Zhang, Xu

    2015-01-01

    Highlights: • A scale division method of wind power based on HHT and Hurst analysis is proposed. • The time–frequency components of wind power show different fractal structures. • These components are superposed and reconstructed into three scale subsequences. • Each subsequence has a chaotic characteristic and shows its own properties. • The EMD-LSSVM + ELM method improves the short-term wind power forecasting accuracy. - Abstract: The causes of uncertainty in wind farm power generation are not yet fully understood. A method for the scale division of wind power based on the Hilbert–Huang transform (HHT) and Hurst analysis is proposed in this paper, which allows the various multi-scale chaotic characteristics of wind power to be investigated to reveal further information about the dynamic behavior of wind power. First, the time–frequency characteristics of wind power are analyzed using the HHT, and then Hurst analysis is applied to analyze the stochastic/persistent characteristics of the different time–frequency components. Second, based on their fractal structures, the components are superposed and reconstructed into three series, which are defined as the Micro-, Meso- and Macro-scale subsequences. Finally, indices related to the statistical and behavioral characteristics of the subsequences are calculated and used to analyze their nonlinear dynamic behavior. The data collected from a wind farm of Hebei Province, China, are selected for case studies. The simulation results reveal that (1) although the time–frequency components can be decomposed, the different fractal structures of the signal are also derived from the original series; (2) the three scale subsequences all present chaotic characteristics and each of them exhibits its own unique properties. The Micro-scale subsequence shows strong randomness and contributes the least to the overall fluctuations; the Macro-scale subsequence is the steadiest and exhibits the most significant tendency

  16. Size and molecular weight determination of polysaccharides by means of nano electrospray gas-phase electrophoretic mobility molecular analysis (nES GEMMA).

    Science.gov (United States)

    Weiss, Victor U; Golesne, Monika; Friedbacher, Gernot; Alban, Susanne; Szymanski, Wladyslaw W; Marchetti-Deschmann, Martina; Allmaier, Günter

    2018-02-21

    Size, size distribution and molecular weight (MW) determination of nanoparticles and that are for example large polymers, are of great interest and pose an analytical challenge. In this context, nano electrospray gas-phase electrophoretic mobility molecular analysis (nES GEMMA) is a valuable tool with growing impact. Separation of single-charged analytes according to their electrophoretic mobility diameter (EMD) starting from single-digit EMDs up to several hundred nm diameters is possible. In case of spherical analytes, the EMD corresponds to the dry nanoparticle size. Additionally, the instrument is capable of number-based, single-particle detection following the recommendation of the European Commission for nanoparticle characterization (2011/696/EU). In case an EMD/MW correlation for a particular compound class (based on availability of well-defined standards) exists, a nanoparticle's MW can be determined from its EMD. In the present study, we focused on nES GEMMA of linear and branched, water-soluble polysaccharides forming nanoparticles and were able to obtain spectra for both analyte classes regarding single-charged species. Based on EMDs for corresponding analytes, an excellent EMD/MW correlation could be obtained in case of the branched natural polymer (dextran). This enables the determination of dextran MWs from nES GEMMA spectra despite high analyte polydispersity and in a size/MW range, where classical mass spectrometry is limited. EMD/MW correlations based on linear (pullulans, oat-ß-glucans) polymers were significantly different, possibly indicating challenges in the exact MW determination of these compounds by, for example, chromatographic and light scattering means. Despite these observations, nES GEMMA of linear, monosaccharide-based polymers enabled the determination of size and size-distribution of such dry bionanoparticles. © 2018 The Authors. Electrophoresis published by Wiley-VCH Verlag GmbH & Co. KGaA.

  17. Probabilistic analysis of external events with focus on the Fukushima event

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  18. On practical challenges of decomposition-based hybrid forecasting algorithms for wind speed and solar irradiation

    International Nuclear Information System (INIS)

    Wang, Yamin; Wu, Lei

    2016-01-01

    This paper presents a comprehensive analysis on practical challenges of empirical mode decomposition (EMD) based algorithms on wind speed and solar irradiation forecasts that have been largely neglected in literature, and proposes an alternative approach to mitigate such challenges. Specifically, the challenges are: (1) Decomposed sub-series are very sensitive to the original time series data. That is, sub-series of the new time series, consisting of the original one plus a limit number of new data samples, may significantly differ from those used in training forecasting models. In turn, forecasting models established by original sub-series may not be suitable for newly decomposed sub-series and have to be trained more frequently; and (2) Key environmental factors usually play a critical role in non-decomposition based methods for forecasting wind speed and solar irradiation. However, it is difficult to incorporate such critical environmental factors into forecasting models of individual decomposed sub-series, because the correlation between the original data and environmental factors is lost after decomposition. Numerical case studies on wind speed and solar irradiation forecasting show that the performance of existing EMD-based forecasting methods could be worse than the non-decomposition based forecasting model, and are not effective in practical cases. Finally, the approximated forecasting model based on EMD is proposed to mitigate the challenges and achieve better forecasting results than existing EMD-based forecasting algorithms and the non-decomposition based forecasting models on practical wind speed and solar irradiation forecasting cases. - Highlights: • Two challenges of existing EMD-based forecasting methods are discussed. • Significant changes of sub-series in each step of the rolling forecast procedure. • Difficulties in incorporating environmental factors into sub-series forecasting models. • The approximated forecasting method is proposed to

  19. MGR External Events Hazards Analysis

    International Nuclear Information System (INIS)

    Booth, L.

    1999-01-01

    The purpose and objective of this analysis is to apply an external events Hazards Analysis (HA) to the License Application Design Selection Enhanced Design Alternative 11 [(LADS EDA II design (Reference 8.32))]. The output of the HA is called a Hazards List (HL). This analysis supersedes the external hazards portion of Rev. 00 of the PHA (Reference 8.1). The PHA for internal events will also be updated to the LADS EDA II design but under a separate analysis. Like the PHA methodology, the HA methodology provides a systematic method to identify potential hazards during the 100-year Monitored Geologic Repository (MGR) operating period updated to reflect the EDA II design. The resulting events on the HL are candidates that may have potential radiological consequences as determined during Design Basis Events (DBEs) analyses. Therefore, the HL that results from this analysis will undergo further screening and analysis based on the criteria that apply during the performance of DBE analyses

  20. Fringe-projection profilometry based on two-dimensional empirical mode decomposition.

    Science.gov (United States)

    Zheng, Suzhen; Cao, Yiping

    2013-11-01

    In 3D shape measurement, because deformed fringes often contain low-frequency information degraded with random noise and background intensity information, a new fringe-projection profilometry is proposed based on 2D empirical mode decomposition (2D-EMD). The fringe pattern is first decomposed into numbers of intrinsic mode functions by 2D-EMD. Because the method has partial noise reduction, the background components can be removed to obtain the fundamental components needed to perform Hilbert transformation to retrieve the phase information. The 2D-EMD can effectively extract the modulation phase of a single direction fringe and an inclined fringe pattern because it is a full 2D analysis method and considers the relationship between adjacent lines of a fringe patterns. In addition, as the method does not add noise repeatedly, as does ensemble EMD, the data processing time is shortened. Computer simulations and experiments prove the feasibility of this method.

  1. Fault diagnosis of rolling bearings based on multifractal detrended fluctuation analysis and Mahalanobis distance criterion

    Science.gov (United States)

    Lin, Jinshan; Chen, Qian

    2013-07-01

    Vibration data of faulty rolling bearings are usually nonstationary and nonlinear, and contain fairly weak fault features. As a result, feature extraction of rolling bearing fault data is always an intractable problem and has attracted considerable attention for a long time. This paper introduces multifractal detrended fluctuation analysis (MF-DFA) to analyze bearing vibration data and proposes a novel method for fault diagnosis of rolling bearings based on MF-DFA and Mahalanobis distance criterion (MDC). MF-DFA, an extension of monofractal DFA, is a powerful tool for uncovering the nonlinear dynamical characteristics buried in nonstationary time series and can capture minor changes of complex system conditions. To begin with, by MF-DFA, multifractality of bearing fault data was quantified with the generalized Hurst exponent, the scaling exponent and the multifractal spectrum. Consequently, controlled by essentially different dynamical mechanisms, the multifractality of four heterogeneous bearing fault data is significantly different; by contrast, controlled by slightly different dynamical mechanisms, the multifractality of homogeneous bearing fault data with different fault diameters is significantly or slightly different depending on different types of bearing faults. Therefore, the multifractal spectrum, as a set of parameters describing multifractality of time series, can be employed to characterize different types and severity of bearing faults. Subsequently, five characteristic parameters sensitive to changes of bearing fault conditions were extracted from the multifractal spectrum and utilized to construct fault features of bearing fault data. Moreover, Hilbert transform based envelope analysis, empirical mode decomposition (EMD) and wavelet transform (WT) were utilized to study the same bearing fault data. Also, the kurtosis and the peak levels of the EMD or the WT component corresponding to the bearing tones in the frequency domain were carefully checked

  2. Frontotemporal Dysfunction in Amyotrophic Lateral Sclerosis: A Discriminant Function Analysis.

    Science.gov (United States)

    Nidos, Andreas; Kasselimis, Dimitrios S; Simos, Panagiotis G; Rentzos, Michael; Alexakis, Theodoros; Zalonis, Ioannis; Zouvelou, Vassiliki; Potagas, Constantin; Evdokimidis, Ioannis; Woolley, Susan C

    2016-01-01

    There is growing evidence for extramotor dysfunction (EMd) in amyotrophic lateral sclerosis (ALS), with a reported prevalence of up to 52%. In the present study, we explore the clinical utility of a brief neuropsychological battery for the investigation of cognitive, behavioral, and language deficits in patients with ALS. Thirty-four consecutive ALS patients aged 44-89 years were tested with a brief neuropsychological battery, including executive, behavioral, and language measures. Patients were initially classified as EMd or non-EMd based on their scores on the frontal assessment battery (FAB). Between-group comparisons revealed significant differences in all measures (p < 0.01). Discriminant analysis resulted in a single canonical function, with all tests serving as significant predictors. This function agreed with the FAB in 13 of 17 patients screened as EMd and identified extramotor deficits in 2 additional patients. Overall sensitivity and specificity estimates against FAB were 88.2%. We stress the importance of discriminant function analysis in clinical neuropsychological assessment and argue that the proposed neuropsychological battery may be of clinical value, especially when the option of extensive and comprehensive neuropsychological testing is limited. The psychometric validity of an ALS-frontotemporal dementia diagnosis using neuropsychological tests is also discussed. © 2015 S. Karger AG, Basel.

  3. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE

    Directory of Open Access Journals (Sweden)

    J. Ajayakumar

    2017-10-01

    Full Text Available With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE. SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the “North American storm complex” in December 2015, the “snowstorm Jonas” in January 2016, the “West Virginia floods” in June 2016, and the “Hurricane Matthew” in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with

  4. Normalization Strategies for Enhancing Spatio-Temporal Analysis of Social Media Responses during Extreme Events: A Case Study based on Analysis of Four Extreme Events using Socio-Environmental Data Explorer (SEDE)

    Science.gov (United States)

    Ajayakumar, J.; Shook, E.; Turner, V. K.

    2017-10-01

    With social media becoming increasingly location-based, there has been a greater push from researchers across various domains including social science, public health, and disaster management, to tap in the spatial, temporal, and textual data available from these sources to analyze public response during extreme events such as an epidemic outbreak or a natural disaster. Studies based on demographics and other socio-economic factors suggests that social media data could be highly skewed based on the variations of population density with respect to place. To capture the spatio-temporal variations in public response during extreme events we have developed the Socio-Environmental Data Explorer (SEDE). SEDE collects and integrates social media, news and environmental data to support exploration and assessment of public response to extreme events. For this study, using SEDE, we conduct spatio-temporal social media response analysis on four major extreme events in the United States including the "North American storm complex" in December 2015, the "snowstorm Jonas" in January 2016, the "West Virginia floods" in June 2016, and the "Hurricane Matthew" in October 2016. Analysis is conducted on geo-tagged social media data from Twitter and warnings from the storm events database provided by National Centers For Environmental Information (NCEI) for analysis. Results demonstrate that, to support complex social media analyses, spatial and population-based normalization and filtering is necessary. The implications of these results suggests that, while developing software solutions to support analysis of non-conventional data sources such as social media, it is quintessential to identify the inherent biases associated with the data sources, and adapt techniques and enhance capabilities to mitigate the bias. The normalization strategies that we have developed and incorporated to SEDE will be helpful in reducing the population bias associated with social media data and will be useful

  5. Risk and sensitivity analysis in relation to external events

    International Nuclear Information System (INIS)

    Alzbutas, R.; Urbonas, R.; Augutis, J.

    2001-01-01

    This paper presents risk and sensitivity analysis of external events impacts on the safe operation in general and in particular the Ignalina Nuclear Power Plant safety systems. Analysis is based on the deterministic and probabilistic assumptions and assessment of the external hazards. The real statistic data are used as well as initial external event simulation. The preliminary screening criteria are applied. The analysis of external event impact on the NPP safe operation, assessment of the event occurrence, sensitivity analysis, and recommendations for safety improvements are performed for investigated external hazards. Such events as aircraft crash, extreme rains and winds, forest fire and flying parts of the turbine are analysed. The models are developed and probabilities are calculated. As an example for sensitivity analysis the model of aircraft impact is presented. The sensitivity analysis takes into account the uncertainty features raised by external event and its model. Even in case when the external events analysis show rather limited danger, the sensitivity analysis can determine the highest influence causes. These possible variations in future can be significant for safety level and risk based decisions. Calculations show that external events cannot significantly influence the safety level of the Ignalina NPP operation, however the events occurrence and propagation can be sufficiently uncertain.(author)

  6. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    International Nuclear Information System (INIS)

    Han, G.; Lin, B.; Xu, Z.

    2017-01-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  7. Electrocardiogram signal denoising based on empirical mode decomposition technique: an overview

    Science.gov (United States)

    Han, G.; Lin, B.; Xu, Z.

    2017-03-01

    Electrocardiogram (ECG) signal is nonlinear and non-stationary weak signal which reflects whether the heart is functioning normally or abnormally. ECG signal is susceptible to various kinds of noises such as high/low frequency noises, powerline interference and baseline wander. Hence, the removal of noises from ECG signal becomes a vital link in the ECG signal processing and plays a significant role in the detection and diagnosis of heart diseases. The review will describe the recent developments of ECG signal denoising based on Empirical Mode Decomposition (EMD) technique including high frequency noise removal, powerline interference separation, baseline wander correction, the combining of EMD and Other Methods, EEMD technique. EMD technique is a quite potential and prospective but not perfect method in the application of processing nonlinear and non-stationary signal like ECG signal. The EMD combined with other algorithms is a good solution to improve the performance of noise cancellation. The pros and cons of EMD technique in ECG signal denoising are discussed in detail. Finally, the future work and challenges in ECG signal denoising based on EMD technique are clarified.

  8. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  9. Event analysis in a primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Jaerventausta, P; Paulasaari, H [Tampere Univ. of Technology (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    The target of the project was to develop applications which observe the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: communication with the SCADA, the internal clock to produce time stamped event data, and the capability to register certain values during the fault. Using the above features some new functions for event analysis were developed in the project

  10. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  11. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  12. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  13. External events analysis of the Ignalina Nuclear Power Plant

    International Nuclear Information System (INIS)

    Liaukonis, Mindaugas; Augutis, Juozas

    1999-01-01

    This paper presents analysis of external events impact on the safe operation of the Ignalina Nuclear Power Plant (INPP) safety systems. Analysis was based on the probabilistic estimation and modelling of the external hazards. The screening criteria were applied to the number of external hazards. The following external events such as aircraft failure on the INPP, external flooding, fire, extreme winds requiring further bounding study were analysed. Mathematical models were developed and event probabilities were calculated. External events analysis showed rather limited external events danger to Ignalina NPP. Results of the analysis were compared to analogous analysis in western NPPs and no great differences were specified. Calculations performed show that external events can not significantly influence the safety level of the Ignalina NPP operation. (author)

  14. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  15. Depth information in natural environments derived from optic flow by insect motion detection system: A model analysis

    Directory of Open Access Journals (Sweden)

    Alexander eSchwegmann

    2014-08-01

    Full Text Available Knowing the depth structure of the environment is crucial for moving animals in many behavioral contexts, such as collision avoidance, targeting objects, or spatial navigation. An important source of depth information is motion parallax. This powerful cue is generated on the eyes during translatory self-motion with the retinal images of nearby objects moving faster than those of distant ones. To investigate how the visual motion pathway represents motion-based depth information we analyzed its responses to image sequences recorded in natural cluttered environments with a wide range of depth structures. The analysis was done on the basis of an experimentally validated model of the visual motion pathway of insects, with its core elements being correlation-type elementary motion detectors (EMDs. It is the key result of our analysis that the absolute EMD responses, i.e. the motion energy profile, represent the contrast-weighted nearness of environmental structures during translatory self-motion at a roughly constant velocity. In other words, the output of the EMD array highlights contours of nearby objects. This conclusion is largely independent of the scale over which EMDs are spatially pooled and was corroborated by scrutinizing the motion energy profile after eliminating the depth structure from the natural image sequences. Hence, the well-established dependence of correlation-type EMDs on both velocity and textural properties of motion stimuli appears to be advantageous for representing behaviorally relevant information about the environment in a computationally parsimonious way.

  16. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  17. External event analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  18. A Key Event Path Analysis Approach for Integrated Systems

    Directory of Open Access Journals (Sweden)

    Jingjing Liao

    2012-01-01

    Full Text Available By studying the key event paths of probabilistic event structure graphs (PESGs, a key event path analysis approach for integrated system models is proposed. According to translation rules concluded from integrated system architecture descriptions, the corresponding PESGs are constructed from the colored Petri Net (CPN models. Then the definitions of cycle event paths, sequence event paths, and key event paths are given. Whereafter based on the statistic results after the simulation of CPN models, key event paths are found out by the sensitive analysis approach. This approach focuses on the logic structures of CPN models, which is reliable and could be the basis of structured analysis for discrete event systems. An example of radar model is given to characterize the application of this approach, and the results are worthy of trust.

  19. Dynamic Event Tree Analysis Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. A. Kinoshita; A. Naviglio

    2013-09-01

    Conventional Event-Tree (ET) based methodologies are extensively used as tools to perform reliability and safety assessment of complex and critical engineering systems. One of the disadvantages of these methods is that timing/sequencing of events and system dynamics is not explicitly accounted for in the analysis. In order to overcome these limitations several techniques, also know as Dynamic Probabilistic Risk Assessment (D-PRA), have been developed. Monte-Carlo (MC) and Dynamic Event Tree (DET) are two of the most widely used D-PRA methodologies to perform safety assessment of Nuclear Power Plants (NPP). In the past two years, the Idaho National Laboratory (INL) has developed its own tool to perform Dynamic PRA: RAVEN (Reactor Analysis and Virtual control ENvironment). RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other application including the ones based on the MOOSE framework, developed by INL as well. RAVEN performs two main tasks: 1) control logic driver for the new Thermo-Hydraulic code RELAP-7 and 2) post-processing tool. In the first task, RAVEN acts as a deterministic controller in which the set of control logic laws (user defined) monitors the RELAP-7 simulation and controls the activation of specific systems. Moreover, RAVEN also models stochastic events, such as components failures, and performs uncertainty quantification. Such stochastic modeling is employed by using both MC and DET algorithms. In the second task, RAVEN processes the large amount of data generated by RELAP-7 using data-mining based algorithms. This paper focuses on the first task and shows how it is possible to perform the analysis of dynamic stochastic systems using the newly developed RAVEN DET capability. As an example, the Dynamic PRA analysis, using Dynamic Event Tree, of a simplified pressurized water reactor for a Station Black-Out scenario is presented.

  20. Extreme flood event analysis in Indonesia based on rainfall intensity and recharge capacity

    Science.gov (United States)

    Narulita, Ida; Ningrum, Widya

    2018-02-01

    Indonesia is very vulnerable to flood disaster because it has high rainfall events throughout the year. Flood is categorized as the most important hazard disaster because it is causing social, economic and human losses. The purpose of this study is to analyze extreme flood event based on satellite rainfall dataset to understand the rainfall characteristic (rainfall intensity, rainfall pattern, etc.) that happened before flood disaster in the area for monsoonal, equatorial and local rainfall types. Recharge capacity will be analyzed using land cover and soil distribution. The data used in this study are CHIRPS rainfall satellite data on 0.05 ° spatial resolution and daily temporal resolution, and GSMap satellite rainfall dataset operated by JAXA on 1-hour temporal resolution and 0.1 ° spatial resolution, land use and soil distribution map for recharge capacity analysis. The rainfall characteristic before flooding, and recharge capacity analysis are expected to become the important information for flood mitigation in Indonesia.

  1. Time-frequency analysis : mathematical analysis of the empirical mode decomposition.

    Science.gov (United States)

    2009-01-01

    Invented over 10 years ago, empirical mode : decomposition (EMD) provides a nonlinear : time-frequency analysis with the ability to successfully : analyze nonstationary signals. Mathematical : Analysis of the Empirical Mode Decomposition : is a...

  2. Stochastic Learning of Multi-Instance Dictionary for Earth Mover's Distance based Histogram Comparison

    OpenAIRE

    Fan, Jihong; Liang, Ru-Ze

    2016-01-01

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover's distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stoc...

  3. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  4. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  5. A hybrid filtering method based on a novel empirical mode decomposition for friction signals

    International Nuclear Information System (INIS)

    Li, Chengwei; Zhan, Liwei

    2015-01-01

    During a measurement, the measured signal usually contains noise. To remove the noise and preserve the important feature of the signal, we introduce a hybrid filtering method that uses a new intrinsic mode function (NIMF) and a modified Hausdorff distance. The NIMF is defined as the difference between the noisy signal and each intrinsic mode function (IMF), which is obtained by empirical mode decomposition (EMD), ensemble EMD, complementary ensemble EMD, or complete ensemble EMD with adaptive noise (CEEMDAN). The relevant mode selecting is based on the similarity between the first NIMF and the rest of the NIMFs. With this filtering method, the EMD and improved versions are used to filter the simulation and friction signals. The friction signal between an airplane tire and the runaway is recorded during a simulated airplane touchdown and features spikes of various amplitudes and noise. The filtering effectiveness of the four hybrid filtering methods are compared and discussed. The results show that the filtering method based on CEEMDAN outperforms other signal filtering methods. (paper)

  6. Multiple timescale analysis and factor analysis of energy ecological footprint growth in China 1953-2006

    International Nuclear Information System (INIS)

    Chen Chengzhong; Lin Zhenshan

    2008-01-01

    Scientific analysis of energy consumption and its influencing factors is of great importance for energy strategy and policy planning. The energy consumption in China 1953-2006 is estimated by applying the energy ecological footprint (EEF) method, and the fluctuation periods of annual China's per capita EEF (EEF cpc ) growth rate are analyzed with the empirical mode decomposition (EMD) method in this paper. EEF intensity is analyzed to depict energy efficiency in China. The main timescales of the 37 factors that affect the annual growth rate of EEF cpc are also discussed based on EMD and factor analysis methods. Results show three obvious undulation cycles of the annual growth rate of EEF cpc , i.e., 4.6, 14.4 and 34.2 years over the last 53 years. The analysis findings from the common synthesized factors of IMF1, IMF2 and IMF3 timescales of the 37 factors suggest that China's energy policy-makers should attach more importance to stabilizing economic growth, optimizing industrial structure, regulating domestic petroleum exploitation and improving transportation efficiency

  7. Combination of canonical correlation analysis and empirical mode decomposition applied to denoising the labor electrohysterogram.

    Science.gov (United States)

    Hassan, Mahmoud; Boudaoud, Sofiane; Terrien, Jérémy; Karlsson, Brynjar; Marque, Catherine

    2011-09-01

    The electrohysterogram (EHG) is often corrupted by electronic and electromagnetic noise as well as movement artifacts, skeletal electromyogram, and ECGs from both mother and fetus. The interfering signals are sporadic and/or have spectra overlapping the spectra of the signals of interest rendering classical filtering ineffective. In the absence of efficient methods for denoising the monopolar EHG signal, bipolar methods are usually used. In this paper, we propose a novel combination of blind source separation using canonical correlation analysis (BSS_CCA) and empirical mode decomposition (EMD) methods to denoise monopolar EHG. We first extract the uterine bursts by using BSS_CCA then the biggest part of any residual noise is removed from the bursts by EMD. Our algorithm, called CCA_EMD, was compared with wavelet filtering and independent component analysis. We also compared CCA_EMD with the corresponding bipolar signals to demonstrate that the new method gives signals that have not been degraded by the new method. The proposed method successfully removed artifacts from the signal without altering the underlying uterine activity as observed by bipolar methods. The CCA_EMD algorithm performed considerably better than the comparison methods.

  8. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  9. Negated bio-events: analysis and identification

    Science.gov (United States)

    2013-01-01

    Background Negation occurs frequently in scientific literature, especially in biomedical literature. It has previously been reported that around 13% of sentences found in biomedical research articles contain negation. Historically, the main motivation for identifying negated events has been to ensure their exclusion from lists of extracted interactions. However, recently, there has been a growing interest in negative results, which has resulted in negation detection being identified as a key challenge in biomedical relation extraction. In this article, we focus on the problem of identifying negated bio-events, given gold standard event annotations. Results We have conducted a detailed analysis of three open access bio-event corpora containing negation information (i.e., GENIA Event, BioInfer and BioNLP’09 ST), and have identified the main types of negated bio-events. We have analysed the key aspects of a machine learning solution to the problem of detecting negated events, including selection of negation cues, feature engineering and the choice of learning algorithm. Combining the best solutions for each aspect of the problem, we propose a novel framework for the identification of negated bio-events. We have evaluated our system on each of the three open access corpora mentioned above. The performance of the system significantly surpasses the best results previously reported on the BioNLP’09 ST corpus, and achieves even better results on the GENIA Event and BioInfer corpora, both of which contain more varied and complex events. Conclusions Recently, in the field of biomedical text mining, the development and enhancement of event-based systems has received significant interest. The ability to identify negated events is a key performance element for these systems. We have conducted the first detailed study on the analysis and identification of negated bio-events. Our proposed framework can be integrated with state-of-the-art event extraction systems. The

  10. EMD self-adaptive selecting relevant modes algorithm for FBG spectrum signal

    Science.gov (United States)

    Chen, Yong; Wu, Chun-ting; Liu, Huan-lin

    2017-07-01

    Noise may reduce the demodulation accuracy of fiber Bragg grating (FBG) sensing signal so as to affect the quality of sensing detection. Thus, the recovery of a signal from observed noisy data is necessary. In this paper, a precise self-adaptive algorithm of selecting relevant modes is proposed to remove the noise of signal. Empirical mode decomposition (EMD) is first used to decompose a signal into a set of modes. The pseudo modes cancellation is introduced to identify and eliminate false modes, and then the Mutual Information (MI) of partial modes is calculated. MI is used to estimate the critical point of high and low frequency components. Simulation results show that the proposed algorithm estimates the critical point more accurately than the traditional algorithms for FBG spectral signal. While, compared to the similar algorithms, the signal noise ratio of the signal can be improved more than 10 dB after processing by the proposed algorithm, and correlation coefficient can be increased by 0.5, so it demonstrates better de-noising effect.

  11. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  12. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  13. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  14. Tissue artifact removal from respiratory signals based on empirical mode decomposition.

    Science.gov (United States)

    Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-05-01

    On-line measurement of respiration plays an important role in monitoring human physical activities. Such measurement commonly employs sensing belts secured around the rib cage and abdomen of the test object. Affected by the movement of body tissues, respiratory signals typically have a low signal-to-noise ratio. Removing tissue artifacts therefore is critical to ensuring effective respiration analysis. This paper presents a signal decomposition technique for tissue artifact removal from respiratory signals, based on the empirical mode decomposition (EMD). An algorithm based on the mutual information and power criteria was devised to automatically select appropriate intrinsic mode functions for tissue artifact removal and respiratory signal reconstruction. Performance of the EMD-algorithm was evaluated through simulations and real-life experiments (N = 105). Comparison with low-pass filtering that has been conventionally applied confirmed the effectiveness of the technique in tissue artifacts removal.

  15. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    International Nuclear Information System (INIS)

    Wang Wen-Bo; Zhang Xiao-Dong; Chang Yuchan; Wang Xiang-Li; Wang Zhao; Chen Xi; Zheng Lei

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. (paper)

  16. Attack Graph Construction for Security Events Analysis

    Directory of Open Access Journals (Sweden)

    Andrey Alexeevich Chechulin

    2014-09-01

    Full Text Available The paper is devoted to investigation of the attack graphs construction and analysis task for a network security evaluation and real-time security event processing. Main object of this research is the attack modeling process. The paper contains the description of attack graphs building, modifying and analysis technique as well as overview of implemented prototype for network security analysis based on attack graph approach.

  17. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  18. ANALYSIS OF EVENT TOURISM IN RUSSIA, ITS FUNCTIONS, WAYS TO IMPROVE THE EFFICIENCY OF EVENT

    Directory of Open Access Journals (Sweden)

    Mikhail Yur'evich Grushin

    2016-01-01

    Full Text Available This article considers one of the important directions of development of the national economy in the area of tourist services – development of event tourism in the Russian Federation. Today the market of event management in Russia is in the process of formation, therefore its impact on the socio-economic development of regions and Russia as a whole is minimal, and the analysis of the influence is not performed. This problem comes to the fore in the regions of Russia, specializing in the creation of event-direction tourist-recreational cluster. The article provides an analysis of the existing market of event management and event tourism functions. Providing the ways to improve the efficiency of event management and recommendations for the organizer of events in the regions. The article shows the specific role of event tourism in the national tourism and provides direction for the development of organizational and methodical recommendations on its formation in the regions of Russia and the creation of an effective management system at the regional level. The purpose of this article is to analyze the emerging in Russia event tourism market and its specifics. On the basis of these studies are considered folding patterns of the new market and the assessment of its impact on the modern national tourism industry. Methodology. To complete this article are used comparative and economic and statistical analysis methods. Conclusions/significance. The practical importance of this article is in the elimination of existing in the national tourism industry contradictions: on the one hand, in the Russian Federation is annually held a large amount events activities, including world-class in all regions say about tourist trips to the event, but the event tourism does not exist yet. In all regions, there is an internal and inbound tourism, but it has nothing to do with the event tourism. The article has a practical conclusions demonstrate the need to adapt the

  19. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  20. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  1. Implementing a nationwide criteria-based emergency medical dispatch system

    DEFF Research Database (Denmark)

    Andersen, Mikkel S; Johnsen, Søren Paaske; Sørensen, Jan Nørtved

    2013-01-01

    A criteria-based nationwide Emergency Medical Dispatch (EMD) system was recently implemented in Denmark. We described the system and studied its ability to triage patients according to the severity of their condition by analysing hospital admission and case-fatality risks.......A criteria-based nationwide Emergency Medical Dispatch (EMD) system was recently implemented in Denmark. We described the system and studied its ability to triage patients according to the severity of their condition by analysing hospital admission and case-fatality risks....

  2. Analysis of event-mode data with Interactive Data Language

    International Nuclear Information System (INIS)

    De Young, P.A.; Hilldore, B.B.; Kiessel, L.M.; Peaslee, G.F.

    2003-01-01

    We have developed an analysis package for event-mode data based on Interactive Data Language (IDL) from Research Systems Inc. This high-level language is high speed, array oriented, object oriented, and has extensive visual (multi-dimensional plotting) and mathematical functions. We have developed a general framework, written in IDL, for the analysis of a variety of experimental data that does not require significant customization for each analysis. Unlike many traditional analysis package, spectra and gates are applied after data are read and are easily changed as analysis proceeds without rereading the data. The events are not sequentially processed into predetermined arrays subject to predetermined gates

  3. Using variable transformations to perform common event analysis

    International Nuclear Information System (INIS)

    Worrell, R.B.

    1977-01-01

    Any analytical method for studying the effect of common events on the behavior of a system is considered as being a form of common event analysis. The particular common events that are involved often represent quite different phenomena, and this has led to the development of different kinds of common event analysis. For example, common mode failure analysis, common cause analysis, critical location analysis, etc., are all different kinds of common event analysis for which the common events involved represent different phenomena. However, the problem that must be solved for each of these different kinds of common event analysis is essentially the same: Determine the effect of common events on the behavior of a system. Thus, a technique that is useful in achieving one kind of common event analysis is often useful in achieving other kinds of common event analysis

  4. Gyroscope-driven mouse pointer with an EMOTIV® EEG headset and data analysis based on Empirical Mode Decomposition.

    Science.gov (United States)

    Rosas-Cholula, Gerardo; Ramirez-Cortes, Juan Manuel; Alarcon-Aquino, Vicente; Gomez-Gil, Pilar; Rangel-Magdaleno, Jose de Jesus; Reyes-Garcia, Carlos

    2013-08-14

    This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user's blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD). EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  5. The effect of automatic blink correction on auditory evoked potentials.

    Science.gov (United States)

    Korpela, Jussi; Vigário, Ricardo; Huotilainen, Minna

    2012-01-01

    The effects of blink correction on auditory event-related potential (ERP) waveforms is assessed. Two blink correction strategies are compared. ICA-SSP combines independent component analysis (ICA) with signal space projection (SSP) and ICA-EMD uses empirical mode decomposition (EMD) to improve the performance of the standard ICA method. Five voluntary subjects performed an auditory oddball task. The resulting ERPs are used to compare the two blink correction methods to each other and against blink rejection. The results suggest that both methods qualitatively preserve the ERP waveform but that they underestimate some of the peak amplitudes. ICA-EMD performs slightly better than ICA-SSP. In conclusion, the use of blink correction is justified, especially if blink rejection leads to severe data loss.

  6. Design and Analysis of Self-Healing Tree-Based Hybrid Spectral Amplitude Coding OCDMA System

    Directory of Open Access Journals (Sweden)

    Waqas A. Imtiaz

    2017-01-01

    Full Text Available This paper presents an efficient tree-based hybrid spectral amplitude coding optical code division multiple access (SAC-OCDMA system that is able to provide high capacity transmission along with fault detection and restoration throughout the passive optical network (PON. Enhanced multidiagonal (EMD code is adapted to elevate system’s performance, which negates multiple access interference and associated phase induced intensity noise through efficient two-matrix structure. Moreover, system connection availability is enhanced through an efficient protection architecture with tree and star-ring topology at the feeder and distribution level, respectively. The proposed hybrid architecture aims to provide seamless transmission of information at minimum cost. Mathematical model based on Gaussian approximation is developed to analyze performance of the proposed setup, followed by simulation analysis for validation. It is observed that the proposed system supports 64 subscribers, operating at the data rates of 2.5 Gbps and above. Moreover, survivability and cost analysis in comparison with existing schemes show that the proposed tree-based hybrid SAC-OCDMA system provides the required redundancy at minimum cost of infrastructure and operation.

  7. Bio-inspired motion detection in an FPGA-based smart camera module

    International Nuclear Information System (INIS)

    Koehler, T; Roechter, F; Moeller, R; Lindemann, J P

    2009-01-01

    Flying insects, despite their relatively coarse vision and tiny nervous system, are capable of carrying out elegant and fast aerial manoeuvres. Studies of the fly visual system have shown that this is accomplished by the integration of signals from a large number of elementary motion detectors (EMDs) in just a few global flow detector cells. We developed an FPGA-based smart camera module with more than 10 000 single EMDs, which is closely modelled after insect motion-detection circuits with respect to overall architecture, resolution and inter-receptor spacing. Input to the EMD array is provided by a CMOS camera with a high frame rate. Designed as an adaptable solution for different engineering applications and as a testbed for biological models, the EMD detector type and parameters such as the EMD time constants, the motion-detection directions and the angle between correlated receptors are reconfigurable online. This allows a flexible and simultaneous detection of complex motion fields such as translation, rotation and looming, such that various tasks, e.g., obstacle avoidance, height/distance control or speed regulation can be performed by the same compact device

  8. A novel energy conversion based method for velocity correction in molecular dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Hanhui [School of Aeronautics and Astronautics, Zhejiang University, Hangzhou 310027 (China); Collaborative Innovation Center of Advanced Aero-Engine, Hangzhou 310027 (China); Liu, Ningning [School of Aeronautics and Astronautics, Zhejiang University, Hangzhou 310027 (China); Ku, Xiaoke, E-mail: xiaokeku@zju.edu.cn [School of Aeronautics and Astronautics, Zhejiang University, Hangzhou 310027 (China); Fan, Jianren [State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou 310027 (China)

    2017-05-01

    Molecular dynamics (MD) simulation has become an important tool for studying micro- or nano-scale dynamics and the statistical properties of fluids and solids. In MD simulations, there are mainly two approaches: equilibrium and non-equilibrium molecular dynamics (EMD and NEMD). In this paper, a new energy conversion based correction (ECBC) method for MD is developed. Unlike the traditional systematic correction based on macroscopic parameters, the ECBC method is developed strictly based on the physical interaction processes between the pair of molecules or atoms. The developed ECBC method can apply to EMD and NEMD directly. While using MD with this method, the difference between the EMD and NEMD is eliminated, and no macroscopic parameters such as external imposed potentials or coefficients are needed. With this method, many limits of using MD are lifted. The application scope of MD is greatly extended.

  9. A novel energy conversion based method for velocity correction in molecular dynamics simulations

    International Nuclear Information System (INIS)

    Jin, Hanhui; Liu, Ningning; Ku, Xiaoke; Fan, Jianren

    2017-01-01

    Molecular dynamics (MD) simulation has become an important tool for studying micro- or nano-scale dynamics and the statistical properties of fluids and solids. In MD simulations, there are mainly two approaches: equilibrium and non-equilibrium molecular dynamics (EMD and NEMD). In this paper, a new energy conversion based correction (ECBC) method for MD is developed. Unlike the traditional systematic correction based on macroscopic parameters, the ECBC method is developed strictly based on the physical interaction processes between the pair of molecules or atoms. The developed ECBC method can apply to EMD and NEMD directly. While using MD with this method, the difference between the EMD and NEMD is eliminated, and no macroscopic parameters such as external imposed potentials or coefficients are needed. With this method, many limits of using MD are lifted. The application scope of MD is greatly extended.

  10. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  11. α-Cut method based importance measure for criticality analysis in fuzzy probability – Based fault tree analysis

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Widodo, Surip; Tjahjono, Hendro

    2017-01-01

    Highlights: •FPFTA deals with epistemic uncertainty using fuzzy probability. •Criticality analysis is important for reliability improvement. •An α-cut method based importance measure is proposed for criticality analysis in FPFTA. •The α-cut method based importance measure utilises α-cut multiplication, α-cut subtraction, and area defuzzification technique. •Benchmarking confirm that the proposed method is feasible for criticality analysis in FPFTA. -- Abstract: Fuzzy probability – based fault tree analysis (FPFTA) has been recently developed and proposed to deal with the limitations of conventional fault tree analysis. In FPFTA, reliabilities of basic events, intermediate events and top event are characterized by fuzzy probabilities. Furthermore, the quantification of the FPFTA is based on fuzzy multiplication rule and fuzzy complementation rule to propagate uncertainties from basic event to the top event. Since the objective of the fault tree analysis is to improve the reliability of the system being evaluated, it is necessary to find the weakest path in the system. For this purpose, criticality analysis can be implemented. Various importance measures, which are based on conventional probabilities, have been developed and proposed for criticality analysis in fault tree analysis. However, not one of those importance measures can be applied for criticality analysis in FPFTA, which is based on fuzzy probability. To be fully applied in nuclear power plant probabilistic safety assessment, FPFTA needs to have its corresponding importance measure. The objective of this study is to develop an α-cut method based importance measure to evaluate and rank the importance of basic events for criticality analysis in FPFTA. To demonstrate the applicability of the proposed measure, a case study is performed and its results are then benchmarked to the results generated by the four well known importance measures in conventional fault tree analysis. The results

  12. Gyroscope-Driven Mouse Pointer with an EMOTIV® EEG Headset and Data Analysis Based on Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    Carlos Reyes-Garcia

    2013-08-01

    Full Text Available This paper presents a project on the development of a cursor control emulating the typical operations of a computer-mouse, using gyroscope and eye-blinking electromyographic signals which are obtained through a commercial 16-electrode wireless headset, recently released by Emotiv. The cursor position is controlled using information from a gyroscope included in the headset. The clicks are generated through the user’s blinking with an adequate detection procedure based on the spectral-like technique called Empirical Mode Decomposition (EMD. EMD is proposed as a simple and quick computational tool, yet effective, aimed to artifact reduction from head movements as well as a method to detect blinking signals for mouse control. Kalman filter is used as state estimator for mouse position control and jitter removal. The detection rate obtained in average was 94.9%. Experimental setup and some obtained results are presented.

  13. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  14. Identification of fire modeling issues based on an analysis of real events from the OECD FIRE database

    Energy Technology Data Exchange (ETDEWEB)

    Hermann, Dominik [Swiss Federal Nuclear Safety Inspectorate ENSI, Brugg (Switzerland)

    2017-03-15

    Precursor analysis is widely used in the nuclear industry to judge the significance of events relevant to safety. However, in case of events that may damage equipment through effects that are not ordinary functional dependencies, the analysis may not always fully appreciate the potential for further evolution of the event. For fires, which are one class of such events, this paper discusses modelling challenges that need to be overcome when performing a probabilistic precursor analysis. The events used to analyze are selected from the Organisation for Economic Cooperation and Development (OECD) Fire Incidents Records Exchange (FIRE) Database.

  15. Human reliability analysis using event trees

    International Nuclear Information System (INIS)

    Heslinga, G.

    1983-01-01

    The shut-down procedure of a technologically complex installation as a nuclear power plant consists of a lot of human actions, some of which have to be performed several times. The procedure is regarded as a chain of modules of specific actions, some of which are analyzed separately. The analysis is carried out by making a Human Reliability Analysis event tree (HRA event tree) of each action, breaking down each action into small elementary steps. The application of event trees in human reliability analysis implies more difficulties than in the case of technical systems where event trees were mainly used until now. The most important reason is that the operator is able to recover a wrong performance; memory influences play a significant role. In this study these difficulties are dealt with theoretically. The following conclusions can be drawn: (1) in principle event trees may be used in human reliability analysis; (2) although in practice the operator will recover his fault partly, theoretically this can be described as starting the whole event tree again; (3) compact formulas have been derived, by which the probability of reaching a specific failure consequence on passing through the HRA event tree after several times of recovery is to be calculated. (orig.)

  16. Pattern recognition based on time-frequency analysis and convolutional neural networks for vibrational events in φ-OTDR

    Science.gov (United States)

    Xu, Chengjin; Guan, Junjun; Bao, Ming; Lu, Jiangang; Ye, Wei

    2018-01-01

    Based on vibration signals detected by a phase-sensitive optical time-domain reflectometer distributed optical fiber sensing system, this paper presents an implement of time-frequency analysis and convolutional neural network (CNN), used to classify different types of vibrational events. First, spectral subtraction and the short-time Fourier transform are used to enhance time-frequency features of vibration signals and transform different types of vibration signals into spectrograms, which are input to the CNN for automatic feature extraction and classification. Finally, by replacing the soft-max layer in the CNN with a multiclass support vector machine, the performance of the classifier is enhanced. Experiments show that after using this method to process 4000 vibration signal samples generated by four different vibration events, namely, digging, walking, vehicles passing, and damaging, the recognition rates of vibration events are over 90%. The experimental results prove that this method can automatically make an effective feature selection and greatly improve the classification accuracy of vibrational events in distributed optical fiber sensing systems.

  17. HiggsToFourLeptonsEV in the ATLAS EventView Analysis Framework

    CERN Document Server

    Lagouri, T; Del Peso, J

    2008-01-01

    ATLAS is one of the four experiments at the Large Hadron Collider (LHC) at CERN. This experiment has been designed to study a large range of physics topics, including searches for previously unobserved phenomena such as the Higgs Boson and super-symmetry. The physics analysis package HiggsToFourLeptonsEV for the Standard Model (SM) Higgs to four leptons channel with ATLAS is presented. The physics goal is to investigate with the ATLAS detector, the SM Higgs boson discovery potential through its observation in the four-lepton (electron and muon) final state. HiggsToFourLeptonsEV is based on the official ATLAS software ATHENA and the EventView (EV) analysis framework. EventView is a highly flexible and modular analysis framework in ATHENA and it is one of several analysis schemes for ATLAS physics user analysis. At the core of the EventView is the representative "view" of an event, which defines the contents of event data suitable for event-level physics analysis. The HiggsToFourLeptonsEV package, presented in ...

  18. Trackside acoustic diagnosis of axle box bearing based on kurtosis-optimization wavelet denoising

    Science.gov (United States)

    Peng, Chaoyong; Gao, Xiaorong; Peng, Jianping; Wang, Ai

    2018-04-01

    As one of the key components of railway vehicles, the operation condition of the axle box bearing has a significant effect on traffic safety. The acoustic diagnosis is more suitable than vibration diagnosis for trackside monitoring. The acoustic signal generated by the train axle box bearing is an amplitude modulation and frequency modulation signal with complex train running noise. Although empirical mode decomposition (EMD) and some improved time-frequency algorithms have proved to be useful in bearing vibration signal processing, it is hard to extract the bearing fault signal from serious trackside acoustic background noises by using those algorithms. Therefore, a kurtosis-optimization-based wavelet packet (KWP) denoising algorithm is proposed, as the kurtosis is the key indicator of bearing fault signal in time domain. Firstly, the geometry based Doppler correction is applied to signals of each sensor, and with the signal superposition of multiple sensors, random noises and impulse noises, which are the interference of the kurtosis indicator, are suppressed. Then, the KWP is conducted. At last, the EMD and Hilbert transform is applied to extract the fault feature. Experiment results indicate that the proposed method consisting of KWP and EMD is superior to the EMD.

  19. Analysis of Paks NPP Personnel Activity during Safety Related Event Sequences

    International Nuclear Information System (INIS)

    Bareith, A.; Hollo, Elod; Karsa, Z.; Nagy, S.

    1998-01-01

    Within the AGNES Project (Advanced Generic and New Evaluation of Safety) the Level-1 PSA model of the Paks NPP Unit 3 was developed in form of a detailed event tree/fault tree structure (53 initiating events, 580 event sequences, 6300 basic events are involved). This model gives a good basis for quantitative evaluation of potential consequences of actually occurred safety-related events, i.e. for precursor event studies. To make these studies possible and efficient, the current qualitative event analysis practice should be reviewed and a new additional quantitative analysis procedure and system should be developed and applied. The present paper gives an overview of the method outlined for both qualitative and quantitative analyses of the operator crew activity during off-normal situations. First, the operator performance experienced during past operational events is discussed. Sources of raw information, the qualitative evaluation process, the follow-up actions, as well as the documentation requirements are described. Second, the general concept of the proposed precursor event analysis is described. Types of modeled interactions and the considered performance influences are presented. The quantification of the potential consequences of the identified precursor events is based on the task-oriented, Level-1 PSA model of the plant unit. A precursor analysis system covering the evaluation of operator activities is now under development. Preliminary results gained during a case study evaluation of a past historical event are presented. (authors)

  20. IMF-Slices for GPR Data Processing Using Variational Mode Decomposition Method

    Directory of Open Access Journals (Sweden)

    Xuebing Zhang

    2018-03-01

    Full Text Available Using traditional time-frequency analysis methods, it is possible to delineate the time-frequency structures of ground-penetrating radar (GPR data. A series of applications based on time-frequency analysis were proposed for the GPR data processing and imaging. With respect to signal processing, GPR data are typically non-stationary, which limits the applications of these methods moving forward. Empirical mode decomposition (EMD provides alternative solutions with a fresh perspective. With EMD, GPR data are decomposed into a set of sub-components, i.e., the intrinsic mode functions (IMFs. However, the mode-mixing effect may also bring some negatives. To utilize the IMFs’ benefits, and avoid the negatives of the EMD, we introduce a new decomposition scheme termed variational mode decomposition (VMD for GPR data processing for imaging. Based on the decomposition results of the VMD, we propose a new method which we refer as “the IMF-slice”. In the proposed method, the IMFs are generated by the VMD trace by trace, and then each IMF is sorted and recorded into different profiles (i.e., the IMF-slices according to its center frequency. Using IMF-slices, the GPR data can be divided into several IMF-slices, each of which delineates a main vibration mode, and some subsurface layers and geophysical events can be identified more clearly. The effectiveness of the proposed method is tested using synthetic benchmark signals, laboratory data and the field dataset.

  1. Stochastic learning of multi-instance dictionary for earth mover’s distance-based histogram comparison

    KAUST Repository

    Fan, Jihong

    2016-09-17

    Dictionary plays an important role in multi-instance data representation. It maps bags of instances to histograms. Earth mover’s distance (EMD) is the most effective histogram distance metric for the application of multi-instance retrieval. However, up to now, there is no existing multi-instance dictionary learning methods designed for EMD-based histogram comparison. To fill this gap, we develop the first EMD-optimal dictionary learning method using stochastic optimization method. In the stochastic learning framework, we have one triplet of bags, including one basic bag, one positive bag, and one negative bag. These bags are mapped to histograms using a multi-instance dictionary. We argue that the EMD between the basic histogram and the positive histogram should be smaller than that between the basic histogram and the negative histogram. Base on this condition, we design a hinge loss. By minimizing this hinge loss and some regularization terms of the dictionary, we update the dictionary instances. The experiments over multi-instance retrieval applications shows its effectiveness when compared to other dictionary learning methods over the problems of medical image retrieval and natural language relation classification. © 2016 The Natural Computing Applications Forum

  2. Efficient Information Hiding Based on Theory of Numbers

    Directory of Open Access Journals (Sweden)

    Yanjun Liu

    2018-01-01

    Full Text Available Data hiding is an efficient technique that conceals secret data into a digital medium. In 2006, Zhang and Wang proposed a data hiding scheme called exploiting modification direction (EMD which has become a milestone in the field of data hiding. In recent years, many EMD-type data hiding schemes have been developed, but their embedding capacity remains restricted. In this paper, a novel data hiding scheme based on the combination of Chinese remainder theorem (CRT and a new extraction function is proposed. By the proposed scheme, the cover image is divided into non-overlapping pixel groups for embedding to increase the embedding capacity. Experimental results show that the embedding capacity of the proposed scheme is significantly higher (greater than 2.5 bpp than previously proposed schemes while ensuring very good visual quality of the stego image. In addition, security analysis is given to show that the proposed scheme can resist visual attack.

  3. Benchmarking of a T-wave alternans detection method based on empirical mode decomposition.

    Science.gov (United States)

    Blanco-Velasco, Manuel; Goya-Esteban, Rebeca; Cruz-Roldán, Fernando; García-Alberola, Arcadi; Rojo-Álvarez, José Luis

    2017-07-01

    T-wave alternans (TWA) is a fluctuation of the ST-T complex occurring on an every-other-beat basis of the surface electrocardiogram (ECG). It has been shown to be an informative risk stratifier for sudden cardiac death, though the lack of gold standard to benchmark detection methods has promoted the use of synthetic signals. This work proposes a novel signal model to study the performance of a TWA detection. Additionally, the methodological validation of a denoising technique based on empirical mode decomposition (EMD), which is used here along with the spectral method, is also tackled. The proposed test bed system is based on the following guidelines: (1) use of open source databases to enable experimental replication; (2) use of real ECG signals and physiological noise; (3) inclusion of randomized TWA episodes. Both sensitivity (Se) and specificity (Sp) are separately analyzed. Also a nonparametric hypothesis test, based on Bootstrap resampling, is used to determine whether the presence of the EMD block actually improves the performance. The results show an outstanding specificity when the EMD block is used, even in very noisy conditions (0.96 compared to 0.72 for SNR = 8 dB), being always superior than that of the conventional SM alone. Regarding the sensitivity, using the EMD method also outperforms in noisy conditions (0.57 compared to 0.46 for SNR=8 dB), while it decreases in noiseless conditions. The proposed test setting designed to analyze the performance guarantees that the actual physiological variability of the cardiac system is reproduced. The use of the EMD-based block in noisy environment enables the identification of most patients with fatal arrhythmias. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  5. Multi-scale analysis of teleconnection indices: climate noise and nonlinear trend analysis

    Directory of Open Access Journals (Sweden)

    C. Franzke

    2009-02-01

    Full Text Available The multi-scale nature and climate noise properties of teleconnection indices are examined by using the Empirical Mode Decomposition (EMD procedure. The EMD procedure allows for the analysis of non-stationary time series to extract physically meaningful intrinsic mode functions (IMF and nonlinear trends. The climatologically relevant monthly mean teleconnection indices of the North Atlantic Oscillation (NAO, the North Pacific index (NP and the Southern Annular Mode (SAM are analyzed.

    The significance of IMFs and trends are tested against the null hypothesis of climate noise. The analysis of surrogate monthly mean time series from a red noise process shows that the EMD procedure is effectively a dyadic filter bank and the IMFs (except the first IMF are nearly Gaussian distributed. The distribution of the variance contained in IMFs of an ensemble of AR(1 simulations is nearly χ2 distributed. To test the statistical significance of the IMFs of the teleconnection indices and their nonlinear trends we utilize an ensemble of corresponding monthly averaged AR(1 processes, which we refer to as climate noise. Our results indicate that most of the interannual and decadal variability of the analysed teleconnection indices cannot be distinguished from climate noise. The NP and SAM indices have significant nonlinear trends, while the NAO has no significant trend when tested against a climate noise hypothesis.

  6. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  7. Collecting operational event data for statistical analysis

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-09-01

    This report gives guidance for collecting operational data to be used for statistical analysis, especially analysis of event counts. It discusses how to define the purpose of the study, the unit (system, component, etc.) to be studied, events to be counted, and demand or exposure time. Examples are given of classification systems for events in the data sources. A checklist summarizes the essential steps in data collection for statistical analysis

  8. Event Shape Analysis in ALICE

    CERN Document Server

    AUTHOR|(CDS)2073367; Paic, Guy

    2009-01-01

    The jets are the final state manifestation of the hard parton scattering. Since at LHC energies the production of hard processes in proton-proton collisions will be copious and varied, it is important to develop methods to identify them through the study of their final states. In the present work we describe a method based on the use of some shape variables to discriminate events according their topologies. A very attractive feature of this analysis is the possibility of using the tracking information of the TPC+ITS in order to identify specific events like jets. Through the correlation between the quantities: thrust and recoil, calculated in minimum bias simulations of proton-proton collisions at 10 TeV, we show the sensitivity of the method to select specific topologies and high multiplicity. The presented results were obtained both at level generator and after reconstruction. It remains that with any kind of jet reconstruction algorithm one will confronted in general with overlapping jets. The present meth...

  9. EVENT PLANNING USING FUNCTION ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Lori Braase; Jodi Grgich

    2011-06-01

    Event planning is expensive and resource intensive. Function analysis provides a solid foundation for comprehensive event planning (e.g., workshops, conferences, symposiums, or meetings). It has been used at Idaho National Laboratory (INL) to successfully plan events and capture lessons learned, and played a significant role in the development and implementation of the “INL Guide for Hosting an Event.” Using a guide and a functional approach to planning utilizes resources more efficiently and reduces errors that could be distracting or detrimental to an event. This integrated approach to logistics and program planning – with the primary focus on the participant – gives us the edge.

  10. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  11. Tracking the evolution of stream DOM source during storm events using end member mixing analysis based on DOM quality

    Science.gov (United States)

    Yang, Liyang; Chang, Soon-Woong; Shin, Hyun-Sang; Hur, Jin

    2015-04-01

    The source of river dissolved organic matter (DOM) during storm events has not been well constrained, which is critical in determining the quality and reactivity of DOM. This study assessed temporal changes in the contributions of four end members (weeds, leaf litter, soil, and groundwater), which exist in a small forested watershed (the Ehwa Brook, South Korea), to the stream DOM during two storm events, using end member mixing analysis (EMMA) based on spectroscopic properties of DOM. The instantaneous export fluxes of dissolved organic carbon (DOC), chromophoric DOM (CDOM), and fluorescent components were all enhanced during peak flows. The DOC concentration increased with the flow rate, while CDOM and humic-like fluorescent components were diluted around the peak flows. Leaf litter was dominant for the DOM source in event 2 with a higher rainfall, although there were temporal variations in the contributions of the four end members to the stream DOM for both events. The contribution of leaf litter peaked while that of deeper soils decreased to minima at peak flows. Our results demonstrated that EMMA based on DOM properties could be used to trace the DOM source, which is of fundamental importance for understanding the factors responsible for river DOM dynamics during storm events.

  12. Quasi-bivariate variational mode decomposition as a tool of scale analysis in wall-bounded turbulence

    Science.gov (United States)

    Wang, Wenkang; Pan, Chong; Wang, Jinjun

    2018-01-01

    The identification and separation of multi-scale coherent structures is a critical task for the study of scale interaction in wall-bounded turbulence. Here, we propose a quasi-bivariate variational mode decomposition (QB-VMD) method to extract structures with various scales from instantaneous two-dimensional (2D) velocity field which has only one primary dimension. This method is developed from the one-dimensional VMD algorithm proposed by Dragomiretskiy and Zosso (IEEE Trans Signal Process 62:531-544, 2014) to cope with a quasi-2D scenario. It poses the feature of length-scale bandwidth constraint along the decomposed dimension, together with the central frequency re-balancing along the non-decomposed dimension. The feasibility of this method is tested on both a synthetic flow field and a turbulent boundary layer at moderate Reynolds number (Re_{τ } = 3458) measured by 2D particle image velocimetry (PIV). Some other popular scale separation tools, including pseudo-bi-dimensional empirical mode decomposition (PB-EMD), bi-dimensional EMD (B-EMD) and proper orthogonal decomposition (POD), are also tested for comparison. Among all these methods, QB-VMD shows advantages in both scale characterization and energy recovery. More importantly, the mode mixing problem, which degrades the performance of EMD-based methods, is avoided or minimized in QB-VMD. Finally, QB-VMD analysis of the wall-parallel plane in the log layer (at y/δ = 0.12) of the studied turbulent boundary layer shows the coexistence of large- or very large-scale motions (LSMs or VLSMs) and inner-scaled structures, which can be fully decomposed in both physical and spectral domains.

  13. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  14. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  15. Preliminary safety analysis of unscrammed events for KLFR

    International Nuclear Information System (INIS)

    Kim, S.J.; Ha, G.S.

    2005-01-01

    The report presents the design features of KLFR; Safety Analysis Code; steady-state calculation results and analysis results of unscrammed events. The calculations of the steady-state and unscrammed events have been performed for the conceptual design of KLFR using SSC-K code. UTOP event results in no fuel damage and no centre-line melting. The inherent safety features are demonstrated through the analysis of ULOHS event. Although the analysis of ULOF has much uncertainties in the pump design, the analysis results show the inherent safety characteristics. 6% flow of rated flow of natural circulation is formed in the case of ULOF. In the metallic fuel rod, the cladding temperature is somewhat high due to the low heat transfer coefficient of lead. ULOHS event should be considered in design of RVACS for long-term cooling

  16. Analysis of external events - Nuclear Power Plant Dukovany

    International Nuclear Information System (INIS)

    Hladky, Milan

    2000-01-01

    PSA of external events at level 1 covers internal events, floods, fires, other external events are not included yet. Shutdown PSA takes into account internal events, floods, fires, heavy load drop, other external events are not included yet. Final safety analysis report was conducted after 10 years of operation for all Dukovany operational units. Probabilistic approach was used for analysis of aircraft drop and external man-induced events. The risk caused by man-induced events was found to be negligible and was accepted by State Office for Nuclear Safety (SONS)

  17. GPR Signal Denoising and Target Extraction With the CEEMD Method

    KAUST Repository

    Li, Jing

    2015-04-17

    In this letter, we apply a time and frequency analysis method based on the complete ensemble empirical mode decomposition (CEEMD) method in ground-penetrating radar (GPR) signal processing. It decomposes the GPR signal into a sum of oscillatory components, with guaranteed positive and smoothly varying instantaneous frequencies. The key idea of this method relies on averaging the modes obtained by empirical mode decomposition (EMD) applied to several realizations of Gaussian white noise added to the original signal. It can solve the mode-mixing problem in the EMD method and improve the resolution of ensemble EMD (EEMD) when the signal has a low signal-to-noise ratio. First, we analyze the difference between the basic theory of EMD, EEMD, and CEEMD. Then, we compare the time and frequency analysis with Hilbert-Huang transform to test the results of different methods. The synthetic and real GPR data demonstrate that CEEMD promises higher spectral-spatial resolution than the other two EMD methods in GPR signal denoising and target extraction. Its decomposition is complete, with a numerically negligible error.

  18. Optimum IMFs Selection Based Envelope Analysis of Bearing Fault Diagnosis in Plunger Pump

    Directory of Open Access Journals (Sweden)

    Wenliao Du

    2016-01-01

    Full Text Available As the plunger pump always works in a complicated environment and the hydraulic cycle has an intrinsic fluid-structure interaction character, the fault information is submerged in the noise and the disturbance impact signals. For the fault diagnosis of the bearings in plunger pump, an optimum intrinsic mode functions (IMFs selection based envelope analysis was proposed. Firstly, the Wigner-Ville distribution was calculated for the acquired vibration signals, and the resonance frequency brought on by fault was obtained. Secondly, the empirical mode decomposition (EMD was employed for the vibration signal, and the optimum IMFs and the filter bandwidth were selected according to the Wigner-Ville distribution. Finally, the envelope analysis was utilized for the selected IMFs filtered by the band pass filter, and the fault type was recognized by compared with the bearing character frequencies. For the two modes, inner race fault and compound fault in the inner race and roller of rolling element bearing in plunger pump, the experiments show that a promising result is achieved.

  19. Joint Attributes and Event Analysis for Multimedia Event Detection.

    Science.gov (United States)

    Ma, Zhigang; Chang, Xiaojun; Xu, Zhongwen; Sebe, Nicu; Hauptmann, Alexander G

    2017-06-15

    Semantic attributes have been increasingly used the past few years for multimedia event detection (MED) with promising results. The motivation is that multimedia events generally consist of lower level components such as objects, scenes, and actions. By characterizing multimedia event videos with semantic attributes, one could exploit more informative cues for improved detection results. Much existing work obtains semantic attributes from images, which may be suboptimal for video analysis since these image-inferred attributes do not carry dynamic information that is essential for videos. To address this issue, we propose to learn semantic attributes from external videos using their semantic labels. We name them video attributes in this paper. In contrast with multimedia event videos, these external videos depict lower level contents such as objects, scenes, and actions. To harness video attributes, we propose an algorithm established on a correlation vector that correlates them to a target event. Consequently, we could incorporate video attributes latently as extra information into the event detector learnt from multimedia event videos in a joint framework. To validate our method, we perform experiments on the real-world large-scale TRECVID MED 2013 and 2014 data sets and compare our method with several state-of-the-art algorithms. The experiments show that our method is advantageous for MED.

  20. Extreme events in total ozone over the Northern mid-latitudes: an analysis based on long-term data sets from five European ground-based stations

    Energy Technology Data Exchange (ETDEWEB)

    Rieder, Harald E. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland)), e-mail: hr2302@columbia.edu; Jancso, Leonhardt M. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Inst. for Meteorology and Geophysics, Univ. of Innsbruck, Innsbruck (Austria)); Di Rocco, Stefania (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Dept. of Geography, Univ. of Zurich, Zurich (Switzerland)) (and others)

    2011-11-15

    We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear 'fingerprints' of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector

  1. Fault feature analysis of cracked gear based on LOD and analytical-FE method

    Science.gov (United States)

    Wu, Jiateng; Yang, Yu; Yang, Xingkai; Cheng, Junsheng

    2018-01-01

    At present, there are two main ideas for gear fault diagnosis. One is the model-based gear dynamic analysis; the other is signal-based gear vibration diagnosis. In this paper, a method for fault feature analysis of gear crack is presented, which combines the advantages of dynamic modeling and signal processing. Firstly, a new time-frequency analysis method called local oscillatory-characteristic decomposition (LOD) is proposed, which has the attractive feature of extracting fault characteristic efficiently and accurately. Secondly, an analytical-finite element (analytical-FE) method which is called assist-stress intensity factor (assist-SIF) gear contact model, is put forward to calculate the time-varying mesh stiffness (TVMS) under different crack states. Based on the dynamic model of the gear system with 6 degrees of freedom, the dynamic simulation response was obtained for different tooth crack depths. For the dynamic model, the corresponding relation between the characteristic parameters and the degree of the tooth crack is established under a specific condition. On the basis of the methods mentioned above, a novel gear tooth root crack diagnosis method which combines the LOD with the analytical-FE is proposed. Furthermore, empirical mode decomposition (EMD) and ensemble empirical mode decomposition (EEMD) are contrasted with the LOD by gear crack fault vibration signals. The analysis results indicate that the proposed method performs effectively and feasibility for the tooth crack stiffness calculation and the gear tooth crack fault diagnosis.

  2. Component-Based Data-Driven Predictive Maintenance to Reduce Unscheduled Maintenance Events

    NARCIS (Netherlands)

    Verhagen, W.J.C.; Curran, R.; de Boer, L.W.M.; Chen, C.H.; Trappey, A.C.; Peruzzini, M.; Stjepandić, J.; Wognum, N.

    2017-01-01

    Costs associated with unscheduled and preventive maintenance can contribute significantly to an airline's expenditure. Reliability analysis can help to identify and plan for maintenance events. Reliability analysis in industry is often limited to statistically based

  3. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  4. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  5. Emergency Load Shedding Strategy Based on Sensitivity Analysis of Relay Operation Margin against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Sun, Haishun Sun

    2012-01-01

    the runtime emergent states of related system component. Based on sensitivity analysis between the relay operation margin and power system state variables, an optimal load shedding strategy is applied to adjust the emergent states timely before the unwanted relay operation. Load dynamics is also taken...... into account to compensate load shedding amount calculation. And the multi-agent technology is applied for the whole strategy implementation. A test system is built in real time digital simulator (RTDS) and has demonstrated the effectiveness of the proposed strategy.......In order to prevent long term voltage instability and induced cascading events, a load shedding strategy based on the sensitivity of relay operation margin to load powers is discussed and proposed in this paper. The operation margin of critical impedance backup relay is defined to identify...

  6. External events analysis for experimental fusion facilities

    International Nuclear Information System (INIS)

    Cadwallader, L.C.

    1990-01-01

    External events are those off-normal events that threaten facilities either from outside or inside the building. These events, such as floods, fires, and earthquakes, are among the leading risk contributors for fission power plants, and the nature of fusion facilities indicates that they may also lead fusion risk. This paper gives overviews of analysis methods, references good analysis guidance documents, and gives design tips for mitigating the effects of floods and fires, seismic events, and aircraft impacts. Implications for future fusion facility siting are also discussed. Sites similar to fission plant sites are recommended. 46 refs

  7. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  8. Analysis of events occurred at overseas nuclear power plants in 2004

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Nishioka, Hiromasa; Sato, Masahiro; Chiba, Gorou; Takagawa, Kenichi; Shimada, Hiroki

    2005-01-01

    The Institute of Nuclear Safety Systems (INSS) investigates the information related to events and incidents occurred at overseas nuclear power plants, and proposes recommendations for the improvement of the safety and reliability of domestic PWR plants by evaluating them. Succeeding to the 2003 report, this report shows the summary of the evaluation activity and of the tendency analysis based on about 2800 information obtained in 2004. The tendency analysis was undertaken on about 1700 analyzed events, from the view point of mechanics, electrics and operations, about the causes, troubled equipments and so on. (author)

  9. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  10. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  11. Analysis of extreme events

    CSIR Research Space (South Africa)

    Khuluse, S

    2009-04-01

    Full Text Available ) determination of the distribution of the damage and (iii) preparation of products that enable prediction of future risk events. The methodology provided by extreme value theory can also be a powerful tool in risk analysis...

  12. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  13. Regression analysis of mixed recurrent-event and panel-count data.

    Science.gov (United States)

    Zhu, Liang; Tong, Xinwei; Sun, Jianguo; Chen, Manhua; Srivastava, Deo Kumar; Leisenring, Wendy; Robison, Leslie L

    2014-07-01

    In event history studies concerning recurrent events, two types of data have been extensively discussed. One is recurrent-event data (Cook and Lawless, 2007. The Analysis of Recurrent Event Data. New York: Springer), and the other is panel-count data (Zhao and others, 2010. Nonparametric inference based on panel-count data. Test 20: , 1-42). In the former case, all study subjects are monitored continuously; thus, complete information is available for the underlying recurrent-event processes of interest. In the latter case, study subjects are monitored periodically; thus, only incomplete information is available for the processes of interest. In reality, however, a third type of data could occur in which some study subjects are monitored continuously, but others are monitored periodically. When this occurs, we have mixed recurrent-event and panel-count data. This paper discusses regression analysis of such mixed data and presents two estimation procedures for the problem. One is a maximum likelihood estimation procedure, and the other is an estimating equation procedure. The asymptotic properties of both resulting estimators of regression parameters are established. Also, the methods are applied to a set of mixed recurrent-event and panel-count data that arose from a Childhood Cancer Survivor Study and motivated this investigation. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Analysis of adverse events occurred at overseas nuclear power plants in 2003

    International Nuclear Information System (INIS)

    Miyazaki, Takamasa; Sato, Masahiro; Takagawa, Kenichi; Fushimi, Yasuyuki; Shimada, Hiroki; Shimada, Yoshio

    2004-01-01

    The adverse events that have occurred in the overseas nuclear power plants can be studied to provide an indication of how to improve the safety and the reliability of nuclear power plants in Japan. The Institute of Nuclear Safety Systems (INSS) obtains information related to overseas adverse events and incidents, and by evaluating them proposes improvements to prevent similar occurrences in Japanese PWR plants. In 2003, INSS obtained approximately 2800 pieces of information and, by evaluating them, proposed nine recommendations to Japanese utilities. This report shows a summary of the evaluation activity and of the tendency analysis based on individual event analyzed in 2003. The tendency analysis was undertaken on about 1600 analyzed events, from the view point of Mechanics, Electrics, Instruments and Controls and Operations, about the causes, countermeasures, troubled equipments and the possible of lessons learnt from overseas events. This report is to show the whole tendency of overseas events and incidents for the improvement of the safety and reliability of domestic PWR plants. (author)

  15. Analysis hierarchical model for discrete event systems

    Science.gov (United States)

    Ciortea, E. M.

    2015-11-01

    The This paper presents the hierarchical model based on discrete event network for robotic systems. Based on the hierarchical approach, Petri network is analysed as a network of the highest conceptual level and the lowest level of local control. For modelling and control of complex robotic systems using extended Petri nets. Such a system is structured, controlled and analysed in this paper by using Visual Object Net ++ package that is relatively simple and easy to use, and the results are shown as representations easy to interpret. The hierarchical structure of the robotic system is implemented on computers analysed using specialized programs. Implementation of hierarchical model discrete event systems, as a real-time operating system on a computer network connected via a serial bus is possible, where each computer is dedicated to local and Petri model of a subsystem global robotic system. Since Petri models are simplified to apply general computers, analysis, modelling, complex manufacturing systems control can be achieved using Petri nets. Discrete event systems is a pragmatic tool for modelling industrial systems. For system modelling using Petri nets because we have our system where discrete event. To highlight the auxiliary time Petri model using transport stream divided into hierarchical levels and sections are analysed successively. Proposed robotic system simulation using timed Petri, offers the opportunity to view the robotic time. Application of goods or robotic and transmission times obtained by measuring spot is obtained graphics showing the average time for transport activity, using the parameters sets of finished products. individually.

  16. Analysis for Human-related Events during the Overhaul

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  17. Signal analysis of acoustic and flow-induced vibrations of BWR main steam line

    Energy Technology Data Exchange (ETDEWEB)

    Espinosa-Paredes, G., E-mail: gepe@xanum.uam.mx [División de Ciencias Básicas e Ingeniería, Universidad Autónoma Metropolitana-Iztapalapa, México, D.F. 09340 (Mexico); Prieto-Guerrero, A. [División de Ciencias Básicas e Ingeniería, Universidad Autónoma Metropolitana-Iztapalapa, México, D.F. 09340 (Mexico); Núñez-Carrera, A. [Comisión Nacional de Seguridad Nuclear y Salvaguardias, Doctor Barragán 779, Col. Narvarte, México, D.F. 03020 (Mexico); Vázquez-Rodríguez, A. [División de Ciencias Básicas e Ingeniería, Universidad Autónoma Metropolitana-Iztapalapa, México, D.F. 09340 (Mexico); Centeno-Pérez, J. [Instituto Politécnico Nacional, Escuela Superior de Física y Matemáticas Unidad Profesional “Adolfo López Mateos”, Av. IPN, s/n, México, D.F. 07738 (Mexico); Espinosa-Martínez, E.-G. [Departamento de Sistemas Energéticos, Universidad Nacional Autónoma de México, México, D.F. 04510 (Mexico); and others

    2016-05-15

    Highlights: • Acoustic and flow-induced vibrations of BWR are analyzed. • BWR performance after extended power uprate is considered. • Effect of acoustic side branches (ASB) is analyzed. • The ASB represents a reduction in the acoustic loads to the steam dryer. • Methodology developed for simultaneous analyzing the signals in the MSL. - Abstract: The aim of this work is the signal analysis of acoustic waves due to phenomenon known as singing in Safety Relief Valves (SRV) of the main steam lines (MSL) in a typical BWR5. The acoustic resonance in SRV standpipes and fluctuating pressure is propagated from SRV to the dryer through the MSL. The signals are analyzed with a novel method based on the Multivariate Empirical Mode Decomposition (M-EMD). The M-EMD algorithm has the potential to find common oscillatory modes (IMF) within multivariate data. Based on this fact, we implement the M-EMD technique to find the oscillatory mode in BWR considering the measurements obtained collected by the strain gauges located around the MSL. These IMF, analyzed simultaneously in time, allow obtaining an estimation of the effects of the multiple-SRV in the MSL. Two scenarios are analyzed: the first is the signal obtained before the installation of the acoustic dampers (ASB), and the second, the signal obtained after installation. The results show the effectiveness of the ASB to damp the strong resonances when the steam flow increases, which represents an important reduction in the acoustic loads to the steam dryer.

  18. Signal analysis of acoustic and flow-induced vibrations of BWR main steam line

    International Nuclear Information System (INIS)

    Espinosa-Paredes, G.; Prieto-Guerrero, A.; Núñez-Carrera, A.; Vázquez-Rodríguez, A.; Centeno-Pérez, J.; Espinosa-Martínez, E.-G.

    2016-01-01

    Highlights: • Acoustic and flow-induced vibrations of BWR are analyzed. • BWR performance after extended power uprate is considered. • Effect of acoustic side branches (ASB) is analyzed. • The ASB represents a reduction in the acoustic loads to the steam dryer. • Methodology developed for simultaneous analyzing the signals in the MSL. - Abstract: The aim of this work is the signal analysis of acoustic waves due to phenomenon known as singing in Safety Relief Valves (SRV) of the main steam lines (MSL) in a typical BWR5. The acoustic resonance in SRV standpipes and fluctuating pressure is propagated from SRV to the dryer through the MSL. The signals are analyzed with a novel method based on the Multivariate Empirical Mode Decomposition (M-EMD). The M-EMD algorithm has the potential to find common oscillatory modes (IMF) within multivariate data. Based on this fact, we implement the M-EMD technique to find the oscillatory mode in BWR considering the measurements obtained collected by the strain gauges located around the MSL. These IMF, analyzed simultaneously in time, allow obtaining an estimation of the effects of the multiple-SRV in the MSL. Two scenarios are analyzed: the first is the signal obtained before the installation of the acoustic dampers (ASB), and the second, the signal obtained after installation. The results show the effectiveness of the ASB to damp the strong resonances when the steam flow increases, which represents an important reduction in the acoustic loads to the steam dryer.

  19. Screening Analysis of Criticality Features, Events, and Processes for License Application

    International Nuclear Information System (INIS)

    J.A. McClure

    2004-01-01

    This report documents the screening analysis of postclosure criticality features, events, and processes. It addresses the probability of criticality events resulting from degradation processes as well as disruptive events (i.e., seismic, rock fall, and igneous). Probability evaluations are performed utilizing the configuration generator described in ''Configuration Generator Model'', a component of the methodology from ''Disposal Criticality Analysis Methodology Topical Report''. The total probability per package of criticality is compared against the regulatory probability criterion for inclusion of events established in 10 CFR 63.114(d) (consider only events that have at least one chance in 10,000 of occurring over 10,000 years). The total probability of criticality accounts for the evaluation of identified potential critical configurations of all baselined commercial and U.S. Department of Energy spent nuclear fuel waste form and waste package combinations, both internal and external to the waste packages. This criticality screening analysis utilizes available information for the 21-Pressurized Water Reactor Absorber Plate, 12-Pressurized Water Reactor Absorber Plate, 44-Boiling Water Reactor Absorber Plate, 24-Boiling Water Reactor Absorber Plate, and the 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Short waste package types. Where defensible, assumptions have been made for the evaluation of the following waste package types in order to perform a complete criticality screening analysis: 21-Pressurized Water Reactor Control Rod, 5-Defense High-Level Radioactive Waste/U.S. Department of Energy Long, and 2-Multi-Canister Overpack/2-Defense High-Level Radioactive Waste package types. The inputs used to establish probabilities for this analysis report are based on information and data generated for the Total System Performance Assessment for the License Application, where available. This analysis report determines whether criticality is to be

  20. Statistical analysis of solar proton events

    Directory of Open Access Journals (Sweden)

    V. Kurt

    2004-06-01

    Full Text Available A new catalogue of 253 solar proton events (SPEs with energy >10MeV and peak intensity >10 protons/cm2.s.sr (pfu at the Earth's orbit for three complete 11-year solar cycles (1970-2002 is given. A statistical analysis of this data set of SPEs and their associated flares that occurred during this time period is presented. It is outlined that 231 of these proton events are flare related and only 22 of them are not associated with Ha flares. It is also noteworthy that 42 of these events are registered as Ground Level Enhancements (GLEs in neutron monitors. The longitudinal distribution of the associated flares shows that a great number of these events are connected with west flares. This analysis enables one to understand the long-term dependence of the SPEs and the related flare characteristics on the solar cycle which are useful for space weather prediction.

  1. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  2. Event History Analysis in Quantitative Genetics

    DEFF Research Database (Denmark)

    Maia, Rafael Pimentel

    Event history analysis is a clas of statistical methods specially designed to analyze time-to-event characteristics, e.g. the time until death. The aim of the thesis was to present adequate multivariate versions of mixed survival models that properly represent the genetic aspects related to a given...

  3. NPP unusual events: data, analysis and application

    International Nuclear Information System (INIS)

    Tolstykh, V.

    1990-01-01

    Subject of the paper are the IAEA cooperative patterns of unusual events data treatment and utilization of the operating safety experience feedback. The Incident Reporting System (IRS) and the Analysis of Safety Significant Event Team (ASSET) are discussed. The IRS methodology in collection, handling, assessment and dissemination of data on NPP unusual events (deviations, incidents and accidents) occurring during operations, surveillance and maintenance is outlined by the reports gathering and issuing practice, the experts assessment procedures and the parameters of the system. After 7 years of existence the IAEA-IRS contains over 1000 reports and receives 1.5-4% of the total information on unusual events. The author considers the reports only as detailed technical 'records' of events requiring assessment. The ASSET approaches implying an in-depth occurrences analysis directed towards level-1 PSA utilization are commented on. The experts evaluated root causes for the reported events and some trends are presented. Generally, internal events due to unexpected paths of water in the nuclear installations, occurrences related to the integrity of the primary heat transport systems, events associated with the engineered safety systems and events involving human factor represent the large groups deserving close attention. Personal recommendations on how to use the events related information use for NPP safety improvement are given. 2 tabs (R.Ts)

  4. Analysis of external flooding events occurred in foreign nuclear power plant sites

    International Nuclear Information System (INIS)

    Li Dan; Cai Hankun; Xiao Zhi; An Hongzhen; Mao Huan

    2013-01-01

    This paper screens and studies 17 external flooding events occurred in foreign NPP sites, analysis the characteristic of external flooding events based on the source of the flooding, the impact on the building, systems and equipment, as well as the threat to nuclear safety. Furthermore, based on the experiences and lessons learned from Fukushima nuclear accident relating to external flooding and countermeasures carried out in the world, some suggestions are proposed in order to improve external flooding response capacity for Chinese NPPs. (authors)

  5. Identification and analysis of external event combinations for Hanhikivi 1PRA

    Energy Technology Data Exchange (ETDEWEB)

    Helander, Juho [Fennovoima Oy, Helsinki (Finland)

    2017-03-15

    Fennovoima's nuclear power plant, Hanhikivi 1, Pyhäjoki, Finland, is currently in design phase, and its construction is scheduled to begin in 2018 and electricity production in 2024. The objective of this paper is to produce a preliminary list of safety-significant external event combinations including preliminary probability estimates, to be used in the probabilistic risk assessment of Hanhikivi 1 plant. Starting from the list of relevant single events, the relevant event combinations are identified based on seasonal variation, preconditions related to different events, and dependencies (fundamental and cascade type) between events. Using this method yields 30 relevant event combinations of two events for the Hanhikivi site. The preliminary probability of each combination is evaluated, and event combinations with extremely low probability are excluded from further analysis. Event combinations of three or more events are identified by adding possible events to the remaining combinations of two events. Finally, 10 relevant combinations of two events and three relevant combinations of three events remain. The results shall be considered preliminary and will be updated after evaluating more detailed effects of different events on plant safety.

  6. Sentiment analysis on tweets for social events

    DEFF Research Database (Denmark)

    Zhou, Xujuan; Tao, Xiaohui; Yong, Jianming

    2013-01-01

    Sentiment analysis or opinion mining is an important type of text analysis that aims to support decision making by extracting and analyzing opinion oriented text, identifying positive and negative opinions, and measuring how positively or negatively an entity (i.e., people, organization, event......, location, product, topic, etc.) is regarded. As more and more users express their political and religious views on Twitter, tweets become valuable sources of people's opinions. Tweets data can be efficiently used to infer people's opinions for marketing or social studies. This paper proposes a Tweets...... Sentiment Analysis Model (TSAM) that can spot the societal interest and general people's opinions in regard to a social event. In this paper, Australian federal election 2010 event was taken as an example for sentiment analysis experiments. We are primarily interested in the sentiment of the specific...

  7. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  8. Gear Fault Detection Based on Teager-Huang Transform

    Directory of Open Access Journals (Sweden)

    Hui Li

    2010-01-01

    Full Text Available Gear fault detection based on Empirical Mode Decomposition (EMD and Teager Kaiser Energy Operator (TKEO technique is presented. This novel method is named as Teager-Huang transform (THT. EMD can adaptively decompose the vibration signal into a series of zero mean Intrinsic Mode Functions (IMFs. TKEO can track the instantaneous amplitude and instantaneous frequency of the Intrinsic Mode Functions at any instant. The experimental results provide effective evidence that Teager-Huang transform has better resolution than that of Hilbert-Huang transform. The Teager-Huang transform can effectively diagnose the fault of the gear, thus providing a viable processing tool for gearbox defect detection and diagnosis.

  9. Interpretation Analysis as a Competitive Event.

    Science.gov (United States)

    Nading, Robert M.

    Interpretation analysis is a new and interesting event on the forensics horizon which appears to be attracting an ever larger number of supporters. This event, developed by Larry Lambert of Ball State University in 1989, requires a student to perform all three disciplines of forensic competition (interpretation, public speaking, and limited…

  10. Event-Based Analysis of Rainfall-Runoff Response to Assess Wetland-Stream Interaction in the Prairie Pothole Region

    Science.gov (United States)

    Haque, M. A.; Ross, C.; Schmall, A.; Bansah, S.; Ali, G.

    2016-12-01

    Process-based understanding of wetland response to precipitation is needed to quantify the extent to which non-floodplain wetlands - such as Prairie potholes - generate flow and transmit that flow to nearby streams. While measuring wetland-stream (W-S) interaction is difficult, it is possible to infer it by examining hysteresis characteristics between wetland and stream stage during individual precipitation events. Hence, to evaluate W-S interaction, 10 intact and 10 altered/lost potholes were selected for study; they are located in Broughton's Creek Watershed (Manitoba, Canada) on both sides of a 5 km creek reach. Stilling wells (i.e., above ground wells) were deployed in the intact and altered wetlands to monitor surface water level fluctuations while water table wells were drilled below drainage ditches to a depth of 1 m to monitor shallow groundwater fluctuations. All stilling wells and water table wells were equipped with capacitance water level loggers to monitor fluctuations in surface water and shallow groundwater every 15 minutes. In 2013 (normal year) and 2014 (wet year), 15+ precipitation events were identified and scatter plots of wetland (x-axis) versus stream (y-axis) stage were built to identify W-S hysteretic dynamics. Initial data analysis reveals that in dry antecedent conditions, intact and altered wetlands show clockwise W-S relations, while drained wetlands show anticlockwise W-S hysteresis. However, in wetter antecedent conditions, all wetland types show anticlockwise hysteresis. Future analysis will target the identification of thresholds in antecedent moisture conditions that determine significant changes in event wetland response characteristics (e.g., the delay between the start of rainfall and stream stage, the maximum water level rise in each wetland during each event, the delay between the start of rainfall and peak wetland stage) as well as hysteresis properties (e.g., gradient and area of the hysteresis loop).

  11. Improved Tensor-Based Singular Spectrum Analysis Based on Single Channel Blind Source Separation Algorithm and Its Application to Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Dan Yang

    2017-04-01

    Full Text Available To solve the problem of multi-fault blind source separation (BSS in the case that the observed signals are under-determined, a novel approach for single channel blind source separation (SCBSS based on the improved tensor-based singular spectrum analysis (TSSA is proposed. As the most natural representation of high-dimensional data, tensor can preserve the intrinsic structure of the data to the maximum extent. Thus, TSSA method can be employed to extract the multi-fault features from the measured single-channel vibration signal. However, SCBSS based on TSSA still has some limitations, mainly including unsatisfactory convergence of TSSA in many cases and the number of source signals is hard to accurately estimate. Therefore, the improved TSSA algorithm based on canonical decomposition and parallel factors (CANDECOMP/PARAFAC weighted optimization, namely CP-WOPT, is proposed in this paper. CP-WOPT algorithm is applied to process the factor matrix using a first-order optimization approach instead of the original least square method in TSSA, so as to improve the convergence of this algorithm. In order to accurately estimate the number of the source signals in BSS, EMD-SVD-BIC (empirical mode decomposition—singular value decomposition—Bayesian information criterion method, instead of the SVD in the conventional TSSA, is introduced. To validate the proposed method, we applied it to the analysis of the numerical simulation signal and the multi-fault rolling bearing signals.

  12. Advanced event reweighting using multivariate analysis

    International Nuclear Information System (INIS)

    Martschei, D; Feindt, M; Honc, S; Wagner-Kuhr, J

    2012-01-01

    Multivariate analysis (MVA) methods, especially discrimination techniques such as neural networks, are key ingredients in modern data analysis and play an important role in high energy physics. They are usually trained on simulated Monte Carlo (MC) samples to discriminate so called 'signal' from 'background' events and are then applied to data to select real events of signal type. We here address procedures that improve this work flow. This will be the enhancement of data / MC agreement by reweighting MC samples on a per event basis. Then training MVAs on real data using the sPlot technique will be discussed. Finally we will address the construction of MVAs whose discriminator is independent of a certain control variable, i.e. cuts on this variable will not change the discriminator shape.

  13. Single Trial Classification of Evoked EEG Signals Due to RGB Colors

    Directory of Open Access Journals (Sweden)

    Eman Alharbi

    2016-03-01

    Full Text Available Recently, the impact of colors on the brain signals has become one of the leading researches in BCI systems. These researches are based on studying the brain behavior after color stimulus, and finding a way to classify its signals offline without considering the real time. Moving to the next step, we present a real time classification model (online for EEG signals evoked by RGB colors stimuli, which is not presented in previous studies. In this research, EEG signals were recorded from 7 subjects through BCI2000 toolbox. The Empirical Mode Decomposition (EMD technique was used at the signal analysis stage. Various feature extraction methods were investigated to find the best and reliable set, including Event-related spectral perturbations (ERSP, Target mean with Feast Fourier Transform (FFT, Wavelet Packet Decomposition (WPD, Auto Regressive model (AR and EMD residual. A new feature selection method was created based on the peak's time of EEG signal when red and blue colors stimuli are presented. The ERP image was used to find out the peak's time, which was around 300 ms for the red color and around 450 ms for the blue color. The classification was performed using the Support Vector Machine (SVM classifier, LIBSVM toolbox being used for that purpose. The EMD residual was found to be the most reliable method that gives the highest classification accuracy with an average of 88.5% and with an execution time of only 14 seconds.

  14. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  15. Study on Apparent Kinetic Prediction Model of the Smelting Reduction Based on the Time-Series

    Directory of Open Access Journals (Sweden)

    Guo-feng Fan

    2012-01-01

    Full Text Available A series of direct smelting reduction experiment has been carried out with high phosphorous iron ore of the different bases by thermogravimetric analyzer. The derivative thermogravimetric (DTG data have been obtained from the experiments. One-step forward local weighted linear (LWL method , one of the most suitable ways of predicting chaotic time-series methods which focus on the errors, is used to predict DTG. In the meanwhile, empirical mode decomposition-autoregressive (EMD-AR, a data mining technique in signal processing, is also used to predict DTG. The results show that (1 EMD-AR(4 is the most appropriate and its error is smaller than the former; (2 root mean square error (RMSE has decreased about two-thirds; (3 standardized root mean square error (NMSE has decreased in an order of magnitude. Finally in this paper, EMD-AR method has been improved by golden section weighting; its error would be smaller than before. Therefore, the improved EMD-AR model is a promising alternative for apparent reaction rate (DTG. The analytical results have been an important reference in the field of industrial control.

  16. RETRIEVAL EVENTS EVALUATION

    International Nuclear Information System (INIS)

    Wilson, T.

    1999-01-01

    The purpose of this analysis is to evaluate impacts to the retrieval concept presented in the Design Analysis ''Retrieval Equipment and Strategy'' (Reference 6), from abnormal events based on Design Basis Events (DBE) and Beyond Design Basis Events (BDBE) as defined in two recent analyses: (1) DBE/Scenario Analysis for Preclosure Repository Subsurface Facilities (Reference 4); and (2) Preliminary Preclosure Design Basis Event Calculations for the Monitored Geologic Repository (Reference 5) The objective of this task is to determine what impacts the DBEs and BDBEs have on the equipment developed for retrieval. The analysis lists potential impacts and recommends changes to be analyzed in subsequent design analyses for developed equipment, or recommend where additional equipment may be needed, to allow retrieval to be performed in all DBE or BDBE situations. This analysis supports License Application design and therefore complies with the requirements of Systems Description Document input criteria comparison as presented in Section 7, Conclusions. In addition, the analysis discusses the impacts associated with not using concrete inverts in the emplacement drifts. The ''Retrieval Equipment and Strategy'' analysis was based on a concrete invert configuration in the emplacement drift. The scope of the analysis, as presented in ''Development Plan for Retrieval Events Evaluation'' (Reference 3) includes evaluation and criteria of the following: Impacts to retrieval from the emplacement drift based on DBE/BDBEs, and changes to the invert configuration for the preclosure period. Impacts to retrieval from the main drifts based on DBE/BDBEs for the preclosure period

  17. Using Web Crawler Technology for Geo-Events Analysis: A Case Study of the Huangyan Island Incident

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2014-04-01

    Full Text Available Social networking and network socialization provide abundant text information and social relationships into our daily lives. Making full use of these data in the big data era is of great significance for us to better understand the changing world and the information-based society. Though politics have been integrally involved in the hyperlinked world issues since the 1990s, the text analysis and data visualization of geo-events faced the bottleneck of traditional manual analysis. Though automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have been explored and built recently, the data mining and information collection are not comprehensive enough because of the sensibility, complexity, relativity, timeliness, and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency, and dissemination path of the Huangyan Island incident were studied by using web crawler technology and the text analysis. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios, and dissemination flow graph, based on the crawled information and data processing not only highlight the characteristics of geo-event itself, but also implicate many interesting phenomenon and deep-seated problems behind it, such as related topics, theme vocabularies, subject contents, hot countries, event bodies, opinion leaders, high-frequency vocabularies, information sources, semantic structure, propagation paths, distribution of different attitudes, and regional difference of net citizens’ response in the Huangyan Island incident. Furthermore, the text analysis of network information with the help of focused web crawler is able to express the time-space relationship of crawled information and the information characteristic of semantic network to the geo-events. Therefore, it is a useful tool to

  18. Surface Management System Departure Event Data Analysis

    Science.gov (United States)

    Monroe, Gilena A.

    2010-01-01

    This paper presents a data analysis of the Surface Management System (SMS) performance of departure events, including push-back and runway departure events.The paper focuses on the detection performance, or the ability to detect departure events, as well as the prediction performance of SMS. The results detail a modest overall detection performance of push-back events and a significantly high overall detection performance of runway departure events. The overall detection performance of SMS for push-back events is approximately 55%.The overall detection performance of SMS for runway departure events nears 100%. This paper also presents the overall SMS prediction performance for runway departure events as well as the timeliness of the Aircraft Situation Display for Industry data source for SMS predictions.

  19. Second-order analysis of semiparametric recurrent event processes.

    Science.gov (United States)

    Guan, Yongtao

    2011-09-01

    A typical recurrent event dataset consists of an often large number of recurrent event processes, each of which contains multiple event times observed from an individual during a follow-up period. Such data have become increasingly available in medical and epidemiological studies. In this article, we introduce novel procedures to conduct second-order analysis for a flexible class of semiparametric recurrent event processes. Such an analysis can provide useful information regarding the dependence structure within each recurrent event process. Specifically, we will use the proposed procedures to test whether the individual recurrent event processes are all Poisson processes and to suggest sensible alternative models for them if they are not. We apply these procedures to a well-known recurrent event dataset on chronic granulomatous disease and an epidemiological dataset on meningococcal disease cases in Merseyside, United Kingdom to illustrate their practical value. © 2011, The International Biometric Society.

  20. Regression analysis of mixed panel count data with dependent terminal events.

    Science.gov (United States)

    Yu, Guanglei; Zhu, Liang; Li, Yang; Sun, Jianguo; Robison, Leslie L

    2017-05-10

    Event history studies are commonly conducted in many fields, and a great deal of literature has been established for the analysis of the two types of data commonly arising from these studies: recurrent event data and panel count data. The former arises if all study subjects are followed continuously, while the latter means that each study subject is observed only at discrete time points. In reality, a third type of data, a mixture of the two types of the data earlier, may occur and furthermore, as with the first two types of the data, there may exist a dependent terminal event, which may preclude the occurrences of recurrent events of interest. This paper discusses regression analysis of mixed recurrent event and panel count data in the presence of a terminal event and an estimating equation-based approach is proposed for estimation of regression parameters of interest. In addition, the asymptotic properties of the proposed estimator are established, and a simulation study conducted to assess the finite-sample performance of the proposed method suggests that it works well in practical situations. Finally, the methodology is applied to a childhood cancer study that motivated this study. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Root cause analysis of relevant events

    International Nuclear Information System (INIS)

    Perez, Silvia S.; Vidal, Patricia G.

    2000-01-01

    During 1998 the research work followed more specific guidelines, which entailed focusing exclusively on the two selected methods (ASSET and HPIP) and incorporating some additional human behaviour elements based on the documents of reference. Once resident inspectors were incorporated in the project (and trained accordingly), events occurring in Argentine nuclear power plants were analysed. Some events were analysed (all of them from Atucha I and Embalse nuclear power plant), concluding that the systematic methodology used allows us to investigate also minor events that were precursors of the events selected. (author)

  2. Fault diagnosis of rotating machinery using an improved HHT based on EEMD and sensitive IMFs

    International Nuclear Information System (INIS)

    Lei, Yaguo; Zuo, Ming J

    2009-01-01

    A Hilbert–Huang transform (HHT) is a time–frequency technique and has been widely applied to analyzing vibration signals in the field of fault diagnosis of rotating machinery. It analyzes the vibration signals using intrinsic mode functions (IMFs) extracted using empirical mode decomposition (EMD). However, EMD sometimes cannot reveal the signal characteristics accurately because of the problem of mode mixing. Ensemble empirical mode decomposition (EEMD) was developed recently to alleviate this problem. The IMFs generated by EEMD have different sensitivity to faults. Some IMFs are sensitive and closely related to the faults but others are irrelevant. To enhance the accuracy of the HHT in fault diagnosis of rotating machinery, an improved HHT based on EEMD and sensitive IMFs is proposed in this paper. Simulated signals demonstrate the effectiveness of the improved HHT in diagnosing the faults of rotating machinery. Finally, the improved HHT is applied to diagnosing an early rub-impact fault of a heavy oil catalytic cracking machine set, and the application results prove that the improved HHT is superior to the HHT based on all IMFs of EMD

  3. Low-Pass Filtering Approach via Empirical Mode Decomposition Improves Short-Scale Entropy-Based Complexity Estimation of QT Interval Variability in Long QT Syndrome Type 1 Patients

    Directory of Open Access Journals (Sweden)

    Vlasta Bari

    2014-09-01

    Full Text Available Entropy-based complexity of cardiovascular variability at short time scales is largely dependent on the noise and/or action of neural circuits operating at high frequencies. This study proposes a technique for canceling fast variations from cardiovascular variability, thus limiting the effect of these overwhelming influences on entropy-based complexity. The low-pass filtering approach is based on the computation of the fastest intrinsic mode function via empirical mode decomposition (EMD and its subtraction from the original variability. Sample entropy was exploited to estimate complexity. The procedure was applied to heart period (HP and QT (interval from Q-wave onset to T-wave end variability derived from 24-hour Holter recordings in 14 non-mutation carriers (NMCs and 34 mutation carriers (MCs subdivided into 11 asymptomatic MCs (AMCs and 23 symptomatic MCs (SMCs. All individuals belonged to the same family developing long QT syndrome type 1 (LQT1 via KCNQ1-A341V mutation. We found that complexity indexes computed over EMD-filtered QT variability differentiated AMCs from NMCs and detected the effect of beta-blocker therapy, while complexity indexes calculated over EMD-filtered HP variability separated AMCs from SMCs. The EMD-based filtering method enhanced features of the cardiovascular control that otherwise would have remained hidden by the dominant presence of noise and/or fast physiological variations, thus improving classification in LQT1.

  4. An adverse events potential costs analysis based on Drug Programs in Poland. Dermatology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-09-01

    Full Text Available The aim of the project, carried out within the Polish Society for Pharmacoeconomics (PTFE, was to estimate the potential costs of treatment of the side effects which (theoretically may occur as a result of treatments for the selected diseases. This paper deals solely with dermatology related events. Herein, several Drug Programs financed by the National Health Fund in Poland, in 2012, were analyzed. The adverse events were selected based on the Summary of Product Characteristics of the chosen products. We focused the project on those potential adverse events which were defined in SPC as frequent and very frequent. The results are presented according to their therapeutic areas, and in this paper, the focus is upon that which is related to dermatology. The events described as ‘very common’ had an incidence of ≥ 1/10, and that which is ‘common’ - ≥ 1/100, <1 /10. In order to identify the resources used, we, with the engagement of clinical experts, performed a survey. In our work, we employed only the total direct costs incurred by the public payer, based on valid individual cost data in February 2014. Moreover, we calculated the total spending from the public payer’s perspective, as well as the patient’s perspective, and the percentage of each component of the total cost in detail. The paper, thus, informs the reader of the estimated costs of treatment of side effects related to the dermatologic symptoms and reactions. Based on our work, we can state that the treatment of skin adverse drug reactions generates a significant cost - one incurred by both the public payer and the patient.

  5. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  6. Use of PSA for the analysis of operational events in nuclear power plants

    International Nuclear Information System (INIS)

    Hulsmans, M.

    2006-01-01

    An operational event is a safety-relevant incident that occurred in an industrial installation like a nuclear power plant (NPP). The probabilistic approach to event analysis focuses on the potential consequences of an operational event. Within its scope of application, it provides a quantitative assessment of the risk significance of this event (and similar events): it calculates the risk increase induced by the event. Such analyses may result in a more objective and a more accurate event severity measure than those provided by commonly used qualitative methods. Probabilistic event analysis complements the traditional event analysis approaches that are oriented towards the understanding of the (root) causes of an event. In practice, risk-based precursor analysis consists of the mapping of an operational event on a risk model of the installation, such as a probabilistic safety analysis (PSA) model. Precursor analyses result in an objective risk ranking of safety-significant events, called accident precursors. An unexpectedly high (or low) risk increase value is in itself already an important finding. This assessment also yields a lot of information on the structure of the risk, since the underlying dominant factors can easily be determined. Relevant 'what if' studies on similar events and conditions can be identified and performed (which is generally not considered in conventional event analysis), with the potential to yield even broader findings. The findings of such a structured assessment can be used for other purposes than merely risk ranking. The operational experience feedback process can be improved by helping to identify design measures and operational practices in order to prevent re-occurrence or in order to mitigate future consequences, and even to evaluate their expected effectiveness, contributing to the validation and prioritization of corrective measures. Confirmed and re-occurring precursors with correlated characteristics may point out opportunities

  7. Top event prevention analysis: A deterministic use of PRA

    International Nuclear Information System (INIS)

    Worrell, R.B.; Blanchard, D.P.

    1996-01-01

    This paper describes the application of Top Event Prevention Analysis. The analysis finds prevention sets which are combinations of basic events that can prevent the occurrence of a fault tree top event such as core damage. The problem analyzed in this application is that of choosing a subset of Motor-Operated Valves (MOVs) for testing under the Generic Letter 89-10 program such that the desired level of safety is achieved while providing economic relief from the burden of testing all safety-related valves. A brief summary of the method is given, and the process used to produce a core damage expression from Level 1 PRA models for a PWR is described. The analysis provides an alternative to the use of importance measures for finding the important combination of events in a core damage expression. This application of Top Event Prevention Analysis to the MOV problem was achieve with currently available software

  8. Detection of Abnormal Events via Optical Flow Feature Analysis

    Directory of Open Access Journals (Sweden)

    Tian Wang

    2015-03-01

    Full Text Available In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm.

  9. Detection of Abnormal Events via Optical Flow Feature Analysis

    Science.gov (United States)

    Wang, Tian; Snoussi, Hichem

    2015-01-01

    In this paper, a novel algorithm is proposed to detect abnormal events in video streams. The algorithm is based on the histogram of the optical flow orientation descriptor and the classification method. The details of the histogram of the optical flow orientation descriptor are illustrated for describing movement information of the global video frame or foreground frame. By combining one-class support vector machine and kernel principal component analysis methods, the abnormal events in the current frame can be detected after a learning period characterizing normal behaviors. The difference abnormal detection results are analyzed and explained. The proposed detection method is tested on benchmark datasets, then the experimental results show the effectiveness of the algorithm. PMID:25811227

  10. Optimal Parameter Selection for Support Vector Machine Based on Artificial Bee Colony Algorithm: A Case Study of Grid-Connected PV System Power Prediction.

    Science.gov (United States)

    Gao, Xiang-Ming; Yang, Shi-Feng; Pan, San-Bo

    2017-01-01

    Predicting the output power of photovoltaic system with nonstationarity and randomness, an output power prediction model for grid-connected PV systems is proposed based on empirical mode decomposition (EMD) and support vector machine (SVM) optimized with an artificial bee colony (ABC) algorithm. First, according to the weather forecast data sets on the prediction date, the time series data of output power on a similar day with 15-minute intervals are built. Second, the time series data of the output power are decomposed into a series of components, including some intrinsic mode components IMFn and a trend component Res, at different scales using EMD. The corresponding SVM prediction model is established for each IMF component and trend component, and the SVM model parameters are optimized with the artificial bee colony algorithm. Finally, the prediction results of each model are reconstructed, and the predicted values of the output power of the grid-connected PV system can be obtained. The prediction model is tested with actual data, and the results show that the power prediction model based on the EMD and ABC-SVM has a faster calculation speed and higher prediction accuracy than do the single SVM prediction model and the EMD-SVM prediction model without optimization.

  11. Optimal Parameter Selection for Support Vector Machine Based on Artificial Bee Colony Algorithm: A Case Study of Grid-Connected PV System Power Prediction

    Directory of Open Access Journals (Sweden)

    Xiang-ming Gao

    2017-01-01

    Full Text Available Predicting the output power of photovoltaic system with nonstationarity and randomness, an output power prediction model for grid-connected PV systems is proposed based on empirical mode decomposition (EMD and support vector machine (SVM optimized with an artificial bee colony (ABC algorithm. First, according to the weather forecast data sets on the prediction date, the time series data of output power on a similar day with 15-minute intervals are built. Second, the time series data of the output power are decomposed into a series of components, including some intrinsic mode components IMFn and a trend component Res, at different scales using EMD. The corresponding SVM prediction model is established for each IMF component and trend component, and the SVM model parameters are optimized with the artificial bee colony algorithm. Finally, the prediction results of each model are reconstructed, and the predicted values of the output power of the grid-connected PV system can be obtained. The prediction model is tested with actual data, and the results show that the power prediction model based on the EMD and ABC-SVM has a faster calculation speed and higher prediction accuracy than do the single SVM prediction model and the EMD-SVM prediction model without optimization.

  12. A wavelet-based technique to predict treatment outcome for Major Depressive Disorder

    Science.gov (United States)

    Xia, Likun; Mohd Yasin, Mohd Azhar; Azhar Ali, Syed Saad

    2017-01-01

    Treatment management for Major Depressive Disorder (MDD) has been challenging. However, electroencephalogram (EEG)-based predictions of antidepressant’s treatment outcome may help during antidepressant’s selection and ultimately improve the quality of life for MDD patients. In this study, a machine learning (ML) method involving pretreatment EEG data was proposed to perform such predictions for Selective Serotonin Reuptake Inhibitor (SSRIs). For this purpose, the acquisition of experimental data involved 34 MDD patients and 30 healthy controls. Consequently, a feature matrix was constructed involving time-frequency decomposition of EEG data based on wavelet transform (WT) analysis, termed as EEG data matrix. However, the resultant EEG data matrix had high dimensionality. Therefore, dimension reduction was performed based on a rank-based feature selection method according to a criterion, i.e., receiver operating characteristic (ROC). As a result, the most significant features were identified and further be utilized during the training and testing of a classification model, i.e., the logistic regression (LR) classifier. Finally, the LR model was validated with 100 iterations of 10-fold cross-validation (10-CV). The classification results were compared with short-time Fourier transform (STFT) analysis, and empirical mode decompositions (EMD). The wavelet features extracted from frontal and temporal EEG data were found statistically significant. In comparison with other time-frequency approaches such as the STFT and EMD, the WT analysis has shown highest classification accuracy, i.e., accuracy = 87.5%, sensitivity = 95%, and specificity = 80%. In conclusion, significant wavelet coefficients extracted from frontal and temporal pre-treatment EEG data involving delta and theta frequency bands may predict antidepressant’s treatment outcome for the MDD patients. PMID:28152063

  13. A wavelet-based technique to predict treatment outcome for Major Depressive Disorder.

    Science.gov (United States)

    Mumtaz, Wajid; Xia, Likun; Mohd Yasin, Mohd Azhar; Azhar Ali, Syed Saad; Malik, Aamir Saeed

    2017-01-01

    Treatment management for Major Depressive Disorder (MDD) has been challenging. However, electroencephalogram (EEG)-based predictions of antidepressant's treatment outcome may help during antidepressant's selection and ultimately improve the quality of life for MDD patients. In this study, a machine learning (ML) method involving pretreatment EEG data was proposed to perform such predictions for Selective Serotonin Reuptake Inhibitor (SSRIs). For this purpose, the acquisition of experimental data involved 34 MDD patients and 30 healthy controls. Consequently, a feature matrix was constructed involving time-frequency decomposition of EEG data based on wavelet transform (WT) analysis, termed as EEG data matrix. However, the resultant EEG data matrix had high dimensionality. Therefore, dimension reduction was performed based on a rank-based feature selection method according to a criterion, i.e., receiver operating characteristic (ROC). As a result, the most significant features were identified and further be utilized during the training and testing of a classification model, i.e., the logistic regression (LR) classifier. Finally, the LR model was validated with 100 iterations of 10-fold cross-validation (10-CV). The classification results were compared with short-time Fourier transform (STFT) analysis, and empirical mode decompositions (EMD). The wavelet features extracted from frontal and temporal EEG data were found statistically significant. In comparison with other time-frequency approaches such as the STFT and EMD, the WT analysis has shown highest classification accuracy, i.e., accuracy = 87.5%, sensitivity = 95%, and specificity = 80%. In conclusion, significant wavelet coefficients extracted from frontal and temporal pre-treatment EEG data involving delta and theta frequency bands may predict antidepressant's treatment outcome for the MDD patients.

  14. Color Multifocus Image Fusion Using Empirical Mode Decomposition

    Directory of Open Access Journals (Sweden)

    S. Savić

    2013-11-01

    Full Text Available In this paper, a recently proposed grayscale multifocus image fusion method based on the first level of Empirical Mode Decomposition (EMD has been extended to color images. In addition, this paper deals with low contrast multifocus image fusion. The major advantages of the proposed methods are simplicity, absence of artifacts and control of contrast, while this isn’t the case with other pyramidal multifocus fusion methods. The efficiency of the proposed method is tested subjectively and with a vector gradient based objective measure, that is proposed in this paper for multifocus color image fusion. Subjective analysis performed on a multifocus image dataset has shown its superiority to the existing EMD and DWT based methods. The objective measures of grayscale and color image fusion show significantly better scores for this method than for the classic complex EMD fusion method.

  15. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    Science.gov (United States)

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  16. Analysis of Human's Motions Based on Local Mean Decomposition in Through-wall Radar Detection

    Science.gov (United States)

    Lu, Qi; Liu, Cai; Zeng, Zhaofa; Li, Jing; Zhang, Xuebing

    2016-04-01

    Observation of human motions through a wall is an important issue in security applications and search-and rescue. Radar has advantages in looking through walls where other sensors give low performance or cannot be used at all. Ultrawideband (UWB) radar has high spatial resolution as a result of employment of ultranarrow pulses. It has abilities to distinguish the closely positioned targets and provide time-lapse information of targets. Moreover, the UWB radar shows good performance in wall penetration when the inherently short pulses spread their energy over a broad frequency range. Human's motions show periodic features including respiration, swing arms and legs, fluctuations of the torso. Detection of human targets is based on the fact that there is always periodic motion due to breathing or other body movements like walking. The radar can gain the reflections from each human body parts and add the reflections at each time sample. The periodic movements will cause micro-Doppler modulation in the reflected radar signals. Time-frequency analysis methods are consider as the effective tools to analysis and extract micro-Doppler effects caused by the periodic movements in the reflected radar signal, such as short-time Fourier transform (STFT), wavelet transform (WT), and Hilbert-Huang transform (HHT).The local mean decomposition (LMD), initially developed by Smith (2005), is to decomposed amplitude and frequency modulated signals into a small set of product functions (PFs), each of which is the product of an envelope signal and a frequency modulated signal from which a time-vary instantaneous phase and instantaneous frequency can be derived. As bypassing the Hilbert transform, the LMD has no demodulation error coming from window effect and involves no negative frequency without physical sense. Also, the instantaneous attributes obtained by LMD are more stable and precise than those obtained by the empirical mode decomposition (EMD) because LMD uses smoothed local

  17. Non-Linear Non Stationary Analysis of Two-Dimensional Time-Series Applied to GRACE Data, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed innovative two-dimensional (2D) empirical mode decomposition (EMD) analysis was applied to NASA's Gravity Recovery and Climate Experiment (GRACE)...

  18. Towards a Unified Understanding of Event-Related Changes in the EEG: The Firefly Model of Synchronization through Cross-Frequency Phase Modulation

    Science.gov (United States)

    Burgess, Adrian P.

    2012-01-01

    Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing. PMID:23049827

  19. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  20. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  1. Pressure Effects Analysis of National Ignition Facility Capacitor Module Events

    International Nuclear Information System (INIS)

    Brereton, S; Ma, C; Newton, M; Pastrnak, J; Price, D; Prokosch, D

    1999-01-01

    Capacitors and power conditioning systems required for the National Ignition Facility (NIF) have experienced several catastrophic failures during prototype demonstration. These events generally resulted in explosion, generating a dramatic fireball and energetic shrapnel, and thus may present a threat to the walls of the capacitor bay that houses the capacitor modules. The purpose of this paper is to evaluate the ability of the capacitor bay walls to withstand the overpressure generated by the aforementioned events. Two calculations are described in this paper. The first one was used to estimate the energy release during a fireball event and the second one was used to estimate the pressure in a capacitor module during a capacitor explosion event. Both results were then used to estimate the subsequent overpressure in the capacitor bay where these events occurred. The analysis showed that the expected capacitor bay overpressure was less than the pressure tolerance of the walls. To understand the risk of the above events in NIF, capacitor module failure probabilities were also calculated. This paper concludes with estimates of the probability of single module failure and multi-module failures based on the number of catastrophic failures in the prototype demonstration facility

  2. An analysis of potential costs of adverse events based on Drug Programs in Poland. Pulmonology focus

    Directory of Open Access Journals (Sweden)

    Szkultecka-Debek Monika

    2014-06-01

    Full Text Available The project was performed within the Polish Society for Pharmacoeconomics (PTFE. The objective was to estimate the potential costs of treatment of side effects, which theoretically may occur as a result of treatment of selected diseases. We analyzed the Drug Programs financed by National Health Fund in Poland in 2012 and for the first analysis we selected those Programs where the same medicinal products were used. We based the adverse events selection on the Summary of Product Characteristics of the chosen products. We extracted all the potential adverse events defined as frequent and very frequent, grouping them according to therapeutic areas. This paper is related to the results in the pulmonology area. The events described as very common had an incidence of ≥ 1/10, and the common ones ≥ 1/100, <1/10. In order to identify the resources used, we performed a survey with the engagement of clinical experts. On the basis of the collected data we allocated direct costs incurred by the public payer. We used the costs valid in December 2013. The paper presents the estimated costs of treatment of side effects related to the pulmonology disease area. Taking into account the costs incurred by the NHF and the patient separately e calculated the total spending and the percentage of each component cost in detail. The treatment of adverse drug reactions generates a significant cost incurred by both the public payer and the patient.

  3. An Unsupervised Anomalous Event Detection and Interactive Analysis Framework for Large-scale Satellite Data

    Science.gov (United States)

    LIU, Q.; Lv, Q.; Klucik, R.; Chen, C.; Gallaher, D. W.; Grant, G.; Shang, L.

    2016-12-01

    Due to the high volume and complexity of satellite data, computer-aided tools for fast quality assessments and scientific discovery are indispensable for scientists in the era of Big Data. In this work, we have developed a framework for automated anomalous event detection in massive satellite data. The framework consists of a clustering-based anomaly detection algorithm and a cloud-based tool for interactive analysis of detected anomalies. The algorithm is unsupervised and requires no prior knowledge of the data (e.g., expected normal pattern or known anomalies). As such, it works for diverse data sets, and performs well even in the presence of missing and noisy data. The cloud-based tool provides an intuitive mapping interface that allows users to interactively analyze anomalies using multiple features. As a whole, our framework can (1) identify outliers in a spatio-temporal context, (2) recognize and distinguish meaningful anomalous events from individual outliers, (3) rank those events based on "interestingness" (e.g., rareness or total number of outliers) defined by users, and (4) enable interactively query, exploration, and analysis of those anomalous events. In this presentation, we will demonstrate the effectiveness and efficiency of our framework in the application of detecting data quality issues and unusual natural events using two satellite datasets. The techniques and tools developed in this project are applicable for a diverse set of satellite data and will be made publicly available for scientists in early 2017.

  4. Enamel matrix derivative (Emdogain) for periodontal tissue regeneration in intrabony defects. A Cochrane systematic review.

    Science.gov (United States)

    Esposito, Marco; Grusovin, Maria Gabriella; Papanikolaou, Nikolaos; Coulthard, Paul; Worthington, Helen V

    2009-01-01

    Periodontitis is a chronic infective disease of the gums caused by bacteria present in dental plaque. This condition induces the breakdown of the tooth supporting apparatus until teeth are lost. Surgery may be indicated to arrest disease progression and regenerate lost tissues. Several surgical techniques have been developed to regenerate periodontal tissues including guided tissue regeneration (GTR), bone grafting (BG) and the use of enamel matrix derivative (EMD). EMD is an extract of enamel matrix and contains amelogenins of various molecular weights. Amelogenins are involved in the formation of enamel and periodontal attachment formation during tooth development. To test whether EMD is effective, and to compare EMD versus GTR, and various BG procedures for the treatment of intrabony defects. The Cochrane Oral Health Group Trials Register, CENTRAL, MEDLINE and EMBASE were searched. Several dental journals were hand searched. No language restrictions were applied. Authors of randomised controlled trials (RCTs) identified, personal contacts and the manufacturer were contacted to identify unpublished trials. The last electronic search was conducted on 4 February 2009. RCTs on patients affected by periodontitis having intrabony defects of at least 3 mm treated with EMD compared with open flap debridement, GTR and various BG procedures with at least 1 year of follow-up. The outcome measures considered were: tooth loss, changes in probing attachment levels (PAL), pocket depths (PPD), gingival recessions (REC), bone levels from the bottom of the defects on intraoral radiographs, aesthetics and adverse events. The following time points were to be evaluated: 1, 5 and 10 years. Screening of eligible studies, assessment of the methodological quality of the trials and data extraction were conducted in duplicate and independently by at least two authors. Results were expressed as random-effects models using mean differences for continuous outcomes and risk ratios (RR) for

  5. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  6. Data driven analysis of rain events: feature extraction, clustering, microphysical /macro physical relationship

    Science.gov (United States)

    Djallel Dilmi, Mohamed; Mallet, Cécile; Barthes, Laurent; Chazottes, Aymeric

    2017-04-01

    The study of rain time series records is mainly carried out using rainfall rate or rain accumulation parameters estimated on a fixed integration time (typically 1 min, 1 hour or 1 day). In this study we used the concept of rain event. In fact, the discrete and intermittent natures of rain processes make the definition of some features inadequate when defined on a fixed duration. Long integration times (hour, day) lead to mix rainy and clear air periods in the same sample. Small integration time (seconds, minutes) will lead to noisy data with a great sensibility to detector characteristics. The analysis on the whole rain event instead of individual short duration samples of a fixed duration allows to clarify relationships between features, in particular between macro physical and microphysical ones. This approach allows suppressing the intra-event variability partly due to measurement uncertainties and allows focusing on physical processes. An algorithm based on Genetic Algorithm (GA) and Self Organising Maps (SOM) is developed to obtain a parsimonious characterisation of rain events using a minimal set of variables. The use of self-organizing map (SOM) is justified by the fact that it allows to map a high dimensional data space in a two-dimensional space while preserving as much as possible the initial space topology in an unsupervised way. The obtained SOM allows providing the dependencies between variables and consequently removing redundant variables leading to a minimal subset of only five features (the event duration, the rain rate peak, the rain event depth, the event rain rate standard deviation and the absolute rain rate variation of order 0.5). To confirm relevance of the five selected features the corresponding SOM is analyzed. This analysis shows clearly the existence of relationships between features. It also shows the independence of the inter-event time (IETp) feature or the weak dependence of the Dry percentage in event (Dd%e) feature. This confirms

  7. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  8. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  9. Mining web-based data to assess public response to environmental events

    International Nuclear Information System (INIS)

    Cha, YoonKyung; Stow, Craig A.

    2015-01-01

    We explore how the analysis of web-based data, such as Twitter and Google Trends, can be used to assess the social relevance of an environmental accident. The concept and methods are applied in the shutdown of drinking water supply at the city of Toledo, Ohio, USA. Toledo's notice, which persisted from August 1 to 4, 2014, is a high-profile event that directly influenced approximately half a million people and received wide recognition. The notice was given when excessive levels of microcystin, a byproduct of cyanobacteria blooms, were discovered at the drinking water treatment plant on Lake Erie. Twitter mining results illustrated an instant response to the Toledo incident, the associated collective knowledge, and public perception. The results from Google Trends, on the other hand, revealed how the Toledo event raised public attention on the associated environmental issue, harmful algal blooms, in a long-term context. Thus, when jointly applied, Twitter and Google Trend analysis results offer complementary perspectives. Web content aggregated through mining approaches provides a social standpoint, such as public perception and interest, and offers context for establishing and evaluating environmental management policies. - The joint application of Twitter and Google Trend analysis to an environmental event offered both short and long-term patterns of public perception and interest on the event

  10. A Research on Maximum Symbolic Entropy from Intrinsic Mode Function and Its Application in Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Zhuofei Xu

    2017-01-01

    Full Text Available Empirical mode decomposition (EMD is a self-adaptive analysis method for nonlinear and nonstationary signals. It has been widely applied to machinery fault diagnosis and structural damage detection. A novel feature, maximum symbolic entropy of intrinsic mode function based on EMD, is proposed to enhance the ability of recognition of EMD in this paper. First, a signal is decomposed into a collection of intrinsic mode functions (IMFs based on the local characteristic time scale of the signal, and then IMFs are transformed into a serious of symbolic sequence with different parameters. Second, it can be found that the entropies of symbolic IMFs are quite different. However, there is always a maximum value for a certain symbolic IMF. Third, take the maximum symbolic entropy as features to describe IMFs from a signal. Finally, the proposed features are applied to evaluate the effect of maximum symbolic entropy in fault diagnosis of rolling bearing, and then the maximum symbolic entropy is compared with other standard time analysis features in a contrast experiment. Although maximum symbolic entropy is only a time domain feature, it can reveal the signal characteristic information accurately. It can also be used in other fields related to EMD method.

  11. A Content-Adaptive Analysis and Representation Framework for Audio Event Discovery from "Unscripted" Multimedia

    Science.gov (United States)

    Radhakrishnan, Regunathan; Divakaran, Ajay; Xiong, Ziyou; Otsuka, Isao

    2006-12-01

    We propose a content-adaptive analysis and representation framework to discover events using audio features from "unscripted" multimedia such as sports and surveillance for summarization. The proposed analysis framework performs an inlier/outlier-based temporal segmentation of the content. It is motivated by the observation that "interesting" events in unscripted multimedia occur sparsely in a background of usual or "uninteresting" events. We treat the sequence of low/mid-level features extracted from the audio as a time series and identify subsequences that are outliers. The outlier detection is based on eigenvector analysis of the affinity matrix constructed from statistical models estimated from the subsequences of the time series. We define the confidence measure on each of the detected outliers as the probability that it is an outlier. Then, we establish a relationship between the parameters of the proposed framework and the confidence measure. Furthermore, we use the confidence measure to rank the detected outliers in terms of their departures from the background process. Our experimental results with sequences of low- and mid-level audio features extracted from sports video show that "highlight" events can be extracted effectively as outliers from a background process using the proposed framework. We proceed to show the effectiveness of the proposed framework in bringing out suspicious events from surveillance videos without any a priori knowledge. We show that such temporal segmentation into background and outliers, along with the ranking based on the departure from the background, can be used to generate content summaries of any desired length. Finally, we also show that the proposed framework can be used to systematically select "key audio classes" that are indicative of events of interest in the chosen domain.

  12. Data analysis of event tape and connection

    International Nuclear Information System (INIS)

    Gong Huili

    1995-01-01

    The data analysis on the VAX-11/780 computer is briefly described, the data is from the recorded event tape of JUHU data acquisition system on the PDP-11/44 computer. The connection of the recorded event tapes of the XSYS data acquisition system on VAX computer is also introduced

  13. Advanced Reactor Passive System Reliability Demonstration Analysis for an External Event

    Directory of Open Access Journals (Sweden)

    Matthew Bucknor

    2017-03-01

    Full Text Available Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general for the postulated transient event.

  14. Advanced reactor passive system reliability demonstration analysis for an external event

    Energy Technology Data Exchange (ETDEWEB)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin [Argonne National Laboratory, Argonne (United States)

    2017-03-15

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event.

  15. Advanced reactor passive system reliability demonstration analysis for an external event

    International Nuclear Information System (INIS)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia J.; Grelle, Austin

    2017-01-01

    Many advanced reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended because of deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize within a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper provides an overview of a passive system reliability demonstration analysis for an external event. Considering an earthquake with the possibility of site flooding, the analysis focuses on the behavior of the passive Reactor Cavity Cooling System following potential physical damage and system flooding. The assessment approach seeks to combine mechanistic and simulation-based methods to leverage the benefits of the simulation-based approach without the need to substantially deviate from conventional probabilistic risk assessment techniques. Although this study is presented as only an example analysis, the results appear to demonstrate a high level of reliability of the Reactor Cavity Cooling System (and the reactor system in general) for the postulated transient event

  16. Repeated Time-to-event Analysis of Consecutive Analgesic Events in Postoperative Pain

    DEFF Research Database (Denmark)

    Juul, Rasmus Vestergaard; Rasmussen, Sten; Kreilgaard, Mads

    2015-01-01

    BACKGROUND: Reduction in consumption of opioid rescue medication is often used as an endpoint when investigating analgesic efficacy of drugs by adjunct treatment, but appropriate methods are needed to analyze analgesic consumption in time. Repeated time-to-event (RTTE) modeling is proposed as a way...... to describe analgesic consumption by analyzing the timing of consecutive analgesic events. METHODS: Retrospective data were obtained from 63 patients receiving standard analgesic treatment including morphine on request after surgery following hip fracture. Times of analgesic events up to 96 h after surgery...... were extracted from hospital medical records. Parametric RTTE analysis was performed with exponential, Weibull, or Gompertz distribution of analgesic events using NONMEM®, version 7.2 (ICON Development Solutions, USA). The potential influences of night versus day, sex, and age were investigated...

  17. Nest-crowdcontrol: Advanced video-based crowd monitoring for large public events

    OpenAIRE

    Monari, Eduardo; Fischer, Yvonne; Anneken, Mathias

    2015-01-01

    Current video surveillance systems still lack of intelligent video and data analysis modules for supporting situation awareness of decision makers. Especially in mass gatherings like large public events, the decision maker would benefit from different views of the area, especially from crowd density estimations. This article describes a multi-camera system called NEST and its application for crowd density analysis. First, the overall system design is presented. Based on this, the crowd densit...

  18. Computer-aided event tree analysis by the impact vector method

    International Nuclear Information System (INIS)

    Lima, J.E.P.

    1984-01-01

    In the development of the Probabilistic Risk Analysis of Angra I, the ' large event tree/small fault tree' approach was adopted for the analysis of the plant behavior in an emergency situation. In this work, the event tree methodology is presented along with the adaptations which had to be made in order to attain a correct description of the safety system performances according to the selected analysis method. The problems appearing in the application of the methodology and their respective solutions are presented and discussed, with special emphasis to the impact vector technique. A description of the ETAP code ('Event Tree Analysis Program') developed for constructing and quantifying event trees is also given in this work. A preliminary version of the small-break LOCA analysis for Angra 1 is presented as an example of application of the methodology and of the code. It is shown that the use of the ETAP code sigmnificantly contributes to decreasing the time spent in event tree analyses, making it viable the practical application of the analysis approach referred above. (author) [pt

  19. DISPELLING ILLUSIONS OF REFLECTION: A NEW ANALYSIS OF THE 2007 MAY 19 CORONAL 'WAVE' EVENT

    International Nuclear Information System (INIS)

    Attrill, Gemma D. R.

    2010-01-01

    A new analysis of the 2007 May 19 coronal wave-coronal mass ejection-dimmings event is offered employing base difference extreme-ultraviolet (EUV) images. Previous work analyzing the coronal wave associated with this event concluded strongly in favor of purely an MHD wave interpretation for the expanding bright front. This conclusion was based to a significant extent on the identification of multiple reflections of the coronal wave front. The analysis presented here shows that the previously identified 'reflections' are actually optical illusions and result from a misinterpretation of the running difference EUV data. The results of this new multiwavelength analysis indicate that two coronal wave fronts actually developed during the eruption. This new analysis has implications for our understanding of diffuse coronal waves and questions the validity of the analysis and conclusions reached in previous studies.

  20. Analysis of area events as part of probabilistic safety assessment for Romanian TRIGA SSR 14 MW reactor

    International Nuclear Information System (INIS)

    Mladin, D.; Stefan, I.

    2005-01-01

    The international experience has shown that the external events could be an important contributor to plant/ reactor risk. For this reason such events have to be included in the PSA studies. In the context of PSA for nuclear facilities, external events are defined as events originating from outside the plant, but with the potential to create an initiating event at the plant. To support plant safety assessment, PSA can be used to find methods for identification of vulnerable features of the plant and to suggest modifications in order to mitigate the impact of external events or the producing of initiating events. For that purpose, probabilistic assessment of area events concerning fire and flooding risk and impact is necessary. Due to the relatively large power level amongst research reactors, the approach to safety analysis of Romanian 14 MW TRIGA benefits from an ongoing PSA project. In this context, treatment of external events should be considered. The specific tasks proposed for the complete evaluation of area event analysis are: identify the rooms important for facility safety, determine a relative area event risk index for these rooms and a relative area event impact index if the event occurs, evaluate the rooms specific area event frequency, determine the rooms contribution to reactor hazard state frequencies, analyze power supply and room dependencies of safety components (as pumps, motor operated valves). The fire risk analysis methodology is based on Berry's method [1]. This approach provides a systematic procedure to carry out a relative index of different rooms. The factors, which affect the fire probability, are: personal presence in the room, number and type of ignition sources, type and area of combustibles, fuel available in the room, fuel location, and ventilation. The flooding risk analysis is based on the amount of piping in the room. For accuracy of the information regarding piping a facility walk-about is necessary. In case of flooding risk

  1. The Multi-Frequency Correlation Between Eua and sCER Futures Prices: Evidence from the Emd Approach

    Science.gov (United States)

    Zhang, Yue-Jun; Huang, Yi-Song

    2015-05-01

    Currently European Union Allowances (EUA) and secondary Certified Emission Reduction (sCER) have become two dominant carbon trading assets for investors and their linkage attracts much attention from academia and practitioners in recent years. Under this circumstance, we use the empirical mode decomposition (EMD) approach to decompose the two carbon futures contract prices and discuss their correlation from the multi-frequency perspective. The empirical results indicate that, first, the EUA and sCER futures price movements can be divided into those triggered by the long-term, medium-term and short-term market impacts. Second, the price movements in the EUA and sCER futures markets are primarily caused by the long-term impact, while the short-term impact can only explain a small fraction. Finally, the long-term (short-term) effect on EUA prices is statistically uncorrelated with the short-term (long-term) effect of sCER prices, and there is a medium or strong lead-and-lag correlation between the EUA and sCER price components with the same time scales. These results may provide some important insights of price forecast and arbitraging activities for carbon futures market investors, analysts and regulators.

  2. Trending analysis of precursor events

    International Nuclear Information System (INIS)

    Watanabe, Norio

    1998-01-01

    The Accident Sequence Precursor (ASP) Program of United States Nuclear Regulatory Commission (U.S.NRC) identifies and categorizes operational events at nuclear power plants in terms of the potential for core damage. The ASP analysis has been performed on yearly basis and the results have been published in the annual reports. This paper describes the trends in initiating events and dominant sequences for 459 precursors identified in the ASP Program during the 1969-94 period and also discusses a comparison with dominant sequences predicted in the past Probabilistic Risk Assessment (PRA) studies. These trends were examined for three time periods, 1969-81, 1984-87 and 1988-94. Although the different models had been used in the ASP analyses for these three periods, the distribution of precursors by dominant sequences show similar trends to each other. For example, the sequences involving loss of both main and auxiliary feedwater were identified in many PWR events and those involving loss of both high and low coolant injection were found in many BWR events. Also, it was found that these dominant sequences were comparable to those determined to be dominant in the predictions by the past PRAs. As well, a list of the 459 precursors identified are provided in Appendix, indicating initiating event types, unavailable systems, dominant sequences, conditional core damage probabilities, and so on. (author)

  3. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  4. The limiting events transient analysis by RETRAN02 and VIPRE01 for an ABWR

    International Nuclear Information System (INIS)

    Tsai Chiungwen; Shih Chunkuan; Wang Jongrong; Lin Haotzu; Jin Jiunan; Cheng Suchin

    2009-01-01

    This paper describes the transient analysis of generator load rejection (LR) and One Turbine Control Valve Closure (OTCVC) events for Lungmen nuclear power plant (LMNPP). According to the Critical Power Ratio (CPR) criterion, the Preliminary Safety Analysis Report (PSAR) concluded that LR and OTCVC are the first and second limiting events respectively. In addition, the fuel type is changed from GE12 to GE14 now. It's necessary to re-analyze these two events for safety consideration. In this study, to quantify the impact to reactor, the difference of initial critical power ratio (ICPR) and minimum critical power ratio (MCPR), ie. ΔCPR is calculated. The ΔCPRs of the LR and OTCVC events are calculated with the combination of RETRAN02 and VIPRE01 codes. In RETRAN02 calculation, a thermal-hydraulic model was prepared for the transient analysis. The data including upper plenum pressure, core inlet flow, normalized power, and axial power shapes during transient are furthermore submitted into VIPRE01 for ΔCPR calculation. In VIPRE01 calculation, there was a hot channel model built to simulate the hottest fuel bundle. Based on the thermal-hydraulic data from RETRAN02, the ΔCPRs are calculated by VIPRE01 hot channel model. Additionally, the different TCV control modes are considered to study the influence of different TCV closure curves on transient behavior. Meanwhile, sensitivity studies including different initial system pressure and different initial power/flow conditions are also considered. Based on this analysis, the maximum ΔCPRs for LR and OTCVC are 0.162 and 0.191 respectively. According CPR criterion, the result shows that the impact caused by OTCVC event leads to be larger than LR event. (author)

  5. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    Science.gov (United States)

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  6. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  7. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  8. Univariate and Bivariate Empirical Mode Decomposition for Postural Stability Analysis

    Directory of Open Access Journals (Sweden)

    Jacques Duchêne

    2008-05-01

    Full Text Available The aim of this paper was to compare empirical mode decomposition (EMD and two new extended methods of  EMD named complex empirical mode decomposition (complex-EMD and bivariate empirical mode decomposition (bivariate-EMD. All methods were used to analyze stabilogram center of pressure (COP time series. The two new methods are suitable to be applied to complex time series to extract complex intrinsic mode functions (IMFs before the Hilbert transform is subsequently applied on the IMFs. The trace of the analytic IMF in the complex plane has a circular form, with each IMF having its own rotation frequency. The area of the circle and the average rotation frequency of IMFs represent efficient indicators of the postural stability status of subjects. Experimental results show the effectiveness of these indicators to identify differences in standing posture between groups.

  9. Serious adverse events with infliximab: analysis of spontaneously reported adverse events.

    Science.gov (United States)

    Hansen, Richard A; Gartlehner, Gerald; Powell, Gregory E; Sandler, Robert S

    2007-06-01

    Serious adverse events such as bowel obstruction, heart failure, infection, lymphoma, and neuropathy have been reported with infliximab. The aims of this study were to explore adverse event signals with infliximab by using a long period of post-marketing experience, stratifying by indication. The relative reporting of infliximab adverse events to the U.S. Food and Drug Administration (FDA) was assessed with the public release version of the adverse event reporting system (AERS) database from 1968 to third quarter 2005. On the basis of a systematic review of adverse events, Medical Dictionary for Regulatory Activities (MedDRA) terms were mapped to predefined categories of adverse events, including death, heart failure, hepatitis, infection, infusion reaction, lymphoma, myelosuppression, neuropathy, and obstruction. Disproportionality analysis was used to calculate the empiric Bayes geometric mean (EBGM) and corresponding 90% confidence intervals (EB05, EB95) for adverse event categories. Infliximab was identified as the suspect medication in 18,220 reports in the FDA AERS database. We identified a signal for lymphoma (EB05 = 6.9), neuropathy (EB05 = 3.8), infection (EB05 = 2.9), and bowel obstruction (EB05 = 2.8). The signal for granulomatous infections was stronger than the signal for non-granulomatous infections (EB05 = 12.6 and 2.4, respectively). The signals for bowel obstruction and infusion reaction were specific to patients with IBD; this suggests potential confounding by indication, especially for bowel obstruction. In light of this additional evidence of risk of lymphoma, neuropathy, and granulomatous infections, clinicians should stress this risk in the shared decision-making process.

  10. Enamel matrix derivative (Emdogain(R)) for periodontal tissue regeneration in intrabony defects.

    Science.gov (United States)

    Esposito, Marco; Grusovin, Maria Gabriella; Papanikolaou, Nikolaos; Coulthard, Paul; Worthington, Helen V

    2009-10-07

    Periodontitis is a chronic infective disease of the gums caused by bacteria present in dental plaque. This condition induces the breakdown of the tooth supporting apparatus until teeth are lost. Surgery may be indicated to arrest disease progression and regenerate lost tissues. Several surgical techniques have been developed to regenerate periodontal tissues including guided tissue regeneration (GTR), bone grafting (BG) and the use of enamel matrix derivative (EMD). EMD is an extract of enamel matrix and contains amelogenins of various molecular weights. Amelogenins are involved in the formation of enamel and periodontal attachment formation during tooth development. To test whether EMD is effective, and to compare EMD versus GTR, and various BG procedures for the treatment of intrabony defects. We searched the Cochrane Oral Health Group Trials Register, CENTRAL, MEDLINE and EMBASE. Several journals were handsearched. No language restrictions were applied. Authors of randomised controlled trials (RCTs) identified, personal contacts and the manufacturer were contacted to identify unpublished trials. Most recent search: February 2009. RCTs on patients affected by periodontitis having intrabony defects of at least 3 mm treated with EMD compared with open flap debridement, GTR and various BG procedures with at least 1 year follow up. The outcome measures considered were: tooth loss, changes in probing attachment levels (PAL), pocket depths (PPD), gingival recessions (REC), bone levels from the bottom of the defects on intraoral radiographs, aesthetics and adverse events. The following time-points were to be evaluated: 1, 5 and 10 years. Screening of eligible studies, assessment of the methodological quality of the trials and data extraction were conducted in duplicate and independently by two authors. Results were expressed as random-effects models using mean differences for continuous outcomes and risk ratios (RR) for dichotomous outcomes with 95% confidence intervals

  11. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  12. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  13. Multi-Unit Initiating Event Analysis for a Single-Unit Internal Events Level 1 PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Dong San; Park, Jin Hee; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    The Fukushima nuclear accident in 2011 highlighted the importance of considering the risks from multi-unit accidents at a site. The ASME/ANS probabilistic risk assessment (PRA) standard also includes some requirements related to multi-unit aspects, one of which (IE-B5) is as follows: 'For multi-unit sites with shared systems, DO NOT SUBSUME multi-unit initiating events if they impact mitigation capability [1].' However, the existing single-unit PSA models do not explicitly consider multi-unit initiating events and hence systems shared by multiple units (e.g., alternate AC diesel generator) are fully credited for the single unit and ignores the need for the shared systems by other units at the same site [2]. This paper describes the results of the multi-unit initiating event (IE) analysis performed as a part of the at-power internal events Level 1 probabilistic safety assessment (PSA) for an OPR1000 single unit ('reference unit'). In this study, a multi-unit initiating event analysis for a single-unit PSA was performed, and using the results, dual-unit LOOP initiating event was added to the existing PSA model for the reference unit (OPR1000 type). Event trees were developed for dual-unit LOOP and dual-unit SBO which can be transferred from dual- unit LOOP. Moreover, CCF basic events for 5 diesel generators were modelled. In case of simultaneous SBO occurrences in both units, this study compared two different assumptions on the availability of the AAC D/G. As a result, when dual-unit LOOP initiating event was added to the existing single-unit PSA model, the total CDF increased by 1∼ 2% depending on the probability that the AAC D/G is available to a specific unit in case of simultaneous SBO in both units.

  14. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  15. Probabilistic safety analysis for fire events for the NPP Isar 2

    International Nuclear Information System (INIS)

    Schmaltz, H.; Hristodulidis, A.

    2007-01-01

    The 'Probabilistic Safety Analysis for Fire Events' (Fire-PSA KKI2) for the NPP Isar 2 was performed in addition to the PSA for full power operation and considers all possible events which can be initiated due to a fire. The aim of the plant specific Fire-PSA was to perform a quantitative assessment of fire events during full power operation, which is state of the art. Based on simplistic assumptions referring to the fire induced failures, the influence of system- and component-failures on the frequency of the core damage states was analysed. The Fire-PSA considers events, which will result due to fire-induced failures of equipment on the one hand in a SCRAM and on the other hand in events, which will not have direct operational effects but because of the fire-induced failure of safety related installations the plant will be shut down as a precautionary measure. These events are considered because they may have a not negligible influence on the frequency of core damage states in case of failures during the plant shut down because of the reduced redundancy of safety related systems. (orig.)

  16. Features, Events, and Processes: Disruptive Events

    Energy Technology Data Exchange (ETDEWEB)

    J. King

    2004-03-31

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  17. Features, Events, and Processes: Disruptive Events

    International Nuclear Information System (INIS)

    J. King

    2004-01-01

    The primary purpose of this analysis is to evaluate seismic- and igneous-related features, events, and processes (FEPs). These FEPs represent areas of natural system processes that have the potential to produce disruptive events (DE) that could impact repository performance and are related to the geologic processes of tectonism, structural deformation, seismicity, and igneous activity. Collectively, they are referred to as the DE FEPs. This evaluation determines which of the DE FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the data and results presented in supporting analysis reports, model reports, technical information, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report

  18. Analysis of adverse events of renal impairment related to platinum-based compounds using the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Naganuma, Misa; Motooka, Yumi; Sasaoka, Sayaka; Hatahira, Haruna; Hasegawa, Shiori; Fukuda, Akiho; Nakao, Satoshi; Shimada, Kazuyo; Hirade, Koseki; Mori, Takayuki; Yoshimura, Tomoaki; Kato, Takeshi; Nakamura, Mitsuhiro

    2018-01-01

    Platinum compounds cause several adverse events, such as nephrotoxicity, gastrointestinal toxicity, myelosuppression, ototoxicity, and neurotoxicity. We evaluated the incidence of renal impairment as adverse events are related to the administration of platinum compounds using the Japanese Adverse Drug Event Report database. We analyzed adverse events associated with the use of platinum compounds reported from April 2004 to November 2016. The reporting odds ratio at 95% confidence interval was used to detect the signal for each renal impairment incidence. We evaluated the time-to-onset profile of renal impairment and assessed the hazard type using Weibull shape parameter and used the applied association rule mining technique to discover undetected relationships such as possible risk factor. In total, 430,587 reports in the Japanese Adverse Drug Event Report database were analyzed. The reporting odds ratios (95% confidence interval) for renal impairment resulting from the use of cisplatin, oxaliplatin, carboplatin, and nedaplatin were 2.7 (2.5-3.0), 0.6 (0.5-0.7), 0.8 (0.7-1.0), and 1.3 (0.8-2.1), respectively. The lower limit of the reporting odds ratio (95% confidence interval) for cisplatin was >1. The median (lower-upper quartile) onset time of renal impairment following the use of platinum-based compounds was 6.0-8.0 days. The Weibull shape parameter β and 95% confidence interval upper limit of oxaliplatin were impairment during cisplatin use in real-world setting. The present findings demonstrate that the incidence of renal impairment following cisplatin use should be closely monitored when patients are hypertensive or diabetic, or when they are co-administered furosemide, loxoprofen, or pemetrexed. In addition, healthcare professionals should closely assess a patient's background prior to treatment.

  19. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  20. Recognizing of stereotypic patterns in epileptic EEG using empirical modes and wavelets

    Science.gov (United States)

    Grubov, V. V.; Sitnikova, E.; Pavlov, A. N.; Koronovskii, A. A.; Hramov, A. E.

    2017-11-01

    Epileptic activity in the form of spike-wave discharges (SWD) appears in the electroencephalogram (EEG) during absence seizures. This paper evaluates two approaches for detecting stereotypic rhythmic activities in EEG, i.e., the continuous wavelet transform (CWT) and the empirical mode decomposition (EMD). The CWT is a well-known method of time-frequency analysis of EEG, whereas EMD is a relatively novel approach for extracting signal's waveforms. A new method for pattern recognition based on combination of CWT and EMD is proposed. It was found that this combined approach resulted to the sensitivity of 86.5% and specificity of 92.9% for sleep spindles and 97.6% and 93.2% for SWD, correspondingly. Considering strong within- and between-subjects variability of sleep spindles, the obtained efficiency in their detection was high in comparison with other methods based on CWT. It is concluded that the combination of a wavelet-based approach and empirical modes increases the quality of automatic detection of stereotypic patterns in rat's EEG.

  1. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  2. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    Science.gov (United States)

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  3. Many multicenter trials had few events per center, requiring analysis via random-effects models or GEEs.

    Science.gov (United States)

    Kahan, Brennan C; Harhay, Michael O

    2015-12-01

    Adjustment for center in multicenter trials is recommended when there are between-center differences or when randomization has been stratified by center. However, common methods of analysis (such as fixed-effects, Mantel-Haenszel, or stratified Cox models) often require a large number of patients or events per center to perform well. We reviewed 206 multicenter randomized trials published in four general medical journals to assess the average number of patients and events per center and determine whether appropriate methods of analysis were used in trials with few patients or events per center. The median number of events per center/treatment arm combination for trials using a binary or survival outcome was 3 (interquartile range, 1-10). Sixteen percent of trials had less than 1 event per center/treatment combination, 50% fewer than 3, and 63% fewer than 5. Of the trials which adjusted for center using a method of analysis which requires a large number of events per center, 6% had less than 1 event per center-treatment combination, 25% fewer than 3, and 50% fewer than 5. Methods of analysis that allow for few events per center, such as random-effects models or generalized estimating equations (GEEs), were rarely used. Many multicenter trials contain few events per center. Adjustment for center using random-effects models or GEE with model-based (non-robust) standard errors may be beneficial in these scenarios. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    Science.gov (United States)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  5. Error Analysis of Satellite Precipitation-Driven Modeling of Flood Events in Complex Alpine Terrain

    Directory of Open Access Journals (Sweden)

    Yiwen Mei

    2016-03-01

    Full Text Available The error in satellite precipitation-driven complex terrain flood simulations is characterized in this study for eight different global satellite products and 128 flood events over the Eastern Italian Alps. The flood events are grouped according to two flood types: rain floods and flash floods. The satellite precipitation products and runoff simulations are evaluated based on systematic and random error metrics applied on the matched event pairs and basin-scale event properties (i.e., rainfall and runoff cumulative depth and time series shape. Overall, error characteristics exhibit dependency on the flood type. Generally, timing of the event precipitation mass center and dispersion of the time series derived from satellite precipitation exhibits good agreement with the reference; the cumulative depth is mostly underestimated. The study shows a dampening effect in both systematic and random error components of the satellite-driven hydrograph relative to the satellite-retrieved hyetograph. The systematic error in shape of the time series shows a significant dampening effect. The random error dampening effect is less pronounced for the flash flood events and the rain flood events with a high runoff coefficient. This event-based analysis of the satellite precipitation error propagation in flood modeling sheds light on the application of satellite precipitation in mountain flood hydrology.

  6. Event based neutron activation spectroscopy and analysis algorithm using MLE and metaheuristics

    Science.gov (United States)

    Wallace, Barton

    2014-03-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods [1] given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis [2] was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes [3] involved was used to create a statistical model. Maximum likelihood estimation was combined with metaheuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research.

  7. Energy-Efficiency Policy Opportunities for Electric Motor-Driven Systems

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    This publication is the first global analysis of energy consumption and energy efficiency potential of EMDS (electric motor- driven system). The electric motors and systems they drive are the largest single electricity end use, accounting for more than 40% of global electricity consumption. Huge energy efficiency potential was found untapped in EMDS - around 25% of EMDS electricity use could be saved cost-effectively, reducing total global electricity demand by about 10%. However, the energy efficiency of EMDS has been relatively neglected in comparison with other sustainable energy opportunities. It is crucial to scale up the operations and resources committed to realizing the vast savings potential of optimized EMDS. This paper proposes a comprehensive package of policy recommendations to help governments realize the potential for energy savings in EMDS.

  8. Parallel processor for fast event analysis

    International Nuclear Information System (INIS)

    Hensley, D.C.

    1983-01-01

    Current maximum data rates from the Spin Spectrometer of approx. 5000 events/s (up to 1.3 MBytes/s) and minimum analysis requiring at least 3000 operations/event require a CPU cycle time near 70 ns. In order to achieve an effective cycle time of 70 ns, a parallel processing device is proposed where up to 4 independent processors will be implemented in parallel. The individual processors are designed around the Am2910 Microsequencer, the AM29116 μP, and the Am29517 Multiplier. Satellite histogramming in a mass memory system will be managed by a commercial 16-bit μP system

  9. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  10. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  11. Climate network analysis of regional precipitation extremes: The true story told by event synchronization

    Science.gov (United States)

    Odenweller, Adrian; Donner, Reik V.

    2017-04-01

    Over the last decade, complex network methods have been frequently used for characterizing spatio-temporal patterns of climate variability from a complex systems perspective, yielding new insights into time-dependent teleconnectivity patterns and couplings between different components of the Earth climate. Among the foremost results reported, network analyses of the synchronicity of extreme events as captured by the so-called event synchronization have been proposed to be powerful tools for disentangling the spatio-temporal organization of particularly extreme rainfall events and anticipating the timing of monsoon onsets or extreme floodings. Rooted in the analysis of spike train synchrony analysis in the neurosciences, event synchronization has the great advantage of automatically classifying pairs of events arising at two distinct spatial locations as temporally close (and, thus, possibly statistically - or even dynamically - interrelated) or not without the necessity of selecting an additional parameter in terms of a maximally tolerable delay between these events. This consideration is conceptually justified in case of the original application to spike trains in electroencephalogram (EEG) recordings, where the inter-spike intervals show relatively narrow distributions at high temporal sampling rates. However, in case of climate studies, precipitation extremes defined by daily precipitation sums exceeding a certain empirical percentile of their local distribution exhibit a distinctively different type of distribution of waiting times between subsequent events. This raises conceptual concerns if event synchronization is still appropriate for detecting interlinkages between spatially distributed precipitation extremes. In order to study this problem in more detail, we employ event synchronization together with an alternative similarity measure for event sequences, event coincidence rates, which requires a manual setting of the tolerable maximum delay between two

  12. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  13. Disruptive event uncertainties in a perturbation approach to nuclear waste repository risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.

    1980-09-01

    A methodology is developed for incorporating a full range of the principal forecasting uncertainties into a risk analysis of a nuclear waste repository. The result of this methodology is a set of risk curves similar to those used by Rasmussen in WASH-1400. The set of curves is partially derived from a perturbation approach to analyze potential disruptive event sequences. Such a scheme could be useful in truncating the number of disruptive event scenarios and providing guidance to those establishing data-base development priorities.

  14. Denoising of chaotic signal using independent component analysis and empirical mode decomposition with circulate translating

    Science.gov (United States)

    Wen-Bo, Wang; Xiao-Dong, Zhang; Yuchan, Chang; Xiang-Li, Wang; Zhao, Wang; Xi, Chen; Lei, Zheng

    2016-01-01

    In this paper, a new method to reduce noises within chaotic signals based on ICA (independent component analysis) and EMD (empirical mode decomposition) is proposed. The basic idea is decomposing chaotic signals and constructing multidimensional input vectors, firstly, on the base of EMD and its translation invariance. Secondly, it makes the independent component analysis on the input vectors, which means that a self adapting denoising is carried out for the intrinsic mode functions (IMFs) of chaotic signals. Finally, all IMFs compose the new denoised chaotic signal. Experiments on the Lorenz chaotic signal composed of different Gaussian noises and the monthly observed chaotic sequence on sunspots were put into practice. The results proved that the method proposed in this paper is effective in denoising of chaotic signals. Moreover, it can correct the center point in the phase space effectively, which makes it approach the real track of the chaotic attractor. Project supported by the National Science and Technology, China (Grant No. 2012BAJ15B04), the National Natural Science Foundation of China (Grant Nos. 41071270 and 61473213), the Natural Science Foundation of Hubei Province, China (Grant No. 2015CFB424), the State Key Laboratory Foundation of Satellite Ocean Environment Dynamics, China (Grant No. SOED1405), the Hubei Provincial Key Laboratory Foundation of Metallurgical Industry Process System Science, China (Grant No. Z201303), and the Hubei Key Laboratory Foundation of Transportation Internet of Things, Wuhan University of Technology, China (Grant No.2015III015-B02).

  15. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  16. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  17. Internal event analysis for Laguna Verde Unit 1 Nuclear Power Plant. Accident sequence quantification and results

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1994-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis for Laguna Verde Unit 1 Nuclear Power Plant, CNSNS-TR 004, in five volumes. The reports are organized as follows: CNSNS-TR 004 Volume 1: Introduction and Methodology. CNSNS-TR4 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR 004 Volume 3: System Analysis. CNSNS-TR 004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR 005 Volume 5: Appendices A, B and C. This volume presents the development of the dependent failure analysis, the treatment of the support system dependencies, the identification of the shared-components dependencies, and the treatment of the common cause failure. It is also presented the identification of the main human actions considered along with the possible recovery actions included. The development of the data base and the assumptions and limitations in the data base are also described in this volume. The accident sequences quantification process and the resolution of the core vulnerable sequences are presented. In this volume, the source and treatment of uncertainties associated with failure rates, component unavailabilities, initiating event frequencies, and human error probabilities are also presented. Finally, the main results and conclusions for the Internal Event Analysis for Laguna Verde Nuclear Power Plant are presented. The total core damage frequency calculated is 9.03x 10-5 per year for internal events. The most dominant accident sequences found are the transients involving the loss of offsite power, the station blackout accidents, and the anticipated transients without SCRAM (ATWS). (Author)

  18. Physics analysis of the gang partial rod drive event

    International Nuclear Information System (INIS)

    Boman, C.; Frost, R.L.

    1992-08-01

    During the routine positioning of partial-length control rods in Gang 3 on the afternoon of Monday, July 27, 1992, the partial-length rods continued to drive into the reactor even after the operator released the controlling toggle switch. In response to this occurrence, the Safety Analysis and Engineering Services Group (SAEG) requested that the Applied Physics Group (APG) analyze the gang partial rod drive event. Although similar accident scenarios were considered in analysis for Chapter 15 of the Safety Analysis Report (SAR), APG and SAEG conferred and agreed that this particular type of gang partial-length rod motion event was not included in the SAR. This report details this analysis

  19. Modeling the recurrent failure to thrive in less than two-year children: recurrent events survival analysis.

    Science.gov (United States)

    Saki Malehi, Amal; Hajizadeh, Ebrahim; Ahmadi, Kambiz; Kholdi, Nahid

    2014-01-01

    This study aimes to evaluate the failure to thrive (FTT) recurrent event over time. This longitudinal study was conducted during February 2007 to July 2009. The primary outcome was growth failure. The analysis was done using 1283 children who had experienced FTT several times, based on recurrent events analysis. Fifty-nine percent of the children had experienced the FTT at least one time and 5.3% of them had experienced it up to four times. The Prentice-Williams-Peterson (PWP) model revealed significant relationship between diarrhea (HR=1.26), respiratory infections (HR=1.25), urinary tract infections (HR=1.51), discontinuation of breast-feeding (HR=1.96), teething (HR=1.18), initiation age of complementary feeding (HR=1.11) and hazard rate of the first FTT event. Recurrence nature of the FTT is a main problem, which taking it into account increases the accuracy in analysis of FTT event process and can lead to identify different risk factors for each FTT recurrences.

  20. Multistate event history analysis with frailty

    Directory of Open Access Journals (Sweden)

    Govert Bijwaard

    2014-05-01

    Full Text Available Background: In survival analysis a large literature using frailty models, or models with unobserved heterogeneity, exists. In the growing literature and modelling on multistate models, this issue is only in its infant phase. Ignoring frailty can, however, produce incorrect results. Objective: This paper presents how frailties can be incorporated into multistate models, with an emphasis on semi-Markov multistate models with a mixed proportional hazard structure. Methods: First, the aspects of frailty modeling in univariate (proportional hazard, Cox and multivariate event history models are addressed. The implications of choosing shared or correlated frailty is highlighted. The relevant differences with recurrent events data are covered next. Multistate models are event history models that can have both multivariate and recurrent events. Incorporating frailty in multistate models, therefore, brings all the previously addressed issues together. Assuming a discrete frailty distribution allows for a very general correlation structure among the transition hazards in a multistate model. Although some estimation procedures are covered the emphasis is on conceptual issues. Results: The importance of multistate frailty modeling is illustrated with data on labour market and migration dynamics of recent immigrants to the Netherlands.

  1. Performance Analysis: Work Control Events Identified January - August 2010

    Energy Technology Data Exchange (ETDEWEB)

    De Grange, C E; Freeman, J W; Kerr, C E; Holman, G; Marsh, K; Beach, R

    2011-01-14

    elements of an institutional work planning and control system. By the end of that year this system was documented and implementation had begun. In 2009, training of the workforce began and as of the time of this report more than 50% of authorized Integration Work Sheets (IWS) use the activity-based planning process. In 2010, LSO independently reviewed the work planning and control process and confirmed to the Laboratory that the Integrated Safety Management (ISM) System was implemented. LLNL conducted a cross-directorate management self-assessment of work planning and control and is developing actions to respond to the issues identified. Ongoing efforts to strengthen the work planning and control process and to improve the quality of LLNL work packages are in progress: completion of remaining actions in response to the 2009 DOE Office of Health, Safety, and Security (HSS) evaluation of LLNL's ISM System; scheduling more than 14 work planning and control self-assessments in FY11; continuing to align subcontractor work control with the Institutional work planning and control system; and continuing to maintain the electronic IWS application. The 24 events included in this analysis were caused by errors in the first four of the five ISMS functions. The most frequent cause was errors in analyzing the hazards (Function 2). The second most frequent cause was errors occurring when defining the work (Function 1), followed by errors during the performance of work (Function 4). Interestingly, very few errors in developing controls (Function 3) resulted in events. This leads one to conclude that if improvements are made to defining the scope of work and analyzing the potential hazards, LLNL may reduce the frequency or severity of events. Analysis of the 24 events resulted in the identification of ten common causes. Some events had multiple causes, resulting in the mention of 39 causes being identified for the 24 events. The most frequent cause was workers, supervisors, or experts

  2. Modeling crowd behavior based on the discrete-event multiagent approach

    OpenAIRE

    Лановой, Алексей Феликсович; Лановой, Артем Алексеевич

    2014-01-01

    The crowd is a temporary, relatively unorganized group of people, who are in close physical contact with each other. Individual behavior of human outside the crowd is determined by many factors, associated with his intellectual activities, but inside the crowd the man loses his identity and begins to obey more simple laws of behavior.One of approaches to the construction of multi-level model of the crowd using discrete-event multiagent approach was described in the paper.Based on this analysi...

  3. Analysis of syntactic and semantic features for fine-grained event-spatial understanding in outbreak news reports

    Directory of Open Access Journals (Sweden)

    Chanlekha Hutchatai

    2010-03-01

    Full Text Available Abstract Background Previous studies have suggested that epidemiological reasoning needs a fine-grained modelling of events, especially their spatial and temporal attributes. While the temporal analysis of events has been intensively studied, far less attention has been paid to their spatial analysis. This article aims at filling the gap concerning automatic event-spatial attribute analysis in order to support health surveillance and epidemiological reasoning. Results In this work, we propose a methodology that provides a detailed analysis on each event reported in news articles to recover the most specific locations where it occurs. Various features for recognizing spatial attributes of the events were studied and incorporated into the models which were trained by several machine learning techniques. The best performance for spatial attribute recognition is very promising; 85.9% F-score (86.75% precision/85.1% recall. Conclusions We extended our work on event-spatial attribute recognition by focusing on machine learning techniques, which are CRF, SVM, and Decision tree. Our approach avoided the costly development of an external knowledge base by employing the feature sources that can be acquired locally from the analyzed document. The results showed that the CRF model performed the best. Our study indicated that the nearest location and previous event location are the most important features for the CRF and SVM model, while the location extracted from the verb's subject is the most important to the Decision tree model.

  4. Probabilistic analysis of extreme wind events

    Energy Technology Data Exchange (ETDEWEB)

    Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)

    1997-12-31

    A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)

  5. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  6. Analysis and Modelling of Taste and Odour Events in a Shallow Subtropical Reservoir

    Directory of Open Access Journals (Sweden)

    Edoardo Bertone

    2016-08-01

    Full Text Available Understanding and predicting Taste and Odour events is as difficult as critical for drinking water treatment plants. Following a number of events in recent years, a comprehensive statistical analysis of data from Lake Tingalpa (Queensland, Australia was conducted. Historical manual sampling data, as well as data remotely collected by a vertical profiler, were collected; regression analysis and self-organising maps were the used to determine correlations between Taste and Odour compounds and potential input variables. Results showed that the predominant Taste and Odour compound was geosmin. Although one of the main predictors was the occurrence of cyanobacteria blooms, it was noticed that the cyanobacteria species was also critical. Additionally, water temperature, reservoir volume and oxidised nitrogen availability, were key inputs determining the occurrence and magnitude of the geosmin peak events. Based on the results of the statistical analysis, a predictive regression model was developed to provide indications on the potential occurrence, and magnitude, of peaks in geosmin concentration. Additionally, it was found that the blue green algae probe of the lake’s vertical profiler has the potential to be used as one of the inputs for an automated geosmin early warning system.

  7. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  8. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  9. A case-crossover analysis of forest fire haze events and mortality in Malaysia

    Science.gov (United States)

    Sahani, Mazrura; Zainon, Nurul Ashikin; Wan Mahiyuddin, Wan Rozita; Latif, Mohd Talib; Hod, Rozita; Khan, Md Firoz; Tahir, Norhayati Mohd; Chan, Chang-Chuan

    2014-10-01

    The Southeast Asian (SEA) haze events due to forest fires are recurrent and affect Malaysia, particularly the Klang Valley region. The aim of this study is to examine the risk of haze days due to biomass burning in Southeast Asia on daily mortality in the Klang Valley region between 2000 and 2007. We used a case-crossover study design to model the effect of haze based on PM10 concentration to the daily mortality. The time-stratified control sampling approach was used, adjusted for particulate matter (PM10) concentrations, time trends and meteorological influences. Based on time series analysis of PM10 and backward trajectory analysis, haze days were defined when daily PM10 concentration exceeded 100 μg/m3. The results showed a total of 88 haze days were identified in the Klang Valley region during the study period. A total of 126,822 cases of death were recorded for natural mortality where respiratory mortality represented 8.56% (N = 10,854). Haze events were found to be significantly associated with natural and respiratory mortality at various lags. For natural mortality, haze events at lagged 2 showed significant association with children less than 14 years old (Odd Ratio (OR) = 1.41; 95% Confidence Interval (CI) = 1.01-1.99). Respiratory mortality was significantly associated with haze events for all ages at lagged 0 (OR = 1.19; 95% CI = 1.02-1.40). Age-and-gender-specific analysis showed an incremental risk of respiratory mortality among all males and elderly males above 60 years old at lagged 0 (OR = 1.34; 95% CI = 1.09-1.64 and OR = 1.41; 95% CI = 1.09-1.84 respectively). Adult females aged 15-59 years old were found to be at highest risk of respiratory mortality at lagged 5 (OR = 1.66; 95% CI = 1.03-1.99). This study clearly indicates that exposure to haze events showed immediate and delayed effects on mortality.

  10. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  11. The Run 2 ATLAS Analysis Event Data Model

    CERN Document Server

    SNYDER, S; The ATLAS collaboration; NOWAK, M; EIFERT, T; BUCKLEY, A; ELSING, M; GILLBERG, D; MOYSE, E; KOENEKE, K; KRASZNAHORKAY, A

    2014-01-01

    During the LHC's first Long Shutdown (LS1) ATLAS set out to establish a new analysis model, based on the experience gained during Run 1. A key component of this is a new Event Data Model (EDM), called the xAOD. This format, which is now in production, provides the following features: A separation of the EDM into interface classes that the user code directly interacts with, and data storage classes that hold the payload data. The user sees an Array of Structs (AoS) interface, while the data is stored in a Struct of Arrays (SoA) format in memory, thus making it possible to efficiently auto-vectorise reconstruction code. A simple way of augmenting and reducing the information saved for different data objects. This makes it possible to easily decorate objects with new properties during data analysis, and to remove properties that the analysis does not need. A persistent file format that can be explored directly with ROOT, either with or without loading any additional libraries. This allows fast interactive naviga...

  12. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  13. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  14. Contrasting safety assessments of a runway incursion scenario: Event sequence analysis versus multi-agent dynamic risk modelling

    International Nuclear Information System (INIS)

    Stroeve, Sybert H.; Blom, Henk A.P.; Bakker, G.J.

    2013-01-01

    In the safety literature it has been argued, that in a complex socio-technical system safety cannot be well analysed by event sequence based approaches, but requires to capture the complex interactions and performance variability of the socio-technical system. In order to evaluate the quantitative and practical consequences of these arguments, this study compares two approaches to assess accident risk of an example safety critical sociotechnical system. It contrasts an event sequence based assessment with a multi-agent dynamic risk model (MA-DRM) based assessment, both of which are performed for a particular runway incursion scenario. The event sequence analysis uses the well-known event tree modelling formalism and the MA-DRM based approach combines agent based modelling, hybrid Petri nets and rare event Monte Carlo simulation. The comparison addresses qualitative and quantitative differences in the methods, attained risk levels, and in the prime factors influencing the safety of the operation. The assessments show considerable differences in the accident risk implications of the performance of human operators and technical systems in the runway incursion scenario. In contrast with the event sequence based results, the MA-DRM based results show that the accident risk is not manifest from the performance of and relations between individual human operators and technical systems. Instead, the safety risk emerges from the totality of the performance and interactions in the agent based model of the safety critical operation considered, which coincides very well with the argumentation in the safety literature.

  15. Internal event analysis of Laguna Verde Unit 1 Nuclear Power Plant. System Analysis

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    The Level 1 results of Laguna Verde Nuclear Power Plant PRA are presented in the I nternal Event Analysis of Laguna Verde Unit 1 Nuclear Power Plant , CNSNS-TR-004, in five volumes. The reports are organized as follows: CNSNS-TR-004 Volume 1: Introduction and Methodology. CNSNS-TR-004 Volume 2: Initiating Event and Accident Sequences. CNSNS-TR-004 Volume 3: System Analysis. CNSNS-TR-004 Volume 4: Accident Sequence Quantification and Results. CNSNS-TR-004 Volume 5: Appendices A, B and C. This volume presents the results of the system analysis for the Laguna Verde Unit 1 Nuclear Power Plant. The system analysis involved the development of logical models for all the systems included in the accident sequence event tree headings, and for all the support systems required to operate the front line systems. For the Internal Event analysis for Laguna Verde, 16 front line systems and 5 support systems were included. Detailed fault trees were developed for most of the important systems. Simplified fault trees focusing on major faults were constructed for those systems that can be adequately represent,ed using this kind of modeling. For those systems where fault tree models were not constructed, actual data were used to represent the dominant failures of the systems. The main failures included in the fault trees are hardware failures, test and maintenance unavailabilities, common cause failures, and human errors. The SETS and TEMAC codes were used to perform the qualitative and quantitative fault tree analyses. (Author)

  16. Exploratory trend and pattern analysis of 1981 through 1983 Licensee Event Report data. Main report. Volume 1

    International Nuclear Information System (INIS)

    Hester, O.V.; Groh, M.R.; Farmer, F.G.

    1986-10-01

    This report presents an overview of the 1981 through 1983 Sequence Coding and Search System (SCSS) data base that contains nuclear power plant operational data derived from Licensee Event Reports (LERs) submitted to the United States Nuclear Regulatory Commission (USNRC). Both overall event reporting and events related to specific components, subsystems, systems, and personnel are discussed. At all of these levels of information, software is used to generate count data for contingency tables. Contingency table analysis is the main tool for the trend and pattern analysis. The tables focus primarily on faults associated with various components and other items of interest across different plants. The abstracts and other SCSS information on the LERs accounting for unusual counts in the tables were examined to gain insights from the events. Trends and patterns in LER reporting and reporting of events for various component groups were examined through log-linear modeling techniques

  17. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  18. Event based neutron activation spectroscopy and analysis algorithm using MLE and meta-heuristics

    International Nuclear Information System (INIS)

    Wallace, B.

    2014-01-01

    Techniques used in neutron activation analysis are often dependent on the experimental setup. In the context of developing a portable and high efficiency detection array, good energy resolution and half-life discrimination are difficult to obtain with traditional methods given the logistic and financial constraints. An approach different from that of spectrum addition and standard spectroscopy analysis was needed. The use of multiple detectors prompts the need for a flexible storage of acquisition data to enable sophisticated post processing of information. Analogously to what is done in heavy ion physics, gamma detection counts are stored as two-dimensional events. This enables post-selection of energies and time frames without the need to modify the experimental setup. This method of storage also permits the use of more complex analysis tools. Given the nature of the problem at hand, a light and efficient analysis code had to be devised. A thorough understanding of the physical and statistical processes involved was used to create a statistical model. Maximum likelihood estimation was combined with meta-heuristics to produce a sophisticated curve-fitting algorithm. Simulated and experimental data were fed into the analysis code prompting positive results in terms of half-life discrimination, peak identification and noise reduction. The code was also adapted to other fields of research such as heavy ion identification of the quasi-target (QT) and quasi-particle (QP). The approach used seems to be able to translate well into other fields of research. (author)

  19. Automated reasoning with dynamic event trees: a real-time, knowledge-based decision aide

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Subramanyan, N.

    1988-01-01

    The models and data contained in a probabilistic risk assessment (PRA) Event Sequence Analysis represent a wealth of information that can be used for dynamic calculation of event sequence likelihood. In this paper we report a new and unique computerization methodology which utilizes these data. This sub-system (referred to as PREDICTOR) has been developed and tested as part of a larger system. PREDICTOR performs a real-time (re)calculation of the estimated likelihood of core-melt as a function of plant status. This methodology uses object-oriented programming techniques from the artificial intelligence discipline that enable one to codify event tree and fault tree logic models and associated probabilities developed in a PRA study. Existence of off-normal conditions is reported to PREDICTOR, which then updates the relevant failure probabilities throughout the event tree and fault tree models by dynamically replacing the off-the-shelf (or prior) probabilities with new probabilities based on the current situation. The new event probabilities are immediately propagated through the models (using 'demons') and an updated core-melt probability is calculated. Along the way, the dominant non-success path of each event tree is determined and highlighted. (author)

  20. Preliminary Analysis of the Common Cause Failure Events for Domestic Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Daeil; Han, Sanghoon

    2007-01-01

    It is known that the common cause failure (CCF) events have a great effect on the safety and probabilistic safety assessment (PSA) results of nuclear power plants (NPPs). However, the domestic studies have been mainly focused on the analysis method and modeling of CCF events. Thus, the analysis of the CCF events for domestic NPPs were performed to establish a domestic database for the CCF events and to deliver them to the operation office of the international common cause failure data exchange (ICDE) project. This paper presents the analysis results of the CCF events for domestic nuclear power plants

  1. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  2. Analysis of system and of course of events

    International Nuclear Information System (INIS)

    Hoertner, H.; Kersting, E.J.; Puetter, B.M.

    1986-01-01

    The analysis of the system and of the course of events is used to determine the frequency of core melt-out accidents and to describe the safety-related boundary conditions of appropriate accidents. The lecture is concerned with the effect of system changes in the reference plant and the effect of triggering events not assessed in detail or not sufficiently assessed in detail in phase A of the German Risk Study on the frequency of core melt-out accidents, the minimum requirements for system functions for controlling triggering events, i.e. to prevent core melt-out accidents, the reliability data important for reliability investigations and frequency assessments. (orig./DG) [de

  3. Digital Image Stabilization Method Based on Variational Mode Decomposition and Relative Entropy

    Directory of Open Access Journals (Sweden)

    Duo Hao

    2017-11-01

    Full Text Available Cameras mounted on vehicles frequently suffer from image shake due to the vehicles’ motions. To remove jitter motions and preserve intentional motions, a hybrid digital image stabilization method is proposed that uses variational mode decomposition (VMD and relative entropy (RE. In this paper, the global motion vector (GMV is initially decomposed into several narrow-banded modes by VMD. REs, which exhibit the difference of probability distribution between two modes, are then calculated to identify the intentional and jitter motion modes. Finally, the summation of the jitter motion modes constitutes jitter motions, whereas the subtraction of the resulting sum from the GMV represents the intentional motions. The proposed stabilization method is compared with several known methods, namely, medium filter (MF, Kalman filter (KF, wavelet decomposition (MD method, empirical mode decomposition (EMD-based method, and enhanced EMD-based method, to evaluate stabilization performance. Experimental results show that the proposed method outperforms the other stabilization methods.

  4. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  5. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    of relevant pressure peaks at the various recording levels. Until now, this selection has been performed entirely by rule-based systems, requiring each pressure deflection to fit within predefined rigid numerical limits in order to be detected. However, due to great variations in the shapes of the pressure...... curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing.......79-0.99 and accuracies of 0.89-0.98, depending on the recording level within the esophageal lumen. The neural networks often recognized peaks that clearly represented true contractions but that had been rejected by a rule-based system. We conclude that neural networks have potentials for automatic detections...

  6. Human factors analysis and design methods for nuclear waste retrieval systems. Volume III. User's guide for the computerized event-tree analysis technique

    International Nuclear Information System (INIS)

    Casey, S.M.; Deretsky, Z.

    1980-08-01

    This document provides detailed instructions for using the Computerized Event-Tree Analysis Technique (CETAT), a program designed to assist a human factors analyst in predicting event probabilities in complex man-machine configurations found in waste retrieval systems. The instructions contained herein describe how to (a) identify the scope of a CETAT analysis, (b) develop operator performance data, (c) enter an event-tree structure, (d) modify a data base, and (e) analyze event paths and man-machine system configurations. Designed to serve as a tool for developing, organizing, and analyzing operator-initiated event probabilities, CETAT simplifies the tasks of the experienced systems analyst by organizing large amounts of data and performing cumbersome and time consuming arithmetic calculations. The principal uses of CETAT in the waste retrieval development project will be to develop models of system reliability and evaluate alternative equipment designs and operator tasks. As with any automated technique, however, the value of the output will be a function of the knowledge and skill of the analyst using the program

  7. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    Science.gov (United States)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content

  8. Is the efficacy of antidepressants in panic disorder mediated by adverse events? A mediational analysis.

    Directory of Open Access Journals (Sweden)

    Irene Bighelli

    Full Text Available It has been hypothesised that the perception of adverse events in placebo-controlled antidepressant clinical trials may induce patients to conclude that they have been randomized to the active arm of the trial, leading to the breaking of blind. This may enhance the expectancies for improvement and the therapeutic response. The main objective of this study is to test the hypothesis that the efficacy of antidepressants in panic disorder is mediated by the perception of adverse events. The present analysis is based on a systematic review of published and unpublished randomised trials comparing antidepressants with placebo for panic disorder. The Baron and Kenny approach was applied to investigate the mediational role of adverse events in the relationship between antidepressants treatment and efficacy. Fourteen placebo-controlled antidepressants trials were included in the analysis. We found that: (a antidepressants treatment was significantly associated with better treatment response (ß = 0.127, 95% CI 0.04 to 0.21, p = 0.003; (b antidepressants treatment was not associated with adverse events (ß = 0.094, 95% CI -0.05 to 0.24, p = 0.221; (c adverse events were negatively associated with treatment response (ß = 0.035, 95% CI -0.06 to -0.05, p = 0.022. Finally, after adjustment for adverse events, the relationship between antidepressants treatment and treatment response remained statistically significant (ß = 0.122, 95% CI 0.01 to 0.23, p = 0.039. These findings do not support the hypothesis that the perception of adverse events in placebo-controlled antidepressant clinical trials may lead to the breaking of blind and to an artificial inflation of the efficacy measures. Based on these results, we argue that the moderate therapeutic effect of antidepressants in individuals with panic disorder is not an artefact, therefore reflecting a genuine effect that doctors can expect to replicate under real-world conditions.

  9. Analysis of catchments response to severe drought event for ...

    African Journals Online (AJOL)

    Nafiisah

    The run sum analysis method was a sound method which indicates in ... intensity and duration of stream flow depletion between nearby catchments. ... threshold level analysis method, and allows drought events to be described in more.

  10. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  11. A case for multi-model and multi-approach based event attribution: The 2015 European drought

    Science.gov (United States)

    Hauser, Mathias; Gudmundsson, Lukas; Orth, René; Jézéquel, Aglaé; Haustein, Karsten; Seneviratne, Sonia Isabelle

    2017-04-01

    Science on the role of anthropogenic influence on extreme weather events such as heat waves or droughts has evolved rapidly over the past years. The approach of "event attribution" compares the occurrence probability of an event in the present, factual world with the probability of the same event in a hypothetical, counterfactual world without human-induced climate change. Every such analysis necessarily faces multiple methodological choices including, but not limited to: the event definition, climate model configuration, and the design of the counterfactual world. Here, we explore the role of such choices for an attribution analysis of the 2015 European summer drought (Hauser et al., in preparation). While some GCMs suggest that anthropogenic forcing made the 2015 drought more likely, others suggest no impact, or even a decrease in the event probability. These results additionally differ for single GCMs, depending on the reference used for the counterfactual world. Observational results do not suggest a historical tendency towards more drying, but the record may be too short to provide robust assessments because of the large interannual variability of drought occurrence. These results highlight the need for a multi-model and multi-approach framework in event attribution research. This is especially important for events with low signal to noise ratio and high model dependency such as regional droughts. Hauser, M., L. Gudmundsson, R. Orth, A. Jézéquel, K. Haustein, S.I. Seneviratne, in preparation. A case for multi-model and multi-approach based event attribution: The 2015 European drought.

  12. Event history analysis and the cross-section

    DEFF Research Database (Denmark)

    Keiding, Niels

    2006-01-01

    Examples are given of problems in event history analysis, where several time origins (generating calendar time, age, disease duration, time on study, etc.) are considered simultaneously. The focus is on complex sampling patterns generated around a cross-section. A basic tool is the Lexis diagram....

  13. Human performance analysis of industrial radiography radiation exposure events

    International Nuclear Information System (INIS)

    Reece, W.J.; Hill, S.G.

    1995-01-01

    A set of radiation overexposure event reports were reviewed as part of a program to examine human performance in industrial radiography for the US Nuclear Regulatory Commission. Incident records for a seven year period were retrieved from an event database. Ninety-five exposure events were initially categorized and sorted for further analysis. Descriptive models were applied to a subset of severe overexposure events. Modeling included: (1) operational sequence tables to outline the key human actions and interactions with equipment, (2) human reliability event trees, (3) an application of an information processing failures model, and (4) an extrapolated use of the error influences and effects diagram. Results of the modeling analyses provided insights into the industrial radiography task and suggested areas for further action and study to decrease overexposures

  14. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  15. Event-by-event simulation of quantum phenomena : Application to Einstein-Podolosky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    De Raedt, H.; De Raedt, K.; Michielsen, K.; Keimpema, K.; Miyashita, S.

    We review the data gathering and analysis procedure used in real E instein-Podolsky-Rosen-Bohm experiments with photons and we illustrate the procedure by analyzing experimental data. Based on this analysis, we construct event-based computer simulation models in which every essential element in the

  16. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    Energy Technology Data Exchange (ETDEWEB)

    Kančev, Duško, E-mail: dusko.kancev@ec.europa.eu [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Duchac, Alexander; Zerger, Benoit [European Commission, DG-JRC, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) mbH, Schwetnergasse 1, 50667 Köln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 - 92262 Fontenay-aux-Roses Cedex (France)

    2014-07-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  17. Statistical analysis of events related to emergency diesel generators failures in the nuclear industry

    International Nuclear Information System (INIS)

    Kančev, Duško; Duchac, Alexander; Zerger, Benoit; Maqua, Michael; Wattrelos, Didier

    2014-01-01

    Highlights: • Analysis of operating experience related to emergency diesel generators events at NPPs. • Four abundant operating experience databases screened. • Delineating important insights and conclusions based on the operating experience. - Abstract: This paper is aimed at studying the operating experience related to emergency diesel generators (EDGs) events at nuclear power plants collected from the past 20 years. Events related to EDGs failures and/or unavailability as well as all the supporting equipment are in the focus of the analysis. The selected operating experience was analyzed in detail in order to identify the type of failures, attributes that contributed to the failure, failure modes potential or real, discuss risk relevance, summarize important lessons learned, and provide recommendations. The study in this particular paper is tightly related to the performing of statistical analysis of the operating experience. For the purpose of this study EDG failure is defined as EDG failure to function on demand (i.e. fail to start, fail to run) or during testing, or an unavailability of an EDG, except of unavailability due to regular maintenance. The Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases as well as the operating experience contained in the IAEA/NEA International Reporting System for Operating Experience and the U.S. Licensee Event Reports were screened. The screening methodology applied for each of the four different databases is presented. Further on, analysis aimed at delineating the causes, root causes, contributing factors and consequences are performed. A statistical analysis was performed related to the chronology of events, types of failures, the operational circumstances of detection of the failure and the affected components/subsystems. The conclusions and results of the statistical analysis are discussed. The main findings concerning the testing

  18. External events analysis in PSA studies for Czech NPPs

    International Nuclear Information System (INIS)

    Holy, J.; Hustak, S.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The purpose of the paper is to summarize current status of natural external hazards analysis in the PSA projects maintained in Czech Republic for both Czech NPPs - Dukovany and Temelin. The focus of the presentation is put upon the basic milestones in external event analysis effort - identification of external hazards important for Czech NPPs sites, screening out of the irrelevant hazards, modeling of plant response to the initiating events, including the basic activities regarding vulnerability and fragility analysis (supported with on-site analysis), quantification of accident sequences, interpretation of results and development of measures decreasing external events risk. The following external hazards are discussed in the paper, which have been addressed during several last years in PSA projects for Czech NPPs: 1)seismicity, 2)extremely low temperature 3)extremely high temperature 4)extreme wind 5)extreme precipitation (water, snow) 6)transport of dangerous substances (as an example of man-made hazard with some differences identified in comparison with natural hazards) 7)other hazards, which are not considered as very important for Czech NPPs, were screened out in the initial phase of the analysis, but are known as potential problem areas abroad. The paper is a result of coordinated effort with participation of experts and staff from engineering support organization UJV Rez, a.s. and NPPs located in Czech Republic - Dukovany and Temelin. (authors)

  19. Analysis of event tree with imprecise inputs by fuzzy set theory

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Chun, Moon Hyun

    1990-01-01

    Fuzzy set theory approach is proposed as a method to analyze event trees with imprecise or linguistic input variables such as 'likely' or 'improbable' instead of the numerical probability. In this paper, it is shown how the fuzzy set theory can be applied to the event tree analysis. The result of this study shows that the fuzzy set theory approach can be applied as an acceptable and effective tool for analysis of the event tree with fuzzy type of inputs. Comparisons of the fuzzy theory approach with the probabilistic approach of computing probabilities of final states of the event tree through subjective weighting factors and LHS technique show that the two approaches have common factors and give reasonable results

  20. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  1. Esophageal Motor Disorders Are a Strong and Independant Associated Factor of Barrett's Esophagus.

    Science.gov (United States)

    Bazin, Camille; Benezech, Alban; Alessandrini, Marine; Grimaud, Jean-Charles; Vitton, Veronique

    2018-04-30

    Esophageal motor disorder (EMD) has been shown to be associated with gastroesophageal reflux disease (GERD). However, the association of EMD with a Barrett's esophagus (BE) is controversial. Our objective was to evaluate whether the presence of EMD was an independent factor associated with BE. A retrospective case-control study was conducted in GERD patients who all had oeso-gastroduodenal endoscopy and high-resolution esophageal manometry. The clinical data collected was known or potential risk factors for BE: male gender, smoking and alcohol consumption, age, body mass index, presence of hiatal hernia, frequency, and age of GERD. EMD were classified according to the Chicago classification into: ineffective motor syndrome, fragmented peristalsis and absence of peristalsis, lower esophageal sphincter hypotonia. Two hundred and one patients (101 in the GERD + BE group and 100 in the GERD without BE) were included. In univariate analysis, male gender, alcohol consumption, presence of hiatal hernia, and EMD appeared to be associated with the presence of BE. In a multivariate analysis, 3 independent factors were identified: the presence of EMD (odds ratio [OR], 3.99; 95% confidence interval [CI], 1.71-9.28; P = 0.001), the presence of hiatal hernia (OR, 5.60; 95% CI, 2.45-12.76; P < 0.001), Helicobacter pylori infection (OR, 0.08; 95% CI, 0.01-0.84; P = 0.035). The presence of EMD (particularly ineffective motor syndrome and lower esophageal sphincter hypotonia) is a strong independent associated factor of BE. Searching systematically for an EMD in patients suffering from GERD could be a new strategy to organize the endoscopic follow-up.

  2. Ontology-based Vaccine and Drug Adverse Event Representation and Theory-guided Systematic Causal Network Analysis toward Integrative Pharmacovigilance Research.

    Science.gov (United States)

    He, Yongqun

    2016-06-01

    Compared with controlled terminologies ( e.g. , MedDRA, CTCAE, and WHO-ART), the community-based Ontology of AEs (OAE) has many advantages in adverse event (AE) classifications. The OAE-derived Ontology of Vaccine AEs (OVAE) and Ontology of Drug Neuropathy AEs (ODNAE) serve as AE knowledge bases and support data integration and analysis. The Immune Response Gene Network Theory explains molecular mechanisms of vaccine-related AEs. The OneNet Theory of Life treats the whole process of a life of an organism as a single complex and dynamic network ( i.e. , OneNet). A new "OneNet effectiveness" tenet is proposed here to expand the OneNet theory. Derived from the OneNet theory, the author hypothesizes that one human uses one single genotype-rooted mechanism to respond to different vaccinations and drug treatments, and experimentally identified mechanisms are manifestations of the OneNet blueprint mechanism under specific conditions. The theories and ontologies interact together as semantic frameworks to support integrative pharmacovigilance research.

  3. Characterization of a Flood Event through a Sediment Analysis: The Tescio River Case Study

    Directory of Open Access Journals (Sweden)

    Silvia Di Francesco

    2016-07-01

    Full Text Available This paper presents the hydrological analysis and grain size characteristics of fluvial sediments in a river basin and their combination to characterize a flood event. The overall objective of the research is the development of a practical methodology based on experimental surveys to reconstruct the hydraulic history of ungauged river reaches on the basis of the modifications detected on the riverbed during the dry season. The grain size analysis of fluvial deposits usually requires great technical and economical efforts and traditional sieving based on physical sampling is not appropriate to adequately represent the spatial distribution of sediments in a wide area of a riverbed with a reasonable number of samples. The use of photographic sampling techniques, on the other hand, allows for the quick and effective determination of the grain size distribution, through the use of a digital camera and specific graphical algorithms in large river stretches. A photographic sampling is employed to characterize the riverbed in a 3 km ungauged reach of the Tescio River, a tributary of the Chiascio River, located in central Italy, representative of many rivers in the same geographical area. To this end, the particle size distribution is reconstructed through the analysis of digital pictures of the sediments taken on the riverbed in dry conditions. The sampling has been performed after a flood event of known duration, which allows for the identification of the removal of the armor in one section along the river reach under investigation. The volume and composition of the eroded sediments made it possible to calculate the average flow rate associated with the flood event which caused the erosion, by means of the sediment transport laws and the hydrological analysis of the river basin. A hydraulic analysis of the river stretch under investigation was employed to verify the validity of the proposed procedure.

  4. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  5. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  6. International Sport Events: Improving Marketing

    Directory of Open Access Journals (Sweden)

    Margarita Kerzaitė

    2014-04-01

    Full Text Available The report and the article will be a comprehensive analysis ofthe needs to improve the international sport events marketing.Highlighting the role of international sport events in contemporarysociety and the challenges in the context of globalization,comparing opinions of various authors about aspects of classificationand the benefits for host country. The article and the reportreveals the main existing problem encountered in organizinginternational sport events, estimated perspectives for solutionof this problem. Summarizes the international sport eventsopportunities, basically modernize marketing tools according tothe marketing mix correction based on systematic synthesis ofmarketing concepts and adaptation/standardization needs, themost important factors in the marketing mix for the excretion ofthe main marketing objectives. The article is based on the latestscientific literature analysis.

  7. Initiating Event Analysis of a Lithium Fluoride Thorium Reactor

    Science.gov (United States)

    Geraci, Nicholas Charles

    The primary purpose of this study is to perform an Initiating Event Analysis for a Lithium Fluoride Thorium Reactor (LFTR) as the first step of a Probabilistic Safety Assessment (PSA). The major objective of the research is to compile a list of key initiating events capable of resulting in failure of safety systems and release of radioactive material from the LFTR. Due to the complex interactions between engineering design, component reliability and human reliability, probabilistic safety assessments are most useful when the scope is limited to a single reactor plant. Thus, this thesis will study the LFTR design proposed by Flibe Energy. An October 2015 Electric Power Research Institute report on the Flibe Energy LFTR asked "what-if?" questions of subject matter experts and compiled a list of key hazards with the most significant consequences to the safety or integrity of the LFTR. The potential exists for unforeseen hazards to pose additional risk for the LFTR, but the scope of this thesis is limited to evaluation of those key hazards already identified by Flibe Energy. These key hazards are the starting point for the Initiating Event Analysis performed in this thesis. Engineering evaluation and technical study of the plant using a literature review and comparison to reference technology revealed four hazards with high potential to cause reactor core damage. To determine the initiating events resulting in realization of these four hazards, reference was made to previous PSAs and existing NRC and EPRI initiating event lists. Finally, fault tree and event tree analyses were conducted, completing the logical classification of initiating events. Results are qualitative as opposed to quantitative due to the early stages of system design descriptions and lack of operating experience or data for the LFTR. In summary, this thesis analyzes initiating events using previous research and inductive and deductive reasoning through traditional risk management techniques to

  8. Modal analysis of the thermal conductivity of nanowires: examining unique thermal transport features

    Science.gov (United States)

    Samaraweera, Nalaka; Larkin, Jason M.; Chan, Kin L.; Mithraratne, Kumar

    2018-06-01

    In this study, unique thermal transport features of nanowires over bulk materials are investigated using a combined analysis based on lattice dynamics and equilibrium molecular dynamics (EMD). The evaluation of the thermal conductivity (TC) of Lenard–Jones nanowires becomes feasible due to the multi-step normal mode decomposition (NMD) procedure implemented in the study. A convergence issue of the TC of nanowires is addressed by the NMD implementation for two case studies, which employ pristine nanowires (PNW) and superlattice nanowires. Interestingly, mode relaxation times at low frequencies of acoustic branches exhibit signs of approaching constant values, thus indicating the convergence of TC. The TC evaluation procedure is further verified by implementing EMD-based Green–Kubo analysis, which is based on a fundamentally different physical perspective. Having verified the NMD procedure, the non-monotonic trend of the TC of nanowires is addressed. It is shown that the principal cause for the observed trend is due to the competing effects of long wavelength phonons and phonon–surface scatterings as the nanowire’s cross-sectional width is changed. A computational procedure is developed to decompose the different modal contribution to the TC of shell alloy nanowires (SANWs) using virtual crystal NMD and the Allen–Feldman theory. Several important conclusions can be drawn from the results. A propagons to non-propagons boundary appeared, resulting in a cut-off frequency (ω cut); moreover, as alloy atomic mass is increased, ω cut shifts to lower frequencies. The existence of non-propagons partly causes the low TC of SANWs. It can be seen that modes with low frequencies demonstrate a similar behavior to corresponding modes of PNWs. Moreover, lower group velocities associated with higher alloy atomic mass resulted in a lower TC of SANWs.

  9. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  10. Uncertainty analysis of one Main Circulation Pump trip event at the Ignalina NPP

    International Nuclear Information System (INIS)

    Vileiniskis, V.; Kaliatka, A.; Uspuras, E.

    2004-01-01

    One Main Circulation Pump (MCP) trip event is an anticipated transient with expected frequency of approximately one event per year. There were a few events when one MCP was inadvertently tripped. The throughput of the rest running pumps in the affected Main Circulation Circuit loop increased, however, the total coolant flow through the affected loop decreased. The main question arises whether this coolant flow rate is sufficient for adequate core cooling. This paper presents an investigation of one MCP trip event at the Ignalina NPP. According to international practice, the transient analysis should consist of deterministic analysis by employing best-estimate codes and uncertainty analysis. For that purpose, the plant's RELAP5 model and the GRS (Germany) System for Uncertainty and Sensitivity Analysis package (SUSA) were employed. Uncertainty analysis of flow energy loss in different parts of the Main Circulation Circuit, initial conditions and code-selected models was performed. Such analysis allows to estimate the influence of separate parameters on calculation results and to find the modelling parameters that have the largest impact on the event studied. On the basis of this analysis, recommendations for the further improvement of the model have been developed. (author)

  11. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  12. An analysis on boron dilution events during SBLOCA for the KNGR

    International Nuclear Information System (INIS)

    Kim, Young In; Hwang, Young Dong; Park, Jong Kuen; Chung, Young Jong; Sim, Suk Gu

    1999-02-01

    An analysis on boron dilution events during small break loss of coolant accident (LOCA) for Korea Next Generation Reactor (KNGR) was performed using Computational Fluid Dynamic (CFD) computer program FLUENT code. The maximum size of the water slug was determined based on the source of un borated water slug and the possible flow paths. Axisymmetric computational fluid dynamic analysis model is applied for conservative scoping analysis of un borated water slug mixing with recirculation water of the reactor system following small break LOCA assuming one Reactor Coolant Pump (RCP) restart. The computation grid was determined through the sensitivity study on the grid size, which calculates the most conservative results, and the preliminary calculation for boron mixing was performed using the grid. (Author). 17 refs., 3 tabs., 26 figs

  13. Gearbox Fault Diagnosis in a Wind Turbine Using Single Sensor Based Blind Source Separation

    Directory of Open Access Journals (Sweden)

    Yuning Qian

    2016-01-01

    Full Text Available This paper presents a single sensor based blind source separation approach, namely, the wavelet-assisted stationary subspace analysis (WSSA, for gearbox fault diagnosis in a wind turbine. Continuous wavelet transform (CWT is used as a preprocessing tool to decompose a single sensor measurement data into a set of wavelet coefficients to meet the multidimensional requirement of the stationary subspace analysis (SSA. The SSA is a blind source separation technique that can separate the multidimensional signals into stationary and nonstationary source components without the need for independency and prior information of the source signals. After that, the separated nonstationary source component with the maximum kurtosis value is analyzed by the enveloping spectral analysis to identify potential fault-related characteristic frequencies. Case studies performed on a wind turbine gearbox test system verify the effectiveness of the WSSA approach and indicate that it outperforms independent component analysis (ICA and empirical mode decomposition (EMD, as well as the spectral-kurtosis-based enveloping, for wind turbine gearbox fault diagnosis.

  14. Root Cause Analysis: Learning from Adverse Safety Events.

    Science.gov (United States)

    Brook, Olga R; Kruskal, Jonathan B; Eisenberg, Ronald L; Larson, David B

    2015-10-01

    Serious adverse events continue to occur in clinical practice, despite our best preventive efforts. It is essential that radiologists, both as individuals and as a part of organizations, learn from such events and make appropriate changes to decrease the likelihood that such events will recur. Root cause analysis (RCA) is a process to (a) identify factors that underlie variation in performance or that predispose an event toward undesired outcomes and (b) allow for development of effective strategies to decrease the likelihood of similar adverse events occurring in the future. An RCA process should be performed within the environment of a culture of safety, focusing on underlying system contributors and, in a confidential manner, taking into account the emotional effects on the staff involved. The Joint Commission now requires that a credible RCA be performed within 45 days for all sentinel or major adverse events, emphasizing the need for all radiologists to understand the processes with which an effective RCA can be performed. Several RCA-related tools that have been found to be useful in the radiology setting include the "five whys" approach to determine causation; cause-and-effect, or Ishikawa, diagrams; causal tree mapping; affinity diagrams; and Pareto charts. © RSNA, 2015.

  15. SHAREHOLDERS VALUE AND CATASTROPHE BONDS. AN EVENT STUDY ANALYSIS AT EUROPEAN LEVEL

    OpenAIRE

    Constantin, Laura-Gabriela; Cernat-Gruici, Bogdan; Lupu, Radu; Nadotti Loris, Lino Maria

    2015-01-01

    Considering that the E.U. based (re)insurance companies are increasingly active within the segment of alternative risk transfer market, the aim of the present paper is to emphasize the impact of issuing cat bonds on the shareholders’ value for highlighting the competitive advantages of the analysed (re)insurance companies while pursuing the consolidation of their resilience in a turbulent economic environment.Eminently an applicative research, the analysis employs an event study methodology w...

  16. Corrective action program at the Krsko NPP. Trending and analysis of minor events

    International Nuclear Information System (INIS)

    Bach, B.; Kavsek, D.

    2007-01-01

    understand the factors that might be responsible for such trend and to take corrective actions prior to the escalation to a significant event. Reviewed and analyzed data based on codes trending identified common problems, potential trends and common contributors, promote a good trending program. For the effective trending program, positive adverse trends identification and corrective actions that are addressed the weaknesses that have been identified, should be specified and implemented through the corrective action program. For that purpose the appropriate coding system incorporated into Corrective Action and Operating Experience Program is established at Krsko NPP. Minor events and near misses are collected and analyzed in order to aggregate detected minor problems. The different groups of codes developed include codes for direct causes and casual factors, processes and organizations, consequences, level of significance etc. For easier trending and further analysis a different code combinations were utilized in a form of graphs. For example: organisation vs. causal factors (allows particular department to trend human performance in their own organisation), direct cause vs. time (allows trending of equipment degradation), processes vs. organisation (allows trending 501.2 of processes degradation in particular organisation) any code in question vs. time (for trend confirmation) etc. The purpose of this article is to present the coding system established at the Krsko Nuclear Power Plant and variety of ways for trending by using the system. The article deals with the codes established, organization of code system, trend codes combinations and benefit for early recognizing adverse trends of lo-level events. (author)

  17. Analysis of human error and organizational deficiency in events considering risk significance

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Kim, Yoonik; Kim, Say Hyung; Kim, Chansoo; Chung, Chang Hyun; Jung, Won Dea

    2004-01-01

    In this study, we analyzed human and organizational deficiencies in the trip events of Korean nuclear power plants. K-HPES items were used in human error analysis, and the organizational factors by Jacobs and Haber were used for organizational deficiency analysis. We proposed the use of CCDP as a risk measure to consider risk information in prioritizing K-HPES items and organizational factors. Until now, the risk significance of events has not been considered in human error and organizational deficiency analysis. Considering the risk significance of events in the process of analysis is necessary for effective enhancement of nuclear power plant safety by focusing on causes of human error and organizational deficiencies that are associated with significant risk

  18. Difference Image Analysis of Galactic Microlensing. II. Microlensing Events

    Energy Technology Data Exchange (ETDEWEB)

    Alcock, C.; Allsman, R. A.; Alves, D.; Axelrod, T. S.; Becker, A. C.; Bennett, D. P.; Cook, K. H.; Drake, A. J.; Freeman, K. C.; Griest, K. (and others)

    1999-09-01

    The MACHO collaboration has been carrying out difference image analysis (DIA) since 1996 with the aim of increasing the sensitivity to the detection of gravitational microlensing. This is a preliminary report on the application of DIA to galactic bulge images in one field. We show how the DIA technique significantly increases the number of detected lensing events, by removing the positional dependence of traditional photometry schemes and lowering the microlensing event detection threshold. This technique, unlike PSF photometry, gives the unblended colors and positions of the microlensing source stars. We present a set of criteria for selecting microlensing events from objects discovered with this technique. The 16 pixel and classical microlensing events discovered with the DIA technique are presented. (c) (c) 1999. The American Astronomical Society.

  19. Civil protection and Damaging Hydrogeological Events: comparative analysis of the 2000 and 2015 events in Calabria (southern Italy

    Directory of Open Access Journals (Sweden)

    O. Petrucci

    2017-11-01

    Full Text Available Calabria (southern Italy is a flood prone region, due to both its rough orography and fast hydrologic response of most watersheds. During the rainy season, intense rain affects the region, triggering floods and mass movements that cause economic damage and fatalities. This work presents a methodological approach to perform the comparative analysis of two events affecting the same area at a distance of 15 years, by collecting all the qualitative and quantitative features useful to describe both rain and damage. The aim is to understand if similar meteorological events affecting the same area can have different outcomes in terms of damage. The first event occurred between 8 and 10 September 2000, damaged 109 out of 409 municipalities of the region and killed 13 people in a campsite due to a flood. The second event, which occurred between 30 October and 1 November 2015, damaged 79 municipalities, and killed a man due to a flood. The comparative analysis highlights that, despite the exceptionality of triggering daily rain was higher in the 2015 event, the damage caused by the 2000 event to both infrastructures and belongings was higher, and it was strongly increased due to the 13 flood victims. We concluded that, in the 2015 event, the management of pre-event phases, with the issuing of meteorological alert, and the emergency management, with the preventive evacuation of people in hazardous situations due to landslides or floods, contributed to reduce the number of victims.

  20. A study of the recovery from 120 events

    International Nuclear Information System (INIS)

    Baumont, Genevieve; Menage, F.; Bigot, F.

    1998-01-01

    The author reports a study which aimed at providing additional information for improving safety by using event analysis. The approach concentrates on the dynamics of error detection and the way errors and shortcomings are managed. The study is based on a systematic analysis of 120 events in nuclear power plants. The authors first outline the differences between the activities described in significant events and that which is assumed to take place during event and accident situations. They describe the methods used to transpose human reliability PSA model to event analysis, report the analysis (event selection, data studied during event analysis, types of errors). Studies concern events during power operation or plant outage. Results are analyzed in terms of number of events, percentage of error type, percentage of activation of engineered safety features before operators recovered the situation. They comment who recovers the error and how it is recovered, and more precisely discuss the case of multiple error situations

  1. Analysis of operational events by ATHEANA framework for human factor modelling

    International Nuclear Information System (INIS)

    Bedreaga, Luminita; Constntinescu, Cristina; Doca, Cezar; Guzun, Basarab

    2007-01-01

    In the area of human reliability assessment, the experts recognise the fact that the current methods have not represented correctly the role of human in prevention, initiating and mitigating the accidents in nuclear power plants. The nature of this deficiency appears because the current methods used in modelling of human factor have not taken into account the human performance and reliability such as it has been observed in the operational events. ATHEANA - A Technique for Human Error ANAlysis - is a new methodology for human analysis that has included the specific data of operational events and also psychological models for human behaviour. This method has included new elements such as the unsafe action and error mechanisms. In this paper we present the application of ATHEANA framework in the analysis of operational events that appeared in different nuclear power plants during 1979-2002. The analysis of operational events has consisted of: - identification of the unsafe actions; - including the unsafe actions into a category, omission ar commission; - establishing the type of error corresponding to the unsafe action: slip, lapse, mistake and circumvention; - establishing the influence of performance by shaping the factors and some corrective actions. (authors)

  2. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H. [Tampere Univ. of Technology (Finland)

    1996-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  3. Event analysis in primary substation

    Energy Technology Data Exchange (ETDEWEB)

    Paulasaari, H [Tampere Univ. of Technology (Finland)

    1997-12-31

    The target of the project is to develop a system which observes the functions of a protection system by using modern microprocessor based relays. Microprocessor based relays have three essential capabilities: the first is the communication with the SRIO and the SCADA system, the second is the internal clock, which is used to produce time stamped event data, and the third is the capability to register some values during the fault. For example, during a short circuit fault the relay registers the value of the short circuit current and information on the number of faulted phases. In the case of an earth fault the relay stores both the neutral current and the neutral voltage

  4. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  5. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Identifying the oil price-macroeconomy relationship. An empirical mode decomposition analysis of US data

    International Nuclear Information System (INIS)

    Oladosu, Gbadebo

    2009-01-01

    This paper employs the empirical mode decomposition (EMD) method to filter cyclical components of US quarterly gross domestic product (GDP) and quarterly average oil price (West Texas Intermediate - WTI). The method is adaptive and applicable to non-linear and non-stationary data. A correlation analysis of the resulting components is performed and examined for insights into the relationship between oil and the economy. Several components of this relationship are identified. However, the principal one is that the medium-run component of the oil price has a negative relationship with the main cyclical component of the GDP. In addition, weak correlations suggesting a lagging, demand-driven component and a long-run component of the relationship were also identified. Comparisons of these findings with significant oil supply disruption and recession dates were supportive. The study identifies a number of lessons applicable to recent oil market events, including the eventuality of persistent oil price and economic decline following a long oil price run-up. In addition, it was found that oil market related exogenous events are associated with short- to medium-run price implications regardless of whether they lead to actual supply losses. (author)

  7. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  8. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  9. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  10. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  11. Initiating events in the safety probabilistic analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Stasiulevicius, R.

    1989-01-01

    The importance of the initiating event in the probabilistic safety analysis of nuclear power plants are discussed and the basic procedures necessary for preparing reports, quantification and grouping of the events are described. The examples of initiating events with its occurence medium frequency, included those calculated for OCONEE reactor and Angra-1 reactor are presented. (E.G.)

  12. The Frasnian-Famennian mass killing event(s), methods of identification and evaluation

    Science.gov (United States)

    Geldsetzer, H. H. J.

    1988-01-01

    The absence of an abnormally high number of earlier Devonian taxa from Famennian sediments was repeatedly documented and can hardly be questioned. Primary recognition of the event(s) was based on paleontological data, especially common macrofossils. Most paleontologists place the disappearance of these common forms at the gigas/triangularis contact and this boundary was recently proposed as the Frasnian-Famennian (F-F) boundary. Not unexpectedly, alternate F-F positions were suggested caused by temporary Frasnian survivors or sudden post-event radiations of new forms. Secondary supporting evidence for mass killing event(s) is supplied by trace element and stable isotope geochemistry but not with the same success as for the K/T boundary, probably due to additional 300 ma of tectonic and diagenetic overprinting. Another tool is microfacies analysis which is surprisingly rarely used even though it can explain geochemical anomalies or paleontological overlap not detectable by conventional macrofacies analysis. The combination of microfacies analysis and geochemistry was applied at two F-F sections in western Canada and showed how interdependent the two methods are. Additional F-F sections from western Canada, western United States, France, Germany and Australia were sampled or re-sampled and await geochemical/microfacies evaluation.

  13. An analysis of post-event processing in social anxiety disorder.

    Science.gov (United States)

    Brozovich, Faith; Heimberg, Richard G

    2008-07-01

    Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.

  14. Esophageal Motor Disorders Are a Strong and Independant Associated Factor of Barrett’s Esophagus

    Science.gov (United States)

    Bazin, Camille; Benezech, Alban; Alessandrini, Marine; Grimaud, Jean-Charles; Vitton, Veronique

    2018-01-01

    Background/Aims Esophageal motor disorder (EMD) has been shown to be associated with gastroesophageal reflux disease (GERD). However, the association of EMD with a Barrett’s esophagus (BE) is controversial. Our objective was to evaluate whether the presence of EMD was an independent factor associated with BE. Methods A retrospective case-control study was conducted in GERD patients who all had oeso-gastroduodenal endoscopy and high-resolution esophageal manometry. The clinical data collected was known or potential risk factors for BE: male gender, smoking and alcohol consumption, age, body mass index, presence of hiatal hernia, frequency, and age of GERD. EMD were classified according to the Chicago classification into: ineffective motor syndrome, fragmented peristalsis and absence of peristalsis, lower esophageal sphincter hypotonia. Results Two hundred and one patients (101 in the GERD + BE group and 100 in the GERD without BE) were included. In univariate analysis, male gender, alcohol consumption, presence of hiatal hernia, and EMD appeared to be associated with the presence of BE. In a multivariate analysis, 3 independent factors were identified: the presence of EMD (odds ratio [OR], 3.99; 95% confidence interval [CI], 1.71–9.28; P = 0.001), the presence of hiatal hernia (OR, 5.60; 95% CI, 2.45–12.76; P motor syndrome and lower esophageal sphincter hypotonia) is a strong independent associated factor of BE. Searching systematically for an EMD in patients suffering from GERD could be a new strategy to organize the endoscopic follow-up. PMID:29605977

  15. An epileptic seizures detection algorithm based on the empirical mode decomposition of EEG.

    Science.gov (United States)

    Orosco, Lorena; Laciar, Eric; Correa, Agustina Garces; Torres, Abel; Graffigna, Juan P

    2009-01-01

    Epilepsy is a neurological disorder that affects around 50 million people worldwide. The seizure detection is an important component in the diagnosis of epilepsy. In this study, the Empirical Mode Decomposition (EMD) method was proposed on the development of an automatic epileptic seizure detection algorithm. The algorithm first computes the Intrinsic Mode Functions (IMFs) of EEG records, then calculates the energy of each IMF and performs the detection based on an energy threshold and a minimum duration decision. The algorithm was tested in 9 invasive EEG records provided and validated by the Epilepsy Center of the University Hospital of Freiburg. In 90 segments analyzed (39 with epileptic seizures) the sensitivity and specificity obtained with the method were of 56.41% and 75.86% respectively. It could be concluded that EMD is a promissory method for epileptic seizure detection in EEG records.

  16. Wavelet methods in mathematical analysis and engineering

    CERN Document Server

    Damlamian, Alain

    2010-01-01

    This book gives a comprehensive overview of both the fundamentals of wavelet analysis and related tools, and of the most active recent developments towards applications. It offers a stateoftheart in several active areas of research where wavelet ideas, or more generally multiresolution ideas have proved particularly effective. The main applications covered are in the numerical analysis of PDEs, and signal and image processing. Recently introduced techniques such as Empirical Mode Decomposition (EMD) and new trends in the recovery of missing data, such as compressed sensing, are also presented.

  17. Integrating natural language processing expertise with patient safety event review committees to improve the analysis of medication events.

    Science.gov (United States)

    Fong, Allan; Harriott, Nicole; Walters, Donna M; Foley, Hanan; Morrissey, Richard; Ratwani, Raj R

    2017-08-01

    Many healthcare providers have implemented patient safety event reporting systems to better understand and improve patient safety. Reviewing and analyzing these reports is often time consuming and resource intensive because of both the quantity of reports and length of free-text descriptions in the reports. Natural language processing (NLP) experts collaborated with clinical experts on a patient safety committee to assist in the identification and analysis of medication related patient safety events. Different NLP algorithmic approaches were developed to identify four types of medication related patient safety events and the models were compared. Well performing NLP models were generated to categorize medication related events into pharmacy delivery delays, dispensing errors, Pyxis discrepancies, and prescriber errors with receiver operating characteristic areas under the curve of 0.96, 0.87, 0.96, and 0.81 respectively. We also found that modeling the brief without the resolution text generally improved model performance. These models were integrated into a dashboard visualization to support the patient safety committee review process. We demonstrate the capabilities of various NLP models and the use of two text inclusion strategies at categorizing medication related patient safety events. The NLP models and visualization could be used to improve the efficiency of patient safety event data review and analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  19. Disruptive event analysis: volcanism and igneous intrusion

    International Nuclear Information System (INIS)

    Crowe, B.M.

    1979-01-01

    Three basic topics are addressed for the disruptive event analysis: first, the range of disruptive consequences of a radioactive waste repository by volcanic activity; second, the possible reduction of the risk of disruption by volcanic activity through selective siting of a repository; and third, the quantification of the probability of repository disruption by volcanic activity

  20. Driving decisions when leaving electronic music dance events: driver, passenger, and group effects.

    Science.gov (United States)

    Johnson, Mark B; Voas, Robert B; Miller, Brenda A

    2012-01-01

    The goal of this article was to identify characteristics of drivers and passengers that predicted peer groups whose drivers exit dance clubs with alcohol levels indicative of impairment (blood alcohol content [BAC] ≥ 0.05 g/dL). We used the portal survey methodology to randomly sample groups of electronic music dance event (EMDE) patrons as they entered and exited a club. From May through November 2010, data were collected from 38 EMDEs hosted by 8 clubs in the San Francisco Bay area. Data included in these analyses are results from breath samples for measuring BAC and self-report data on demographics, recent drinking history drinking, drinking intentions, travel to and from the clubs, and the familiarity/experience with other group members. These data were collected from a subset of 175 drivers and 272 passengers. Although drivers drank less than passengers, one driver in 5 groups had a BAC indicative of elevated crash risk (BAC ≥ 0.05 g/dL). Groups of drivers and/or passengers with a recent history of binge drinking were more likely to have drivers with BACs ≥ 0.05 g/dL. One unanticipated finding was that drivers who knew more group members relatively well were more likely to exit the club with a BAC ≥ 0.05 g/dL. Additionally, we found that groups with all female passengers were at greater risk for having a driver whose BAC was ≥ 0.05 g/dL. Some group characteristics predicted drivers who exit clubs with BACs ≥ 0.05 g/dL. One intervention strategy to promote safety might be to encourage group members to reconsider who is sober enough to drive away from the club; for some groups, a change of drivers would be a safer choice, because a passenger may have a relatively safe BAC. Groups of females appear to have a particularly elevated risk of having a driver whose BAC exceeds 0.05 g/dL, and new intervention efforts should be particularly directed to these at-risk groups.

  1. A systemic approach for managing extreme risk events-dynamic financial analysis

    Directory of Open Access Journals (Sweden)

    Ph.D.Student Rodica Ianole

    2011-12-01

    Full Text Available Following the Black Swan logic, it often happens that what we do not know becomes more relevant that what we (believe to know. The management of extreme risks falls under this paradigm in the sense that it cannot be limited to a static approach based only on objective and easily quantifiable variables. Making appeal to the operational tools developed primarily for the insurance industry, the present paper aims to investigate how dynamic financial analysis (DFA can be used within the framework of extreme risk events.

  2. Organization of pulse-height analysis programs for high event rates

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, C E [Argonne National Lab., Ill. (USA)

    1976-09-01

    The ability of a pulse-height analysis program to handle high event rates can be enhanced by organizing it so as to minimize the time spent in interrupt housekeeping. Specifically, the routine that services the data-ready interrupt from the ADC should test whether another event is ready before performing the interrupt return.

  3. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  4. External events analysis for the Savannah River Site K reactor

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be ''external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 x 10 -4 per year, from which seismic events are the major contributor (1.2 x 10 -4 per year). Fire initiated events contribute 1.4 x 10 -7 per year, tornados 5.8 x 10 -7 per year, dam failures 1.5 x 10 -6 per year and the crane failure scenario less than 10 -4 per year to the core melt frequency. 8 refs., 3 figs., 5 tabs

  5. Empirical Mode Decomposition of the atmospheric wave field

    Directory of Open Access Journals (Sweden)

    A. J. McDonald

    2007-03-01

    Full Text Available This study examines the utility of the Empirical Mode Decomposition (EMD time-series analysis technique to separate the horizontal wind field observed by the Scott Base MF radar (78° S, 167° E into its constituent parts made up of the mean wind, gravity waves, tides, planetary waves and instrumental noise. Analysis suggests that EMD effectively separates the wind field into a set of Intrinsic Mode Functions (IMFs which can be related to atmospheric waves with different temporal scales. The Intrinsic Mode Functions resultant from application of the EMD technique to Monte-Carlo simulations of white- and red-noise processes are compared to those obtained from the measurements and are shown to be significantly different statistically. Thus, application of the EMD technique to the MF radar horizontal wind data can be used to prove that this data contains information on internal gravity waves, tides and planetary wave motions.

    Examination also suggests that the EMD technique has the ability to highlight amplitude and frequency modulations in these signals. Closer examination of one of these regions of amplitude modulation associated with dominant periods close to 12 h is suggested to be related to a wave-wave interaction between the semi-diurnal tide and a planetary wave. Application of the Hilbert transform to the IMFs forms a Hilbert-Huang spectrum which provides a way of viewing the data in a similar manner to the analysis from a continuous wavelet transform. However, the fact that the basis function of EMD is data-driven and does not need to be selected a priori is a major advantage. In addition, the skeleton diagrams, produced from the results of the Hilbert-Huang spectrum, provide a method of presentation which allows quantitative information on the instantaneous period and amplitude squared to be displayed as a function of time. Thus, it provides a novel way to view frequency and amplitude-modulated wave phenomena and potentially non

  6. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  7. Do climate extreme events foster violent civil conflicts? A coincidence analysis

    Science.gov (United States)

    Schleussner, Carl-Friedrich; Donges, Jonathan F.; Donner, Reik V.

    2014-05-01

    Civil conflicts promoted by adverse environmental conditions represent one of the most important potential feedbacks in the global socio-environmental nexus. While the role of climate extremes as a triggering factor is often discussed, no consensus is yet reached about the cause-and-effect relation in the observed data record. Here we present results of a rigorous statistical coincidence analysis based on the Munich Re Inc. extreme events database and the Uppsala conflict data program. We report evidence for statistically significant synchronicity between climate extremes with high economic impact and violent conflicts for various regions, although no coherent global signal emerges from our analysis. Our results indicate the importance of regional vulnerability and might aid to identify hot-spot regions for potential climate-triggered violent social conflicts.

  8. Contingency Analysis of Cascading Line Outage Events

    Energy Technology Data Exchange (ETDEWEB)

    Thomas L Baldwin; Magdy S Tawfik; Miles McQueen

    2011-03-01

    As the US power systems continue to increase in size and complexity, including the growth of smart grids, larger blackouts due to cascading outages become more likely. Grid congestion is often associated with a cascading collapse leading to a major blackout. Such a collapse is characterized by a self-sustaining sequence of line outages followed by a topology breakup of the network. This paper addresses the implementation and testing of a process for N-k contingency analysis and sequential cascading outage simulation in order to identify potential cascading modes. A modeling approach described in this paper offers a unique capability to identify initiating events that may lead to cascading outages. It predicts the development of cascading events by identifying and visualizing potential cascading tiers. The proposed approach was implemented using a 328-bus simplified SERC power system network. The results of the study indicate that initiating events and possible cascading chains may be identified, ranked and visualized. This approach may be used to improve the reliability of a transmission grid and reduce its vulnerability to cascading outages.

  9. Content-based analysis and indexing of sports video

    Science.gov (United States)

    Luo, Ming; Bai, Xuesheng; Xu, Guang-you

    2001-12-01

    An explosion of on-line image and video data in digital form is already well underway. With the exponential rise in interactive information exploration and dissemination through the World-Wide Web, the major inhibitors of rapid access to on-line video data are the management of capture and storage, and content-based intelligent search and indexing techniques. This paper proposes an approach for content-based analysis and event-based indexing of sports video. It includes a novel method to organize shots - classifying shots as close shots and far shots, an original idea of blur extent-based event detection, and an innovative local mutation-based algorithm for caption detection and retrieval. Results on extensive real TV programs demonstrate the applicability of our approach.

  10. Microprocessor event analysis in parallel with Camac data acquisition

    International Nuclear Information System (INIS)

    Cords, D.; Eichler, R.; Riege, H.

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a Camac System (GEC-ELLIOTT System Crate) and shares the Camac access with a Nord-1OS computer. Interfaces have been designed and tested for execution of Camac cycles, communication with the Nord-1OS computer and DMA-transfer from Camac to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-1OS computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the result of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-1OS buffer will be reset and the event omitted from further processing. (orig.)

  11. Event-by-event particle multiplicity fluctuations in Pb-Pb collisions with ALICE

    Energy Technology Data Exchange (ETDEWEB)

    Arslandok, Mesut [Institut fuer Kernphysik, Goethe-Universitaet Frankfurt (Germany); Collaboration: ALICE-Collaboration

    2014-07-01

    The study of event-by-event fluctuations of identified hadrons may reveal the degrees of freedom of the strongly interacting mater created in heavy-ion collisions. Particle identification that is based on the measurement of the specific ionization energy loss dE/dx works well on a statistical basis, however, suffers from ambiguities when applied on the event-by-event level. A novel experimental technique called the ''Identity Method'' was recently proposed to overcome such limitations. The method follows a probabilistic approach using the inclusive dE/dx distributions measured in the ALICE TPC, and determines the moments of the multiplicity distributions by an unfolding procedure. In this contribution, the status of an event-by-event fluctuation analysis that applies the Identity Method to Pb-Pb data from ALICE is presented.

  12. Big Data Toolsets to Pharmacometrics: Application of Machine Learning for Time-to-Event Analysis.

    Science.gov (United States)

    Gong, Xiajing; Hu, Meng; Zhao, Liang

    2018-05-01

    Additional value can be potentially created by applying big data tools to address pharmacometric problems. The performances of machine learning (ML) methods and the Cox regression model were evaluated based on simulated time-to-event data synthesized under various preset scenarios, i.e., with linear vs. nonlinear and dependent vs. independent predictors in the proportional hazard function, or with high-dimensional data featured by a large number of predictor variables. Our results showed that ML-based methods outperformed the Cox model in prediction performance as assessed by concordance index and in identifying the preset influential variables for high-dimensional data. The prediction performances of ML-based methods are also less sensitive to data size and censoring rates than the Cox regression model. In conclusion, ML-based methods provide a powerful tool for time-to-event analysis, with a built-in capacity for high-dimensional data and better performance when the predictor variables assume nonlinear relationships in the hazard function. © 2018 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  13. Using discriminant analysis as a nucleation event classification method

    Directory of Open Access Journals (Sweden)

    S. Mikkonen

    2006-01-01

    Full Text Available More than three years of measurements of aerosol size-distribution and different gas and meteorological parameters made in Po Valley, Italy were analysed for this study to examine which of the meteorological and trace gas variables effect on the emergence of nucleation events. As the analysis method, we used discriminant analysis with non-parametric Epanechnikov kernel, included in non-parametric density estimation method. The best classification result in our data was reached with the combination of relative humidity, ozone concentration and a third degree polynomial of radiation. RH appeared to have a preventing effect on the new particle formation whereas the effects of O3 and radiation were more conductive. The concentration of SO2 and NO2 also appeared to have significant effect on the emergence of nucleation events but because of the great amount of missing observations, we had to exclude them from the final analysis.

  14. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  15. Survival analysis using S analysis of time-to-event data

    CERN Document Server

    Tableman, Mara

    2003-01-01

    Survival Analysis Using S: Analysis of Time-to-Event Data is designed as a text for a one-semester or one-quarter course in survival analysis for upper-level or graduate students in statistics, biostatistics, and epidemiology. Prerequisites are a standard pre-calculus first course in probability and statistics, and a course in applied linear regression models. No prior knowledge of S or R is assumed. A wide choice of exercises is included, some intended for more advanced students with a first course in mathematical statistics. The authors emphasize parametric log-linear models, while also detailing nonparametric procedures along with model building and data diagnostics. Medical and public health researchers will find the discussion of cut point analysis with bootstrap validation, competing risks and the cumulative incidence estimator, and the analysis of left-truncated and right-censored data invaluable. The bootstrap procedure checks robustness of cut point analysis and determines cut point(s). In a chapter ...

  16. Adverse events following 12 and 18 month vaccinations: a population-based, self-controlled case series analysis.

    Directory of Open Access Journals (Sweden)

    Kumanan Wilson

    Full Text Available BACKGROUND: Live vaccines have distinct safety profiles, potentially causing systemic reactions one to 2 weeks after administration. In the province of Ontario, Canada, live MMR vaccine is currently recommended at age 12 months and 18 months. METHODS: Using the self-controlled case series design we examined 271,495 12 month vaccinations and 184,312 18 month vaccinations to examine the relative incidence of the composite endpoint of emergency room visits or hospital admissions in consecutive one day intervals following vaccination. These were compared to a control period 20 to 28 days later. In a post-hoc analysis we examined the reasons for emergency room visits and the average acuity score at presentation for children during the at-risk period following the 12 month vaccine. RESULTS: Four to 12 days post 12 month vaccination, children had a 1.33 (1.29-1.38 increased relative incidence of the combined endpoint compared to the control period, or at least one event during the risk interval for every 168 children vaccinated. Ten to 12 days post 18 month vaccination, the relative incidence was 1.25 (95%, 1.17-1.33 which represented at least one excess event for every 730 children vaccinated. The primary reason for increased events was statistically significant elevations in emergency room visits following all vaccinations. There were non-significant increases in hospital admissions. There were an additional 20 febrile seizures for every 100,000 vaccinated at 12 months. CONCLUSIONS: There are significantly elevated risks of primarily emergency room visits approximately one to two weeks following 12 and 18 month vaccination. Future studies should examine whether these events could be predicted or prevented.

  17. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  18. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  19. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  20. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  1. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  2. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik

    2001-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses compositionality and dependency analysis to significantly improve the efficiency of symbolic model checking of state/event models. It makes...

  3. Verification of Large State/Event Systems using Compositionality and Dependency Analysis

    DEFF Research Database (Denmark)

    Lind-Nielsen, Jørn; Andersen, Henrik Reif; Behrmann, Gerd

    1999-01-01

    A state/event model is a concurrent version of Mealy machines used for describing embedded reactive systems. This paper introduces a technique that uses \\emph{compositionality} and \\emph{dependency analysis} to significantly improve the efficiency of symbolic model checking of state/event models...

  4. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  5. A novel hybrid ensemble learning paradigm for tourism forecasting

    Science.gov (United States)

    Shabri, Ani

    2015-02-01

    In this paper, a hybrid forecasting model based on Empirical Mode Decomposition (EMD) and Group Method of Data Handling (GMDH) is proposed to forecast tourism demand. This methodology first decomposes the original visitor arrival series into several Intrinsic Model Function (IMFs) components and one residual component by EMD technique. Then, IMFs components and the residual components is forecasted respectively using GMDH model whose input variables are selected by using Partial Autocorrelation Function (PACF). The final forecasted result for tourism series is produced by aggregating all the forecasted results. For evaluating the performance of the proposed EMD-GMDH methodologies, the monthly data of tourist arrivals from Singapore to Malaysia are used as an illustrative example. Empirical results show that the proposed EMD-GMDH model outperforms the EMD-ARIMA as well as the GMDH and ARIMA (Autoregressive Integrated Moving Average) models without time series decomposition.

  6. Extreme events in total ozone: Spatio-temporal analysis from local to global scale

    Science.gov (United States)

    Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.

    2010-05-01

    Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric

  7. Ultimate design load analysis of planetary gearbox bearings under extreme events

    DEFF Research Database (Denmark)

    Gallego Calderon, Juan Felipe; Natarajan, Anand; Cutululis, Nicolaos Antonio

    2017-01-01

    This paper investigates the impact of extreme events on the planet bearings of a 5 MW gearbox. The system is simulated using an aeroelastic tool, where the turbine structure is modeled, and MATLAB/Simulink, where the drivetrain (gearbox and generator) are modeled using a lumped-parameter approach....... Three extreme events are assessed: low-voltage ride through, emergency stop and normal stop. The analysis is focused on finding which event has the most negative impact on the bearing extreme radial loads. The two latter events are carried out following the guidelines of the International...

  8. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  9. Increasing the Operational Value of Event Messages

    Science.gov (United States)

    Li, Zhenping; Savkli, Cetin; Smith, Dan

    2003-01-01

    Assessing the health of a space mission has traditionally been performed using telemetry analysis tools. Parameter values are compared to known operational limits and are plotted over various time periods. This presentation begins with the notion that there is an incredible amount of untapped information contained within the mission s event message logs. Through creative advancements in message handling tools, the event message logs can be used to better assess spacecraft and ground system status and to highlight and report on conditions not readily apparent when messages are evaluated one-at-a-time during a real-time pass. Work in this area is being funded as part of a larger NASA effort at the Goddard Space Flight Center to create component-based, middleware-based, standards-based general purpose ground system architecture referred to as GMSEC - the GSFC Mission Services Evolution Center. The new capabilities and operational concepts for event display, event data analyses and data mining are being developed by Lockheed Martin and the new subsystem has been named GREAT - the GMSEC Reusable Event Analysis Toolkit. Planned for use on existing and future missions, GREAT has the potential to increase operational efficiency in areas of problem detection and analysis, general status reporting, and real-time situational awareness.

  10. Analysis of loss of offsite power events reported in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Volkanovski, Andrija, E-mail: Andrija.VOLKANOVSKI@ec.europa.eu [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Ballesteros Avila, Antonio; Peinador Veira, Miguel [European Commission, Joint Research Centre, Institute for Energy and Transport, P.O. Box 2, NL-1755 ZG Petten (Netherlands); Kančev, Duško [Kernkraftwerk Goesgen-Daeniken AG, CH-4658 Daeniken (Switzerland); Maqua, Michael [Gesellschaft für Anlagen-und-Reaktorsicherheit (GRS) gGmbH, Schwertnergasse 1, 50667 Köln (Germany); Stephan, Jean-Luc [Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP 17 – 92262 Fontenay-aux-Roses Cedex (France)

    2016-10-15

    Highlights: • Loss of offsite power events were identified in four databases. • Engineering analysis of relevant events was done. • The dominant root cause for LOOP are human failures. • Improved maintenance procedures can decrease the number of LOOP events. - Abstract: This paper presents the results of analysis of the loss of offsite power events (LOOP) in four databases of operational events. The screened databases include: the Gesellschaft für Anlagen und Reaktorsicherheit mbH (GRS) and Institut de Radioprotection et de Sûreté Nucléaire (IRSN) databases, the IAEA International Reporting System for Operating Experience (IRS) and the U.S. Licensee Event Reports (LER). In total 228 relevant loss of offsite power events were identified in the IRSN database, 190 in the GRS database, 120 in U.S. LER and 52 in IRS database. Identified events were classified in predefined categories. Obtained results show that the largest percentage of LOOP events is registered during On power operational mode and lasted for two minutes or more. The plant centered events is the main contributor to LOOP events identified in IRSN, GRS and IAEA IRS database. The switchyard centered events are the main contributor in events registered in the NRC LER database. The main type of failed equipment is switchyard failures in IRSN and IAEA IRS, main or secondary lines in NRC LER and busbar failures in GRS database. The dominant root cause for the LOOP events are human failures during test, inspection and maintenance followed by human failures due to the insufficient or wrong procedures. The largest number of LOOP events resulted in reactor trip followed by EDG start. The actions that can result in reduction of the number of LOOP events and minimize consequences on plant safety are identified and presented.

  11. Analysis of unprotected overcooling events in the Integral Fast Reactor

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1989-01-01

    Simple analytic models are developed for predicting the response of a metal fueled, liquid-metal cooled reactor to unprotected overcooling events in the balance of plant. All overcooling initiators are shown to fall into two categories. The first category contains these events for which there is no final equilibrium state of constant overcooling, as in the case for a large steam leak. These events are analyzed using a non-flow control mass approach. The second category contains those events which will eventually equilibrate, such as a loss of feedwater heaters. A steady flow control volume analysis shows that these latter events ultimately affect the plant through the feedwater inlet to the steam generator. The models developed for analyzing these two categories provide upper bounds for the reactor's passive response to overcooling accident initiators. Calculation of these bounds for a prototypic plant indicate that failure limits -- eutectic melting, sodium boiling, fuel pin failure -- are not exceeded in any overcooling event. 2 refs

  12. Analysis of mutual events of Galilean satellites observed from VBO during 2014-2015

    Science.gov (United States)

    Vasundhara, R.; Selvakumar, G.; Anbazhagan, P.

    2017-06-01

    Results of analysis of 23 events of the 2014-2015 mutual event series from the Vainu Bappu Observatory are presented. Our intensity distribution model for the eclipsed/occulted satellite is based on the criterion that it simulates a rotational light curve that matches the ground-based light curve. Dichotomy in the scattering characteristics of the leading and trailing sides explains the basic shape of the rotational light curves of Europa, Ganymede and Callisto. In the case of Io, the albedo map (courtesy United States Geological Survey) along with global values of scattering parameters works well. Mean values of residuals in (O - C) along and perpendicular to the track are found to be -3.3 and -3.4 mas, respectively, compared to 'L2' theory for the seven 2E1/2O1 events. The corresponding rms values are 8.7 and 7.8 mas, respectively. For the five 1E3/1O3 events, the along and perpendicular to the track mean residuals are 5.6 and 3.2 mas, respectively. The corresponding rms residuals are 6.8 and 10.5 mas, respectively. We compare the results using the chosen model (Model 1) with a uniform but limb-darkened disc (Model 2). The residuals with Model 2 of the 2E1/2O1 and 1E3/1O3 events indicate a bias along the satellite track. The extent and direction of bias are consistent with the shift of the light centre from the geometric centre. Results using Model 1, which intrinsically takes into account the intensity distribution, show no such bias.

  13. Flow detection via sparse frame analysis for suspicious event recognition in infrared imagery

    Science.gov (United States)

    Fernandes, Henrique C.; Batista, Marcos A.; Barcelos, Celia A. Z.; Maldague, Xavier P. V.

    2013-05-01

    It is becoming increasingly evident that intelligent systems are very bene¯cial for society and that the further development of such systems is necessary to continue to improve society's quality of life. One area that has drawn the attention of recent research is the development of automatic surveillance systems. In our work we outline a system capable of monitoring an uncontrolled area (an outside parking lot) using infrared imagery and recognizing suspicious events in this area. The ¯rst step is to identify moving objects and segment them from the scene's background. Our approach is based on a dynamic background-subtraction technique which robustly adapts detection to illumination changes. It is analyzed only regions where movement is occurring, ignoring in°uence of pixels from regions where there is no movement, to segment moving objects. Regions where movement is occurring are identi¯ed using °ow detection via sparse frame analysis. During the tracking process the objects are classi¯ed into two categories: Persons and Vehicles, based on features such as size and velocity. The last step is to recognize suspicious events that may occur in the scene. Since the objects are correctly segmented and classi¯ed it is possible to identify those events using features such as velocity and time spent motionless in one spot. In this paper we recognize the suspicious event suspicion of object(s) theft from inside a parked vehicle at spot X by a person" and results show that the use of °ow detection increases the recognition of this suspicious event from 78:57% to 92:85%.

  14. Microprocessor event analysis in parallel with CAMAC data acquisition

    CERN Document Server

    Cords, D; Riege, H

    1981-01-01

    The Plessey MIPROC-16 microprocessor (16 bits, 250 ns execution time) has been connected to a CAMAC System (GEC-ELLIOTT System Crate) and shares the CAMAC access with a Nord-10S computer. Interfaces have been designed and tested for execution of CAMAC cycles, communication with the Nord-10S computer and DMA-transfer from CAMAC to the MIPROC-16 memory. The system is used in the JADE data-acquisition-system at PETRA where it receives the data from the detector in parallel with the Nord-10S computer via DMA through the indirect-data-channel mode. The microprocessor performs an on-line analysis of events and the results of various checks is appended to the event. In case of spurious triggers or clear beam gas events, the Nord-10S buffer will be reset and the event omitted from further processing. (5 refs).

  15. A trend analysis of human error events for proactive prevention of accidents. Methodology development and effective utilization

    International Nuclear Information System (INIS)

    Hirotsu, Yuko; Ebisu, Mitsuhiro; Aikawa, Takeshi; Matsubara, Katsuyuki

    2006-01-01

    This paper described methods for analyzing human error events that has been accumulated in the individual plant and for utilizing the result to prevent accidents proactively. Firstly, a categorization framework of trigger action and causal factors of human error events were reexamined, and the procedure to analyze human error events was reviewed based on the framework. Secondly, a method for identifying the common characteristics of trigger action data and of causal factor data accumulated by analyzing human error events was clarified. In addition, to utilize the results of trend analysis effectively, methods to develop teaching material for safety education, to develop the checkpoints for the error prevention and to introduce an error management process for strategic error prevention were proposed. (author)

  16. Fuzzy probability based fault tree analysis to propagate and quantify epistemic uncertainty

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry; Sony Tjahyani, D.T.; Ekariansyah, Andi Sofrany; Tjahjono, Hendro

    2015-01-01

    Highlights: • Fuzzy probability based fault tree analysis is to evaluate epistemic uncertainty in fuzzy fault tree analysis. • Fuzzy probabilities represent likelihood occurrences of all events in a fault tree. • A fuzzy multiplication rule quantifies epistemic uncertainty of minimal cut sets. • A fuzzy complement rule estimate epistemic uncertainty of the top event. • The proposed FPFTA has successfully evaluated the U.S. Combustion Engineering RPS. - Abstract: A number of fuzzy fault tree analysis approaches, which integrate fuzzy concepts into the quantitative phase of conventional fault tree analysis, have been proposed to study reliabilities of engineering systems. Those new approaches apply expert judgments to overcome the limitation of the conventional fault tree analysis when basic events do not have probability distributions. Since expert judgments might come with epistemic uncertainty, it is important to quantify the overall uncertainties of the fuzzy fault tree analysis. Monte Carlo simulation is commonly used to quantify the overall uncertainties of conventional fault tree analysis. However, since Monte Carlo simulation is based on probability distribution, this technique is not appropriate for fuzzy fault tree analysis, which is based on fuzzy probabilities. The objective of this study is to develop a fuzzy probability based fault tree analysis to overcome the limitation of fuzzy fault tree analysis. To demonstrate the applicability of the proposed approach, a case study is performed and its results are then compared to the results analyzed by a conventional fault tree analysis. The results confirm that the proposed fuzzy probability based fault tree analysis is feasible to propagate and quantify epistemic uncertainties in fault tree analysis

  17. Sequence Synopsis: Optimize Visual Summary of Temporal Event Data.

    Science.gov (United States)

    Chen, Yuanzhe; Xu, Panpan; Ren, Liu

    2018-01-01

    Event sequences analysis plays an important role in many application domains such as customer behavior analysis, electronic health record analysis and vehicle fault diagnosis. Real-world event sequence data is often noisy and complex with high event cardinality, making it a challenging task to construct concise yet comprehensive overviews for such data. In this paper, we propose a novel visualization technique based on the minimum description length (MDL) principle to construct a coarse-level overview of event sequence data while balancing the information loss in it. The method addresses a fundamental trade-off in visualization design: reducing visual clutter vs. increasing the information content in a visualization. The method enables simultaneous sequence clustering and pattern extraction and is highly tolerant to noises such as missing or additional events in the data. Based on this approach we propose a visual analytics framework with multiple levels-of-detail to facilitate interactive data exploration. We demonstrate the usability and effectiveness of our approach through case studies with two real-world datasets. One dataset showcases a new application domain for event sequence visualization, i.e., fault development path analysis in vehicles for predictive maintenance. We also discuss the strengths and limitations of the proposed method based on user feedback.

  18. Cryogenic dark matter search (CDMS II): Application of neural networks and wavelets to event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Attisha, Michael J. [Brown U.

    2006-01-01

    The Cryogenic Dark Matter Search (CDMS) experiment is designed to search for dark matter in the form of Weakly Interacting Massive Particles (WIMPs) via their elastic scattering interactions with nuclei. This dissertation presents the CDMS detector technology and the commissioning of two towers of detectors at the deep underground site in Soudan, Minnesota. CDMS detectors comprise crystals of Ge and Si at temperatures of 20 mK which provide ~keV energy resolution and the ability to perform particle identification on an event by event basis. Event identification is performed via a two-fold interaction signature; an ionization response and an athermal phonon response. Phonons and charged particles result in electron recoils in the crystal, while neutrons and WIMPs result in nuclear recoils. Since the ionization response is quenched by a factor ~ 3(2) in Ge(Si) for nuclear recoils compared to electron recoils, the relative amplitude of the two detector responses allows discrimination between recoil types. The primary source of background events in CDMS arises from electron recoils in the outer 50 µm of the detector surface which have a reduced ionization response. We develop a quantitative model of this ‘dead layer’ effect and successfully apply the model to Monte Carlo simulation of CDMS calibration data. Analysis of data from the two tower run March-August 2004 is performed, resulting in the world’s most sensitive limits on the spin-independent WIMP-nucleon cross-section, with a 90% C.L. upper limit of 1.6 × 10-43 cm2 on Ge for a 60 GeV WIMP. An approach to performing surface event discrimination using neural networks and wavelets is developed. A Bayesian methodology to classifying surface events using neural networks is found to provide an optimized method based on minimization of the expected dark matter limit. The discrete wavelet analysis of CDMS phonon pulses improves surface event discrimination in conjunction with the neural

  19. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    International Nuclear Information System (INIS)

    FORESTER, JOHN A.; BLEY, DENNIS C.; COOPER, SUSANE; KOLACZKOWSKI, ALAN M.; THOMPSON, CATHERINE; RAMEY-SMITH, ANN; WREATHALL, JOHN

    2000-01-01

    This paper describes the most recent version of a human reliability analysis (HRA) method called ''A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed

  20. Rare event computation in deterministic chaotic systems using genealogical particle analysis

    International Nuclear Information System (INIS)

    Wouters, J; Bouchet, F

    2016-01-01

    In this paper we address the use of rare event computation techniques to estimate small over-threshold probabilities of observables in deterministic dynamical systems. We demonstrate that genealogical particle analysis algorithms can be successfully applied to a toy model of atmospheric dynamics, the Lorenz ’96 model. We furthermore use the Ornstein–Uhlenbeck system to illustrate a number of implementation issues. We also show how a time-dependent objective function based on the fluctuation path to a high threshold can greatly improve the performance of the estimator compared to a fixed-in-time objective function. (paper)

  1. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  2. Temporal associations between weather and headache: analysis by empirical mode decomposition.

    Directory of Open Access Journals (Sweden)

    Albert C Yang

    Full Text Available BACKGROUND: Patients frequently report that weather changes trigger headache or worsen existing headache symptoms. Recently, the method of empirical mode decomposition (EMD has been used to delineate temporal relationships in certain diseases, and we applied this technique to identify intrinsic weather components associated with headache incidence data derived from a large-scale epidemiological survey of headache in the Greater Taipei area. METHODOLOGY/PRINCIPAL FINDINGS: The study sample consisted of 52 randomly selected headache patients. The weather time-series parameters were detrended by the EMD method into a set of embedded oscillatory components, i.e. intrinsic mode functions (IMFs. Multiple linear regression models with forward stepwise methods were used to analyze the temporal associations between weather and headaches. We found no associations between the raw time series of weather variables and headache incidence. For decomposed intrinsic weather IMFs, temperature, sunshine duration, humidity, pressure, and maximal wind speed were associated with headache incidence during the cold period, whereas only maximal wind speed was associated during the warm period. In analyses examining all significant weather variables, IMFs derived from temperature and sunshine duration data accounted for up to 33.3% of the variance in headache incidence during the cold period. The association of headache incidence and weather IMFs in the cold period coincided with the cold fronts. CONCLUSIONS/SIGNIFICANCE: Using EMD analysis, we found a significant association between headache and intrinsic weather components, which was not detected by direct comparisons of raw weather data. Contributing weather parameters may vary in different geographic regions and different seasons.

  3. Arenal-type pyroclastic flows: A probabilistic event tree risk analysis

    Science.gov (United States)

    Meloy, Anthony F.

    2006-09-01

    A quantitative hazard-specific scenario-modelling risk analysis is performed at Arenal volcano, Costa Rica for the newly recognised Arenal-type pyroclastic flow (ATPF) phenomenon using an event tree framework. These flows are generated by the sudden depressurisation and fragmentation of an active basaltic andesite lava pool as a result of a partial collapse of the crater wall. The deposits of this type of flow include angular blocks and juvenile clasts, which are rarely found in other types of pyroclastic flow. An event tree analysis (ETA) is a useful tool and framework in which to analyse and graphically present the probabilities of the occurrence of many possible events in a complex system. Four event trees are created in the analysis, three of which are extended to investigate the varying individual risk faced by three generic representatives of the surrounding community: a resident, a worker, and a tourist. The raw numerical risk estimates determined by the ETA are converted into a set of linguistic expressions (i.e. VERY HIGH, HIGH, MODERATE etc.) using an established risk classification scale. Three individually tailored semi-quantitative risk maps are then created from a set of risk conversion tables to show how the risk varies for each individual in different areas around the volcano. In some cases, by relocating from the north to the south, the level of risk can be reduced by up to three classes. While the individual risk maps may be broadly applicable, and therefore of interest to the general community, the risk maps and associated probability values generated in the ETA are intended to be used by trained professionals and government agencies to evaluate the risk and effectively manage the long-term development of infrastructure and habitation. With the addition of fresh monitoring data, the combination of both long- and short-term event trees would provide a comprehensive and consistent method of risk analysis (both during and pre-crisis), and as such

  4. Setting up and functioning of an Emergency Medicine Department: Lessons learned from a preliminary study

    Directory of Open Access Journals (Sweden)

    K Asish

    2016-01-01

    Full Text Available Background and Aims: Tertiary care teaching hospitals remain referral centres for victims of trauma and mass casualty. Often specialists from various disciplines manage these crowded casualty areas. These age old casualty areas are being replaced, throughout the country by Emergency Medicine Departments (EMDs, presumed to be better planned to confront a crisis. We aimed to gather basic data contributive in setting up of an EMD at a tertiary care teaching hospital from the lessons learned from functioning existent systems. Methods: This is primarily a questionnaire-based descriptive study at tertiary care referral centres across the country, which was purposively selected.The study models included one from a hospital without designated EMD and the other four from hospitals with established EMDs. Direct observation and focus group meetings with experienced informants at these hospitals contributed to the data. In the absence of a validated hospital preparedness assessment scale, comparison was done with regard to quantitative, qualitative and corroborative parameters using descriptive analysis. Results: The EMDs at best practice models were headed by specialist in Emergency Medicine assisted by organised staff, had protocols for managing mass casualty incident (MCI, separate trauma teams, ergonomic use of infrastructure and public education programmes. In this regard, these hospitals seemed well organised to manage MCIs and disasters. Conclusion: The observation may provide a preliminary data useful in setting up an EMD. In the absence of published Indian literature, this may facilitate further research in this direction. Anaesthesiologists, presently an approved Faculty in Emergency Medicine training can provide creative input with regard to its initial organisation and functioning, thus widening our horizons in a country where there is a severe dearth of trained emergency physicians.

  5. Analysis of internal events for the Unit 1 of the Laguna Verde nuclear power station

    International Nuclear Information System (INIS)

    Huerta B, A.; Aguilar T, O.; Nunez C, A.; Lopez M, R.

    1993-01-01

    This volume presents the results of the starter event analysis and the event tree analysis for the Unit 1 of the Laguna Verde nuclear power station. The starter event analysis includes the identification of all those internal events which cause a disturbance to the normal operation of the power station and require mitigation. Those called external events stay beyond the reach of this study. For the analysis of the Laguna Verde power station eight transient categories were identified, three categories of loss of coolant accidents (LOCA) inside the container, a LOCA out of the primary container, as well as the vessel break. The event trees analysis involves the development of the possible accident sequences for each category of starter events. Events trees by systems for the different types of LOCA and for all the transients were constructed. It was constructed the event tree for the total loss of alternating current, which represents an extension of the event tree for the loss of external power transient. Also the event tree by systems for the anticipated transients without scram was developed (ATWS). The events trees for the accident sequences includes the sequences evaluation with vulnerable nucleus, that is to say those sequences in which it is had an adequate cooling of nucleus but the remoting systems of residual heat had failed. In order to model adequately the previous, headings were added to the event tree for developing the sequences until the point where be solved the nucleus state. This process includes: the determination of the failure pressure of the primary container, the evaluation of the environment generated in the reactor building as result of the container failure or cracked of itself, the determination of the localization of the components in the reactor building and the construction of boolean expressions to estimate the failure of the subordinated components to an severe environment. (Author)

  6. Markov chains and semi-Markov models in time-to-event analysis.

    Science.gov (United States)

    Abner, Erin L; Charnigo, Richard J; Kryscio, Richard J

    2013-10-25

    A variety of statistical methods are available to investigators for analysis of time-to-event data, often referred to as survival analysis. Kaplan-Meier estimation and Cox proportional hazards regression are commonly employed tools but are not appropriate for all studies, particularly in the presence of competing risks and when multiple or recurrent outcomes are of interest. Markov chain models can accommodate censored data, competing risks (informative censoring), multiple outcomes, recurrent outcomes, frailty, and non-constant survival probabilities. Markov chain models, though often overlooked by investigators in time-to-event analysis, have long been used in clinical studies and have widespread application in other fields.

  7. Human Performance Event Database

    International Nuclear Information System (INIS)

    Trager, E. A.

    1998-01-01

    The purpose of this paper is to describe several aspects of a Human Performance Event Database (HPED) that is being developed by the Nuclear Regulatory Commission. These include the background, the database structure and basis for the structure, the process for coding and entering event records, the results of preliminary analyses of information in the database, and plans for the future. In 1992, the Office for Analysis and Evaluation of Operational Data (AEOD) within the NRC decided to develop a database for information on human performance during operating events. The database was needed to help classify and categorize the information to help feedback operating experience information to licensees and others. An NRC interoffice working group prepared a list of human performance information that should be reported for events and the list was based on the Human Performance Investigation Process (HPIP) that had been developed by the NRC as an aid in investigating events. The structure of the HPED was based on that list. The HPED currently includes data on events described in augmented inspection team (AIT) and incident investigation team (IIT) reports from 1990 through 1996, AEOD human performance studies from 1990 through 1993, recent NRR special team inspections, and licensee event reports (LERs) that were prepared for the events. (author)

  8. Event-by-event simulation of Einstein-Podolsky-Rosen-Bohm experiments

    NARCIS (Netherlands)

    Zhao, Shuang; De Raedt, Hans; Michielsen, Kristel

    We construct an event-based computer simulation model of the Einstein-Podolsky-Rosen-Bohm experiments with photons. The algorithm is a one-to-one copy of the data gathering and analysis procedures used in real laboratory experiments. We consider two types of experiments, those with a source emitting

  9. Statistical Analysis of Solar Events Associated with SSC over Year of Solar Maximum during Cycle 23: 1. Identification of Related Sun-Earth Events

    Science.gov (United States)

    Grison, B.; Bocchialini, K.; Menvielle, M.; Chambodut, A.; Cornilleau-Wehrlin, N.; Fontaine, D.; Marchaudon, A.; Pick, M.; Pitout, F.; Schmieder, B.; Regnier, S.; Zouganelis, Y.

    2017-12-01

    Taking the 32 sudden storm commencements (SSC) listed by the observatory de l'Ebre / ISGI over the year 2002 (maximal solar activity) as a starting point, we performed a statistical analysis of the related solar sources, solar wind signatures, and terrestrial responses. For each event, we characterized and identified, as far as possible, (i) the sources on the Sun (Coronal Mass Ejections -CME-), with the help of a series of herafter detailed criteria (velocities, drag coefficient, radio waves, polarity), as well as (ii) the structure and properties in the interplanetary medium, at L1, of the event associated to the SSC: magnetic clouds -MC-, non-MC interplanetary coronal mass ejections -ICME-, co-rotating/stream interaction regions -SIR/CIR-, shocks only and unclear events that we call "miscellaneous" events. The categorization of the events at L1 is made on published catalogues. For each potential CME/L1 event association we compare the velocity observed at L1 with the one observed at the Sun and the estimated balistic velocity. Observations of radio emissions (Type II, Type IV detected from the ground and /or by WIND) associated to the CMEs make the solar source more probable. We also compare the polarity of the magnetic clouds with the hemisphere of the solar source. The drag coefficient (estimated with the drag-based model) is calculated for each potential association and it is compared to the expected range values. We identified a solar source for 26 SSC related events. 12 of these 26 associations match all criteria. We finally discuss the difficulty to perform such associations.

  10. A novel GLM-based method for the Automatic IDentification of functional Events (AIDE) in fNIRS data recorded in naturalistic environments.

    Science.gov (United States)

    Pinti, Paola; Merla, Arcangelo; Aichelburg, Clarisse; Lind, Frida; Power, Sarah; Swingler, Elizabeth; Hamilton, Antonia; Gilbert, Sam; Burgess, Paul W; Tachtsidis, Ilias

    2017-07-15

    Recent technological advances have allowed the development of portable functional Near-Infrared Spectroscopy (fNIRS) devices that can be used to perform neuroimaging in the real-world. However, as real-world experiments are designed to mimic everyday life situations, the identification of event onsets can be extremely challenging and time-consuming. Here, we present a novel analysis method based on the general linear model (GLM) least square fit analysis for the Automatic IDentification of functional Events (or AIDE) directly from real-world fNIRS neuroimaging data. In order to investigate the accuracy and feasibility of this method, as a proof-of-principle we applied the algorithm to (i) synthetic fNIRS data simulating both block-, event-related and mixed-design experiments and (ii) experimental fNIRS data recorded during a conventional lab-based task (involving maths). AIDE was able to recover functional events from simulated fNIRS data with an accuracy of 89%, 97% and 91% for the simulated block-, event-related and mixed-design experiments respectively. For the lab-based experiment, AIDE recovered more than the 66.7% of the functional events from the fNIRS experimental measured data. To illustrate the strength of this method, we then applied AIDE to fNIRS data recorded by a wearable system on one participant during a complex real-world prospective memory experiment conducted outside the lab. As part of the experiment, there were four and six events (actions where participants had to interact with a target) for the two different conditions respectively (condition 1: social-interact with a person; condition 2: non-social-interact with an object). AIDE managed to recover 3/4 events and 3/6 events for conditions 1 and 2 respectively. The identified functional events were then corresponded to behavioural data from the video recordings of the movements and actions of the participant. Our results suggest that "brain-first" rather than "behaviour-first" analysis is

  11. Emergency department based HIV screening: An opportunity for early diagnosis in high prevalent areas

    Directory of Open Access Journals (Sweden)

    Teja V

    2008-01-01

    Full Text Available The Emergency Medicine Department (EMD is an ideal place for public health interventions and provides ready access to the health care system, offering a great opportunity for HIV testing and counselling. Between 2003 and 2005, rapid test was requested for 59.39% of 10,752 cases from EMD, where as ELISA was requested for 40.61%. Of the 317 HIV reactive cases, available medical records of 249 were reviewed for epidemiological and clinical information. Nearly 42% of total reactive cases detected in our Institute were from EMD. Three percent (317/10,752 were diagnosed as HIV reactive, 1.52% of the total samples were reactive by rapid test and the other 1.43% by ELISA. Two and half percent (163/6386 of those who had rapid testing and 3.53% (154/4366 who had ELISA testing, were identified as HIV reactive. All these cases were diagnosed within a mean EMD stay of 2.5 days. Eighty-five percent of HIV reactive individuals were unaware of their reactive status. Additional 53 cases of asymptomatic spouses were diagnosed as HIV reactive, thus making it possible to seek early treatment for HIV infection. The study emphasizes the importance of offering HIV testing to all patients who present to emergency department.

  12. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Yihb, Swu

    1997-01-01

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  13. Potential Indoor Worker Exposure From Handling Area Leakage: Example Event Sequence Frequency Analysis

    International Nuclear Information System (INIS)

    Benke, Roland R.; Adams, George R.

    2008-01-01

    The U.S. Department of Energy (DOE) is currently considering design options for the facilities that will handle spent nuclear fuel and high-level radioactive waste at the potential nuclear waste repository at Yucca Mountain, Nevada. The license application must demonstrate compliance with the performance objectives of 10 CFR Part 63, which include occupational dose limits from 10 CFR Part 20. If DOE submits a license application under 10 CFR Part 63, the U.S. Nuclear Regulatory Commission (NRC) will conduct a risk-informed, performance-based review of the DOE license application and its preclosure safety analysis, in which in-depth technical evaluations are focused on technical areas that are significant to preclosure safety and risk. As part of pre-licensing activities, the Center for Nuclear Waste Regulatory Analyses (CNWRA) developed the Preclosure Safety Analysis Tool software to aid in the regulatory review of a DOE license application and support any independent confirmatory assessments that may be needed. Recent DOE information indicates a primarily canister-based handling approach that includes the wet transfer of individual assemblies where Heating, Ventilation, and Air Conditioning (HVAC) systems may be relied on to provide confinement and limit the spread of any airborne radioactive material from handling operations. Workers may be involved in manual and remote operations in handling transportation casks, canisters, waste packages, or bare spent nuclear fuel assemblies inside facility buildings. As part of routine operations within these facilities, radioactive material may potentially become airborne if canisters are opened or bare fuel assemblies are handled. Leakage of contaminated air from the handling area into adjacent occupied areas, therefore, represents a potential radiological exposure pathway for indoor workers. The objective of this paper is to demonstrate modeling capabilities that can be used by the regulator to estimate frequencies of

  14. Population Analysis of Adverse Events in Different Age Groups Using Big Clinical Trials Data.

    Science.gov (United States)

    Luo, Jake; Eldredge, Christina; Cho, Chi C; Cisler, Ron A

    2016-10-17

    Understanding adverse event patterns in clinical studies across populations is important for patient safety and protection in clinical trials as well as for developing appropriate drug therapies, procedures, and treatment plans. The objective of our study was to conduct a data-driven population-based analysis to estimate the incidence, diversity, and association patterns of adverse events by age of the clinical trials patients and participants. Two aspects of adverse event patterns were measured: (1) the adverse event incidence rate in each of the patient age groups and (2) the diversity of adverse events defined as distinct types of adverse events categorized by organ system. Statistical analysis was done on the summarized clinical trial data. The incident rate and diversity level in each of the age groups were compared with the lowest group (reference group) using t tests. Cohort data was obtained from ClinicalTrials.gov, and 186,339 clinical studies were analyzed; data were extracted from the 17,853 clinical trials that reported clinical outcomes. The total number of clinical trial participants was 6,808,619, and total number of participants affected by adverse events in these trials was 1,840,432. The trial participants were divided into eight different age groups to support cross-age group comparison. In general, children and older patients are more susceptible to adverse events in clinical trial studies. Using the lowest incidence age group as the reference group (20-29 years), the incidence rate of the 0-9 years-old group was 31.41%, approximately 1.51 times higher (P=.04) than the young adult group (20-29 years) at 20.76%. The second-highest group is the 50-59 years-old group with an incidence rate of 30.09%, significantly higher (Pgroup. The adverse event diversity also increased with increase in patient age. Clinical studies that recruited older patients (older than 40 years) were more likely to observe a diverse range of adverse events (Page group (older

  15. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  16. Exploitation of a component event data bank for common cause failure analysis

    International Nuclear Information System (INIS)

    Games, A.M.; Amendola, A.; Martin, P.

    1985-01-01

    Investigations into using the European Reliability Data System Component Event Data Bank for common cause failure analysis have been carried out. Starting from early exercises where data were analyzed without computer aid, different types of linked multiple failures have been identified. A classification system is proposed based on this experience. It defines a multiple failure event space wherein each category defines causal, modal, temporal and structural links between failures. It is shown that a search algorithm which incorporates the specific interrogative procedures of the data bank can be developed in conjunction with this classification system. It is concluded that the classification scheme and the search algorithm are useful organizational tools in the field of common cause failures studies. However, it is also suggested that the use of the term common cause failure should be avoided since it embodies to many different types of linked multiple failures

  17. Multivariate analysis methods to tag b quark events at LEP/SLC

    International Nuclear Information System (INIS)

    Brandl, B.; Falvard, A.; Guicheney, C.; Henrard, P.; Jousset, J.; Proriol, J.

    1992-01-01

    Multivariate analyses are applied to tag Z → bb-bar events at LEP/SLC. They are based on the specific b-event shape caused by the large b-quark mass. Discriminant analyses, classification trees and neural networks are presented and their performances are compared. It is shown that the neural network approach, due to its non-linearity, copes best with the complexity of the problem. As an example for an application of the developed methods the measurement of Γ(Z → bb-bar) is discussed. The usefulness of methods based on the global event shape is limited by the uncertainties introduced by the necessity of event simulation. As solution a double tag method is presented which can be applied to many tasks of LEP/SLC heavy flavour physics. (author) 29 refs.; 6 figs.; 1 tab

  18. In vitro evaluation of demineralized freeze-dried bone allograft in combination with enamel matrix derivative.

    Science.gov (United States)

    Miron, Richard J; Bosshardt, Dieter D; Laugisch, Oliver; Dard, Michel; Gemperli, Anja C; Buser, Daniel; Gruber, Reinhard; Sculean, Anton

    2013-11-01

    Preclinical and clinical studies suggest that a combination of enamel matrix derivative (EMD) with demineralized freeze-dried bone allograft (DFDBA) may improve periodontal wound healing and regeneration. To date, no single study has characterized the effects of this combination on in vitro cell behavior. The aim of this study is to test the ability of EMD to adsorb to the surface of DFDBA particles and determine the effect of EMD coating on downstream cellular pathways such as adhesion, proliferation, and differentiation of primary human osteoblasts and periodontal ligament (PDL) cells. DFDBA particles were precoated with EMD or human blood and analyzed for protein adsorption patterns via scanning electron microscopy. Cell attachment and proliferation were quantified using a commercial assay. Cell differentiation was analyzed using real-time polymerase chain reaction for genes encoding Runx2, alkaline phosphatase, osteocalcin, and collagen 1α1, and mineralization was assessed using alizarinred staining. Analysis of cell attachment revealed no significant differences among control, blood-coated, and EMD-coated DFDBA particles. EMD significantly increased cell proliferation at 3 and 5 days after seeding for both osteoblasts and PDL cells compared to control and blood-coated samples. Moreover, there were significantly higher messenger ribonucleic acid levels of osteogenic differentiation markers, including collagen 1α1, alkaline phosphatase, and osteocalcin, in osteoblasts and PDL cells cultured on EMD-coated DFDBA particles at 3, 7, and 14 days. The results suggest that the addition of EMD to DFDBA particles may influence periodontal regeneration by stimulating PDL cell and osteoblast proliferation and differentiation.

  19. Top event prevention analysis - a deterministic use of PRA

    International Nuclear Information System (INIS)

    Blanchard, D.P.; Worrell, R.B.

    1995-01-01

    Risk importance measures are popular for many applications of probabilistic analysis. Inherent in the derivation of risk importance measures are implicit assumptions that those using these numerical results should be aware of in their decision making. These assumptions and potential limitations include the following: (1) The risk importance measures are derived for a single event at a time and are therefore valid only if all other event probabilities are unchanged at their current values. (2) The results for which risk importance measures are derived may not be complete for reasons such as truncation

  20. [Analysis on the adverse events of cupping therapy in the application].

    Science.gov (United States)

    Zhou, Xin; Ruan, Jing-wen; Xing, Bing-feng

    2014-10-01

    The deep analysis has been done on the cases of adverse events and common injury of cupping therapy encountered in recent years in terms of manipulation and patient's constitution. The adverse events of cupping therapy are commonly caused by improper manipulation of medical practitioners, ignoring contraindication and patient's constitution. Clinical practitioners should use cupping therapy cautiously, follow strictly the rules of standard manipulation and medical core system, pay attention to the contraindication and take strict precautions against the occurrence of adverse events.

  1. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  2. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    Science.gov (United States)

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  3. [Incidence rate of adverse reaction/event by Qingkailing injection: a Meta-analysis of single rate].

    Science.gov (United States)

    Ai, Chun-ling; Xie, Yan-ming; Li, Ming-quan; Wang, Lian-xin; Liao, Xing

    2015-12-01

    To systematically review the incidence rate of adverse drug reaction/event by Qingkailing injection. Such databases as the PubMed, EMbase, the Cochrane library, CNKI, VIP WanFang data and CBM were searched by computer from foundation to July 30, 2015. Two reviewers independently screened literature according to the inclusion and exclusion criteria, extracted data and cross check data. Then, Meta-analysis was performed by using the R 3.2.0 software, subgroup sensitivity analysis was performed based on age, mode of medicine, observation time and research quality. Sixty-three studies involving 9,793 patients with Qingkailing injection were included, 367 cases of adverse reactions/events were reported in total. The incidence rate of adverse reaction in skin and mucosa group was 2% [95% CI (0.02; 0.03)]; the digestive system adverse reaction was 6% [95% CI(0.05; 0.07); the injection site adverse reaction was 4% [95% CI (0.02; 0.07)]. In the digestive system as the main types of adverse reactions/events, incidence of children and adults were 4.6% [0.021 1; 0.097 7] and 6.9% [0.053 5; 0.089 8], respectively. Adverse reactions to skin and mucous membrane damage as the main performance/event type, the observation time > 7 days and ≤ 7 days incidence of 3% [0.012 9; 0.068 3] and 1.9% [0.007 8; 0.046 1], respectively. Subgroup analysis showed that different types of adverse reactions, combination in the incidence of adverse reactions/events were higher than that of single drug, the difference was statistically significant (P reactions occur, and clinical rational drug use, such as combination, age and other fators, and the influence factors vary in different populations. Therefore, clinical doctors for children and the elderly use special care was required for a clear and open spirit injection, the implementation of individualized medication.

  4. Ground standoff mine detection system (GSTAMIDS) engineering, manufacturing, and development (EMD) Block 0

    Science.gov (United States)

    Pressley, Jackson R.; Pabst, Donald; Sower, Gary D.; Nee, Larry; Green, Brian; Howard, Peter

    2001-10-01

    The United States Army has contracted EG&G Technical Services to build the GSTAMIDS EMD Block 0. This system autonomously detects and marks buried anti-tank land mines from an unmanned vehicle. It consists of a remotely operated host vehicle, standard teleoperation system (STS) control, mine detection system (MDS) and a control vehicle. Two complete systems are being fabricated, along with a third MDS. The host vehicle for Block 0 is the South African Meerkat that has overpass capability for anti-tank mines, as well as armor anti-mine blast protection and ballistic protection. It is operated via the STS radio link from within the control vehicle. The Main Computer System (MCS), located in the control vehicle, receives sensor data from the MDS via a high speed radio link, processes and fuses the data to make a decision of a mine detection, and sends the information back to the host vehicle for a mark to be placed on the mine location. The MCS also has the capability to interface into the FBCB2 system via SINGARS radio. The GSTAMIDS operator station and the control vehicle communications system also connect to the MCS. The MDS sensors are mounted on the host vehicle and include Ground Penetrating Radar (GPR), Pulsed Magnetic Induction (PMI) metal detector, and (as an option) long-wave infrared (LWIR). A distributed processing architecture is used so that pre-processing is performed on data at the sensor level before transmission to the MCS, minimizing required throughput. Nine (9) channels each of GPR and PMI are mounted underneath the meerkat to provide a three-meter detection swath. Two IR cameras are mounted on the upper sides of the Meerkat, providing a field of view of the required swath with overlap underneath the vehicle. Also included on the host vehicle are an Internal Navigation System (INS), Global Positioning System (GPS), and radio communications for remote control and data transmission. The GSTAMIDS Block 0 is designed as a modular, expandable system

  5. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    Science.gov (United States)

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  6. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  7. Analysis of factors associated with hiccups based on the Japanese Adverse Drug Event Report database.

    Science.gov (United States)

    Hosoya, Ryuichiro; Uesawa, Yoshihiro; Ishii-Nozawa, Reiko; Kagaya, Hajime

    2017-01-01

    Hiccups are occasionally experienced by most individuals. Although hiccups are not life-threatening, they may lead to a decline in quality of life. Previous studies showed that hiccups may occur as an adverse effect of certain medicines during chemotherapy. Furthermore, a male dominance in hiccups has been reported. However, due to the limited number of studies conducted on this phenomenon, debate still surrounds the few factors influencing hiccups. The present study aimed to investigate the influence of medicines and patient characteristics on hiccups using a large-sized adverse drug event report database and, specifically, the Japanese Adverse Drug Event Report (JADER) database. Cases of adverse effects associated with medications were extracted from JADER, and Fisher's exact test was performed to assess the presence or absence of hiccups for each medication. In a multivariate analysis, we conducted a multiple logistic regression analysis using medication and patient characteristic variables exhibiting significance. We also examined the role of dexamethasone in inducing hiccups during chemotherapy. Medicines associated with hiccups included dexamethasone, levofolinate, fluorouracil, oxaliplatin, carboplatin, and irinotecan. Patient characteristics associated with hiccups included a male gender and greater height. The combination of anti-cancer agent and dexamethasone use was noted in more than 95% of patients in the dexamethasone-use group. Hiccups also occurred in patients in the anti-cancer agent-use group who did not use dexamethasone. Most of the medications that induce hiccups are used in chemotherapy. The results of the present study suggest that it is possible to predict a high risk of hiccups using patient characteristics. We confirmed that dexamethasone was the drug that has the strongest influence on the induction of hiccups. However, the influence of anti-cancer agents on the induction of hiccups cannot be denied. We consider the results of the present

  8. Fault trees based on past accidents. Factorial analysis of events

    International Nuclear Information System (INIS)

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  9. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  10. Analysis of respiratory events in obstructive sleep apnea syndrome: Inter-relations and association to simple nocturnal features.

    Science.gov (United States)

    Ghandeharioun, H; Rezaeitalab, F; Lotfi, R

    2016-01-01

    This study carefully evaluates the association of different respiration-related events to each other and to simple nocturnal features in obstructive sleep apnea-hypopnea syndrome (OSAS). The events include apneas, hypopneas, respiratory event-related arousals and snores. We conducted a statistical study on 158 adults who underwent polysomnography between July 2012 and May 2014. To monitor relevance, along with linear statistical strategies like analysis of variance and bootstrapping a correlation coefficient standard error, the non-linear method of mutual information is also applied to illuminate vague results of linear techniques. Based on normalized mutual information weights (NMIW), indices of apnea are 1.3 times more relevant to AHI values than those of hypopnea. NMIW for the number of blood oxygen desaturation below 95% is considerable (0.531). The next relevant feature is "respiratory arousals index" with NMIW of 0.501. Snore indices (0.314), and BMI (0.203) take the next place. Based on NMIW values, snoring events are nearly one-third (29.9%) more dependent to hypopneas than RERAs. 1. The more sever the OSAS is, the more frequently the apneic events happen. 2. The association of snore with hypopnea/RERA revealed which is routinely ignored in regression-based OSAS modeling. 3. The statistical dependencies of oximetry features potentially can lead to home-based screening of OSAS. 4. Poor ESS-AHI relevance in the database under study indicates its disability for the OSA diagnosis compared to oximetry. 5. Based on poor RERA-snore/ESS relevance, detailed history of the symptoms plus polysomnography is suggested for accurate diagnosis of RERAs. Copyright © 2015 Sociedade Portuguesa de Pneumologia. Published by Elsevier España, S.L.U. All rights reserved.

  11. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  12. Multidimensional k-nearest neighbor model based on EEMD for financial time series forecasting

    Science.gov (United States)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    2017-07-01

    In this paper, we propose a new two-stage methodology that combines the ensemble empirical mode decomposition (EEMD) with multidimensional k-nearest neighbor model (MKNN) in order to forecast the closing price and high price of the stocks simultaneously. The modified algorithm of k-nearest neighbors (KNN) has an increasingly wide application in the prediction of all fields. Empirical mode decomposition (EMD) decomposes a nonlinear and non-stationary signal into a series of intrinsic mode functions (IMFs), however, it cannot reveal characteristic information of the signal with much accuracy as a result of mode mixing. So ensemble empirical mode decomposition (EEMD), an improved method of EMD, is presented to resolve the weaknesses of EMD by adding white noise to the original data. With EEMD, the components with true physical meaning can be extracted from the time series. Utilizing the advantage of EEMD and MKNN, the new proposed ensemble empirical mode decomposition combined with multidimensional k-nearest neighbor model (EEMD-MKNN) has high predictive precision for short-term forecasting. Moreover, we extend this methodology to the case of two-dimensions to forecast the closing price and high price of the four stocks (NAS, S&P500, DJI and STI stock indices) at the same time. The results indicate that the proposed EEMD-MKNN model has a higher forecast precision than EMD-KNN, KNN method and ARIMA.

  13. Rocchio-based relevance feedback in video event retrieval

    NARCIS (Netherlands)

    Pingen, G.L.J.; de Boer, M.H.T.; Aly, Robin; Amsaleg, Laurent; Guðmundsson, Gylfi Þór; Gurrin, Cathal; Jónsson, Björn Þór; Satoh, Shin’ichi

    This paper investigates methods for user and pseudo relevance feedback in video event retrieval. Existing feedback methods achieve strong performance but adjust the ranking based on few individual examples. We propose a relevance feedback algorithm (ARF) derived from the Rocchio method, which is a

  14. To what extent can global warming events influence scaling properties of climatic fluctuations in glacial periods?

    Science.gov (United States)

    Alberti, Tommaso; Lepreti, Fabio; Vecchio, Antonio; Carbone, Vincenzo

    2017-04-01

    The Earth's climate is an extremely unstable complex system consisting of nonlinear and still rather unknown interactions among atmosphere, land surface, ice and oceans. The system is mainly driven by solar irradiance, even if internal components as volcanic eruptions and human activities affect the atmospheric composition thus acting as a driver for climate changes. Since the extreme climate variability is the result of a set of phenomena operating from daily to multi-millennial timescales, with different correlation times, a study of the scaling properties of the system can evidence non-trivial persistent structures, internal or external physical processes. Recently, the scaling properties of the paleoclimate changes have been analyzed by distinguish between interglacial and glacial climates [Shao and Ditlevsen, 2016]. The results show that the last glacial record (20-120 kyr BP) presents some elements of multifractality, while the last interglacial period (0-10 kyr BP), say the Holocene period, seems to be characterized by a mono-fractal structure. This is associated to the absence of Dansgaard-Oeschger (DO) events in the interglacial climate that could be the cause for the absence of multifractality. This hypothesis is supported by the analysis of the period between 18 and 27 kyr BP, i.e. during the Last Glacial Period, in which a single DO event have been registred. Through the Empirical Mode Decomposition (EMD) we were able to detect a timescale separation within the Last Glacial Period (20-120 kyr BP) in two main components: a high-frequency component, related to the occurrence of DO events, and a low-frequency one, associated to the cooling/warming phase switch [Alberti et al., 2014]. Here, we investigate the scaling properties of the climate fluctuations within the Last Glacial Period, where abrupt climate changes, characterized by fast increase of temperature usually called Dansgaard-Oeschger (DO) events, have been particularly pronounced. By using the

  15. Evaluation of Fourier integral. Spectral analysis of seismic events

    International Nuclear Information System (INIS)

    Chitaru, Cristian; Enescu, Dumitru

    2003-01-01

    Spectral analysis of seismic events represents a method for great earthquake prediction. The seismic signal is not a sinusoidal signal; for this, it is necessary to find a method for best approximation of real signal with a sinusoidal signal. The 'Quanterra' broadband station allows the data access in numerical and/or graphical forms. With the numerical form we can easily make a computer program (MSOFFICE-EXCEL) for spectral analysis. (authors)

  16. Introduction

    Science.gov (United States)

    Warwick, Peter D.

    2007-01-01

    The inevitable increase in demand and continuing depletion of accessible oil and gas resources during the 21st century will cause greater dependence on energy minerals such as coal, uranium, and unconventional sources of oil and natural gas to satisfy our increasing energy needs. The Energy Minerals Division (EMD) of the American Association of Petroleum Geologists (AAPG) is a membership-based technical interest group with goals to: (1) advance the science of geology, especially as it relates to exploration, discovery, and production of mineral resources and subsurface gas and liquids (other than conventional oil and gas) for energy-related purposes; (2) foster the spirit of scientific research; (3) disseminate information related to the geology of energy minerals and the associated technology of energy mineral resources extraction; and (4) advance the professional wellbeing of its members. This article contains a brief summary of some of the 2006 annual committee reports presented to the EMD Leadership. These reports are available to the EMD Membership at http://emd.aapg.org/members_only. This collection of short reports is presented here by the EMD as a service to the general geologic community and to simulate interest in the focus technical areas of EMD.

  17. Empirical mode decomposition and Hilbert transforms for analysis of oil-film interferograms

    International Nuclear Information System (INIS)

    Chauhan, Kapil; Ng, Henry C H; Marusic, Ivan

    2010-01-01

    Oil-film interferometry is rapidly becoming the preferred method for direct measurement of wall shear stress in studies of wall-bounded turbulent flows. Although being widely accepted as the most accurate technique, it does have inherent measurement uncertainties, one of which is associated with determining the fringe spacing. This is the focus of this paper. Conventional analysis methods involve a certain level of user input and thus some subjectivity. In this paper, we consider empirical mode decomposition (EMD) and the Hilbert transform as an alternative tool for analyzing oil-film interferograms. In contrast to the commonly used Fourier-based techniques, this new method is less subjective and, as it is based on the Hilbert transform, is superior for treating amplitude and frequency modulated data. This makes it particularly robust to wide differences in the quality of interferograms

  18. Replica analysis of overfitting in regression models for time-to-event data

    Science.gov (United States)

    Coolen, A. C. C.; Barrett, J. E.; Paga, P.; Perez-Vicente, C. J.

    2017-09-01

    Overfitting, which happens when the number of parameters in a model is too large compared to the number of data points available for determining these parameters, is a serious and growing problem in survival analysis. While modern medicine presents us with data of unprecedented dimensionality, these data cannot yet be used effectively for clinical outcome prediction. Standard error measures in maximum likelihood regression, such as p-values and z-scores, are blind to overfitting, and even for Cox’s proportional hazards model (the main tool of medical statisticians), one finds in literature only rules of thumb on the number of samples required to avoid overfitting. In this paper we present a mathematical theory of overfitting in regression models for time-to-event data, which aims to increase our quantitative understanding of the problem and provide practical tools with which to correct regression outcomes for the impact of overfitting. It is based on the replica method, a statistical mechanical technique for the analysis of heterogeneous many-variable systems that has been used successfully for several decades in physics, biology, and computer science, but not yet in medical statistics. We develop the theory initially for arbitrary regression models for time-to-event data, and verify its predictions in detail for the popular Cox model.

  19. A general framework for time series data mining based on event analysis: application to the medical domains of electroencephalography and stabilometry.

    Science.gov (United States)

    Lara, Juan A; Lizcano, David; Pérez, Aurora; Valente, Juan P

    2014-10-01

    There are now domains where information is recorded over a period of time, leading to sequences of data known as time series. In many domains, like medicine, time series analysis requires to focus on certain regions of interest, known as events, rather than analyzing the whole time series. In this paper, we propose a framework for knowledge discovery in both one-dimensional and multidimensional time series containing events. We show how our approach can be used to classify medical time series by means of a process that identifies events in time series, generates time series reference models of representative events and compares two time series by analyzing the events they have in common. We have applied our framework on time series generated in the areas of electroencephalography (EEG) and stabilometry. Framework performance was evaluated in terms of classification accuracy, and the results confirmed that the proposed schema has potential for classifying EEG and stabilometric signals. The proposed framework is useful for discovering knowledge from medical time series containing events, such as stabilometric and electroencephalographic time series. These results would be equally applicable to other medical domains generating iconographic time series, such as, for example, electrocardiography (ECG). Copyright © 2014 Elsevier Inc. All rights reserved.

  20. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  1. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  2. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  3. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  4. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  5. Urbanization and fertility: an event-history analysis of coastal Ghana.

    Science.gov (United States)

    White, Michael J; Muhidin, Salut; Andrzejewski, Catherine; Tagoe, Eva; Knight, Rodney; Reed, Holly

    2008-11-01

    In this article, we undertake an event-history analysis of fertility in Ghana. We exploit detailed life history calendar data to conduct a more refined and definitive analysis of the relationship among personal traits, urban residence, and fertility. Although urbanization is generally associated with lower fertility in developing countries, inferences in most studies have been hampered by a lack of information about the timing of residence in relationship to childbearing. We find that the effect of urbanization itself is strong, evident, and complex, and persists after we control for the effects of age, cohort, union status, and education. Our discrete-time event-history analysis shows that urban women exhibit fertility rates that are, on average, 11% lower than those of rural women, but the effects vary by parity. Differences in urban population traits would augment the effects of urban adaptation itself Extensions of the analysis point to the operation of a selection effect in rural-to-urban mobility but provide limited evidence for disruption effects. The possibility of further selection of urbanward migrants on unmeasured traits remains. The analysis also demonstrates the utility of an annual life history calendar for collecting such data in the field.

  6. NetWeaver for EMDS user guide (version 1.1): a knowledge base development system.

    Science.gov (United States)

    Keith M. Reynolds

    1999-01-01

    The guide describes use of the NetWeaver knowledge base development system. Knowledge representation in NetWeaver is based on object-oriented fuzzy-logic networks that offer several significant advantages over the more traditional rulebased representation. Compared to rule-based knowledge bases, NetWeaver knowledge bases are easier to build, test, and maintain because...

  7. Enhancing the Effectiveness of Significant Event Analysis: Exploring Personal Impact and Applying Systems Thinking in Primary Care.

    Science.gov (United States)

    Bowie, Paul; McNaughton, Elaine; Bruce, David; Holly, Deirdre; Forrest, Eleanor; Macleod, Marion; Kennedy, Susan; Power, Ailsa; Toppin, Denis; Black, Irene; Pooley, Janet; Taylor, Audrey; Swanson, Vivien; Kelly, Moya; Ferguson, Julie; Stirling, Suzanne; Wakeling, Judy; Inglis, Angela; McKay, John; Sargeant, Joan

    2016-01-01

    Significant event analysis (SEA) is well established in many primary care settings but can be poorly implemented. Reasons include the emotional impact on clinicians and limited knowledge of systems thinking in establishing why events happen and formulating improvements. To enhance SEA effectiveness, we developed and tested "guiding tools" based on human factors principles. Mixed-methods development of guiding tools (Personal Booklet-to help with emotional demands and apply a human factors analysis at the individual level; Desk Pad-to guide a team-based systems analysis; and a written Report Format) by a multiprofessional "expert" group and testing with Scottish primary care practitioners who submitted completed enhanced SEA reports. Evaluation data were collected through questionnaire, telephone interviews, and thematic analysis of SEA reports. Overall, 149/240 care practitioners tested the guiding tools and submitted completed SEA reports (62.1%). Reported understanding of how to undertake SEA improved postintervention (P systems issues (85/123, 69.1%), while most found the Report Format clear (94/123, 76.4%) and would recommend it (88/123, 71.5%). Most SEA reports adopted a systems approach to analyses (125/149, 83.9%), care improvement (74/149, 49.7), or planned actions (42/149, 28.2%). Applying human factors principles to SEA potentially enables care teams to gain a systems-based understanding of why things go wrong, which may help with related emotional demands and with more effective learning and improvement.

  8. Central FPGA-based destination and load control in the LHCb MHz event readout

    International Nuclear Information System (INIS)

    Jacobsson, R.

    2012-01-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  9. Central FPGA-based destination and load control in the LHCb MHz event readout

    Science.gov (United States)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  10. OGLE-2016-BLG-0168 Binary Microlensing Event: Prediction and Confirmation of the Microlens Parallax Effect from Space-based Observations

    Energy Technology Data Exchange (ETDEWEB)

    Shin, I.-G.; Yee, J. C.; Jung, Y. K. [Smithsonian Astrophysical Observatory, 60 Garden Street, Cambridge, MA 02138 (United States); Udalski, A.; Skowron, J.; Mróz, P.; Soszyński, I.; Poleski, R.; Szymański, M. K.; Kozłowski, S.; Pietrukowicz, P.; Ulaczyk, K.; Pawlak, M. [Warsaw University Observatory, Al. Ujazdowskie 4,00-478 Warszawa (Poland); Novati, S. Calchi [IPAC, Mail Code 100-22, California Institute of Technology, 1200 E. California Boulevard, Pasadena, CA 91125 (United States); Han, C. [Department of Physics, Chungbuk National University, Cheongju 371-763 (Korea, Republic of); Albrow, M. D. [University of Canterbury, Department of Physics and Astronomy, Private Bag 4800, Christchurch 8020 (New Zealand); Gould, A. [Department of Astronomy, Ohio State University, 140 W. 18th Avenue, Columbus, OH 43210 (United States); Chung, S.-J.; Hwang, K.-H.; Ryu, Y.-H. [Korea Astronomy and Space Science Institute, 776 Daedeokdae-ro, Yuseong-Gu, Daejeon 34055 (Korea, Republic of); Collaboration: OGLE Collaboration; KMTNet Group; Spitzer Team; and others

    2017-11-01

    The microlens parallax is a crucial observable for conclusively identifying the nature of lens systems in microlensing events containing or composed of faint (even dark) astronomical objects such as planets, neutron stars, brown dwarfs, and black holes. With the commencement of a new era of microlensing in collaboration with space-based observations, the microlens parallax can be routinely measured. In addition, space-based observations can provide opportunities to verify the microlens parallax measured from ground-only observations and to find a unique solution to the lensing light-curve analysis. Furthermore, since most space-based observations cannot cover the full light curves of lensing events, it is also necessary to verify the reliability of the information extracted from fragmentary space-based light curves. We conduct a test based on the microlensing event OGLE-2016-BLG-0168, created by a binary lens system consisting of almost equal mass M-dwarf stars, to demonstrate that it is possible to verify the microlens parallax and to resolve degeneracies using the space-based light curve even though the observations are fragmentary. Since space-based observatories will frequently produce fragmentary light curves due to their short observing windows, the methodology of this test will be useful for next-generation microlensing experiments that combine space-based and ground-based collaboration.

  11. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies.

    Science.gov (United States)

    Balador, Ali; Uhlemann, Elisabeth; Calafate, Carlos T; Cano, Juan-Carlos

    2018-03-23

    Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC) method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  12. Supporting Beacon and Event-Driven Messages in Vehicular Platoons through Token-Based Strategies

    Directory of Open Access Journals (Sweden)

    Ali Balador

    2018-03-01

    Full Text Available Timely and reliable inter-vehicle communications is a critical requirement to support traffic safety applications, such as vehicle platooning. Furthermore, low-delay communications allow the platoon to react quickly to unexpected events. In this scope, having a predictable and highly effective medium access control (MAC method is of utmost importance. However, the currently available IEEE 802.11p technology is unable to adequately address these challenges. In this paper, we propose a MAC method especially adapted to platoons, able to transmit beacons within the required time constraints, but with a higher reliability level than IEEE 802.11p, while concurrently enabling efficient dissemination of event-driven messages. The protocol circulates the token within the platoon not in a round-robin fashion, but based on beacon data age, i.e., the time that has passed since the previous collection of status information, thereby automatically offering repeated beacon transmission opportunities for increased reliability. In addition, we propose three different methods for supporting event-driven messages co-existing with beacons. Analysis and simulation results in single and multi-hop scenarios showed that, by providing non-competitive channel access and frequent retransmission opportunities, our protocol can offer beacon delivery within one beacon generation interval while fulfilling the requirements on low-delay dissemination of event-driven messages for traffic safety applications.

  13. A Geo-Event-Based Geospatial Information Service: A Case Study of Typhoon Hazard

    Directory of Open Access Journals (Sweden)

    Yu Zhang

    2017-03-01

    Full Text Available Social media is valuable in propagating information during disasters for its timely and available characteristics nowadays, and assists in making decisions when tagged with locations. Considering the ambiguity and inaccuracy in some social data, additional authoritative data are needed for important verification. However, current works often fail to leverage both social and authoritative data and, on most occasions, the data are used in disaster analysis after the fact. Moreover, current works organize the data from the perspective of the spatial location, but not from the perspective of the disaster, making it difficult to dynamically analyze the disaster. All of the disaster-related data around the affected locations need to be retrieved. To solve these limitations, this study develops a geo-event-based geospatial information service (GEGIS framework and proceeded as follows: (1 a geo-event-related ontology was constructed to provide a uniform semantic basis for the system; (2 geo-events and attributes were extracted from the web using a natural language process (NLP and used in the semantic similarity match of the geospatial resources; and (3 a geospatial information service prototype system was designed and implemented for automatically retrieving and organizing geo-event-related geospatial resources. A case study of a typhoon hazard is analyzed here within the GEGIS and shows that the system would be effective when typhoons occur.

  14. Discrete event systems diagnosis and diagnosability

    CERN Document Server

    Sayed-Mouchaweh, Moamar

    2014-01-01

    Discrete Event Systems: Diagnosis and Diagnosability addresses the problem of fault diagnosis of Discrete Event Systems (DES). This book provides the basic techniques and approaches necessary for the design of an efficient fault diagnosis system for a wide range of modern engineering applications. The different techniques and approaches are classified according to several criteria such as: modeling tools (Automata, Petri nets) that is used to construct the model; the information (qualitative based on events occurrences and/or states outputs, quantitative based on signal processing and data analysis) that is needed to analyze and achieve the diagnosis; the decision structure (centralized, decentralized) that is required to achieve the diagnosis. The goal of this classification is to select the efficient method to achieve the fault diagnosis according to the application constraints. This book focuses on the centralized and decentralized event based diagnosis approaches using formal language and automata as mode...

  15. A browser-based event display for the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Hategan, M; McCauley, T; Nguyen, P

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  16. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    Science.gov (United States)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  17. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  18. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Science.gov (United States)

    Fedosimova, Anastasiya; Gaitinov, Adigam; Grushevskaya, Ekaterina; Lebedev, Igor

    2017-06-01

    In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC), short-range multiparticle correlations (SC) and mixed type (MT) in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  19. Study of the peculiarities of multiparticle production via event-by-event analysis in asymmetric nucleus-nucleus interactions

    Directory of Open Access Journals (Sweden)

    Fedosimova Anastasiya

    2017-01-01

    Full Text Available In this work the study on the peculiarities of multiparticle production in interactions of asymmetric nuclei to search for unusual features of such interactions, is performed. A research of long-range and short-range multiparticle correlations in the pseudorapidity distribution of secondary particles on the basis of analysis of individual interactions of nuclei of 197 Au at energy 10.7 AGeV with photoemulsion nuclei, is carried out. Events with long-range multiparticle correlations (LC, short-range multiparticle correlations (SC and mixed type (MT in pseudorapidity distribution of secondary particles, are selected by the Hurst method in accordance with Hurst curve behavior. These types have significantly different characteristics. At first, they have different fragmentation parameters. Events of LC type are processes of full destruction of the projectile nucleus, in which multicharge fragments are absent. In events of mixed type several multicharge fragments of projectile nucleus are discovered. Secondly, these two types have significantly different multiplicity distribution. The mean multiplicity of LC type events is significantly more than in mixed type events. On the basis of research of the dependence of multiplicity versus target-nuclei fragments number for events of various types it is revealed, that the most considerable multiparticle correlations are observed in interactions of the mixed type, which correspond to the central collisions of gold nuclei and nuclei of CNO-group, i.e. nuclei with strongly asymmetric volume, nuclear mass, charge, etc. Such events are characterised by full destruction of the target-nucleus and the disintegration of the projectile-nucleus on several multi-charged fragments.

  20. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  1. Idaho National Laboratory Quarterly Event Performance Analysis FY 2013 4th Quarter

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, Lisbeth A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-11-01

    This report is published quarterly by the Idaho National Laboratory (INL) Performance Assurance Organization. The Department of Energy Occurrence Reporting and Processing System (ORPS) as prescribed in DOE Order 232.2 “Occurrence Reporting and Processing of Operations Information” requires a quarterly analysis of events, both reportable and not reportable for the previous twelve months. This report is the analysis of occurrence reports and deficiency reports (including not reportable events) identified at the Idaho National Laboratory (INL) during the period of October 2012 through September 2013.

  2. Budget Impact Analysis of Switching to Digital Mammography in a Population-Based Breast Cancer Screening Program: A Discrete Event Simulation Model

    Science.gov (United States)

    Comas, Mercè; Arrospide, Arantzazu; Mar, Javier; Sala, Maria; Vilaprinyó, Ester; Hernández, Cristina; Cots, Francesc; Martínez, Juan; Castells, Xavier

    2014-01-01

    Objective To assess the budgetary impact of switching from screen-film mammography to full-field digital mammography in a population-based breast cancer screening program. Methods A discrete-event simulation model was built to reproduce the breast cancer screening process (biennial mammographic screening of women aged 50 to 69 years) combined with the natural history of breast cancer. The simulation started with 100,000 women and, during a 20-year simulation horizon, new women were dynamically entered according to the aging of the Spanish population. Data on screening were obtained from Spanish breast cancer screening programs. Data on the natural history of breast cancer were based on US data adapted to our population. A budget impact analysis comparing digital with screen-film screening mammography was performed in a sample of 2,000 simulation runs. A sensitivity analysis was performed for crucial screening-related parameters. Distinct scenarios for recall and detection rates were compared. Results Statistically significant savings were found for overall costs, treatment costs and the costs of additional tests in the long term. The overall cost saving was 1,115,857€ (95%CI from 932,147 to 1,299,567) in the 10th year and 2,866,124€ (95%CI from 2,492,610 to 3,239,638) in the 20th year, representing 4.5% and 8.1% of the overall cost associated with screen-film mammography. The sensitivity analysis showed net savings in the long term. Conclusions Switching to digital mammography in a population-based breast cancer screening program saves long-term budget expense, in addition to providing technical advantages. Our results were consistent across distinct scenarios representing the different results obtained in European breast cancer screening programs. PMID:24832200

  3. Instantaneous 3D EEG Signal Analysis Based on Empirical Mode Decomposition and the Hilbert–Huang Transform Applied to Depth of Anaesthesia

    Directory of Open Access Journals (Sweden)

    Mu-Tzu Shih

    2015-02-01

    Full Text Available Depth of anaesthesia (DoA is an important measure for assessing the degree to which the central nervous system of a patient is depressed by a general anaesthetic agent, depending on the potency and concentration with which anaesthesia is administered during surgery. We can monitor the DoA by observing the patient’s electroencephalography (EEG signals during the surgical procedure. Typically high frequency EEG signals indicates the patient is conscious, while low frequency signals mean the patient is in a general anaesthetic state. If the anaesthetist is able to observe the instantaneous frequency changes of the patient’s EEG signals during surgery this can help to better regulate and monitor DoA, reducing surgical and post-operative risks. This paper describes an approach towards the development of a 3D real-time visualization application which can show the instantaneous frequency and instantaneous amplitude of EEG simultaneously by using empirical mode decomposition (EMD and the Hilbert–Huang transform (HHT. HHT uses the EMD method to decompose a signal into so-called intrinsic mode functions (IMFs. The Hilbert spectral analysis method is then used to obtain instantaneous frequency data. The HHT provides a new method of analyzing non-stationary and nonlinear time series data. We investigate this approach by analyzing EEG data collected from patients undergoing surgical procedures. The results show that the EEG differences between three distinct surgical stages computed by using sample entropy (SampEn are consistent with the expected differences between these stages based on the bispectral index (BIS, which has been shown to be quantifiable measure of the effect of anaesthetics on the central nervous system. Also, the proposed filtering approach is more effective compared to the standard filtering method in filtering out signal noise resulting in more consistent results than those provided by the BIS. The proposed approach is therefore

  4. Adverse events following yellow fever immunization: Report and analysis of 67 neurological cases in Brazil.

    Science.gov (United States)

    Martins, Reinaldo de Menezes; Pavão, Ana Luiza Braz; de Oliveira, Patrícia Mouta Nunes; dos Santos, Paulo Roberto Gomes; Carvalho, Sandra Maria D; Mohrdieck, Renate; Fernandes, Alexandre Ribeiro; Sato, Helena Keico; de Figueiredo, Patricia Mandali; von Doellinger, Vanessa Dos Reis; Leal, Maria da Luz Fernandes; Homma, Akira; Maia, Maria de Lourdes S

    2014-11-20

    Neurological adverse events following administration of the 17DD substrain of yellow fever vaccine (YEL-AND) in the Brazilian population are described and analyzed. Based on information obtained from the National Immunization Program through passive surveillance or intensified passive surveillance, from 2007 to 2012, descriptive analysis, national and regional rates of YFV associated neurotropic, neurological autoimmune disease, and reporting rate ratios with their respective 95% confidence intervals were calculated for first time vaccinees stratified on age and year. Sixty-seven neurological cases were found, with the highest rate of neurological adverse events in the age group from 5 to 9 years (2.66 per 100,000 vaccine doses in Rio Grande do Sul state, and 0.83 per 100,000 doses in national analysis). Two cases had a combination of neurotropic and autoimmune features. This is the largest sample of YEL-AND already analyzed. Rates are similar to other recent studies, but on this study the age group from 5 to 9 years of age had the highest risk. As neurological adverse events have in general a good prognosis, they should not contraindicate the use of yellow fever vaccine in face of risk of infection by yellow fever virus. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Developing future precipitation events from historic events: An Amsterdam case study.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  6. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. A. Wasiolek

    2003-07-21

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in

  7. Disruptive Event Biosphere Dose Conversion Factor Analysis

    International Nuclear Information System (INIS)

    M. A. Wasiolek

    2003-01-01

    This analysis report, ''Disruptive Event Biosphere Dose Conversion Factor Analysis'', is one of the technical reports containing documentation of the ERMYN (Environmental Radiation Model for Yucca Mountain Nevada) biosphere model for the geologic repository at Yucca Mountain, its input parameters, and the application of the model to perform the dose assessment for the repository. The biosphere model is one of a series of process models supporting the Total System Performance Assessment (TSPA) for the Yucca Mountain repository. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of the two reports that develop biosphere dose conversion factors (BDCFs), which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2003 [DIRS 164186]) describes in detail the conceptual model as well as the mathematical model and lists its input parameters. Model input parameters are developed and described in detail in five analysis report (BSC 2003 [DIRS 160964], BSC 2003 [DIRS 160965], BSC 2003 [DIRS 160976], BSC 2003 [DIRS 161239], and BSC 2003 [DIRS 161241]). The objective of this analysis was to develop the BDCFs for the volcanic ash exposure scenario and the dose factors (DFs) for calculating inhalation doses during volcanic eruption (eruption phase of the volcanic event). The volcanic ash exposure scenario is hereafter referred to as the volcanic ash scenario. For the volcanic ash scenario, the mode of radionuclide release into the biosphere is a volcanic eruption through the repository with the resulting entrainment of contaminated waste in the tephra and the subsequent atmospheric transport and dispersion of contaminated material in the biosphere. The biosphere process

  8. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  9. Trend analysis of explosion events at overseas nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Hiroki

    2008-01-01

    We surveyed failures caused by disasters (e.g., severe storms, heavy rainfall, earthquakes, explosions and fires) which occurred during the 13 years from 1995 to 2007 at overseas nuclear power plants (NPPs) from the nuclear information database of the Institute of Nuclear Safety System. Incorporated (INSS). The results revealed that explosions were the second most frequent type of failure after fires. We conducted a trend analysis on such explosion events. The analysis by equipment, cause, and effect on the plant showed that the explosions occurred mainly at electrical facilities, and thus it is essential to manage the maintenance of electrical facilities for preventing explosions. In addition, it was shown that explosions at transformers and batteries, which have never occurred at Japan's NPPs, accounted for as much as 55% of all explosions. The fact infers that this difference is attributable to the difference in maintenance methods of transformers (condition based maintenance adopted by NPPs) and workforce organization of batteries (inspections performed by utilities' own maintenance workers at NPPs). (author)

  10. Development of time dependent safety analysis code for plasma anomaly events in fusion reactors

    International Nuclear Information System (INIS)

    Honda, Takuro; Okazaki, Takashi; Bartels, H.W.; Uckan, N.A.; Seki, Yasushi.

    1997-01-01

    A safety analysis code SAFALY has been developed to analyze plasma anomaly events in fusion reactors, e.g., a loss of plasma control. The code is a hybrid code comprising a zero-dimensional plasma dynamics and a one-dimensional thermal analysis of in-vessel components. The code evaluates the time evolution of plasma parameters and temperature distributions of in-vessel components. As the plasma-safety interface model, we proposed a robust plasma physics model taking into account updated data for safety assessment. For example, physics safety guidelines for beta limit, density limit and H-L mode confinement transition threshold power, etc. are provided in the model. The model of the in-vessel components are divided into twenty temperature regions in the poloidal direction taking account of radiative heat transfer between each surface of each region. This code can also describe the coolant behavior under hydraulic accidents with the results by hydraulics code and treat vaporization (sublimation) from plasma facing components (PFCs). Furthermore, the code includes the model of impurity transport form PFCs by using a transport probability and a time delay. Quantitative analysis based on the model is possible for a scenario of plasma passive shutdown. We examined the possibility of the code as a safety analysis code for plasma anomaly events in fusion reactors and had a prospect that it would contribute to the safety analysis of the International Thermonuclear Experimental Reactor (ITER). (author)

  11. Short-Period Surface Wave Based Seismic Event Relocation

    Science.gov (United States)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  12. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  13. OAE: The Ontology of Adverse Events.

    Science.gov (United States)

    He, Yongqun; Sarntivijai, Sirarat; Lin, Yu; Xiang, Zuoshuang; Guo, Abra; Zhang, Shelley; Jagannathan, Desikan; Toldo, Luca; Tao, Cui; Smith, Barry

    2014-01-01

    A medical intervention is a medical procedure or application intended to relieve or prevent illness or injury. Examples of medical interventions include vaccination and drug administration. After a medical intervention, adverse events (AEs) may occur which lie outside the intended consequences of the intervention. The representation and analysis of AEs are critical to the improvement of public health. The Ontology of Adverse Events (OAE), previously named Adverse Event Ontology (AEO), is a community-driven ontology developed to standardize and integrate data relating to AEs arising subsequent to medical interventions, as well as to support computer-assisted reasoning. OAE has over 3,000 terms with unique identifiers, including terms imported from existing ontologies and more than 1,800 OAE-specific terms. In OAE, the term 'adverse event' denotes a pathological bodily process in a patient that occurs after a medical intervention. Causal adverse events are defined by OAE as those events that are causal consequences of a medical intervention. OAE represents various adverse events based on patient anatomic regions and clinical outcomes, including symptoms, signs, and abnormal processes. OAE has been used in the analysis of several different sorts of vaccine and drug adverse event data. For example, using the data extracted from the Vaccine Adverse Event Reporting System (VAERS), OAE was used to analyse vaccine adverse events associated with the administrations of different types of influenza vaccines. OAE has also been used to represent and classify the vaccine adverse events cited in package inserts of FDA-licensed human vaccines in the USA. OAE is a biomedical ontology that logically defines and classifies various adverse events occurring after medical interventions. OAE has successfully been applied in several adverse event studies. The OAE ontological framework provides a platform for systematic representation and analysis of adverse events and of the factors (e

  14. Hazard analysis of typhoon-related external events using extreme value theory

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Chan; Jang, Seung Cheol [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lim, Tae Jin [Dept. of Industrial Information Systems Engineering, Soongsil University, Seoul (Korea, Republic of)

    2015-02-15

    After the Fukushima accident, the importance of hazard analysis for extreme external events was raised. To analyze typhoon-induced hazards, which are one of the significant disasters of East Asian countries, a statistical analysis using the extreme value theory, which is a method for estimating the annual exceedance frequency of a rare event, was conducted for an estimation of the occurrence intervals or hazard levels. For the four meteorological variables, maximum wind speed, instantaneous wind speed, hourly precipitation, and daily precipitation, the parameters of the predictive extreme value theory models were estimated. The 100-year return levels for each variable were predicted using the developed models and compared with previously reported values. It was also found that there exist significant long-term climate changes of wind speed and precipitation. A fragility analysis should be conducted to ensure the safety levels of a nuclear power plant for high levels of wind speed and precipitation, which exceed the results of a previous analysis.

  15. Cost analysis of adverse events associated with non-small cell lung cancer management in France

    Directory of Open Access Journals (Sweden)

    Chouaid C

    2017-07-01

    Full Text Available Christos Chouaid,1 Delphine Loirat,2 Emilie Clay,3 Aurélie Millier,3 Chloé Godard,4 Amira Fannan,4 Laurie Lévy-Bachelot,4 Eric Angevin5 1Chest Department, Centre Hospitalier Intercommunal Créteil, Créteil, France; 2Institut Curie, Paris, France; 3Creativ-Ceutical, Paris, France; 4MSD France, Courbevoie, France; 5Institut Gustave Roussy, Villejuif, France Background: Adverse events (AEs related to medical treatments in non-small cell lung cancer (NSCLC are frequent and need an appropriate costing in health economic models. Nevertheless, data on costs associated with AEs in NSCLC are scarce, particularly since the development of immunotherapy with specific immune-related AEs.Objective: To estimate the costs of grades 3 and 4 AEs related to NSCLC treatments including immunotherapy in France.Methods: Grades 3 and 4 AEs related to treatment and reported in at least 1% of patients in Phase III clinical trials for erlotinib, ramucirumab plus docetaxel, docetaxel, pemetrexed plus carboplatin plus bevacizumab, platinum-based chemotherapies, nivolumab and pembrolizumab were identified. When no cost evaluation was reported in literature, estimates on standard treatments and medical resource use for each AE were obtained thanks to an expert panel. Total cost per AE was calculated from a French national health insurance perspective and updated in 2017 Euros. Hospital stay costs were estimated based on public and private weighted tariffs and data from the French Medical Information System (Programme de Médicalisation des Systèmes d’Information. Costs of tests, consultations and treatments were calculated based on national reimbursement tariffs.Results: Overall, costs of grades 3 and 4 AEs related to treatment ranged from €46 per event to €7,742 per year. Fourteen out of 24 AEs identified had a mean estimated cost over €2,000. The highest mean costs were related to type 1 diabetes (€7,742 per year followed by pneumonitis (€5,786 per event

  16. The English Monolingual Dictionary: Its Use among Second Year Students of University Technology of Malaysia, International Campus, Kuala Lumpur

    Directory of Open Access Journals (Sweden)

    Amerrudin Abd. Manan

    2011-07-01

    Full Text Available This research was conducted to seek information on English Monolingual Dictionary (EMD use among 2nd year students of Universiti Teknologi Malaysia, International Campus, Kuala Lumpur (UTMKL. Specifically, the researchers wish to discover, firstly the students’ habit and attitude in EMD use; secondly, to discover their knowledge with regard to the language learning resources available in EMD; thirdly, to discover their skill in using EMD, and finally, to discover whether there were formal instructions in EMD use when they were studying in their former schools and tertiary education. One hundred and ninety-six students took part in the survey by answering a questionnaire. The results of the study reveal that the respondents were poor users of EMD. They rarely consulted the EMD; their knowledge of the language learning resources in the EMD was limited; most perceived their EMD skill as average, and there was no instruction in EMD when they were at tertiary education and previously when they were at school.

  17. Antipsychotics, glycemic disorders, and life-threatening diabetic events: a Bayesian data-mining analysis of the FDA adverse event reporting system (1968-2004).

    Science.gov (United States)

    DuMouchel, William; Fram, David; Yang, Xionghu; Mahmoud, Ramy A; Grogg, Amy L; Engelhart, Luella; Ramaswamy, Krishnan

    2008-01-01

    This analysis compared diabetes-related adverse events associated with use of different antipsychotic agents. A disproportionality analysis of the US Food and Drug Administration (FDA) Adverse Event Reporting System (AERS) was performed. Data from the FDA postmarketing AERS database (1968 through first quarter 2004) were evaluated. Drugs studied included aripiprazole, clozapine, haloperidol, olanzapine, quetiapine, risperidone, and ziprasidone. Fourteen Medical Dictionary for Regulatory Activities (MedDRA) Primary Terms (MPTs) were chosen to identify diabetes-related adverse events; 3 groupings into higher-level descriptive categories were also studied. Three methods of measuring drug-event associations were used: proportional reporting ratio, the empirical Bayes data-mining algorithm known as the Multi-Item Gamma Poisson Shrinker, and logistic regression (LR) analysis. Quantitative measures of association strength, with corresponding confidence intervals, between drugs and specified adverse events were computed and graphed. Some of the LR analyses were repeated separately for reports from patients under and over 45 years of age. Differences in association strength were declared statistically significant if the corresponding 90% confidence intervals did not overlap. Association with various glycemic events differed for different drugs. On average, the rankings of association strength agreed with the following ordering: low association, ziprasidone, aripiprazole, haloperidol, and risperidone; medium association, quetiapine; and strong association, clozapine and olanzapine. The median rank correlation between the above ordering and the 17 sets of LR coefficients (1 set for each glycemic event) was 93%. Many of the disproportionality measures were significantly different across drugs, and ratios of disproportionality factors of 5 or more were frequently observed. There are consistent and substantial differences between atypical antipsychotic drugs in the

  18. Magnesium and the Risk of Cardiovascular Events: A Meta-Analysis of Prospective Cohort Studies

    Science.gov (United States)

    Hao, Yongqiang; Li, Huiwu; Tang, Tingting; Wang, Hao; Yan, Weili; Dai, Kerong

    2013-01-01

    Background Prospective studies that have examined the association between dietary magnesium intake and serum magnesium concentrations and the risk of cardiovascular disease (CVD) events have reported conflicting findings. We undertook a meta-analysis to evaluate the association between dietary magnesium intake and serum magnesium concentrations and the risk of total CVD events. Methodology/Principal Findings We performed systematic searches on MEDLINE, EMBASE, and OVID up to February 1, 2012 without limits. Categorical, linear, and nonlinear, dose-response, heterogeneity, publication bias, subgroup, and meta-regression analysis were performed. The analysis included 532,979 participants from 19 studies (11 studies on dietary magnesium intake, 6 studies on serum magnesium concentrations, and 2 studies on both) with 19,926 CVD events. The pooled relative risks of total CVD events for the highest vs. lowest category of dietary magnesium intake and serum magnesium concentrations were 0.85 (95% confidence interval 0.78 to 0.92) and 0.77 (0.66 to 0.87), respectively. In linear dose-response analysis, only serum magnesium concentrations ranging from 1.44 to 1.8 mEq/L were significantly associated with total CVD events risk (0.91, 0.85 to 0.97) per 0.1 mEq/L (Pnonlinearity = 0.465). However, significant inverse associations emerged in nonlinear models for dietary magnesium intake (Pnonlinearity = 0.024). The greatest risk reduction occurred when intake increased from 150 to 400 mg/d. There was no evidence of publication bias. Conclusions/Significance There is a statistically significant nonlinear inverse association between dietary magnesium intake and total CVD events risk. Serum magnesium concentrations are linearly and inversely associated with the risk of total CVD events. PMID:23520480

  19. Root Cause Analysis Following an Event at a Nuclear Installation: Reference Manual

    International Nuclear Information System (INIS)

    2015-01-01

    Following an event at a nuclear installation, it is important to determine accurately its root causes so that effective corrective actions can be implemented. As stated in IAEA Safety Standards Series No. SF-1, Fundamental Safety Principles: “Processes must be put in place for the feedback and analysis of operating experience”. If this process is completed effectively, the probability of a similar event occurring is significantly reduced. Guidance on how to establish and implement such a process is given in IAEA Safety Standards Series No. NS-G-2.11, A System for the Feedback of Experience from Events in Nuclear Installations. To cater for the diverse nature of operating experience events, several different root cause analysis (RCA) methodologies and techniques have been developed for effective investigation and analysis. An event here is understood as any unanticipated sequence of occurrences that results in, or potentially results in, consequences to plant operation and safety. RCA is not a topic uniquely relevant to event investigators: knowledge of the concepts enhances the learning characteristics of the whole organization. This knowledge also makes a positive contribution to nuclear safety and helps to foster a culture of preventing event occurrence. This publication allows organizations to deepen their knowledge of these methodologies and techniques and also provides new organizations with a broad overview of the RCA process. It is the outcome of a coordinated effort involving the participation of experts from nuclear organizations, the energy industry and research centres in several Member States. This publication also complements IAEA Services Series No. 10, PROSPER Guidelines: Guidelines for Peer Review and for Plant Self- Assessment of Operational Experience Feedback Process, and is intended to form part of a suite of publications developing the principles set forth in these guidelines. In addition to the information and description of RCA

  20. PRELIMINARY SELECTION OF MGR DESIGN BASIS EVENTS

    International Nuclear Information System (INIS)

    Kappes, J.A.

    1999-01-01

    The purpose of this analysis is to identify the preliminary design basis events (DBEs) for consideration in the design of the Monitored Geologic Repository (MGR). For external events and natural phenomena (e.g., earthquake), the objective is to identify those initiating events that the MGR will be designed to withstand. Design criteria will ensure that radiological release scenarios resulting from these initiating events are beyond design basis (i.e., have a scenario frequency less than once per million years). For internal (i.e., human-induced and random equipment failures) events, the objective is to identify credible event sequences that result in bounding radiological releases. These sequences will be used to establish the design basis criteria for MGR structures, systems, and components (SSCs) design basis criteria in order to prevent or mitigate radiological releases. The safety strategy presented in this analysis for preventing or mitigating DBEs is based on the preclosure safety strategy outlined in ''Strategy to Mitigate Preclosure Offsite Exposure'' (CRWMS M andO 1998f). DBE analysis is necessary to provide feedback and requirements to the design process, and also to demonstrate compliance with proposed 10 CFR 63 (Dyer 1999b) requirements. DBE analysis is also required to identify and classify the SSCs that are important to safety (ITS)

  1. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  2. Trend analysis of human error events and assessment of their proactive prevention measure at Rokkasho reprocessing plant

    International Nuclear Information System (INIS)

    Yamazaki, Satoru; Tanaka, Izumi; Wakabayashi, Toshio

    2012-01-01

    A trend analysis of human error events is important for preventing the recurrence of human error events. We propose a new method for identifying the common characteristics from results of trend analysis, such as the latent weakness of organization, and a management process for strategic error prevention. In this paper, we describe a trend analysis method for human error events that have been accumulated in the organization and the utilization of the results of trend analysis to prevent accidents proactively. Although the systematic analysis of human error events, the monitoring of their overall trend, and the utilization of the analyzed results have been examined for the plant operation, such information has never been utilized completely. Sharing information on human error events and analyzing their causes lead to the clarification of problems in the management and human factors. This new method was applied to the human error events that occurred in the Rokkasho reprocessing plant from 2010 October. Results revealed that the output of this method is effective in judging the error prevention plan and that the number of human error events is reduced to about 50% those observed in 2009 and 2010. (author)

  3. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  4. The January 2001, El Salvador event: a multi-data analysis

    Science.gov (United States)

    Vallee, M.; Bouchon, M.; Schwartz, S. Y.

    2001-12-01

    On January 13, 2001, a large normal event (Mw=7.6) occured 100 kilometers away from the Salvadorian coast (Central America) with a centroid depth of about 50km. The size of this event is surprising according to the classical idea that such events have to be much weaker than thrust events in subduction zones. We analysed this earthquake with different types of data: because teleseismic waves are the only data which offer a good azimuthal coverage, we first built a kinematic source model with P and SH waves provided by the IRIS-GEOSCOPE networks. The ambiguity between the 30o plane (plunging toward Pacific Ocean) and the 60o degree plane (plunging toward Central America) leaded us to do a parallel analysis of the two possible planes. We used a simple point-source modelling in order to define the main characteristics of the event and then used an extended source to retrieve the kinematic features of the rupture. For the 2 possible planes, this analysis reveals a downdip and northwest rupture propagation but the difference of fit remains subtle even when using the extended source. In a second part we confronted our models for the two planes with other seismological data, which are (1) regional data, (2) surface wave data through an Empirical Green Function given by a similar but much weaker earthquake which occured in July 1996 and lastly (3) nearfield data provided by Universidad Centroamericana (UCA) and Centro de Investigationes Geotecnicas (CIG). Regional data do not allow to discriminate the 2 planes neither but surface waves and especially near field data confirm that the fault plane is the steepest one plunging toward Central America. Moreover, the slight directivity toward North is confirmed by surface waves.

  5. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  6. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  7. Underlying Event Studies for LHC Energies

    International Nuclear Information System (INIS)

    Barnafoeldi, Gergely Gabor; Levai, Peter; Agocs, Andras G.

    2011-01-01

    Underlying event was originally defined by the CDF collaboration decades ago. Here we improve the original definition to extend our analysis for events with multiple-jets. We introduce a definition for surrounding rings/belts and based on this definition the jet- and surrounding-belt-excluded areas will provide a good underlying event definition. We inverstigate our definition via the multiplicity in the defined geometry. In parallel, mean transverse momenta of these areas also studied in proton-proton collisions at √(s) = 7 TeV LHC energy.

  8. Preterm Versus Term Children: Analysis of Sedation/Anesthesia Adverse Events and Longitudinal Risk.

    Science.gov (United States)

    Havidich, Jeana E; Beach, Michael; Dierdorf, Stephen F; Onega, Tracy; Suresh, Gautham; Cravero, Joseph P

    2016-03-01

    Preterm and former preterm children frequently require sedation/anesthesia for diagnostic and therapeutic procedures. Our objective was to determine the age at which children who are born risk for sedation/anesthesia adverse events. Our secondary objective was to describe the nature and incidence of adverse events. This is a prospective observational study of children receiving sedation/anesthesia for diagnostic and/or therapeutic procedures outside of the operating room by the Pediatric Sedation Research Consortium. A total of 57,227 patients 0 to 22 years of age were eligible for this study. All adverse events and descriptive terms were predefined. Logistic regression and locally weighted scatterplot regression were used for analysis. Preterm and former preterm children had higher adverse event rates (14.7% vs 8.5%) compared with children born at term. Our analysis revealed a biphasic pattern for the development of adverse sedation/anesthesia events. Airway and respiratory adverse events were most commonly reported. MRI scans were the most commonly performed procedures in both categories of patients. Patients born preterm are nearly twice as likely to develop sedation/anesthesia adverse events, and this risk continues up to 23 years of age. We recommend obtaining birth history during the formulation of an anesthetic/sedation plan, with heightened awareness that preterm and former preterm children may be at increased risk. Further prospective studies focusing on the etiology and prevention of adverse events in former preterm patients are warranted. Copyright © 2016 by the American Academy of Pediatrics.

  9. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  10. GIS-based rare events logistic regression for mineral prospectivity mapping

    Science.gov (United States)

    Xiong, Yihui; Zuo, Renguang

    2018-02-01

    Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

  11. A Combined Methodology to Eliminate Artifacts in Multichannel Electrogastrogram Based on Independent Component Analysis and Ensemble Empirical Mode Decomposition.

    Science.gov (United States)

    Sengottuvel, S; Khan, Pathan Fayaz; Mariyappa, N; Patel, Rajesh; Saipriya, S; Gireesan, K

    2018-06-01

    Cutaneous measurements of electrogastrogram (EGG) signals are heavily contaminated by artifacts due to cardiac activity, breathing, motion artifacts, and electrode drifts whose effective elimination remains an open problem. A common methodology is proposed by combining independent component analysis (ICA) and ensemble empirical mode decomposition (EEMD) to denoise gastric slow-wave signals in multichannel EGG data. Sixteen electrodes are fixed over the upper abdomen to measure the EGG signals under three gastric conditions, namely, preprandial, postprandial immediately, and postprandial 2 h after food for three healthy subjects and a subject with a gastric disorder. Instantaneous frequencies of intrinsic mode functions that are obtained by applying the EEMD technique are analyzed to individually identify and remove each of the artifacts. A critical investigation on the proposed ICA-EEMD method reveals its ability to provide a higher attenuation of artifacts and lower distortion than those obtained by the ICA-EMD method and conventional techniques, like bandpass and adaptive filtering. Characteristic changes in the slow-wave frequencies across the three gastric conditions could be determined from the denoised signals for all the cases. The results therefore encourage the use of the EEMD-based technique for denoising gastric signals to be used in clinical practice.

  12. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    Science.gov (United States)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  13. Development of transient initiating event frequencies for use in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors

  14. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  15. Visitor satisfaction of international cultural events in Belgrade

    Directory of Open Access Journals (Sweden)

    Zečević Bojan

    2016-01-01

    Full Text Available In modern tourism, events are of great importance. The increase in the number of events on a global scale has influenced the growth of competitive pressure and the need for a marketing approach in managing event development. Consumer satisfaction (service user is one of the basic elements in managing tourism development generally seen, and thus it is also important to manage and measure the satisfaction of event visitors. The satisfaction of event visitors is important bearing in mind its influence onto passing over positive experience, re-visits and tourism affirmation in areas where the event takes place. The paper analyzes the visitor satisfaction of three most important cultural events in Belgrade-BITEF, Jazz Festival and Belgrade book fair. The focus of the analysis is on visitor satisfaction which is the result of event participation, the contents which the event offers, as well as the following tourism contents of Belgrade, as a tourism destination. The analysis has been conducted based on an empirical research in which 450 participants, event visitors, took part in.

  16. Event dependent sampling of recurrent events

    DEFF Research Database (Denmark)

    Kvist, Tine Kajsa; Andersen, Per Kragh; Angst, Jules

    2010-01-01

    The effect of event-dependent sampling of processes consisting of recurrent events is investigated when analyzing whether the risk of recurrence increases with event count. We study the situation where processes are selected for study if an event occurs in a certain selection interval. Motivation...... retrospective and prospective disease course histories are used. We examine two methods to correct for the selection depending on which data are used in the analysis. In the first case, the conditional distribution of the process given the pre-selection history is determined. In the second case, an inverse...

  17. Top-down proteomics for the analysis of proteolytic events - Methods, applications and perspectives.

    Science.gov (United States)

    Tholey, Andreas; Becker, Alexander

    2017-11-01

    Mass spectrometry based proteomics is an indispensable tool for almost all research areas relevant for the understanding of proteolytic processing, ranging from the identification of substrates, products and cleavage sites up to the analysis of structural features influencing protease activity. The majority of methods for these studies are based on bottom-up proteomics performing analysis at peptide level. As this approach is characterized by a number of pitfalls, e.g. loss of molecular information, there is an ongoing effort to establish top-down proteomics, performing separation and MS analysis both at intact protein level. We briefly introduce major approaches of bottom-up proteomics used in the field of protease research and highlight the shortcomings of these methods. We then discuss the present state-of-the-art of top-down proteomics. Together with the discussion of known challenges we show the potential of this approach and present a number of successful applications of top-down proteomics in protease research. This article is part of a Special Issue entitled: Proteolysis as a Regulatory Event in Pathophysiology edited by Stefan Rose-John. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Visual search of cyclic spatio-temporal events

    Science.gov (United States)

    Gautier, Jacques; Davoine, Paule-Annick; Cunty, Claire

    2018-05-01

    The analysis of spatio-temporal events, and especially of relationships between their different dimensions (space-time-thematic attributes), can be done with geovisualization interfaces. But few geovisualization tools integrate the cyclic dimension of spatio-temporal event series (natural events or social events). Time Coil and Time Wave diagrams represent both the linear time and the cyclic time. By introducing a cyclic temporal scale, these diagrams may highlight the cyclic characteristics of spatio-temporal events. However, the settable cyclic temporal scales are limited to usual durations like days or months. Because of that, these diagrams cannot be used to visualize cyclic events, which reappear with an unusual period, and don't allow to make a visual search of cyclic events. Also, they don't give the possibility to identify the relationships between the cyclic behavior of the events and their spatial features, and more especially to identify localised cyclic events. The lack of possibilities to represent the cyclic time, outside of the temporal diagram of multi-view geovisualization interfaces, limits the analysis of relationships between the cyclic reappearance of events and their other dimensions. In this paper, we propose a method and a geovisualization tool, based on the extension of Time Coil and Time Wave, to provide a visual search of cyclic events, by allowing to set any possible duration to the diagram's cyclic temporal scale. We also propose a symbology approach to push the representation of the cyclic time into the map, in order to improve the analysis of relationships between space and the cyclic behavior of events.

  19. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  20. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  1. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  2. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  3. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  4. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  5. Prognostic table for predicting major cardiac events based on J-ACCESS investigation

    International Nuclear Information System (INIS)

    Nakajima, Kenichi; Nishimura, Tsunehiko

    2008-01-01

    The event risk of patients with coronary heart disease may be estimated by a large-scale prognostic database in a Japanese population. The aim of this study was to create a heart risk table for predicting the major cardiac event rate. Using the Japanese-assessment of cardiac event and survival study (J-ACCESS) database created by a prognostic investigation involving 117 hospitals and >4000 patients in Japan, multivariate logistic regression analysis was performed. The major event rate over a 3-year period that included cardiac death, non-fatal myocardial infarction, and severe heart failure requiring hospitalization was predicted by the logistic regression equation. The algorithm for calculating the event rate was simplified for creating tables. Two tables were created to calculate cardiac risk by age, perfusion score category, and ejection fraction with and without the presence of diabetes. A relative risk table comparing age-matched control subjects was also made. When the simplified tables were compared with the results from the original logistic regression analysis, both risk values and relative risks agreed well (P<0.0001 for both). The Heart Risk Table was created for patients suspected of having ischemic heart disease and who underwent myocardial perfusion gated single-photon emission computed tomography. The validity of risk assessment using a J-ACCESS database should be validated in a future study. (author)

  6. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  7. The complexity of standing postural control in older adults: a modified detrended fluctuation analysis based upon the empirical mode decomposition algorithm.

    Directory of Open Access Journals (Sweden)

    Junhong Zhou

    Full Text Available Human aging into senescence diminishes the capacity of the postural control system to adapt to the stressors of everyday life. Diminished adaptive capacity may be reflected by a loss of the fractal-like, multiscale complexity within the dynamics of standing postural sway (i.e., center-of-pressure, COP. We therefore studied the relationship between COP complexity and adaptive capacity in 22 older and 22 younger healthy adults. COP magnitude dynamics were assessed from raw data during quiet standing with eyes open and closed, and complexity was quantified with a new technique termed empirical mode decomposition embedded detrended fluctuation analysis (EMD-DFA. Adaptive capacity of the postural control system was assessed with the sharpened Romberg test. As compared to traditional DFA, EMD-DFA more accurately identified trends in COP data with intrinsic scales and produced short and long-term scaling exponents (i.e., α(Short, α(Long with greater reliability. The fractal-like properties of COP fluctuations were time-scale dependent and highly complex (i.e., α(Short values were close to one over relatively short time scales. As compared to younger adults, older adults demonstrated lower short-term COP complexity (i.e., greater α(Short values in both visual conditions (p>0.001. Closing the eyes decreased short-term COP complexity, yet this decrease was greater in older compared to younger adults (p<0.001. In older adults, those with higher short-term COP complexity exhibited better adaptive capacity as quantified by Romberg test performance (r(2 = 0.38, p<0.001. These results indicate that an age-related loss of COP complexity of magnitude series may reflect a clinically important reduction in postural control system functionality as a new biomarker.

  8. Increasing Supply-Chain Visibility with Rule-Based RFID Data Analysis

    DEFF Research Database (Denmark)

    Ilic, A.; Andersen, Thomas; Michahelles, F.

    2009-01-01

    RFID technology tracks the flow of physical items and goods in supply chains to help users detect inefficiencies, such as shipment delays, theft, or inventory problems. An inevitable consequence, however, is that it generates huge numbers of events. To exploit these large amounts of data, the Sup......RFID technology tracks the flow of physical items and goods in supply chains to help users detect inefficiencies, such as shipment delays, theft, or inventory problems. An inevitable consequence, however, is that it generates huge numbers of events. To exploit these large amounts of data......, the Supply Chain Visualizer increases supply-chain visibility by analyzing RFID data, using a mix of automated analysis techniques and human effort. The tool's core concepts include rule-based analysis techniques and a map-based representation interface. With these features, it lets users visualize...

  9. Organizational Learning in Rare Events

    DEFF Research Database (Denmark)

    Andersen, Kristina Vaarst; Tyler, Beverly; Beukel, Karin

    When organizations encounter rare events they often find it challenging to extract learning from the experience. We analyze opportunities for organizational learning in one such rare event, namely Intellectual Property (IP) litigation, i.e., when organizations take disputes regarding their intell......When organizations encounter rare events they often find it challenging to extract learning from the experience. We analyze opportunities for organizational learning in one such rare event, namely Intellectual Property (IP) litigation, i.e., when organizations take disputes regarding...... the organization little discretion to utilize any learning from past litigation success. Thus, learning appears be to most beneficial in infringement cases. Based on statistical analysis of 10,211 litigation court cases in China, we find support for our hypotheses. Our findings suggest that organizations can learn...

  10. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  11. Turning a Private Story into a Public Event. Frame Analysis of Scandals in Television Performance

    Directory of Open Access Journals (Sweden)

    Olga Galanova

    2012-07-01

    Full Text Available It does not suffice to treat scandals only as supra-individual discourses on the macro level of  social communication. Rather we have to develop concrete methodical principles for the description of the practice of doing scandal in certain media. In this paper we look at these practices from a micro-sociological perspective and analyze how and through which concrete actions an event is staged as a scandal. Practices of scandal build a special frame of media communication, which allows  television producers to solve certain "communicative problems." Based on the detailed analysis of a video recording of a television show we exemplify how a private case turns to a public event by means of  scandal-framing. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs120398

  12. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  13. Stress reaction process-based hierarchical recognition algorithm for continuous intrusion events in optical fiber prewarning system

    Science.gov (United States)

    Qu, Hongquan; Yuan, Shijiao; Wang, Yanping; Yang, Dan

    2018-04-01

    To improve the recognition performance of optical fiber prewarning system (OFPS), this study proposed a hierarchical recognition algorithm (HRA). Compared with traditional methods, which employ only a complex algorithm that includes multiple extracted features and complex classifiers to increase the recognition rate with a considerable decrease in recognition speed, HRA takes advantage of the continuity of intrusion events, thereby creating a staged recognition flow inspired by stress reaction. HRA is expected to achieve high-level recognition accuracy with less time consumption. First, this work analyzed the continuity of intrusion events and then presented the algorithm based on the mechanism of stress reaction. Finally, it verified the time consumption through theoretical analysis and experiments, and the recognition accuracy was obtained through experiments. Experiment results show that the processing speed of HRA is 3.3 times faster than that of a traditional complicated algorithm and has a similar recognition rate of 98%. The study is of great significance to fast intrusion event recognition in OFPS.

  14. Sampled-data consensus in switching networks of integrators based on edge events

    Science.gov (United States)

    Xiao, Feng; Meng, Xiangyu; Chen, Tongwen

    2015-02-01

    This paper investigates the event-driven sampled-data consensus in switching networks of multiple integrators and studies both the bidirectional interaction and leader-following passive reaction topologies in a unified framework. In these topologies, each information link is modelled by an edge of the information graph and assigned a sequence of edge events, which activate the mutual data sampling and controller updates of the two linked agents. Two kinds of edge-event-detecting rules are proposed for the general asynchronous data-sampling case and the synchronous periodic event-detecting case. They are implemented in a distributed fashion, and their effectiveness in reducing communication costs and solving consensus problems under a jointly connected topology condition is shown by both theoretical analysis and simulation examples.

  15. Analysis and modeling of a hail event consequences on a building portfolio

    Science.gov (United States)

    Nicolet, Pierrick; Voumard, Jérémie; Choffet, Marc; Demierre, Jonathan; Imhof, Markus; Jaboyedoff, Michel

    2014-05-01

    North-West Switzerland has been affected by a severe Hail Storm in July 2011, which was especially intense in the Canton of Aargau. The damage cost of this event is around EUR 105 Million only for the Canton of Aargau, which corresponds to half of the mean annual consolidated damage cost of the last 20 years for the 19 Cantons (over 26) with a public insurance. The aim of this project is to benefit from the collected insurance data to better understand and estimate the risk of such event. In a first step, a simple hail event simulator, which has been developed for a previous hail episode, is modified. The geometric properties of the storm is derived from the maximum intensity radar image by means of a set of 2D Gaussians instead of using 1D Gaussians on profiles, as it was the case in the previous version. The tool is then tested on this new event in order to establish its ability to give a fast damage estimation based on the radar image and buildings value and location. The geometrical properties are used in a further step to generate random outcomes with similar characteristics, which are combined with a vulnerability curve and an event frequency to estimate the risk. The vulnerability curve comes from a 2009 event and is improved with data from this event, whereas the frequency for the Canton is estimated from insurance records. In addition to this regional risk analysis, this contribution aims at studying the relation of the buildings orientation with the damage rate. Indeed, it is expected that the orientation of the roof influences the aging of the material by controlling the frequency and amplitude of thaw-freeze cycles, changing then the vulnerability over time. This part is established by calculating the hours of sunshine, which are used to derive the material temperatures. This information is then compared with insurance claims. A last part proposes a model to study the hail impact on a building, by modeling the different equipment on each facade of the

  16. Re-presentations of space in Hollywood movies: an event-indexing analysis.

    Science.gov (United States)

    Cutting, James; Iricinschi, Catalina

    2015-03-01

    Popular movies present chunk-like events (scenes and subscenes) that promote episodic, serial updating of viewers' representations of the ongoing narrative. Event-indexing theory would suggest that the beginnings of new scenes trigger these updates, which in turn require more cognitive processing. Typically, a new movie event is signaled by an establishing shot, one providing more background information and a longer look than the average shot. Our analysis of 24 films reconfirms this. More important, we show that, when returning to a previously shown location, the re-establishing shot reduces both context and duration while remaining greater than the average shot. In general, location shifts dominate character and time shifts in event segmentation of movies. In addition, over the last 70 years re-establishing shots have become more like the noninitial shots of a scene. Establishing shots have also approached noninitial shot scales, but not their durations. Such results suggest that film form is evolving, perhaps to suit more rapid encoding of narrative events. Copyright © 2014 Cognitive Science Society, Inc.

  17. A hydrological analysis of the 4 November 2011 event in Genoa

    Directory of Open Access Journals (Sweden)

    F. Silvestro

    2012-09-01

    Full Text Available On the 4 November 2011 a flash flood event hit the area of Genoa with dramatic consequences. Such an event represents, from the meteorological and hydrological perspective, a paradigm of flash floods in the Mediterranean environment.

    The hydro-meteorological probabilistic forecasting system for small and medium size catchments in use at the Civil Protection Centre of Liguria region exhibited excellent performances for the event, by predicting, 24–48 h in advance, the potential level of risk associated with the forecast. It greatly helped the decision makers in issuing a timely and correct alert.

    In this work we present the operational outputs of the system provided during the Liguria events and the post event hydrological modelling analysis that has been carried out accounting also for the crowd sourcing information and data. We discuss the benefit of the implemented probabilistic systems for decision-making under uncertainty, highlighting how, in this case, the multi-catchment approach used for predicting floods in small basins has been crucial.

  18. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  19. Discrimination of Rock Fracture and Blast Events Based on Signal Complexity and Machine Learning

    Directory of Open Access Journals (Sweden)

    Zilong Zhou

    2018-01-01

    Full Text Available The automatic discrimination of rock fracture and blast events is complex and challenging due to the similar waveform characteristics. To solve this problem, a new method based on the signal complexity analysis and machine learning has been proposed in this paper. First, the permutation entropy values of signals at different scale factors are calculated to reflect complexity of signals and constructed into a feature vector set. Secondly, based on the feature vector set, back-propagation neural network (BPNN as a means of machine learning is applied to establish a discriminator for rock fracture and blast events. Then to evaluate the classification performances of the new method, the classifying accuracies of support vector machine (SVM, naive Bayes classifier, and the new method are compared, and the receiver operating characteristic (ROC curves are also analyzed. The results show the new method obtains the best classification performances. In addition, the influence of different scale factor q and number of training samples n on discrimination results is discussed. It is found that the classifying accuracy of the new method reaches the highest value when q = 8–15 or 8–20 and n=140.

  20. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.