WorldWideScience

Sample records for event frequency based

  1. Event group importance measures for top event frequency analyses

    International Nuclear Information System (INIS)

    1995-01-01

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures

  2. Event group importance measures for top event frequency analyses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-31

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures.

  3. Initiating events frequency determination

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2004-01-01

    The paper describes work performed for the Nuclear Power Station (NPS). Work is related to the periodic initiating events frequency update for the Probabilistic Safety Assessment (PSA). Data for all relevant NPS initiating events (IE) were reviewed. The main focus was on events occurring during most recent operating history (i.e., last four years). The final IE frequencies were estimated by incorporating both NPS experience and nuclear industry experience. Each event was categorized according to NPS individual plant examination (IPE) initiating events grouping approach. For the majority of the IE groups, few, or no events have occurred at the NPS. For those IE groups with few or no NPS events, the final estimate was made by means of a Bayesian update with general nuclear industry values. Exceptions are rare loss-of-coolant-accidents (LOCA) events, where evaluation of engineering aspects is used in order to determine frequency.(author)

  4. Development of transient initiating event frequencies for use in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors

  5. Development of transient initiating event frequencies for use in probabilistic risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.

  6. Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network

    Directory of Open Access Journals (Sweden)

    Hasmaini Mohamad

    2016-06-01

    Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.

  7. Pattern recognition based on time-frequency analysis and convolutional neural networks for vibrational events in φ-OTDR

    Science.gov (United States)

    Xu, Chengjin; Guan, Junjun; Bao, Ming; Lu, Jiangang; Ye, Wei

    2018-01-01

    Based on vibration signals detected by a phase-sensitive optical time-domain reflectometer distributed optical fiber sensing system, this paper presents an implement of time-frequency analysis and convolutional neural network (CNN), used to classify different types of vibrational events. First, spectral subtraction and the short-time Fourier transform are used to enhance time-frequency features of vibration signals and transform different types of vibration signals into spectrograms, which are input to the CNN for automatic feature extraction and classification. Finally, by replacing the soft-max layer in the CNN with a multiclass support vector machine, the performance of the classifier is enhanced. Experiments show that after using this method to process 4000 vibration signal samples generated by four different vibration events, namely, digging, walking, vehicles passing, and damaging, the recognition rates of vibration events are over 90%. The experimental results prove that this method can automatically make an effective feature selection and greatly improve the classification accuracy of vibrational events in distributed optical fiber sensing systems.

  8. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  9. Recurrent frequency-size distribution of characteristic events

    Directory of Open Access Journals (Sweden)

    S. G. Abaimov

    2009-04-01

    Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities CV of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.

  10. Frequency of adverse events after vaccination with different vaccinia strains.

    Directory of Open Access Journals (Sweden)

    Mirjam Kretzschmar

    2006-08-01

    Full Text Available BACKGROUND: Large quantities of smallpox vaccine have been stockpiled to protect entire nations against a possible reintroduction of smallpox. Planning for an appropriate use of these stockpiled vaccines in response to a smallpox outbreak requires a rational assessment of the risks of vaccination-related adverse events, compared to the risk of contracting an infection. Although considerable effort has been made to understand the dynamics of smallpox transmission in modern societies, little attention has been paid to estimating the frequency of adverse events due to smallpox vaccination. Studies exploring the consequences of smallpox vaccination strategies have commonly used a frequency of approximately one death per million vaccinations, which is based on a study of vaccination with the New York City Board of Health (NYCBH strain of vaccinia virus. However, a multitude of historical studies of smallpox vaccination with other vaccinia strains suggest that there are strain-related differences in the frequency of adverse events after vaccination. Because many countries have stockpiled vaccine based on the Lister strain of vaccinia virus, a quantitative evaluation of the adverse effects of such vaccines is essential for emergency response planning. We conducted a systematic review and statistical analysis of historical data concerning vaccination against smallpox with different strains of vaccinia virus. METHODS AND FINDINGS: We analyzed historical vaccination data extracted from the literature. We extracted data on the frequency of postvaccinal encephalitis and death with respect to vaccinia strain and age of vaccinees. Using a hierarchical Bayesian approach for meta-analysis, we estimated the expected frequencies of postvaccinal encephalitis and death with respect to age at vaccination for smallpox vaccines based on the NYCBH and Lister vaccinia strains. We found large heterogeneity between findings from different studies and a time-period effect

  11. Utilization of Satellite Data to Identify and Monitor Changes in Frequency of Meteorological Events

    Science.gov (United States)

    Mast, J. C.; Dessler, A. E.

    2017-12-01

    Increases in temperature and climate variability due to human-induced climate change is increasing the frequency and magnitude of extreme heat events (i.e., heatwaves). This will have a detrimental impact on the health of human populations and habitability of certain land locations. Here we seek to utilize satellite data records to identify and monitor extreme heat events. We analyze satellite data sets (MODIS and AIRS land surface temperatures (LST) and water vapor profiles (WV)) due to their global coverage and stable calibration. Heat waves are identified based on the frequency of maximum daily temperatures above a threshold, determined as follows. Land surface temperatures are gridded into uniform latitude/longitude bins. Maximum daily temperatures per bin are determined and probability density functions (PDF) of these maxima are constructed monthly and seasonally. For each bin, a threshold is calculated at the 95th percentile of the PDF of maximum temperatures. Per each bin, an extreme heat event is defined based on the frequency of monthly and seasonal days exceeding the threshold. To account for the decreased ability of the human body to thermoregulate with increasing moisture, and to assess lethality of the heat events, we determine the wet-bulb temperature at the locations of extreme heat events. Preliminary results will be presented.

  12. Shallow very-low-frequency earthquakes accompany slow slip events in the Nankai subduction zone.

    Science.gov (United States)

    Nakano, Masaru; Hori, Takane; Araki, Eiichiro; Kodaira, Shuichi; Ide, Satoshi

    2018-03-14

    Recent studies of slow earthquakes along plate boundaries have shown that tectonic tremor, low-frequency earthquakes, very-low-frequency events (VLFEs), and slow-slip events (SSEs) often accompany each other and appear to share common source faults. However, the source processes of slow events occurring in the shallow part of plate boundaries are not well known because seismic observations have been limited to land-based stations, which offer poor resolution beneath offshore plate boundaries. Here we use data obtained from seafloor observation networks in the Nankai trough, southwest of Japan, to investigate shallow VLFEs in detail. Coincident with the VLFE activity, signals indicative of shallow SSEs were detected by geodetic observations at seafloor borehole observatories in the same region. We find that the shallow VLFEs and SSEs share common source regions and almost identical time histories of moment release. We conclude that these slow events arise from the same fault slip and that VLFEs represent relatively high-frequency fluctuations of slip during SSEs.

  13. Mutational jackpot events generate effective frequency-dependent selection in adapting populations

    Science.gov (United States)

    Hallatschek, Oskar

    The site-frequency spectrum is one the most easily measurable quantities that characterize the genetic diversity of a population. While most neutral models predict that site frequency spectra should decay with increasing frequency, a high-frequency uptick has been reported in many populations. Anomalies in the high-frequency tail are particularly unsettling because the highest frequencies can be measured with greatest accuracy. Here, we show that an uptick in the spectrum of neutral mutations generally arises when mutant frequencies are dominated by rare jackpot events, mutational events with large descendant numbers. This leads to an effective pattern of frequency-dependent selection (or unstable internal equilibrium at one half frequency) that causes an accumulation of high-frequency polymorphic sites. We reproduce the known uptick occurring for recurrent hitchhiking (genetic draft) as well as rapid adaptation, and (in the future) generalize the shape of the high-frequency tail to other scenarios that are dominated by jackpot events, such as frequent range expansions. We also tackle (in the future) the inverse approach to use the high-frequency uptick for learning about the tail of the offspring number distribution. Positively selected alleles need to surpass, typically, an u NSF Career Award (PoLS), NIH NIGMS R01, Simons Foundation.

  14. Estimation of initiating event frequency for external flood events by extreme value theorem

    International Nuclear Information System (INIS)

    Chowdhury, Sourajyoti; Ganguly, Rimpi; Hari, Vibha

    2017-01-01

    External flood is an important common cause initiating event in nuclear power plants (NPPs). It may potentially lead to severe core damage (SCD) by first causing the failure of the systems required for maintaining the heat sinks and then by contributing to failures of engineered systems designed to mitigate such failures. The sample NPP taken here is twin 220 MWe Indian standard pressurized heavy water reactor (PHWR) situated inland. A comprehensive in-house Level-1 internal event PSA for full power had already been performed. External flood assessment was further conducted in area of external hazard risk assessment in response to post-Fukushima measures taken in nuclear industries. The present paper describes the methodology to calculate initiating event (IE) frequency for external flood events for the sample inland Indian NPP. General extreme value (GEV) theory based on maximum likelihood method (MLM) and order statistics approach (OSA) is used to analyse the rainfall data for the site. Thousand-year return level and necessary return periods for extreme rainfall are evaluated. These results along with plant-specific topographical calculations quantitatively establish that external flooding resulting from upstream dam break, river flooding and heavy rainfall (flash flood) would be unlikely for the sample NPP in consideration.

  15. Classification of hydromagnetic emissions based on frequency--time spectra

    International Nuclear Information System (INIS)

    Fukunishi, H.; Toya, T.; Koike, K.; Kuwashima, M.; Kawamura, M.

    1981-01-01

    By using 3035 hydromagnetic emission events observed in the frequency range of 0.1--2.0 Hz at Syowa (Lapprox.6), HM emissions have been classified into eight subtypes based on their spectral structures, i.e., HM whistler, periodic HM emission, HM chorus, HM emission burst, IPDP, morning IPDP, Pc 1--2 band, and irregular HM emission. It is seen that each subtype has a preferential magnetic local time interval and also a frequency range for its occurrence. Morning IPDP events and irregular HM emissions occur in the magnetic morning hours, while dispersive periodic HM emissions and HM emission bursts occur around magnetic local noon, then HM chorus emissions occur in the afternoon hours and IPDP events occur in the evening hours. Furthermore, it is noticed that the mid-frequencies of these emissions vary from high frequencies in the morning hours to low frequencies in the afternoon hours. On the basis of these results, the generation mechanisms of each subtype are discussed

  16. A Method to Quantify Plant Availability and Initiating Event Frequency Using a Large Event Tree, Small Fault Tree Model

    International Nuclear Information System (INIS)

    Kee, Ernest J.; Sun, Alice; Rodgers, Shawn; Popova, ElmiraV; Nelson, Paul; Moiseytseva, Vera; Wang, Eric

    2006-01-01

    South Texas Project uses a large fault tree to produce scenarios (minimal cut sets) used in quantification of plant availability and event frequency predictions. On the other hand, the South Texas Project probabilistic risk assessment model uses a large event tree, small fault tree for quantifying core damage and radioactive release frequency predictions. The South Texas Project is converting its availability and event frequency model to use a large event tree, small fault in an effort to streamline application support and to provide additional detail in results. The availability and event frequency model as well as the applications it supports (maintenance and operational risk management, system engineering health assessment, preventive maintenance optimization, and RIAM) are briefly described. A methodology to perform availability modeling in a large event tree, small fault tree framework is described in detail. How the methodology can be used to support South Texas Project maintenance and operations risk management is described in detail. Differences with other fault tree methods and other recently proposed methods are discussed in detail. While the methods described are novel to the South Texas Project Risk Management program and to large event tree, small fault tree models, concepts in the area of application support and availability modeling have wider applicability to the industry. (authors)

  17. A Study on the Frequency of Initiating Event of OPR-1000 during Outage Periods

    Energy Technology Data Exchange (ETDEWEB)

    Hong Jae Beol; Jae, Moo Sung [Hanyang Univ., Seoul (Korea, Republic of)

    2013-10-15

    These sources of data did not reflect the latest event data which have occurred during the PWR outage to the frequencies of initiating event Electric Power Research Institute(EPRI) in USA collected the data of loss of decay heat removal during outage from 1989 to 2009 and published technical report. Domestic operating experiences for LOOP is gathered in Operational Performance Information System for Nuclear Power Plant(OPIS). To reduce conservatism and obtain completeness for LPSD PSA, those data should be collected and used to update the frequencies. The frequencies of LOSDC and LOOP are reevaluated using the data of EPRI and OPIS in this paper. Quantification is conducted to recalculate core damage frequency(CDF), since the rate is changed. The results are discussed below. To make an accurate estimate of the initiating events of LPSD PSA, the event data were collected and the frequencies of initiating events were updated using Bayesian approach. CDF was evaluated through quantification. Δ CDF is -40% and the dominant contributor is pressurizer PSV stuck open event. The most of the event data in EPRI TR were collected from US nuclear power plant industry. Those data are not enough to evaluate outage risk precisely. Therefore, to reduce conservatism and obtain completeness for LPSD PSA, the licensee event report and domestic data should be collected and reflected to the frequencies of the initiating events during outage.

  18. Procedures for the external event core damage frequency analyses for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1990-11-01

    This report presents methods which can be used to perform the assessment of risk due to external events at nuclear power plants. These methods were used to perform the external events risk assessments for the Surry and Peach Bottom nuclear power plants as part of the NRC-sponsored NUREG-1150 risk assessments. These methods apply to the full range of hazards such as earthquakes, fires, floods, etc. which are collectively known as external events. The methods described in this report have been developed under NRC sponsorship and represent, in many cases, both advancements and simplifications over techniques that have been used in past years. They also include the most up-to-date data bases on equipment seismic fragilities, fire occurrence frequencies and fire damageability thresholds. The methods described here are based on making full utilization of the power plant systems logic models developed in the internal events analyses. By making full use of the internal events models one obtains an external event analysis that is consistent both in nomenclature and in level of detail with the internal events analyses, and in addition, automatically includes all the appropriate random and tests/maintenance unavailabilities as appropriate. 50 refs., 9 figs., 11 tabs

  19. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-01-01

    Aim The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. Methods A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. Results From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for ‘symptom’ related and ‘mild’ adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Conclusions Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. PMID:26271492

  20. On the Onset Frequency of Metric Type II Radio Bursts and the Longitudinal Extent of the Associated SEP Events

    Science.gov (United States)

    Makela, P. A.; Gopalswamy, N.; Yashiro, S.; Thakur, N.; Akiyama, S.; Xie, H.

    2017-12-01

    In a recent study Gopalswamy et al. (2017, J. Phys. Conf. Ser., Proc. 16th AIAC) found that the ground level enhancements (GLEs), regular solar energetic particle (SEP) events and filament eruption (FE) associated SEP events have distinct average starting frequencies of the associated type II bursts, although the distributions overlap. They also found that the initial acceleration of the coronal mass ejections (CMEs) associated with the three groups were distinct. Based on these earlier results emphasizing a hierarchical relationship of CME kinematics and SEP events, we studied the possible dependence between the longitudinal spread of the SEP events and the onset frequency of metric type II. The studied >25 MeV SEP events are from the list of Richardson et al. (2014, Sol. Phys. 289) covering the first seven years of the STEREO mission. However, our preliminary results show only a weak correlation between the extent of the SEP event and the onset frequency of the metric type II radio burst.

  1. Neural network approach to the prediction of seismic events based on low-frequency signal monitoring of the Kuril-Kamchatka and Japanese regions

    Directory of Open Access Journals (Sweden)

    Irina Popova

    2013-08-01

    Full Text Available Very-low-frequency/ low-frequency (VLF/LF sub-ionospheric radiowave monitoring has been widely used in recent years to analyze earthquake preparatory processes. The connection between earthquakes with M ≥5.5 and nighttime disturbances of signal amplitude and phase has been established. Thus, it is possible to use nighttime anomalies of VLF/LF signals as earthquake precursors. Here, we propose a method for estimation of the VLF/LF signal sensitivity to seismic processes using a neural network approach. We apply the error back-propagation technique based on a three-level perceptron to predict a seismic event. The back-propagation technique involves two main stages to solve the problem; namely, network training, and recognition (the prediction itself. To train a neural network, we first create a so-called ‘training set’. The ‘teacher’ specifies the correspondence between the chosen input and the output data. In the present case, a representative database includes both the LF data received over three years of monitoring at the station in Petropavlovsk-Kamchatsky (2005-2007, and the seismicity parameters of the Kuril-Kamchatka and Japanese regions. At the first stage, the neural network established the relationship between the characteristic features of the LF signal (the mean and dispersion of a phase and an amplitude at nighttime for a few days before a seismic event and the corresponding level of correlation with a seismic event, or the absence of a seismic event. For the second stage, the trained neural network was applied to predict seismic events from the LF data using twelve time intervals in 2004, 2005, 2006 and 2007. The results of the prediction are discussed.

  2. Attitudes of Montenegrin Consumers Toward Advertising Through Sport Among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Bojan Masanovic

    2018-01-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Montenegrin consumers toward advertising through sport among. The sample included 342 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.00. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  3. Towards a Unified Understanding of Event-Related Changes in the EEG: The Firefly Model of Synchronization through Cross-Frequency Phase Modulation

    Science.gov (United States)

    Burgess, Adrian P.

    2012-01-01

    Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing. PMID:23049827

  4. Potential Indoor Worker Exposure From Handling Area Leakage: Example Event Sequence Frequency Analysis

    International Nuclear Information System (INIS)

    Benke, Roland R.; Adams, George R.

    2008-01-01

    potential event sequences. A hypothetical case is presented for failure of the HVAC exhaust system to provide confinement for contaminated air from otherwise normal operations. This paper presents an example calculation of frequencies for a potential event sequence involving HVAC system failure during otherwise routine wet transfer operations of spent nuclear fuel assemblies from an open container. For the simplified HVAC exhaust system model, the calculation indicated that the potential event sequence may or may not be a Category 1 event sequence, in light of current uncertainties (e.g., final HVAC system design and duration of facility operations). Categorization of potential event sequences is important because different regulatory requirements and performance objectives are specified based on the categorization of event sequences. A companion paper presents a dose calculation methodology and example calculations of indoor worker consequences for the posed example event sequence. Together, the two companion papers demonstrate capabilities for performing confirmatory calculations of frequency and consequence, which may assist the assessment of worker safety during a risk-informed regulatory review of a potential DOE license application

  5. Solar micro-bursts of 22. 2 GHz and their relationship to events observed at lower frequencies

    Energy Technology Data Exchange (ETDEWEB)

    Blakey, J R [Universidade Mackenzie, Sao Paulo (Brazil). Centro de Radio-Astronomia e Astrofisica

    1976-01-01

    Observations of McMath region 10433 at 22 GHz using a telescope with a 4 minutes of arc beam during July 1974 revealed the existence events or 'microbursts' with intensities below the sensitivity limit of normal solar patrol instruments. Many of these events were simply the high frequency counterpart of more intense bursts observed at lower frequencies. This note considers the small number of events which suggest that the gyro-synchrotron mechanism alone is incapable of explaining the observations and indicates that a thermal mechanism is needed to explain the high frequency event.

  6. Characterization of the frequency and nature of bleed air contamination events in commercial aircraft.

    Science.gov (United States)

    Shehadi, M; Jones, B; Hosni, M

    2016-06-01

    Contamination of the bleed air used to pressurize and ventilate aircraft cabins is of concern due to the potential health and safety hazards for passengers and crew. Databases from the Federal Aviation Administration, NASA, and other sources were examined in detail to determine the frequency of bleed air contamination incidents. The frequency was examined on an aircraft model basis with the intent of identifying aircraft make and models with elevated frequencies of contamination events. The reported results herein may help investigators to focus future studies of bleed air contamination incidents on smaller number of aircrafts. Incident frequency was normalized by the number of aircraft, number of flights, and flight hours for each model to account for the large variations in the number of aircraft of different models. The focus of the study was on aircraft models that are currently in service and are used by major airlines in the United States. Incidents examined in this study include those related to smoke, oil odors, fumes, and any symptom that might be related to exposure to such contamination, reported by crew members, between 2007 and 2012, for US-based carriers for domestic flights and all international flights that either originated or terminated in the US. In addition to the reported frequency of incidents for different aircraft models, the analysis attempted to identify propulsion engines and auxiliary power units associated with aircrafts that had higher frequencies of incidents. While substantial variations were found in frequency of incidents, it was found that the contamination events were widely distributed across nearly all common models of aircraft. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Attitudes of Turkish Consumers toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Bojan Masanovic

    2017-08-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Serbian consumers toward advertising through sport among. The sample included 173 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between two out of three variables at a significance level of p=.05. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  8. An exploration of the relationship among valence, fading affect, rehearsal frequency, and memory vividness for past personal events.

    Science.gov (United States)

    Lindeman, Meghan I H; Zengel, Bettina; Skowronski, John J

    2017-07-01

    The affect associated with negative (or unpleasant) memories typically tends to fade faster than the affect associated with positive (or pleasant) memories, a phenomenon called the fading affect bias (FAB). We conducted a study to explore the mechanisms related to the FAB. A retrospective recall procedure was used to obtain three self-report measures (memory vividness, rehearsal frequency, affective fading) for both positive events and negative events. Affect for positive events faded less than affect for negative events, and positive events were recalled more vividly than negative events. The perceived vividness of an event (memory vividness) and the extent to which an event has been rehearsed (rehearsal frequency) were explored as possible mediators of the relation between event valence and affect fading. Additional models conceived of affect fading and rehearsal frequency as contributors to a memory's vividness. Results suggested that memory vividness was a plausible mediator of the relation between an event's valence and affect fading. Rehearsal frequency was also a plausible mediator of this relation, but only via its effects on memory vividness. Additional modelling results suggested that affect fading and rehearsal frequency were both plausible mediators of the relation between an event's valence and the event's rated memory vividness.

  9. Calculation of noninformative prior of reliability parameter and initiating event frequency with Jeffreys method

    International Nuclear Information System (INIS)

    He Jie; Zhang Binbin

    2013-01-01

    In the probabilistic safety assessment (PSA) of nuclear power plants, there are few historical records on some initiating event frequencies or component failures in industry. In order to determine the noninformative priors of such reliability parameters and initiating event frequencies, the Jeffreys method in Bayesian statistics was employed. The mathematical mechanism of the Jeffreys prior and the simplified constrained noninformative distribution (SCNID) were elaborated in this paper. The Jeffreys noninformative formulas and the credible intervals of the Gamma-Poisson and Beta-Binomial models were introduced. As an example, the small break loss-of-coolant accident (SLOCA) was employed to show the application of the Jeffreys prior in determining an initiating event frequency. The result shows that the Jeffreys method is an effective method for noninformative prior calculation. (authors)

  10. Evaluation of Frequency and Restoration time for Loss of Offsite Power events based on domestic operation experience

    International Nuclear Information System (INIS)

    Park, Jin-Hee; Han, Sang-Hoon; Lee, Ho Joong

    2006-01-01

    It is recognized that the availability of AC power to nuclear power plants is essential for safe operation and shutdown and accident recovery of commercial nuclear power plants (NPPs). Unavailability of AC power can be a important adverse impact on a plant's ability to recover accident sequences and maintain safe shutdown. The probabilistic safety assessment (PSA or PRA) performed for Korea NPPs also indicated that a loss of offsite power (LOOP) event and a station blackout (SBO) event can be a important contributors to total risk at nuclear power plant, contributing from 30% to 70% of the total risk at some NPPs in Korea. But, up to now, the LOOP and subsequent restoration time are important inputs to plant probabilistic risk assessment have relied upon foreign data. Therefore, in this paper, the actual LOOP events that have occurred from 1978 to 2004 at commercial nuclear power plants in Korea are collected. A statistical analysis for LOOP frequency and restoration time are performed to apply NPPs's specific and realistic risk model in Korea. Additionally, an engineering analysis is also performed to obtain the insights about the LOOP events

  11. Estimation of average hazardous-event-frequency for allocation of safety-integrity levels

    International Nuclear Information System (INIS)

    Misumi, Y.; Sato, Y.

    1999-01-01

    frequencies are derived based on the fault-trees. Thus, new definitions regarding modes of operation for the allocation of Safety Integrity Levels and shortcut methods for estimation of hazardous-event frequencies are proposed

  12. Estimation of frequency of occurrence of extreme natural external events of very high intensity on the base of (non)available data - Estimation of frequency of rare natural external events of very high intensity on the base of (non)available data

    International Nuclear Information System (INIS)

    Holy, J.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The relatively frequent natural external events are usually of minor safety importance, because the NPPs are, with a significant safety margin, constructed and operated to withstand the effects of them. Thus, risk analysis is typically devoted to the natural events of exceptional intensity, which mostly have not occurred up to now, but which still could happen with some low probability, but critical consequences. Since 'direct' plant specific data providing evidence about such events to occur is not at disposal, special data treatment and extrapolation methods have to be employed for frequency estimation. The paper summarizes possible approach to estimation of rate event frequency by means of extrapolation from available data and points out the potential problems and challenges encountered during the analysis. The general framework is commented in the presentation, regarding the effects of choice of probabilistic distribution (Gumbel distribution versus the others), methods of work with data records (To take out some observations and why?) and analysis of quality of input data sets (To mix the data sets from different sources or not? To use 'old' observations?) In the first part of the paper, the approach to creation of NPP Dukovany deterministic design basis regarding natural external events, which was used in past, is summarized. The second, major part of the paper, is devoted to involvement of the ideas of probabilistic safety assessment into safety assessment of external hazards, including such specific topics as addressing the quality of available data records, discussion on possible violation of common assumptions expected to be valid by the rules of statistical data analysis and the ways how to fix it, the choice of probabilistic distribution modeling data variability etc. The examples of results achieved for NPP Dukovany site in Czech republic are given in the final section. This paper represents a coordinated effort with participation of experts and staff

  13. Ground motion: frequency of occurrence versus amplitude of disturbing transient events

    International Nuclear Information System (INIS)

    Werner, K.L.

    1983-01-01

    Successful collider operation requires that ground motion not exceed certain tolerances. In this note it is pointed out that on occasion these tolerances are exceeded. The frequency of such events and their amplitudes, measured as a function of time of day, have been measured. An examination of the data leads one to conclude that most events are of cultural (i.e., man-made) origin. 2 references, 20 figures

  14. Low-Frequency Type III Bursts and Solar Energetic Particle Events

    Science.gov (United States)

    Gopalswamy, Nat; Makela, Pertti

    2010-01-01

    We analyzed the coronal mass ejections (CMEs), flares, and type 11 radio bursts associated with a set of six low frequency (15 min) normally used to define these bursts. All but one of the type III bursts was not associated with a type 11 burst in the metric or longer wavelength domains. The burst without type 11 burst also lacked a solar energetic particle (SEP) event at energies >25 MeV. The 1-MHz duration of the type III burst (28 min) is near the median value of type III durations found for gradual SEP events and ground level enhancement (GLE) events. Yet, there was no sign of SEP events. On the other hand, two other type III bursts from the same active region had similar duration but accompanied by WAVES type 11 bursts; these bursts were also accompanied by SEP events detected by SOHO/ERNE. The CMEs were of similar speeds and the flares are also of similar size and duration. This study suggests that the type III burst duration may not be a good indicator of an SEP event.

  15. Merging expert and empirical data for rare event frequency estimation: Pool homogenisation for empirical Bayes models

    International Nuclear Information System (INIS)

    Quigley, John; Hardman, Gavin; Bedford, Tim; Walls, Lesley

    2011-01-01

    Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification.

  16. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  17. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis

  18. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  19. Aesthetic appreciation: event-related field and time-frequency analyses.

    Science.gov (United States)

    Munar, Enric; Nadal, Marcos; Castellanos, Nazareth P; Flexas, Albert; Maestú, Fernando; Mirasso, Claudio; Cela-Conde, Camilo J

    2011-01-01

    Improvements in neuroimaging methods have afforded significant advances in our knowledge of the cognitive and neural foundations of aesthetic appreciation. We used magnetoencephalography (MEG) to register brain activity while participants decided about the beauty of visual stimuli. The data were analyzed with event-related field (ERF) and Time-Frequency (TF) procedures. ERFs revealed no significant differences between brain activity related with stimuli rated as "beautiful" and "not beautiful." TF analysis showed clear differences between both conditions 400 ms after stimulus onset. Oscillatory power was greater for stimuli rated as "beautiful" than those regarded as "not beautiful" in the four frequency bands (theta, alpha, beta, and gamma). These results are interpreted in the frame of synchronization studies.

  20. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Science.gov (United States)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  1. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  2. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  3. The SKI-project External events - Phase 2. Estimation of fire frequencies per plant and per building

    International Nuclear Information System (INIS)

    Poern, K.

    1996-08-01

    The Swedish-Finnish handbook for initiating event frequencies, I-Book, does not contain any fire frequencies. This matter of fact is not defensible considering the substantial risk contribution caused by fires. In the PSAs performed hitherto the initiating fire frequencies have been determined from case to case. Because data are usually very scarce in these areas it is very important to develop unique definitions, to systematically utilize both international and national experiences and to establish an appropriate statistical estimation method. It is also important to present the accumulated experience such that it can be used for different purposes, not only within PSA but also in the concrete fire preventive work. During phase 1 of the project External Events an inventory was made of existing methods for probabilistic fire analysis in general. During phase 2 of the project it was decided to initialize the work on a complementary handbook, called X-Book, in order to encompass the frequencies of system external events, i.e. initiating events that are caused by events occurring outside the system boundaries. In Version 1 of the X-Book the attention is mainly focussed on the estimation of initiating fire frequencies, per plant and per building. This estimation is basically founded on reports that the power companies have collected for this specific purpose. This report describes the statistical model and method that have been used in the estimation process. The methodological results achieved may, possibly after some modification, be applicable also to other types of system external events

  4. ATTITUDES OF SERBIAN CONSUMERS TOWARD ADVERTISING THROUGH SPORT WITH REGARD TO THE FREQUENCY OF WATCHING SPORTS EVENTS

    Directory of Open Access Journals (Sweden)

    Stevo Popović

    2015-05-01

    Full Text Available It is proposed that potential cosumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product (Pyun, 2006. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Serbian consumers toward advertising through sport among. Methods: The sample included 127 respondents, divided into six subsample groups: cconsumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modeled by seven-point Likert scale. The results of the measuring were analyzed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Results: Based on the statistical analyses it was found that significant differences didn’t occur at multivariate level, as well as between all three variables at a significance level of p=.05. Hence, it is interesting to highlight that it was found there are no significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. Discussion: These results are so important for the marketers, mostly due to the reason they can merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this wasn’t the case in previous investigations (Bjelica and Popović, 2011 and this observation presents relevant information.

  5. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  6. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  7. Adequate engineering for lowering the frequency of initiating events at Siemens/KWU

    International Nuclear Information System (INIS)

    Gremm, O.

    1988-01-01

    The analysis of TMI and Chernobyl events shows weak points and deficits in the field of preventive safety features. This should not be forgotten during the ongoing discussion on severe accidents. Therefore the paper explains special preventive safety features which were the results of the development of Siemens/KWU reactor technology. With respect to the present discussion on new reactor concepts special attention is given to the inherent and passive safety features and the engineering which results in low core melt frequency. Such an analysis leads to knowledge modules which are based on experience during licensing procedures and plant operation and should be the starting points for reactor technology of the future

  8. Quantification of LOCA core damage frequency based on thermal-hydraulics analysis

    International Nuclear Information System (INIS)

    Cho, Jaehyun; Park, Jin Hee; Kim, Dong-San; Lim, Ho-Gon

    2017-01-01

    Highlights: • We quantified the LOCA core damage frequency based on the best-estimated success criteria analysis. • The thermal-hydraulic analysis using MARS code has been applied to Korea Standard Nuclear Power Plants. • Five new event trees with new break size boundaries and new success criteria were developed. • The core damage frequency is 5.80E−07 (/y), which is 12% less than the conventional PSA event trees. - Abstract: A loss-of-coolant accident (LOCA) has always been significantly considered one of the most important initiating events. However, most probabilistic safety assessment models, up to now, have undoubtedly adopted the three groups of LOCA, and even an exact break size boundary that used in WASH-1400 reports was published in 1975. With an awareness of the importance of a realistic PSA for a risk-informed application, several studies have tried to find the realistic thermal-hydraulic behavior of a LOCA, and improve the PSA model. The purpose of this research is to obtain realistic results of the LOCA core damage frequency based on a success criteria analysis using the best-estimate thermal-hydraulics code. To do so, the Korea Standard Nuclear Power Plant (KSNP) was selected for this study. The MARS code was used for a thermal hydraulics analysis and the AIMS code was used for the core damage quantification. One of the major findings in the thermal hydraulics analysis was that the decay power is well removed by only a normal secondary cooling in LOCAs of below 1.4 in and by only a high pressure safety injection in LOCAs of 0.8–9.4 in. Based on the thermal hydraulics results regarding new break size boundaries and new success criteria, five new event trees (ETs) were developed. The core damage frequency of new LOCA ETs is 5.80E−07 (/y), which is 12% less than the conventional PSA ETs. In this research, we obtained not only thermal-hydraulics characteristics for the entire break size of a LOCA in view of the deterministic safety

  9. Quantification of LOCA core damage frequency based on thermal-hydraulics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jaehyun, E-mail: chojh@kaeri.re.kr; Park, Jin Hee; Kim, Dong-San; Lim, Ho-Gon

    2017-04-15

    Highlights: • We quantified the LOCA core damage frequency based on the best-estimated success criteria analysis. • The thermal-hydraulic analysis using MARS code has been applied to Korea Standard Nuclear Power Plants. • Five new event trees with new break size boundaries and new success criteria were developed. • The core damage frequency is 5.80E−07 (/y), which is 12% less than the conventional PSA event trees. - Abstract: A loss-of-coolant accident (LOCA) has always been significantly considered one of the most important initiating events. However, most probabilistic safety assessment models, up to now, have undoubtedly adopted the three groups of LOCA, and even an exact break size boundary that used in WASH-1400 reports was published in 1975. With an awareness of the importance of a realistic PSA for a risk-informed application, several studies have tried to find the realistic thermal-hydraulic behavior of a LOCA, and improve the PSA model. The purpose of this research is to obtain realistic results of the LOCA core damage frequency based on a success criteria analysis using the best-estimate thermal-hydraulics code. To do so, the Korea Standard Nuclear Power Plant (KSNP) was selected for this study. The MARS code was used for a thermal hydraulics analysis and the AIMS code was used for the core damage quantification. One of the major findings in the thermal hydraulics analysis was that the decay power is well removed by only a normal secondary cooling in LOCAs of below 1.4 in and by only a high pressure safety injection in LOCAs of 0.8–9.4 in. Based on the thermal hydraulics results regarding new break size boundaries and new success criteria, five new event trees (ETs) were developed. The core damage frequency of new LOCA ETs is 5.80E−07 (/y), which is 12% less than the conventional PSA ETs. In this research, we obtained not only thermal-hydraulics characteristics for the entire break size of a LOCA in view of the deterministic safety

  10. Mean occurrence frequency and temporal risk analysis of solar particle events

    International Nuclear Information System (INIS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.; Wilson, John W.

    2006-01-01

    The protection of astronauts from space radiation is required on future exploratory class and long-duration missions. For the accurate projections of radiation doses, a solar cycle statistical model, which quantifies the progression level within the cycle, has been developed. The resultant future cycle projection is then applied to estimate the mean frequency of solar particle events (SPEs) in the near future using a power law function of sunspot number. Detailed temporal behaviors of the recent large event and two historically large events of the August 1972 SPE and the November 1960 SPE are analyzed for dose-rate and cumulative dose equivalent at sensitive organs. Polyethylene shielded 'storm shelters' inside spacecraft are studied to limit astronauts' total exposure at a sensitive site within 10 cSv from a large event as a potential goal that fulfills the ALARA (as low as reasonably achievable) requirement

  11. The differential effects of increasing frequency and magnitude of extreme events on coral populations.

    Science.gov (United States)

    Fabina, Nicholas S; Baskett, Marissa L; Gross, Kevin

    2015-09-01

    Extreme events, which have profound ecological consequences, are changing in both frequency and magnitude with climate change. Because extreme temperatures induce coral bleaching, we can explore the relative impacts of changes in frequency and magnitude of high temperature events on coral reefs. Here, we combined climate projections and a dynamic population model to determine how changing bleaching regimes influence coral persistence. We additionally explored how coral traits and competition with macroalgae mediate changes in bleaching regimes. Our results predict that severe bleaching events reduce coral persistence more than frequent bleaching. Corals with low adult mortality and high growth rates are successful when bleaching is mild, but bleaching resistance is necessary to persist when bleaching is severe, regardless of frequency. The existence of macroalgae-dominated stable states reduces coral persistence and changes the relative importance of coral traits. Building on previous studies, our results predict that management efforts may need to prioritize protection of "weaker" corals with high adult mortality when bleaching is mild, and protection of "stronger" corals with high bleaching resistance when bleaching is severe. In summary, future reef projections and conservation targets depend on both local bleaching regimes and biodiversity.

  12. Effects of the major sudden stratospheric warming event of 2009 on the subionospheric very low frequency/low frequency radio signals

    Science.gov (United States)

    Pal, S.; Hobara, Y.; Chakrabarti, S. K.; Schnoor, P. W.

    2017-07-01

    This paper presents effects of the major sudden stratospheric warming (SSW) event of 2009 on the subionospheric very low frequency/low frequency (VLF/LF) radio signals propagating in the Earth-ionosphere waveguide. Signal amplitudes from four transmitters received by VLF/LF radio networks of Germany and Japan corresponding to the major SSW event are investigated for possible anomalies and atmospheric influence on the high- to middle-latitude ionosphere. Significant anomalous increase or decrease of nighttime and daytime amplitudes of VLF/LF signals by ˜3-5 dB during the SSW event have been found for all propagation paths associated with stratospheric temperature rise at 10 hPa level. Increase or decrease in VLF/LF amplitudes during daytime and nighttime is actually due to the modification of the lower ionospheric boundary conditions in terms of electron density and electron-neutral collision frequency profiles and associated modal interference effects between the different propagating waveguide modes during the SSW period. TIMED/SABER mission data are also used to investigate the upper mesospheric conditions over the VLF/LF propagation path during the same time period. We observe a decrease in neutral temperature and an increase in pressure at the height of 75-80 km around the peak time of the event. VLF/LF anomalies are correlated and in phase with the stratospheric temperature and mesospheric pressure variation, while minimum of mesospheric cooling shows a 2-3 day delay with maximum VLF/LF anomalies. Simulations of VLF/LF diurnal variation are performed using the well-known Long Wave Propagating Capability (LWPC) code within the Earth-ionosphere waveguide to explain the VLF/LF anomalies qualitatively.

  13. In-cylinder pressure-based direct techniques and time frequency analysis for combustion diagnostics in IC engines

    International Nuclear Information System (INIS)

    D’Ambrosio, S.; Ferrari, A.; Galleani, L.

    2015-01-01

    Highlights: • Direct pressure-based techniques have been applied successfully to spark-ignition engines. • The burned mass fraction of pressure-based techniques has been compared with that of 2- and 3-zone combustion models. • The time frequency analysis has been employed to simulate complex diesel combustion events. - Abstract: In-cylinder pressure measurement and analysis has historically been a key tool for off-line combustion diagnosis in internal combustion engines, but online applications for real-time condition monitoring and combustion management have recently become popular. The present investigation presents and compares different low computing-cost in-cylinder pressure based methods for the analyses of the main features of combustion, that is, the start of combustion, the end of combustion and the crankshaft angle that responds to half of the overall burned mass. The instantaneous pressure in the combustion chamber has been used as an input datum for the described analytical procedures and it has been measured by means of a standard piezoelectric transducer. Traditional pressure-based techniques have been shown to be able to predict the burned mass fraction time history more accurately in spark ignition engines than in diesel engines. The most suitable pressure-based techniques for both spark ignition and compression ignition engines have been chosen on the basis of the available experimental data. Time–frequency analysis has also been applied to the analysis of diesel combustion, which is richer in events than spark ignited combustion. Time frequency algorithms for the calculation of the mean instantaneous frequency are computationally efficient, allow the main events of the diesel combustion to be identified and provide the greatest benefits in the presence of multiple injection events. These algorithms can be optimized and applied to onboard diagnostics tools designed for real control, but can also be used as an advanced validation tool for

  14. Analysis of core damage frequency due to external events at the DOE [Department of Energy] N-Reactor

    International Nuclear Information System (INIS)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L.; Baxter, J.T.; Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P.; Brosseau, D.A.

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs

  15. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  16. Probe-controlled soliton frequency shift in the regime of optical event horizon

    DEFF Research Database (Denmark)

    Gu, Jie; Guo, Hairun; Wang, Shaofei

    2015-01-01

    In optical analogy of the event horizon, temporal pulse collision and mutual interactions are mainly between an intense solitary wave (soliton) and a dispersive probe wave. In such a regime, here we numerically investigate the probe-controlled soliton frequency shift as well as the soliton self...

  17. EGC: a time-frequency augmented template-based method for gravitational wave burst search in ground-based interferometers

    International Nuclear Information System (INIS)

    Clapson, Andre-Claude; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Varvella, Monica

    2008-01-01

    The detection of burst-type events in the output of ground gravitational wave detectors is particularly challenging. The potential variety of astrophysical waveforms, as proposed by simulations and analytic studies in general relativity and the discrimination of actual signals from instrumental noise both are critical issues. Robust methods that achieve reasonable detection performances over a wide range of signals are required. We present here a hybrid burst-detection pipeline related to time-frequency transforms while based on matched filtering to provide robustness against noise characteristics. Studies on simulated noise show that the algorithm has a detection efficiency similar to other methods over very different waveforms and particularly good timing even for low amplitude signals: no bias for most tested waveforms and an average accuracy of 1.1 ms (down to 0.1 ms in the best case). Time-frequency-type parameters, useful for event classification, are also derived for noise spectral densities unfavourable to standard time-frequency algorithms

  18. EGC: a time-frequency augmented template-based method for gravitational wave burst search in ground-based interferometers

    Energy Technology Data Exchange (ETDEWEB)

    Clapson, Andre-Claude; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Varvella, Monica [LAL, Universite Paris-Sud 11, BP 34, 91898 Orsay (France)

    2008-02-07

    The detection of burst-type events in the output of ground gravitational wave detectors is particularly challenging. The potential variety of astrophysical waveforms, as proposed by simulations and analytic studies in general relativity and the discrimination of actual signals from instrumental noise both are critical issues. Robust methods that achieve reasonable detection performances over a wide range of signals are required. We present here a hybrid burst-detection pipeline related to time-frequency transforms while based on matched filtering to provide robustness against noise characteristics. Studies on simulated noise show that the algorithm has a detection efficiency similar to other methods over very different waveforms and particularly good timing even for low amplitude signals: no bias for most tested waveforms and an average accuracy of 1.1 ms (down to 0.1 ms in the best case). Time-frequency-type parameters, useful for event classification, are also derived for noise spectral densities unfavourable to standard time-frequency algorithms.

  19. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  20. Financial system loss as an example of high consequence, high frequency events

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, D.E.

    1996-07-01

    Much work has been devoted to high consequence events with low frequency of occurrence. Characteristic of these events are bridge failure (such as that of the Tacoma Narrows), building failure (such as the collapse of a walkway at a Kansas City hotel), or compromise of a major chemical containment system (such as at Bhopal, India). Such events, although rare, have an extreme personal, societal, and financial impact. An interesting variation is demonstrated by financial losses due to fraud and abuse in the money management system. The impact can be huge, entailing very high aggregate costs, but these are a result of the contribution of many small attacks and not the result of a single (or few) massive events. Public awareness is raised through publicized events such as the junk bond fraud perpetrated by Milikin or gross mismanagement in the failure of the Barings Bank through unsupervised trading activities by Leeson in Singapore. These event,s although seemingly large (financial losses may be on the order of several billion dollars), are but small contributors to the estimated $114 billion loss to all types of financial fraud in 1993. This paper explores the magnitude of financial system losses and identifies new areas for analysis of high consequence events including the potential effect of malevolent intent.

  1. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  2. Cardiovascular Events in Cancer Patients Treated with Highly or Moderately Emetogenic Chemotherapy: Results from a Population-Based Study

    International Nuclear Information System (INIS)

    Vo, T. T.; Nelson, J. J.

    2012-01-01

    Studies on cardiovascular safety in cancer patients treated with highly or moderately emetogenic chemotherapy (HEC or MEC), who may have taken the antiemetic, aprepitant, have been limited to clinical trials and postmarketing spontaneous reports. Our study explored background rates of cardiovascular disease (CVD) events among HEC- or MEC-treated cancer patients in a population-based setting to contextualize events seen in a new drug development program and to determine at a high level whether rates differed by aprepitant usage. Medical and pharmacy claims data from the 2005-2007 IMPACT National Benchmark Database were classified into emetogenic chemotherapy categories and CVD outcomes. Among 5827 HEC/MEC-treated patients, frequencies were highest for hypertension (16-21%) and composites of venous (7-12%) and arterial thromboembolic events (4-7%). Aprepitant users generally did not experience higher frequencies of events compared to nonusers. Our study serves as a useful benchmark of background CVD event rates in a population-based setting of cancer patients.

  3. Increased risk of severe hypoglycemic events with increasing frequency of non-severe hypoglycemic events in patients with Type 1 and Type 2 diabetes.

    LENUS (Irish Health Repository)

    Sreenan, Seamus

    2014-07-15

    Severe hypoglycemic events (SHEs) are associated with significant morbidity, mortality and costs. However, the more common non-severe hypoglycemic events (NSHEs) are less well explored. We investigated the association between reported frequency of NSHEs and SHEs among patients with type 1 diabetes mellitus (T1DM) and type 2 diabetes mellitus (T2DM) in the PREDICTIVE study.

  4. Very low frequency earthquakes (VLFEs) detected during episodic tremor and slip (ETS) events in Cascadia using a match filter method indicate repeating events

    Science.gov (United States)

    Hutchison, A. A.; Ghosh, A.

    2016-12-01

    Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.

  5. Attitudes of Consumers from Podgorica toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Nikola Milovic

    2018-04-01

    Full Text Available This investigation was aimed at gaining relevant knowledge about the attitudes of Podgorica consumers toward advertising through sport among. The sample included 330 students from Faculty of Economics in Podgorica, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analyzed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.00. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. The significant differences were found in two of three variables, while the consumers who do not watch sports events had much more negative attitudes toward advertising though sport.

  6. Memory for past public events depends on retrieval frequency but not memory age in Alzheimer's disease.

    Science.gov (United States)

    Müller, Stephan; Mychajliw, Christian; Hautzinger, Martin; Fallgatter, Andreas J; Saur, Ralf; Leyhe, Thomas

    2014-01-01

    Alzheimer's disease (AD) is characterized by retrograde memory deficits primarily caused by dysfunction of the hippocampal complex. Unresolved questions exist concerning the time course of hippocampal involvement in conscious recollection of declarative knowledge, as reports of temporal gradients of retrograde amnesia have been inconclusive. The aim of this study was to examine whether the extent and severity of retrograde amnesia is mediated by retrieval frequency or, in contrast, whether it depends on the age of the memory according to the assumptions of the main current theories of memory formation. We compared recall of past public events in patients with AD and healthy control (HC) individuals using the Historic Events Test (HET). The HET assesses knowledge about famous public events of the past 60 years divided into four time segments and consists of subjective memory rating, dating accuracy, and contextual memory tasks. Although memory for public events was impaired in AD patients, there was a strong effect of retrieval frequency across all time segments and both groups. As AD and HC groups derived similar benefits from greater retrieval frequency, cortical structures other than the hippocampal complex may mediate memory retrieval. These findings suggest that more frequently retrieved events and facts become more independent of the hippocampal complex and thus better protected against early damage of AD. This could explain why cognitive activity may delay the onset of memory decline in persons who develop AD.

  7. Variability and trends in dry day frequency and dry event length in the southwestern United States

    Science.gov (United States)

    McCabe, Gregory J.; Legates, David R.; Lins, Harry F.

    2010-01-01

    Daily precipitation from 22 National Weather Service first-order weather stations in the southwestern United States for water years 1951 through 2006 are used to examine variability and trends in the frequency of dry days and dry event length. Dry events with minimum thresholds of 10 and 20 consecutive days of precipitation with less than 2.54 mm are analyzed. For water years and cool seasons (October through March), most sites indicate negative trends in dry event length (i.e., dry event durations are becoming shorter). For the warm season (April through September), most sites also indicate negative trends; however, more sites indicate positive trends in dry event length for the warm season than for water years or cool seasons. The larger number of sites indicating positive trends in dry event length during the warm season is due to a series of dry warm seasons near the end of the 20th century and the beginning of the 21st century. Overall, a large portion of the variability in dry event length is attributable to variability of the El Niño–Southern Oscillation, especially for water years and cool seasons. Our results are consistent with analyses of trends in discharge for sites in the southwestern United States, an increased frequency in El Niño events, and positive trends in precipitation in the southwestern United States.

  8. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  9. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  10. Frequency shifting at fiber-optical event horizons: The effect of Raman deceleration

    International Nuclear Information System (INIS)

    Robertson, S.; Leonhardt, U.

    2010-01-01

    Pulses in fibers establish analogs of the event horizon [Philbin et al., Science 319, 1367 (2008)]. At a group-velocity horizon, the frequency of a probe wave is shifted. We present a theoretical model of this frequency shifting, taking into account the deceleration of the pulse caused by the Raman effect. The theory shows that the probe-wave spectrum is sensitive to details of the probe-pulse interaction. Our results indicate an additional loss mechanism in the experiment [Philbin et al., Science 319, 1367 (2008)] that has not been accounted for. Our analysis is also valid for more general cases of the interaction of dispersive waves with decelerated solitons.

  11. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  12. radio frequency based radio frequency based water level monitor

    African Journals Online (AJOL)

    eobe

    ABSTRACT. This paper elucidates a radio frequency (RF) based transmission and reception system used to remotely monitor and .... range the wireless can cover but in this prototype, it ... power supply to the system, the sensed water level is.

  13. Analysis of core damage frequency: Surry, Unit 1 internal events

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  14. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  15. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    Science.gov (United States)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  16. Attitudes of Consumers from the Mostar Canton in Bosnia and Herzegovina toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Marina Vukotic

    2018-04-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Mostar consumers toward advertising through sport among. The sample included 228 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.006. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  17. Attitudes of Consumers from the Sarajevo Canton in Bosnia and Herzegovina toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Izet Bajramovic

    2018-04-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Sarajevo consumers toward advertising through sport among. The sample included 358 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.00. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  18. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  19. Effects of Sound Frequency on Audiovisual Integration: An Event-Related Potential Study.

    Science.gov (United States)

    Yang, Weiping; Yang, Jingjing; Gao, Yulin; Tang, Xiaoyu; Ren, Yanna; Takahashi, Satoshi; Wu, Jinglong

    2015-01-01

    A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190-210 ms, for 1 kHz stimuli from 170-200 ms, for 2.5 kHz stimuli from 140-200 ms, 5 kHz stimuli from 100-200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300-340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.

  20. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.

  1. GHz band frequency hopping PLL-based frequency synthesizers

    Institute of Scientific and Technical Information of China (English)

    XU Yong; WANG Zhi-gong; GUAN Yu; XU Zhi-jun; QIAO Lu-feng

    2005-01-01

    In this paper we describe a full-integrated circuit containing all building blocks of a completed PLL-based synthesizer except for low pass filter(LPF).The frequency synthesizer is designed for a frequency hopping (FH) transceiver operating up to 1.5 GHz as a local oscillator. The architecture of Voltage Controlled Oscillator (VCO) is optimized to get better performance, and a phase noise of -111.85-dBc/Hz @ 1 MHz and a tuning range of 250 MHz are gained at a centre frequency of 1.35 GHz.A novel Dual-Modulus Prescaler(DMP) is designed to achieve a very low jitter and a lower power.The settling time of PLL is 80 μs while the reference frequency is 400 KHz.This monolithic frequency synthesizer is to integrate all main building blocks of PLL except for the low pass filter,with a maximum VCO output frequency of 1.5 GHz,and is fabricated with a 0.18 μm mixed signal CMOS process. Low power dissipation, low phase noise, large tuning range and fast settling time are gained in this design.

  2. Determination of the frequency and direct cost of the adverse drug events in Argentina.

    Science.gov (United States)

    Izquierdo, Estela; Rodríguez, Claudio; Pampliega, Eneas; Filinger, Ester

    2009-05-01

    To determine the frequency and the direct costs of adverse drug reactions, in an ambulatory population of the City of Buenos Aires, Argentina and its area of influence. A retrospective study was done during a period of three months on approximately 300.000 residents of the Buenos Aires area, gathering data according to the selected variables by means of the electronic capture of prescriptions dispensed in pharmacies of the area. This method enables the detection and registration of potential conflicts that may arise between a prescribed drug and factors such as: patient's demographic, clinical and drug profile. The analysis unit was defined as the happening of a moderate or severe adverse event reported by the system. The selected variables were the incidence of these effects and the direct cost was calculated as the value of the drugs that induced the adverse event. The events were classified according to the following interactions: a) drug-drug, b) drug-pediatrics, c) drug-gender, d) drug-pregnancy and abuse of controlled substances. The observed frequency shows great variability and the shortage of available data for ambulatory populations. We found 6.74% of reported events over the total of processed items, which generated an additional cost equivalent to 4.58% of the total pharmaceutical expenses. This study has only evaluated the cost occurred by the use of a drug that will lead to an adverse reaction. Moderate and severe reactions were included regardless of the important indirect costs, hospitalization costs, tests, physician fees, etc.

  3. Analysis of core damage frequency, Surry, Unit 1 internal events appendices

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the appendices for the accident sequence analyses of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed is an extensive reanalysis of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.0E-5 per year, with a 95% upper bound of 1.3E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency

  4. Central FPGA-based destination and load control in the LHCb MHz event readout

    International Nuclear Information System (INIS)

    Jacobsson, R.

    2012-01-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  5. Central FPGA-based destination and load control in the LHCb MHz event readout

    Science.gov (United States)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  6. Dynamic model based novel findings in power systems analysis and frequency measurement verification

    Science.gov (United States)

    Kook, Kyung Soo

    This study selects several new advanced topics in power systems, and verifies their usefulness using the simulation. In the study on ratio of the equivalent reactance and resistance of the bulk power systems, the simulation results give us the more correct value of X/R of the bulk power system, which can explain why the active power compensation is also important in voltage flicker mitigation. In the application study of the Energy Storage System(ESS) to the wind power, the new model implementation of the ESS connected to the wind power is proposed, and the control effect of ESS to the intermittency of the wind power is verified. Also this study conducts the intensive simulations for clarifying the behavior of the wide-area power system frequency as well as the possibility of the on-line instability detection. In our POWER IT Laboratory, since 2003, the U.S. national frequency monitoring network (FNET) has been being continuously operated to monitor the wide-area power system frequency in the U.S. Using the measured frequency data, the event of the power system is triggered, and its location and scale are estimated. This study also looks for the possibility of using the simulation technologies to contribute the applications of FNET, finds similarity of the event detection orders between the frequency measurements and the simulations in the U.S. Eastern power grid, and develops the new methodology for estimating the event location based on the simulated N-1 contingencies using the frequency measurement. It has been pointed out that the simulation results can not represent the actual response of the power systems due to the inevitable limit of modeling power systems and different operating conditions of the systems at every second. However, in the circumstances that we need to test such an important infrastructure supplying the electric energy without taking any risk of it, the software based simulation will be the best solution to verify the new technologies in

  7. Study on LOOP and SBO Frequency for Multi-Unit PSA

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Kyung Ho; Heo, Gyun Young [Kyunghee University, Yongin (Korea, Republic of)

    2016-05-15

    In conventional single unit PSA, it was assumed that all accidents or events are independent and the risk of only one unit has been evaluated. In other words, the possibility that simultaneous events occur on multiple units was excluded because it was assumed that the probability of concurrent events were extremely low. After Fukushima accidents, however, it was found that external hazards such as tsunami may impact on multiple units; that means, for the sake of proper mitigation the risk of accidents in shared SSCs should be reevaluated. New risk metrics to improve conventional CDF (Core Damage frequency) based on reactor-year is needed to perform MUPSA (Multi-Unit Probabilistic Safety Assessment). IAEA suggested SCDF (Site CDF) as a risk metrics for MUPSA. The frequency based on reactor-year was converted to the frequency based on site-year. In addition, shared SSCs were modeled in single unit PSA as if those units have independent shared SSCs. Therefore, the risk of shared SSCs should be reevaluated. In this paper, the frequency of LOOP (Loss of Offsite Power), which is typically a multi-unit event, was evaluated and the frequency of SBO (Station Blackout) depending on LOOP frequency and emergency power systems such as EDG (Emergency Diesel Generator) and AAC (Alternate AC), that can mitigate SBO events, was modeled. This paper describes how to calculate LOOP and SBO frequency from the simple example for two-unit site with shared AAC. The events impacting on multiple units, which were excluded in conventional PSA, should be considered for MUPSA.

  8. Modelado del transformador para eventos de alta frecuencia; Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana Galván Sánchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético. Debido a su construcción física, su representación como un circuito y su modelo matemático son muy complejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de la red eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienen fenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muy detallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operación de un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea de transmisión a una distancia de 5 km de una subestación de potencia. The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physical construction, its representation as a circuit and its mathematical model are very complex. The electromagnetic behavior and all the elements in the power network depend on the involved frequency. So, for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transient behavior. This work analyzes how to pass from a simple model to a very detailed model to simulated high frequency events. The simulated events are the switch operation due to a fault in the system and the impact of an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  9. Disruption of perineuronal nets increases the frequency of sharp wave ripple events.

    Science.gov (United States)

    Sun, Zhi Yong; Bozzelli, P Lorenzo; Caccavano, Adam; Allen, Megan; Balmuth, Jason; Vicini, Stefano; Wu, Jian-Young; Conant, Katherine

    2018-01-01

    Hippocampal sharp wave ripples (SWRs) represent irregularly occurring synchronous neuronal population events that are observed during phases of rest and slow wave sleep. SWR activity that follows learning involves sequential replay of training-associated neuronal assemblies and is critical for systems level memory consolidation. SWRs are initiated by CA2 or CA3 pyramidal cells (PCs) and require initial excitation of CA1 PCs as well as participation of parvalbumin (PV) expressing fast spiking (FS) inhibitory interneurons. These interneurons are relatively unique in that they represent the major neuronal cell type known to be surrounded by perineuronal nets (PNNs), lattice like structures composed of a hyaluronin backbone that surround the cell soma and proximal dendrites. Though the function of the PNN is not completely understood, previous studies suggest it may serve to localize glutamatergic input to synaptic contacts and thus influence the activity of ensheathed cells. Noting that FS PV interneurons impact the activity of PCs thought to initiate SWRs, and that their activity is critical to ripple expression, we examine the effects of PNN integrity on SWR activity in the hippocampus. Extracellular recordings from the stratum radiatum of horizontal murine hippocampal hemisections demonstrate SWRs that occur spontaneously in CA1. As compared with vehicle, pre-treatment (120 min) of paired hemislices with hyaluronidase, which cleaves the hyaluronin backbone of the PNN, decreases PNN integrity and increases SWR frequency. Pre-treatment with chondroitinase, which cleaves PNN side chains, also increases SWR frequency. Together, these data contribute to an emerging appreciation of extracellular matrix as a regulator of neuronal plasticity and suggest that one function of mature perineuronal nets could be to modulate the frequency of SWR events. © 2017 Wiley Periodicals, Inc.

  10. Brain-computer interface based on intermodulation frequency

    Science.gov (United States)

    Chen, Xiaogang; Chen, Zhikai; Gao, Shangkai; Gao, Xiaorong

    2013-12-01

    Objective. Most recent steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) systems have used a single frequency for each target, so that a large number of targets require a large number of stimulus frequencies and therefore a wider frequency band. However, human beings show good SSVEP responses only in a limited range of frequencies. Furthermore, this issue is especially problematic if the SSVEP-based BCI takes a PC monitor as a stimulator, which is only capable of generating a limited range of frequencies. To mitigate this issue, this study presents an innovative coding method for SSVEP-based BCI by means of intermodulation frequencies. Approach. Simultaneous modulations of stimulus luminance and color at different frequencies were utilized to induce intermodulation frequencies. Luminance flickered at relatively large frequency (10, 12, 15 Hz), while color alternated at low frequency (0.5, 1 Hz). An attractive feature of the proposed method was that it would substantially increase the number of targets at a single flickering frequency by altering color modulated frequencies. Based on this method, the BCI system presented in this study realized eight targets merely using three flickering frequencies. Main results. The online results obtained from 15 subjects (14 healthy and 1 with stroke) revealed that an average classification accuracy of 93.83% and information transfer rate (ITR) of 33.80 bit min-1 were achieved using our proposed SSVEP-based BCI system. Specifically, 5 out of the 15 subjects exhibited an ITR of 40.00 bit min-1 with a classification accuracy of 100%. Significance. These results suggested that intermodulation frequencies could be adopted as steady responses in BCI, for which our system could be used as a practical BCI system.

  11. Do the frequencies of adverse events increase, decrease, or stay the same with long-term use of statins?

    Science.gov (United States)

    Huddy, Karlyn; Dhesi, Pavittarpaul; Thompson, Paul D

    2013-02-01

    Statins are widely used for their cholesterol-lowering properties and proven reduction of cardiovascular disease risk. Many patients take statins as long-term treatment for a variety of conditions without a clear-cut understanding of how treatment duration affects the frequency of adverse effects. We aimed to evaluate whether the frequencies of documented adverse events increase, decrease, or remain unchanged with long-term statin use. We reviewed the established literature to define the currently known adverse effects of statin therapy, including myopathy, central nervous system effects, and the appearance of diabetes, and the frequency of these events with long-term medication use. The frequency of adverse effects associated with long-term statin therapy appears to be low. Many patients who develop side effects from statin therapy do so relatively soon after initiation of therapy, so the frequency of side effects from statin therapy when expressed as a percentage of current users decreases over time. Nevertheless, patients may develop side effects such as muscle pain and weakness years after starting statin therapy; however, the absolute number of patients affected by statin myopathy increases with treatment duration. Also, clinical trials of statin therapy rarely exceed 5 years, so it is impossible to determine with certainty the frequency of long-term side effects with these drugs.

  12. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  13. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  14. A Machine Learning-based Rainfall System for GPM Dual-frequency Radar

    Science.gov (United States)

    Tan, H.; Chandrasekar, V.; Chen, H.

    2017-12-01

    Precipitation measurement produced by the Global Precipitation Measurement (GPM) Dual-frequency Precipitation Radar (DPR) plays an important role in researching the water circle and forecasting extreme weather event. Compare with its predecessor - Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), GRM DPR measures precipitation in two different frequencies (i.e., Ku and Ka band), which can provide detailed information on the microphysical properties of precipitation particles, quantify particle size distribution and quantitatively measure light rain and falling snow. This paper presents a novel Machine Learning system for ground-based and space borne radar rainfall estimation. The system first trains ground radar data for rainfall estimation using rainfall measurements from gauges and subsequently uses the ground radar based rainfall estimates to train GPM DPR data in order to get space based rainfall product. Therein, data alignment between space DPR and ground radar is conducted using the methodology proposed by Bolen and Chandrasekar (2013), which can minimize the effects of potential geometric distortion of GPM DPR observations. For demonstration purposes, rainfall measurements from three rain gauge networks near Melbourne, Florida, are used for training and validation purposes. These three gauge networks, which are located in Kennedy Space Center (KSC), South Florida Water Management District (SFL), and St. Johns Water Management District (STJ), include 33, 46, and 99 rain gauge stations, respectively. Collocated ground radar observations from the National Weather Service (NWS) Weather Surveillance Radar - 1988 Doppler (WSR-88D) in Melbourne (i.e., KMLB radar) are trained with the gauge measurements. The trained model is then used to derive KMLB radar based rainfall product, which is used to train GPM DPR data collected from coincident overpasses events. The machine learning based rainfall product is compared against the GPM standard products

  15. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  16. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  17. Comparison between Japan and the United States in the frequency of events in equipment and components at nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2007-01-01

    The Institute of Nuclear Safety System, Incorporated (INSS) conducted trend analyses until 2005 to compare the frequency of events in certain electrical components and instrumentation components at nuclear power plants between Japan and the United States. The results revealed that events have occurred approximately an order of magnitude less often in Japan than in the United States. This paper compared Japan and the United States in more detail in terms of how often events - events reported under the reporting standards of the Nuclear Information Archive (NUCIA) or the Institute of Nuclear Power Operations (INPO) - occurred in electrical components, instrumentation components and mechanical components at nuclear power plants. The results were as follows: (1) In regard to electrical components and instrumentation components, events have occurred one-eighth less frequently in Japan than in the United States, suggesting that the previous results were correct. (2) Events have occurred more often in mechanical components than electrical components and instrumentation components in both Japan and the United States, and there was a smaller difference in the frequency of events in mechanical components between the two countries. (3) Regarding mechanical components, it was found that events in the pipes for critical systems and equipment, such as reactor coolant systems, emergency core cooling systems, instrument and control systems, ventilating and air-conditioning systems, and turbine equipment, have occurred more often in Japan than in the United States. (4) The above observations suggest that there is little scope for reducing the frequency of events in electrical components and instrumentation components, but that mechanical components such as pipes for main systems like emergency core cooling systems and turbine equipment in the case of PWRs, could be improved by re-examining inspection methods and intervals. (author)

  18. Increased frequency of FBN1 truncating and splicing variants in Marfan syndrome patients with aortic events.

    Science.gov (United States)

    Baudhuin, Linnea M; Kotzer, Katrina E; Lagerstedt, Susan A

    2015-03-01

    Marfan syndrome is a systemic disorder that typically involves FBN1 mutations and cardiovascular manifestations. We investigated FBN1 genotype-phenotype correlations with aortic events (aortic dissection and prophylactic aortic surgery) in patients with Marfan syndrome. Genotype and phenotype information from probands (n = 179) with an FBN1 pathogenic or likely pathogenic variant were assessed. A higher frequency of truncating or splicing FBN1 variants was observed in Ghent criteria-positive patients with an aortic event (n = 34) as compared with all other probands (n = 145) without a reported aortic event (79 vs. 39%; P Marfan syndrome patients with FBN1 truncating and splicing variants.Genet Med 17 3, 177-187.

  19. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  20. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  1. Shallow very-low-frequency earthquakes accompanied with slow slip event along the plate boundary of the Nankai trough

    Science.gov (United States)

    Nakano, M.; Hori, T.; Araki, E.; Kodaira, S.; Ide, S.

    2017-12-01

    Recent improvements of seismic and geodetic observations have revealed the existence of a new family of slow earthquakes occurring along or close to the plate boundary worldwide. In the viewpoint of the characteristic time scales, the slow earthquakes can be classified into several groups as low-frequency tremor or tectonic tremor (LFT) dominated in several hertz, very-low-frequency earthquake (VLFE) dominated in 10 to 100 s, and short- and long-term slow-slip event (SSE) with durations of days to years. In many cases, these slow earthquakes are accompanied with other types of slow events. However, the events occurring offshore, especially beneath the toe of accretionary prism, are poorly understood because of the difficulty to detect signals. Utilizing the data captured from oceanfloor observation networks which many efforts have recently been taken to develop is necessary to improve our understandings for these events. Here, we investigated CMT analysis of shallow VLFEs using data obtained from DONET oceanfloor observation networks along the Nankai trough, southwest of Japan. We found that shallow VLFEs have almost identical history of moment release with that of synchronous SSE which occurred at the same region recently found by Araki et al. (2017). VLFE sources show updip migrations during the activity, coincident with the migration of SSE source. From these findings we conclude that these slow events share the same fault slip, and VLFE represent high-frequency fluctuations of slip during SSE. This result imply that shallow SSE along the plate interface would have occurred in the background during the shallow VLFE activities repeatedly observed along the Nankai trough, but the SSE was not reported because of difficult detections.

  2. How Metastrategic Considerations Influence the Selection of Frequency Estimation Strategies

    Science.gov (United States)

    Brown, Norman R.

    2008-01-01

    Prior research indicates that enumeration-based frequency estimation strategies become increasingly common as memory for relevant event instances improves and that moderate levels of context memory are associated with moderate rates of enumeration [Brown, N. R. (1995). Estimation strategies and the judgment of event frequency. Journal of…

  3. Seasonal variability of stream water quality response to storm events captured using high-frequency and multi-parameter data

    Science.gov (United States)

    Fovet, O.; Humbert, G.; Dupas, R.; Gascuel-Odoux, C.; Gruau, G.; Jaffrezic, A.; Thelusma, G.; Faucheux, M.; Gilliet, N.; Hamon, Y.; Grimaldi, C.

    2018-04-01

    The response of stream chemistry to storm is of major interest for understanding the export of dissolved and particulate species from catchments. The related challenge is the identification of active hydrological flow paths during these events and of the sources of chemical elements for which these events are hot moments of exports. An original four-year data set that combines high frequency records of stream flow, turbidity, nitrate and dissolved organic carbon concentrations, and piezometric levels was used to characterize storm responses in a headwater agricultural catchment. The data set was used to test to which extend the shallow groundwater was impacting the variability of storm responses. A total of 177 events were described using a set of quantitative and functional descriptors related to precipitation, stream and groundwater pre-event status and event dynamics, and to the relative dynamics between water quality parameters and flow via hysteresis indices. This approach led to identify different types of response for each water quality parameter which occurrence can be quantified and related to the seasonal functioning of the catchment. This study demonstrates that high-frequency records of water quality are precious tools to study/unique in their ability to emphasize the variability of catchment storm responses.

  4. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  5. A Bayesian approach to unanticipated events frequency estimation in the decision making context of a nuclear research reactor facility

    International Nuclear Information System (INIS)

    Chatzidakis, S.; Staras, A.

    2013-01-01

    Highlights: • The Bayes’ theorem is employed to support the decision making process in a research reactor. • The intention is to calculate parameters related to unanticipated occurrence of events. • Frequency, posterior distribution and confidence limits are calculated. • The approach is demonstrated using two real-world numerical examples. • The approach can be used even if no failures have been observed. - Abstract: Research reactors are considered as multi-tasking environments having the multiple roles of commercial, research and training facilities. Yet, reactor managers have to make decisions, frequently with high economic impact, based on little available knowledge. A systematic approach employing the Bayes’ theorem is proposed to support the decision making process in a research reactor environment. This approach is characterized by low level complexity, appropriate for research reactor facilities. The methodology is demonstrated through the study of two characteristic events that lead to unanticipated system shutdown, namely the de-energization of the control rod magnet and the flapper valve opening. The results obtained demonstrate the suitability of the Bayesian approach in the decision making context when unanticipated events are considered

  6. Characterizing the Frequency and Elevation of Rapid Drainage Events in West Greenland

    Science.gov (United States)

    Cooley, S.; Christoffersen, P.

    2016-12-01

    Rapid drainage of supraglacial lakes on the Greenland Ice Sheet is critical for the establishment of surface-to-bed hydrologic connections and the subsequent transfer of water from surface to bed. Yet, estimates of the number and spatial distribution of rapidly draining lakes vary widely due to limitations in the temporal frequency of image collection and obscureness by cloud. So far, no study has assessed the impact of these observation biases. In this study, we examine the frequency and elevation of rapidly draining lakes in central West Greenland, from 68°N to 72.6°N, and we make a robust statistical analysis to estimate more accurately the likelihood of lakes draining rapidly. Using MODIS imagery and a fully automated lake detection method, we map more than 500 supraglacial lakes per year over a 63000 km2 study area from 2000-2015. Through testing four different definitions of rapidly draining lakes from previously published studies, we find that the number of rapidly draining lakes varies from 3% to 38%. Logistic regression between rapid drainage events and image sampling frequency demonstrates that the number of rapid drainage events is strongly dependent on cloud-free observation percentage. We then develop three new drainage criteria and apply an observation bias correction that suggests a true rapid drainage probability between 36% and 45%, considerably higher than previous studies without bias assessment have reported. We find rapid-draining lakes are on average larger and disappear earlier than slow-draining lakes, and we also observe no elevation differences for the lakes detected as rapidly draining. We conclude a) that methodological problems in rapid drainage research caused by observation bias and varying detection methods have obscured large-scale rapid drainage characteristics and b) that the lack of evidence for an elevation limit on rapid drainage suggests surface-to-bed hydrologic connections may continue to propagate inland as climate warms.

  7. The development on the methodology of the initiating event frequencies for liquid metal reactor KALIMER

    International Nuclear Information System (INIS)

    Jeong, K. S.; Yang, Z. A.; Ah, Y. B.; Jang, W. P.; Jeong, H. Y.; Ha, K. S.; Han, D. H.

    2002-01-01

    In this paper, the PSA methodology of PRISM,Light Water Reactor, Pressurized Heavy Water Reactor are analyzed and the methodology of Initiating Events for KALIMER are suggested. Also,the reliability assessment of assumptions for Pipes Corrosion Frequency is set up. The reliability assessment of Passive Safety System, one of Main Safety System of KALIMER, are discussed and analyzed

  8. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  9. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  10. Clinical usefulness and feasibility of time-frequency analysis of chemosensory event-related potentials.

    Science.gov (United States)

    Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A

    2013-09-01

    The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.

  11. An expert elicitation process to project the frequency and magnitude of Florida manatee mortality events caused by red tide (Karenia brevis)

    Science.gov (United States)

    Martin, Julien; Runge, Michael C.; Flewelling, Leanne J.; Deutsch, Charles J.; Landsberg, Jan H.

    2017-11-20

    Red tides (blooms of the harmful alga Karenia brevis) are one of the major sources of mortality for the Florida manatee (Trichechus manatus latirostris), especially in southwest Florida. It has been hypothesized that the frequency and severity of red tides may increase in the future because of global climate change and other factors. To improve our ecological forecast for the effects of red tides on manatee population dynamics and long-term persistence, we conducted a formal expert judgment process to estimate probability distributions for the frequency and relative magnitude of red-tide-related manatee mortality (RTMM) events over a 100-year time horizon in three of the four regions recognized as manatee management units in Florida. This information was used to update a population viability analysis for the Florida manatee (the Core Biological Model). We convened a panel of 12 experts in manatee biology or red-tide ecology; the panel met to frame, conduct, and discuss the elicitation. Each expert provided a best estimate and plausible low and high values (bounding a confidence level of 80 percent) for each parameter in each of three regions (Northwest, Southwest, and Atlantic) of the subspecies’ range (excluding the Upper St. Johns River region) for two time periods (0−40 and 41−100 years from present). We fitted probability distributions for each parameter, time period, and expert by using these three elicited values. We aggregated the parameter estimates elicited from individual experts and fitted a parametric distribution to the aggregated results.Across regions, the experts expected the future frequency of RTMM events to be higher than historical levels, which is consistent with the hypothesis that global climate change (among other factors) may increase the frequency of red-tide blooms. The experts articulated considerable uncertainty, however, about the future frequency of RTMM events. The historical frequency of moderate and intense RTMM (combined) in

  12. Statistical Prediction of Solar Particle Event Frequency Based on the Measurements of Recent Solar Cycles for Acute Radiation Risk Analysis

    Science.gov (United States)

    Myung-Hee, Y. Kim; Shaowen, Hu; Cucinotta, Francis A.

    2009-01-01

    Large solar particle events (SPEs) present significant acute radiation risks to the crew members during extra-vehicular activities (EVAs) or in lightly shielded space vehicles for space missions beyond the protection of the Earth's magnetic field. Acute radiation sickness (ARS) can impair performance and result in failure of the mission. Improved forecasting capability and/or early-warning systems and proper shielding solutions are required to stay within NASA's short-term dose limits. Exactly how to make use of observations of SPEs for predicting occurrence and size is a great challenge, because SPE occurrences themselves are random in nature even though the expected frequency of SPEs is strongly influenced by the time position within the solar activity cycle. Therefore, we developed a probabilistic model approach, where a cumulative expected occurrence curve of SPEs for a typical solar cycle was formed from a non-homogeneous Poisson process model fitted to a database of proton fluence measurements of SPEs that occurred during the past 5 solar cycles (19 - 23) and those of large SPEs identified from impulsive nitrate enhancements in polar ice. From the fitted model, the expected frequency of SPEs was estimated at any given proton fluence threshold (Phi(sub E)) with energy (E) >30 MeV during a defined space mission period. Corresponding Phi(sub E) (E=30, 60, and 100 MeV) fluence distributions were simulated with a random draw from a gamma distribution, and applied for SPE ARS risk analysis for a specific mission period. It has been found that the accurate prediction of deep-seated organ doses was more precisely predicted at high energies, Phi(sub 100), than at lower energies such as Phi(sub 30) or Phi(sub 60), because of the high penetration depth of high energy protons. Estimates of ARS are then described for 90th and 95th percentile events for several mission lengths and for several likely organ dose-rates. The ability to accurately measure high energy protons

  13. A multispacecraft event study of Pc5 ultralow-frequency waves in the magnetosphere and their external drivers

    International Nuclear Information System (INIS)

    Wang, Chih-Ping; Thorne, Richard; Liu, Terry Z.; Hartinger, Michael D.; Nagai, Tsugunobu

    2017-01-01

    We investigate a quiet time event of magnetospheric Pc5 ultralow-frequency (ULF) waves and their likely external drivers using multiple spacecraft observations. Enhancements of electric and magnetic field perturbations in two narrow frequency bands, 1.5–2 mHz and 3.5–4 mHz, were observed over a large radial distance range from r ~ 5 to 11 RE. During the first half of this event, perturbations were mainly observed in the transverse components and only in the 3.5–4 mHz band. In comparison, enhancements were stronger during the second half in both transverse and compressional components and in both frequency bands. No indication of field line resonances was found for these magnetic field perturbations. Perturbations in these two bands were also observed in the magnetosheath, but not in the solar wind dynamic pressure perturbations. For the first interval, good correlations between the flow perturbations in the magnetosphere and magnetosheath and an indirect signature for Kelvin-Helmholtz (K-H) vortices suggest K-H surface waves as the driver. For the second interval, good correlations are found between the magnetosheath dynamic pressure perturbations, magnetopause deformation, and magnetospheric waves, all in good correspondence to interplanetary magnetic field (IMF) discontinuities. The characteristics of these perturbations can be explained by being driven by foreshock perturbations resulting from these IMF discontinuities. This event shows that even during quiet periods, K-H-unstable magnetopause and ion foreshock perturbations can combine to create a highly dynamic magnetospheric ULF wave environment

  14. Tracking the time course of word-frequency effects in auditory word recognition with event-related potentials.

    Science.gov (United States)

    Dufour, Sophie; Brunellière, Angèle; Frauenfelder, Ulrich H

    2013-04-01

    Although the word-frequency effect is one of the most established findings in spoken-word recognition, the precise processing locus of this effect is still a topic of debate. In this study, we used event-related potentials (ERPs) to track the time course of the word-frequency effect. In addition, the neighborhood density effect, which is known to reflect mechanisms involved in word identification, was also examined. The ERP data showed a clear frequency effect as early as 350 ms from word onset on the P350, followed by a later effect at word offset on the late N400. A neighborhood density effect was also found at an early stage of spoken-word processing on the PMN, and at word offset on the late N400. Overall, our ERP differences for word frequency suggest that frequency affects the core processes of word identification starting from the initial phase of lexical activation and including target word selection. They thus rule out any interpretation of the word frequency effect that is limited to a purely decisional locus after word identification has been completed. Copyright © 2012 Cognitive Science Society, Inc.

  15. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Heo, Gyunyoung

    2015-01-01

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making

  16. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  17. Component external leakage and rupture frequency estimates

    International Nuclear Information System (INIS)

    Eide, S.A.; Khericha, S.T.; Calley, M.B.; Johnson, D.A.; Marteeny, M.L.

    1991-11-01

    In order to perform detailed internal flooding risk analyses of nuclear power plants, external leakage and rupture frequencies are needed for various types of components - piping, valves, pumps, flanges, and others. However, there appears to be no up-to-date, comprehensive source for such frequency estimates. This report attempts to fill that void. Based on a comprehensive search of Licensee Event Reports (LERs) contained in Nuclear Power Experience (NPE), and estimates of component populations and exposure times, component external leakage and rupture frequencies were generated. The remainder of this report covers the specifies of the NPE search for external leakage and rupture events, analysis of the data, a comparison with frequency estimates from other sources, and a discussion of the results

  18. LOSP-initiated event tree analysis for BWR

    International Nuclear Information System (INIS)

    Watanabe, Norio; Kondo, Masaaki; Uno, Kiyotaka; Chigusa, Takeshi; Harami, Taikan

    1989-03-01

    As a preliminary study of 'Japanese Model Plant PSA', a LOSP (loss of off-site power)-initiated Event Tree Analysis for a Japanese typical BWR was carried out solely based on the open documents such as 'Safety Analysis Report'. The objectives of this analysis are as follows; - to delineate core-melt accident sequences initiated by LOSP, - to evaluate the importance of core-melt accident sequences in terms of occurrence frequency, and - to develop a foundation of plant information and analytical procedures for efficiently performing further 'Japanese Model Plant PSA'. This report describes the procedure and results of the LOSP-initiated Event Tree Analysis. In this analysis, two types of event trees, Functional Event Tree and Systemic Event Tree, were developed to delineate core-melt accident sequences and to quantify their frequencies. Front-line System Event Tree was prepared as well to provide core-melt sequence delineation for accident progression analysis of Level 2 PSA which will be followed in a future. Applying U.S. operational experience data such as component failure rates and a LOSP frequency, we obtained the following results; - The total frequency of core-melt accident sequences initiated by LOSP is estimated at 5 x 10 -4 per reactor-year. - The dominant sequences are 'Loss of Decay Heat Removal' and 'Loss of Emergency Electric Power Supply', which account for more than 90% of the total core-melt frequency. In this analysis, a higher value of 0.13/R·Y was used for the LOSP frequency than experiences in Japan and any recovery action was not considered. In fact, however, there has been no experience of LOSP event in Japanese nuclear power plants so far and it is also expected that offsite power and/or PCS would be recovered before core melt. Considering Japanese operating experience and recovery factors will reduce the total core-melt frequency to less than 10 -6 per reactor-year. (J.P.N.)

  19. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    Energy Technology Data Exchange (ETDEWEB)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.; Dagle, Jeffery E.; Millard, W. David; Yao, Juan; Glantz, Clifford S.; Gourisetti, Sri Nikhil Gup

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insights from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of

  20. Prediction problem for target events based on the inter-event waiting time

    Science.gov (United States)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  1. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  2. Assessing loss event frequencies of smart grid cyber threats: Encoding flexibility into FAIR using Bayesian network approach

    NARCIS (Netherlands)

    Le, Anhtuan; Chen, Yue; Chai, Kok Keong; Vasenev, Alexandr; Montoya, L.

    Assessing loss event frequencies (LEF) of smart grid cyber threats is essential for planning cost-effective countermeasures. Factor Analysis of Information Risk (FAIR) is a well-known framework that can be applied to consider threats in a structured manner by using look-up tables related to a

  3. Assessment of System Frequency Support Effect of PMSG-WTG Using Torque-Limit-Based Inertial Control: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiao; Gao, Wenzhong; Wang, Jianhui; Wu, Ziping; Yan, Weihang; Gevorgian, Vahan; Zhang, Yingchen; Muljadi, Eduard; Kang, Moses; Hwang, Min; Kang, Yong Cheol

    2017-05-12

    To release the 'hidden inertia' of variable-speed wind turbines for temporary frequency support, a method of torque-limit-based inertial control is proposed in this paper. This method aims to improve the frequency support capability considering the maximum torque restriction of a permanent magnet synchronous generator. The advantages of the proposed method are improved frequency nadir (FN) in the event of an under-frequency disturbance; and avoidance of over-deceleration and a second frequency dip during the inertial response. The system frequency response is different, with different slope values in the power-speed plane when the inertial response is performed. The proposed method is evaluated in a modified three-machine, nine-bus system. The simulation results show that there is a trade-off between the recovery time and FN, such that a gradual slope tends to improve the FN and restrict the rate of change of frequency aggressively while causing an extension of the recovery time. These results provide insight into how to properly design such kinds of inertial control strategies for practical applications.

  4. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    Science.gov (United States)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  5. Evolution in Intensity and Frequency of Extreme Events of Precipitation in Northeast Region and Brazilian Amazon in XXI Century

    Science.gov (United States)

    Fonseca, P. M.; Veiga, J. A.; Correia, F. S.; Brito, A. L.

    2013-05-01

    The aim of this research was evaluate changes in frequency and intensity of extreme events of precipitation in Brazilian Amazon and Northeast Region, doubling CO2 concentration in agreement of IPCC A2 emissions scenarios (Nakicenovic et al., 2001). For this evaluation was used ETA model (Chou et al., 2011), forced with CCSM3 Global model data (Meehl, 2006) to run 4 experiments, only for January, February and March: 1980-1990, 2000-2010, 2040-2050 and 2090-2100. Using the first decade as reference (1980-1990), was evaluated changes occurred in following decades, with a methodology to classify extremes events adapted from Frich (2002) and Gao (2006). Higher was the class, more intense is the event. An increase of 25% was observed in total precipitation in Brazilian Amazon for the end of XXI century and 12% for extreme events type 1, 9% for events type 2 and 10% for type 3. By the other hand, a 17% decrease of precipitation in Brazilian Northeast was observed, and a pronounced decay of 24% and 15% in extreme events contribution type 1 and 2 to total amount of precipitation, respectively. The difference between total normal type events was positive in this three decades compared with reference decade 1980-1990, varying positively from 4 to 6 thousand events included in normality by decade, these events was decreased in your majority of Class 1 events, which presented a decay of at least 3.500 events by each decade. This suggests an intensification of extreme events, considering that the amount of precipitation by class increased, and the number of events by class decreased. To Northeast region, an increasing in 9% of contribution to events type 3 class was observed, as well as in the frequency of this type of events (about of 700 more events). Major decreasing in number of classes extreme events occur in 2000-2010, to classes 1 and 3, with 7,2 and 5,6%, and by the end of century in class 3, with 4,5%. For the three analyzed decades a total decrease of 8.400 events was

  6. Modelado del transformador para eventos de alta frecuencia ;Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana – Galván Sanchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético.Debido a su construcción física, su representación como un circuito y su modelo matemático son muycomplejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de lared eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienenfenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muydetallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operaciónde un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea detransmisión a una distancia de 5 km de una subestación de potencia.The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physicalconstruction, its representation as a circuit and its mathematical model are very complex. Theelectromagnetic behavior and all the elements in the power network depend on the involved frequency. So,for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transientbehavior. This work analyzes how to pass from a simple model to a very detailed model to simulated highfrequency events. The simulated events are the switch operation due to a fault in the system and the impactof an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  7. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  8. Frequency deviations and generation scheduling in the nordic system

    DEFF Research Database (Denmark)

    Li, Zhongwei; Samuelsson, Olaf; Garcia-Valle, Rodrigo

    2011-01-01

    to be considered, the disturbances caused to this control by the hourly dispatch of generation has received less attention and is the focus of this paper. Based on years of recorded PMU data, statistics of frequency events and analysis of frequency quality are made to demonstrate the relation between the frequency...

  9. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  10. Power System Event Ranking Using a New Linear Parameter-Varying Modeling with a Wide Area Measurement System-Based Approach

    Directory of Open Access Journals (Sweden)

    Mohammad Bagher Abolhasani Jabali

    2017-07-01

    Full Text Available Detecting critical power system events for Dynamic Security Assessment (DSA is required for reliability improvement. The approach proposed in this paper investigates the effects of events on dynamic behavior during nonlinear system response while common approaches use steady-state conditions after events. This paper presents some new and enhanced indices for event ranking based on time-domain simulation and polytopic linear parameter-varying (LPV modeling of a power system. In the proposed approach, a polytopic LPV representation is generated via linearization about some points of the nonlinear dynamic behavior of power system using wide-area measurement system (WAMS concepts and then event ranking is done based on the frequency response of the system models on the vertices. Therefore, the nonlinear behaviors of the system in the time of fault occurrence are considered for events ranking. The proposed algorithm is applied to a power system using nonlinear simulation. The comparison of the results especially in different fault conditions shows the advantages of the proposed approach and indices.

  11. Frequency Based Real-time Pricing for Residential Prosumers

    Science.gov (United States)

    Hambridge, Sarah Mabel

    This work is the first to explore frequency based pricing for secondary frequency control as a price-reactive control mechanism for residential prosumers. A frequency based real-time electricity rate is designed as an autonomous market control mechanism for residential prosumers to provide frequency support as an ancillary service. In addition, prosumers are empowered to participate in dynamic energy transactions, therefore integrating Distributed Energy Resources (DERs), and increasing distributed energy storage onto the distributed grid. As the grid transitions towards DERs, a new market based control system will take the place of the legacy distributed system and possibly the legacy bulk power system. DERs provide many benefits such as energy independence, clean generation, efficiency, and reliability to prosumers during blackouts. However, the variable nature of renewable energy and current lack of installed energy storage on the grid will create imbalances in supply and demand as uptake increases, affecting the grid frequency and system operation. Through a frequency-based electricity rate, prosumers will be encouraged to purchase energy storage systems (ESS) to offset their neighbor's distributed generation (DG) such as solar. Chapter 1 explains the deregulation of the power system and move towards Distributed System Operators (DSOs), as prosumers become owners of microgrids and energy cells connected to the distributed system. Dynamic pricing has been proposed as a benefit to prosumers, giving them the ability to make decisions in the energy market, while also providing a way to influence and control their behavior. Frequency based real-time pricing is a type of dynamic pricing which falls between price-reactive control and transactive control. Prosumer-to-prosumer transactions may take the place of prosumer-to-utility transactions, building The Energy Internet. Frequency based pricing could be a mechanism for determining prosumer prices and supporting

  12. Extreme events in total ozone over the Northern mid-latitudes: an analysis based on long-term data sets from five European ground-based stations

    Energy Technology Data Exchange (ETDEWEB)

    Rieder, Harald E. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland)), e-mail: hr2302@columbia.edu; Jancso, Leonhardt M. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Inst. for Meteorology and Geophysics, Univ. of Innsbruck, Innsbruck (Austria)); Di Rocco, Stefania (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Dept. of Geography, Univ. of Zurich, Zurich (Switzerland)) (and others)

    2011-11-15

    We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear 'fingerprints' of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector

  13. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  14. Probe-controlled soliton frequency shift in the regime of optical event horizon.

    Science.gov (United States)

    Gu, Jie; Guo, Hairun; Wang, Shaofei; Zeng, Xianglong

    2015-08-24

    In optical analogy of the event horizon, temporal pulse collision and mutual interactions are mainly between an intense solitary wave (soliton) and a dispersive probe wave. In such a regime, here we numerically investigate the probe-controlled soliton frequency shift as well as the soliton self-compression. In particular, in the dispersion landscape with multiple zero dispersion wavelengths, bi-directional soliton spectral tunneling effects is possible. Moreover, we propose a mid-infrared soliton self-compression to the generation of few-cycle ultrashort pulses, in a bulk of quadratic nonlinear crystals in contrast to optical fibers or cubic nonlinear media, which could contribute to the community with a simple and flexible method to experimental implementations.

  15. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  16. Frequency of Extreme Heat Event as a Surrogate Exposure Metric for Examining the Human Health Effects of Climate Change.

    Directory of Open Access Journals (Sweden)

    Crystal Romeo Upperman

    Full Text Available Epidemiological investigation of the impact of climate change on human health, particularly chronic diseases, is hindered by the lack of exposure metrics that can be used as a marker of climate change that are compatible with health data. Here, we present a surrogate exposure metric created using a 30-year baseline (1960-1989 that allows users to quantify long-term changes in exposure to frequency of extreme heat events with near unabridged spatial coverage in a scale that is compatible with national/state health outcome data. We evaluate the exposure metric by decade, seasonality, area of the country, and its ability to capture long-term changes in weather (climate, including natural climate modes. Our findings show that this generic exposure metric is potentially useful to monitor trends in the frequency of extreme heat events across varying regions because it captures long-term changes; is sensitive to the natural climate modes (ENSO events; responds well to spatial variability, and; is amenable to spatial/temporal aggregation, making it useful for epidemiological studies.

  17. Thermal-Diffusivity-Based Frequency References in Standard CMOS

    NARCIS (Netherlands)

    Kashmiri, S.M.

    2012-01-01

    In recent years, a lot of research has been devoted to the realization of accurate integrated frequency references. A thermal-diffusivity-based (TD) frequency reference provides an alternative method of on-chip frequency generation in standard CMOS technology. A frequency-locked loop locks the

  18. Carbon nanotube transistor based high-frequency electronics

    Science.gov (United States)

    Schroter, Michael

    At the nanoscale carbon nanotubes (CNTs) have higher carrier mobility and carrier velocity than most incumbent semiconductors. Thus CNT based field-effect transistors (FETs) are being considered as strong candidates for replacing existing MOSFETs in digital applications. In addition, the predicted high intrinsic transit frequency and the more recent finding of ways to achieve highly linear transfer characteristics have inspired investigations on analog high-frequency (HF) applications. High linearity is extremely valuable for an energy efficient usage of the frequency spectrum, particularly in mobile communications. Compared to digital applications, the much more relaxed constraints for CNT placement and lithography combined with already achieved operating frequencies of at least 10 GHz for fabricated devices make an early entry in the low GHz HF market more feasible than in large-scale digital circuits. Such a market entry would be extremely beneficial for funding the development of production CNTFET based process technology. This talk will provide an overview on the present status and feasibility of HF CNTFET technology will be given from an engineering point of view, including device modeling, experimental results, and existing roadblocks. Carbon nanotube transistor based high-frequency electronics.

  19. Aerosol events in the broader Mediterranean basin based on 7-year (2000–2007 MODIS C005 data

    Directory of Open Access Journals (Sweden)

    A. Gkikas

    2009-09-01

    Full Text Available Aerosol events (their frequency and intensity in the broader Mediterranean basin were studied using 7-year (2000–2007 aerosol data of optical depth (AOD at 550 nm from the MODerate Resolution Imaging Spectroradiometer (MODIS Terra. The complete spatial coverage of data revealed a significant spatial variability of aerosol events which is also dependent on their intensity. Strong events occur more often in the western and central Mediterranean basin (up to 14 events/year whereas extreme events (AOD up to 5.0 are systematically observed in the eastern Mediterranean basin throughout the year. There is also a significant seasonal variability with strong aerosol events occurring most frequently in the western part of the basin in summer and extreme episodes in the eastern part during spring. The events were also analyzed separately over land and sea revealing differences that are due to the different natural and anthropogenic processes, like dust transport (producing maximum frequencies of extreme episodes in spring over both land and sea or forest fires (producing maximum frequencies in strong episodes in summer over land. The inter-annual variability shows a gradual decrease in the frequency of all aerosol episodes over land and sea areas of the Mediterranean during the period 2000–2007, associated with an increase in their intensity (increased AOD values. The strong spatiotemporal variability of aerosol events indicates the need for monitoring them at the highest spatial and temporal coverage and resolution.

  20. Effects of low-frequency repetitive transcranial magnetic stimulation on event-related potential P300

    Science.gov (United States)

    Torii, Tetsuya; Sato, Aya; Iwahashi, Masakuni; Iramina, Keiji

    2012-04-01

    The present study analyzed the effects of repetitive transcranial magnetic stimulation (rTMS) on brain activity. P300 latency of event-related potential (ERP) was used to evaluate the effects of low-frequency and short-term rTMS by stimulating the supramarginal gyrus (SMG), which is considered to be the related area of P300 origin. In addition, the prolonged stimulation effects on P300 latency were analyzed after applying rTMS. A figure-eight coil was used to stimulate left-right SMG, and intensity of magnetic stimulation was 80% of motor threshold. A total of 100 magnetic pulses were applied for rTMS. The effects of stimulus frequency at 0.5 or 1 Hz were determined. Following rTMS, an odd-ball task was performed and P300 latency of ERP was measured. The odd-ball task was performed at 5, 10, and 15 min post-rTMS. ERP was measured prior to magnetic stimulation as a control. Electroencephalograph (EEG) was measured at Fz, Cz, and Pz that were indicated by the international 10-20 electrode system. Results demonstrated that different effects on P300 latency occurred between 0.5-1 Hz rTMS. With 1 Hz low-frequency magnetic stimulation to the left SMG, P300 latency decreased. Compared to the control, the latency time difference was approximately 15 ms at Cz. This decrease continued for approximately 10 min post-rTMS. In contrast, 0.5 Hz rTMS resulted in delayed P300 latency. Compared to the control, the latency time difference was approximately 20 ms at Fz, and this delayed effect continued for approximately 15 min post-rTMS. Results demonstrated that P300 latency varied according to rTMS frequency. Furthermore, the duration of the effect was not similar for stimulus frequency of low-frequency rTMS.

  1. Event-related desynchronization and synchronization in MEG: Framework for analysis and illustrative datasets related to discrimination of frequency-modulated tones.

    Science.gov (United States)

    Zygierewicz, J; Sieluzycki, C; König, R; Durka, P J

    2008-02-15

    We introduce a complete framework for the calculation of statistically significant event-related desynchronization and synchronization (ERD/ERS) in the time-frequency plane for magnetoencephalographic (MEG) data, and provide free Internet access to software and illustrative datasets related to a classification task of frequency-modulated (FM) tones. Event-related changes in MEG were analysed on the basis of the normal component of the magnetic field acquired by the 148 magnetometers of the hardware configuration of our whole-head MEG device, and by computing planar gradients in longitudinal and latitudinal direction. Time-frequency energy density for the magnetometer as well as the two gradient configurations is first approximated using short-time Fourier transform. Subsequently, detailed information is obtained from high-resolution time-frequency maps for the most interesting sensors by means of the computationally much more demanding matching pursuit parametrization. We argue that the ERD/ERS maps are easier to interpret in the gradient approaches and discuss the superior resolution of the matching pursuit time-frequency representation compared to short-time Fourier and wavelet transforms. Experimental results are accompanied by the following resources, available from http://brain.fuw.edu.pl/MEG: (a) 48 high-resolution figures presenting the results of four subjects in all applicable settings, (b) raw datasets, and (c) complete software environment, allowing to recompute these figures from the raw datasets.

  2. Joint time-frequency analysis of EEG signals based on a phase-space interpretation of the recording process

    Science.gov (United States)

    Testorf, M. E.; Jobst, B. C.; Kleen, J. K.; Titiz, A.; Guillory, S.; Scott, R.; Bujarski, K. A.; Roberts, D. W.; Holmes, G. L.; Lenck-Santini, P.-P.

    2012-10-01

    Time-frequency transforms are used to identify events in clinical EEG data. Data are recorded as part of a study for correlating the performance of human subjects during a memory task with pathological events in the EEG, called spikes. The spectrogram and the scalogram are reviewed as tools for evaluating spike activity. A statistical evaluation of the continuous wavelet transform across trials is used to quantify phase-locking events. For simultaneously improving the time and frequency resolution, and for representing the EEG of several channels or trials in a single time-frequency plane, a multichannel matching pursuit algorithm is used. Fundamental properties of the algorithm are discussed as well as preliminary results, which were obtained with clinical EEG data.

  3. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    Energy Technology Data Exchange (ETDEWEB)

    Khristova, R; Kaltchev, B; Dimitrov, B [Energoproekt, Sofia (Bulgaria); Nedyalkova, D; Sonev, A [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1996-12-31

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs.

  4. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    International Nuclear Information System (INIS)

    Khristova, R.; Kaltchev, B.; Dimitrov, B.; Nedyalkova, D.; Sonev, A.

    1995-01-01

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs

  5. FREQUENCY OF SOLAR-LIKE SYSTEMS AND OF ICE AND GAS GIANTS BEYOND THE SNOW LINE FROM HIGH-MAGNIFICATION MICROLENSING EVENTS IN 2005-2008

    International Nuclear Information System (INIS)

    Gould, A.; Dong, Subo; Gaudi, B. S.; Han, C.

    2010-01-01

    We present the first measurement of the planet frequency beyond the 'snow line', for the planet-to-star mass-ratio interval -4.5 2 N pl )/(d log q d log s) = (0.36±0.15) dex -2 at the mean mass ratio q = 5 x 10 -4 with no discernible deviation from a flat (Oepik's law) distribution in log-projected separation s. The determination is based on a sample of six planets detected from intensive follow-up observations of high-magnification (A>200) microlensing events during 2005-2008. The sampled host stars have a typical mass M host ∼ 0.5 M sun , and detection is sensitive to planets over a range of planet-star-projected separations (s -1 max R E , s max R E ), where R E ∼ 3.5 AU(M host /M sun ) 1/2 is the Einstein radius and s max ∼ (q/10 -4.3 ) 1/3 . This corresponds to deprojected separations roughly three times the 'snow line'. We show that the observations of these events have the properties of a 'controlled experiment', which is what permits measurement of absolute planet frequency. High-magnification events are rare, but the survey-plus-follow-up high-magnification channel is very efficient: half of all high-mag events were successfully monitored and half of these yielded planet detections. The extremely high sensitivity of high-mag events leads to a policy of monitoring them as intensively as possible, independent of whether they show evidence of planets. This is what allows us to construct an unbiased sample. The planet frequency derived from microlensing is a factor 8 larger than the one derived from Doppler studies at factor ∼25 smaller star-planet separations (i.e., periods 2-2000 days). However, this difference is basically consistent with the gradient derived from Doppler studies (when extrapolated well beyond the separations from which it is measured). This suggests a universal separation distribution across 2 dex in planet-star separation, 2 dex in mass ratio, and 0.3 dex in host mass. Finally, if all planetary systems were 'analogs' of the solar

  6. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  7. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  8. SUBTLEX-ESP: Spanish Word Frequencies Based on Film Subtitles

    Science.gov (United States)

    Cuetos, Fernando; Glez-Nosti, Maria; Barbon, Analia; Brysbaert, Marc

    2011-01-01

    Recent studies have shown that word frequency estimates obtained from films and television subtitles are better to predict performance in word recognition experiments than the traditional word frequency estimates based on books and newspapers. In this study, we present a subtitle-based word frequency list for Spanish, one of the most widely spoken…

  9. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  10. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  11. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    Science.gov (United States)

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  12. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  13. Activated Very Low Frequency Earthquakes By the Slow Slip Events in the Ryukyu Subduction Zone

    Science.gov (United States)

    Nakamura, M.; Sunagawa, N.

    2014-12-01

    The Ryukyu Trench (RT), where the Philippine Sea plate is subducting, has had no known thrust earthquakes with a Mw>8.0 in the last 300 years. However, the rupture source of the 1771 tsunami has been proposed as an Mw > 8.0 earthquake in the south RT. Based on the dating of tsunami boulders, it has been estimated that large tsunamis occur at intervals of 150-400 years in the south Ryukyu arc (RA) (Araoka et al., 2013), although they have not occurred for several thousand years in the central and northern Ryukyu areas (Goto et al., 2014). To address the discrepancy between recent low moment releases by earthquakes and occurrence of paleo-tsunamis in the RT, we focus on the long-term activity of the very low frequency earthquakes (VLFEs), which are good indicators of the stress release in the shallow plate interface. VLFEs have been detected along the RT (Ando et al., 2012), which occur on the plate interface or at the accretionary prism. We used broadband data from the F-net of NIED along the RT and from the IRIS network. We applied two filters to all the raw broadband seismograms: a 0.02-0.05 Hz band-pass filter and a 1 Hz high-pass filter. After identification of the low-frequency events from the band-pass-filtered seismograms, the local and teleseismic events were removed. Then we picked the arrival time of the maximum amplitude of the surface wave of the VLFEs and determined the epicenters. VLFEs occurred on the RA side within 100 km from the trench axis along the RT. Distribution of the 6670 VLFEs from 2002 to 2013 could be divided to several clusters. Principal large clusters were located at 27.1°-29.0°N, 25.5°-26.6°N, and 122.1°-122.4°E (YA). We found that the VLFEs of the YA are modulated by repeating slow slip events (SSEs) which occur beneath south RA. The activity of the VLFEs increased to two times of its ordinary rate in 15 days after the onset of the SSEs. Activation of the VLFEs could be generated by low stress change of 0.02-20 kPa increase in

  14. A Kalman-based Fundamental Frequency Estimation Algorithm

    DEFF Research Database (Denmark)

    Shi, Liming; Nielsen, Jesper Kjær; Jensen, Jesper Rindom

    2017-01-01

    Fundamental frequency estimation is an important task in speech and audio analysis. Harmonic model-based methods typically have superior estimation accuracy. However, such methods usually as- sume that the fundamental frequency and amplitudes are station- ary over a short time frame. In this pape...

  15. Hierarchical Control of Thermostatically Controller Loads for Primary Frequency Control

    DEFF Research Database (Denmark)

    Zhao, Haoran; Wu, Qiuwei; Huang, Shaojun

    2016-01-01

    reserve references. At the middle level, distribution substations estimate the available power of TCLs based on the aggregated bin model, and dispatch control signals to individual TCLs. At the local level, a supplementary frequency control loop is implemented at the local controller, which makes TCLs...... respond to the frequency event autonomously. Case studies show that the proposed controller can efficiently respond to frequency events and fulfill the requirement specified by the system operator. The users’ comforts are not compromised and the short cycling of TCLs is largely reduced. Due...... to the autonomous control, the communication requirement is minimized....

  16. Dust events in Beijing, China (2004–2006: comparison of ground-based measurements with columnar integrated observations

    Directory of Open Access Journals (Sweden)

    Z. J. Wu

    2009-09-01

    Full Text Available Ambient particle number size distributions spanning three years were used to characterize the frequency and intensity of atmospheric dust events in the urban areas of Beijing, China in combination with AERONET sun/sky radiometer data. Dust events were classified into two types based on the differences in particle number and volume size distributions and local weather conditions. This categorization was confirmed by aerosol index images, columnar aerosol optical properties, and vertical potential temperature profiles. During the type-1 events, dust particles dominated the total particle volume concentration (<10 μm, with a relative share over 70%. Anthropogenic particles in the Aitken and accumulation mode played a subordinate role here because of high wind speeds (>4 m s−1. The type-2 events occurred in rather stagnant air masses and were characterized by a lower volume fraction of coarse mode particles (on average, 55%. Columnar optical properties showed that the superposition of dust and anthropogenic aerosols in type-2 events resulted in a much higher AOD (average: 1.51 than for the rather pure dust aerosols in type-1 events (average AOD: 0.36. A discrepancy was found between the ground-based and column integrated particle volume size distributions, especially for the coarse mode particles. This discrepancy likely originates from both the limited comparability of particle volume size distributions derived from Sun photometer and in situ number size distributions, and the inhomogeneous vertical distribution of particles during dust events.

  17. Do changes in the frequency, magnitude and timing of extreme climatic events threaten the population viability of coastal birds?

    NARCIS (Netherlands)

    van de Pol, Martijn; Ens, Bruno J.; Heg, Dik; Brouwer, Lyanne; Krol, Johan; Maier, Martin; Exo, Klaus-Michael; Oosterbeek, Kees; Lok, Tamar; Eising, Corine M.; Koffijberg, Kees

    P>1. Climate change encompasses changes in both the means and the extremes of climatic variables, but the population consequences of the latter are intrinsically difficult to study. 2. We investigated whether the frequency, magnitude and timing of rare but catastrophic flooding events have changed

  18. Triple-Frequency GPS Precise Point Positioning Ambiguity Resolution Using Dual-Frequency Based IGS Precise Clock Products

    Directory of Open Access Journals (Sweden)

    Fei Liu

    2017-01-01

    Full Text Available With the availability of the third civil signal in the Global Positioning System, triple-frequency Precise Point Positioning ambiguity resolution methods have drawn increasing attention due to significantly reduced convergence time. However, the corresponding triple-frequency based precise clock products are not widely available and adopted by applications. Currently, most precise products are generated based on ionosphere-free combination of dual-frequency L1/L2 signals, which however are not consistent with the triple-frequency ionosphere-free carrier-phase measurements, resulting in inaccurate positioning and unstable float ambiguities. In this study, a GPS triple-frequency PPP ambiguity resolution method is developed using the widely used dual-frequency based clock products. In this method, the interfrequency clock biases between the triple-frequency and dual-frequency ionosphere-free carrier-phase measurements are first estimated and then applied to triple-frequency ionosphere-free carrier-phase measurements to obtain stable float ambiguities. After this, the wide-lane L2/L5 and wide-lane L1/L2 integer property of ambiguities are recovered by estimating the satellite fractional cycle biases. A test using a sparse network is conducted to verify the effectiveness of the method. The results show that the ambiguity resolution can be achieved in minutes even tens of seconds and the positioning accuracy is in decimeter level.

  19. A data base approach for prediction of deforestation-induced mass wasting events

    Science.gov (United States)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  20. Rydberg-atom based radio-frequency electrometry using frequency modulation spectroscopy in room temperature vapor cells.

    Science.gov (United States)

    Kumar, Santosh; Fan, Haoquan; Kübler, Harald; Jahangiri, Akbar J; Shaffer, James P

    2017-04-17

    Rydberg atom-based electrometry enables traceable electric field measurements with high sensitivity over a large frequency range, from gigahertz to terahertz. Such measurements are particularly useful for the calibration of radio frequency and terahertz devices, as well as other applications like near field imaging of electric fields. We utilize frequency modulated spectroscopy with active control of residual amplitude modulation to improve the signal to noise ratio of the optical readout of Rydberg atom-based radio frequency electrometry. Matched filtering of the signal is also implemented. Although we have reached similarly, high sensitivity with other read-out methods, frequency modulated spectroscopy is advantageous because it is well-suited for building a compact, portable sensor. In the current experiment, ∼3 µV cm-1 Hz-1/2 sensitivity is achieved and is found to be photon shot noise limited.

  1. Event generators for address event representation transmitters

    Science.gov (United States)

    Serrano-Gotarredona, Rafael; Serrano-Gotarredona, Teresa; Linares Barranco, Bernabe

    2005-06-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows for real-time virtual massive connectivity between huge number neurons located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate 'events' according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. In a typical AER transmitter chip, there is an array of neurons that generate events. They send events to a peripheral circuitry (let's call it "AER Generator") that transforms those events to neurons coordinates (addresses) which are put sequentially on an interchip high speed digital bus. This bus includes a parallel multi-bit address word plus a Rqst (request) and Ack (acknowledge) handshaking signals for asynchronous data exchange. There have been two main approaches published in the literature for implementing such "AER Generator" circuits. They differ on the way of handling event collisions coming from the array of neurons. One approach is based on detecting and discarding collisions, while the other incorporates arbitration for sequencing colliding events . The first approach is supposed to be simpler and faster, while the second is able to handle much higher event traffic. In this article we will concentrate on the second arbiter-based approach. Boahen has been publishing several techniques for implementing and improving the arbiter based approach. Originally, he proposed an arbitration squeme by rows, followed by a column arbitration. In this scheme, while one neuron was selected by the arbiters to transmit his event out of the chip, the rest of neurons in the array were

  2. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  3. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  4. Application of precursor methodology in initiating frequency estimates

    International Nuclear Information System (INIS)

    Kohut, P.; Fitzpatrick, R.G.

    1991-01-01

    The precursor methodology developed in recent years provides a consistent technique to identify important accident sequence precursors. It relies on operational events (extracting information from actual experience) and infers core damage scenarios based on expected safety system responses. The ranking or categorization of each precursor is determined by considering the full spectrum of potential core damage sequences. The methodology estimates the frequency of severe core damage based on the approach suggested by Apostolakis and Mosleh, which may lead to a potential overestimation of the severe-accident sequence frequency due to the inherent dependencies between the safety systems and the initiating events. The methodology is an encompassing attempt to incorporate most of the operating information available from nuclear power plants and is an attractive tool from the point of view of risk management. In this paper, a further extension of this methodology is discussed with regard to the treatment of initiating frequency of the accident sequences

  5. Frequencies and trends of significant characteristics of reported events in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Farber, G.; Matthes, H. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koln (Germany)

    2001-07-01

    In the frame of its support to the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety the GRS continuously performs in-depth technical analyses of reported events at operating nuclear power reactors in Germany which can be used for the determination of plant weaknesses with regard to reactor safety. During the last 18 months, in addition to those activities, the GRS has developed a data bank model for the statistical assessment of events. This model is based on a hierarchically structured, detailed coding system with respect to technical and safety relevant characteristics of the plants and the systematic characterization of plant-specific events. The data bank model is ready for practical application. Results of a first statistical evaluation, taking into account the data sets from the time period 1996 to 1999, are meanwhile available. By increasing the amount of data it will become possible to herewith improve the statements concerning trends of safety aspects. This report describes the coding system, the evaluation model, the data input and the evaluations performed during the period beginning in April 2000. (authors)

  6. Frequencies and trends of significant characteristics of reported events in Germany

    International Nuclear Information System (INIS)

    Farber, G.; Matthes, H.

    2001-01-01

    In the frame of its support to the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety the GRS continuously performs in-depth technical analyses of reported events at operating nuclear power reactors in Germany which can be used for the determination of plant weaknesses with regard to reactor safety. During the last 18 months, in addition to those activities, the GRS has developed a data bank model for the statistical assessment of events. This model is based on a hierarchically structured, detailed coding system with respect to technical and safety relevant characteristics of the plants and the systematic characterization of plant-specific events. The data bank model is ready for practical application. Results of a first statistical evaluation, taking into account the data sets from the time period 1996 to 1999, are meanwhile available. By increasing the amount of data it will become possible to herewith improve the statements concerning trends of safety aspects. This report describes the coding system, the evaluation model, the data input and the evaluations performed during the period beginning in April 2000. (authors)

  7. Low velocity target detection based on time-frequency image for high frequency ground wave radar

    Institute of Scientific and Technical Information of China (English)

    YAN Songhua; WU Shicai; WEN Biyang

    2007-01-01

    The Doppler spectral broadening resulted from non-stationary movement of target and radio-frequency interference will decrease the veracity of target detection by high frequency ground wave(HEGW)radar.By displaying the change of signal energy on two dimensional time-frequency images based on time-frequency analysis,a new mathematical morphology method to distinguish target from nonlinear time-frequency curves is presented.The analyzed results from the measured data verify that with this new method the target can be detected correctly from wide Doppler spectrum.

  8. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs

  9. Methods for tornado frequency calculation of nuclear power plant

    International Nuclear Information System (INIS)

    Liu Haibin; Li Lin

    2012-01-01

    In order to take probabilistic safety assessment of nuclear power plant tornado attack event, a method to calculate tornado frequency of nuclear power plant is introduced based on HAD 101/10 and NUREG/CR-4839 references. This method can consider history tornado frequency of the plant area, construction dimension, intensity various along with tornado path and area distribution and so on and calculate the frequency of different scale tornado. (authors)

  10. Single event burnout sensitivity of embedded field effect transistors

    International Nuclear Information System (INIS)

    Koga, R.; Crain, S.H.; Crawford, K.B.; Yu, P.; Gordon, M.J.

    1999-01-01

    Observations of single event burnout (SEB) in embedded field effect transistors are reported. Both SEB and other single event effects are presented for several pulse width modulation and high frequency devices. The microscope has been employed to locate and to investigate the damaged areas. A model of the damage mechanism based on the results so obtained is described

  11. Single event burnout sensitivity of embedded field effect transistors

    Energy Technology Data Exchange (ETDEWEB)

    Koga, R.; Crain, S.H.; Crawford, K.B.; Yu, P.; Gordon, M.J.

    1999-12-01

    Observations of single event burnout (SEB) in embedded field effect transistors are reported. Both SEB and other single event effects are presented for several pulse width modulation and high frequency devices. The microscope has been employed to locate and to investigate the damaged areas. A model of the damage mechanism based on the results so obtained is described.

  12. Neural Network Based Load Frequency Control for Restructuring ...

    African Journals Online (AJOL)

    Neural Network Based Load Frequency Control for Restructuring Power Industry. ... an artificial neural network (ANN) application of load frequency control (LFC) of a Multi-Area power system by using a neural network controller is presented.

  13. Seismology-based early identification of dam-formation landquake events.

    Science.gov (United States)

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  14. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, M.; Gallagher, D. L.; Whitt, A.; Six, N. Frank (Technical Monitor)

    2002-01-01

    For the past four years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of science communication through the web resources on the Internet. The program includes extended stories about NAS.4 science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. We give here, examples of events, problems, and lessons learned from these activities.

  15. A new frequency matching technique for FRF-based model updating

    Science.gov (United States)

    Yang, Xiuming; Guo, Xinglin; Ouyang, Huajiang; Li, Dongsheng

    2017-05-01

    Frequency Response Function (FRF) residues have been widely used to update Finite Element models. They are a kind of original measurement information and have the advantages of rich data and no extraction errors, etc. However, like other sensitivity-based methods, an FRF-based identification method also needs to face the ill-conditioning problem which is even more serious since the sensitivity of the FRF in the vicinity of a resonance is much greater than elsewhere. Furthermore, for a given frequency measurement, directly using a theoretical FRF at a frequency may lead to a huge difference between the theoretical FRF and the corresponding experimental FRF which finally results in larger effects of measurement errors and damping. Hence in the solution process, correct selection of the appropriate frequency to get the theoretical FRF in every iteration in the sensitivity-based approach is an effective way to improve the robustness of an FRF-based algorithm. A primary tool for right frequency selection based on the correlation of FRFs is the Frequency Domain Assurance Criterion. This paper presents a new frequency selection method which directly finds the frequency that minimizes the difference of the order of magnitude between the theoretical and experimental FRFs. A simulated truss structure is used to compare the performance of different frequency selection methods. For the sake of reality, it is assumed that not all the degrees of freedom (DoFs) are available for measurement. The minimum number of DoFs required in each approach to correctly update the analytical model is regarded as the right identification standard.

  16. Towards the Realization of Graphene Based Flexible Radio Frequency Receiver

    Directory of Open Access Journals (Sweden)

    Maruthi N. Yogeesh

    2015-11-01

    Full Text Available We report on our progress and development of high speed flexible graphene field effect transistors (GFETs with high electron and hole mobilities (~3000 cm2/V·s, and intrinsic transit frequency in the microwave GHz regime. We also describe the design and fabrication of flexible graphene based radio frequency system. This RF communication system consists of graphite patch antenna at 2.4 GHz, graphene based frequency translation block (frequency doubler and AM demodulator and graphene speaker. The communication blocks are utilized to demonstrate graphene based amplitude modulated (AM radio receiver operating at 2.4 GHz.

  17. [Frequency and Type of Traumatic Events in Children and Adolescents with a Posttraumatic Stress Disorder].

    Science.gov (United States)

    Loos, Sabine; Wolf, Saskia; Tutus, Dunja; Goldbeck, Lutz

    2015-01-01

    The risk for children and adolescents to be exposed to a potentially traumatic event (PTE) is high. The present study examines the frequency of PTEs in children and adolescents with Posttraumatic Stress Disorder (PTSD), the type of index trauma, and its relation to PTSD symptom severity and gender. A clinical sample of 159 children and adolescents between 7-16 years was assessed using the Clinician-Administered PTSD Scale for Children and Adolescents (CAPS-CA). All reported PTEs from the checklist were analyzed according to frequency. The index events were categorized according to the following categories: cause (random vs. intentional), relation to offender (intrafamilial vs. extrafamilial), patient's role (victim, witness or vicarious traumatization), and type of PTE (physical or sexual violence). Relation between categories and PTSD symptom severity and sex were analyzed with inferential statistics. On average participants reported five PTEs, most frequently physical violence without weapons (57.9%), loss of loved person through death (45.9%), and sexual abuse/assaults (44%). The most frequent index traumata were intentional (76.7%). Regarding trauma type, there was a significant difference concerning higher symptom severity in children and adolescents who experienced sexual abuse/assault compared to physical violence (t=-1.913(109), p=0.05). A significantly higher symptom severity was found for girls compared to boys for the trauma categories extrafamilial offender (z=-2,27, p=0.02), victim (z=-2,11, p=0,04), and sexual abuse/assault (z=-2,43, p=0,01). Clinical and diagnostic implications are discussed in relation to the amendments of PTSD diagnostic criteria in DSM-5.

  18. Study of Updating Initiating Event Frequency using Prognostics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of)

    2014-10-15

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA.

  19. Study of Updating Initiating Event Frequency using Prognostics

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung

    2014-01-01

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA

  20. Short-Term Effects of Changing Precipitation Patterns on Shrub-Steppe Grasslands: Seasonal Watering Is More Important than Frequency of Watering Events.

    Science.gov (United States)

    Densmore-McCulloch, Justine A; Thompson, Donald L; Fraser, Lauchlan H

    2016-01-01

    Climate change is expected to alter precipitation patterns. Droughts may become longer and more frequent, and the timing and intensity of precipitation may change. We tested how shifting precipitation patterns, both seasonally and by frequency of events, affects soil nitrogen availability, plant biomass and diversity in a shrub-steppe temperate grassland along a natural productivity gradient in Lac du Bois Grasslands Protected Area near Kamloops, British Columbia, Canada. We manipulated seasonal watering patterns by either exclusively watering in the spring or the fall. To simulate spring precipitation we restricted precipitation inputs in the fall, then added 50% more water than the long term average in the spring, and vice-versa for the fall precipitation treatment. Overall, the amount of precipitation remained roughly the same. We manipulated the frequency of rainfall events by either applying water weekly (frequent) or monthly (intensive). After 2 years, changes in the seasonality of watering had greater effects on plant biomass and diversity than changes in the frequency of watering. Fall watering reduced biomass and increased species diversity, while spring watering had little effect. The reduction in biomass in fall watered treatments was due to a decline in grasses, but not forbs. Plant available N, measured by Plant Root Simulator (PRS)-probes, increased from spring to summer to fall, and was higher in fall watered treatments compared to spring watered treatments when measured in the fall. The only effect observed due to frequency of watering events was greater extractable soil N in monthly applied treatments compared to weekly watering treatments. Understanding the effects of changing precipitation patterns on grasslands will allow improved grassland conservation and management in the face of global climatic change, and here we show that if precipitation is more abundant in the fall, compared to the spring, grassland primary productivity will likely be

  1. On frequency-weighted coprime factorization based controller reduction

    OpenAIRE

    Varga, Andras

    2003-01-01

    We consider the efficient solution of a class of coprime factorization based controller approximation problems by using frequency-weighted balancing related model reduction approaches. It is shown that for some special stability enforcing frequency-weights, the computation of the frequency-weighted controllability and observability grammians can be done by solving reduced order Lyapunov equations. The new approach can be used in conjunction with accuracy enhancing square-root and balancing-fr...

  2. Calculation of the n-th coincidences frequency

    International Nuclear Information System (INIS)

    Mercier, C.

    1959-01-01

    Events can occur randomly with a given frequency. Each event lasts a Θ-time. During this Θ-time other events can occur. A coincidence beginning of order n at a t-time is when an event occurs while n other events already occurred between t-Θ and t. In this work the frequency of coincidence beginnings with an order greater than or equal to n is established

  3. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  4. Frequency of Testing for Dyslipidemia: An Evidence-Based Analysis

    Science.gov (United States)

    2014-01-01

    Background Dyslipidemias include high levels of total cholesterol, low-density lipoprotein (LDL) cholesterol, and triglycerides and low levels of high-density lipoprotein (HDL) cholesterol. Dyslipidemia is a risk factor for cardiovascular disease, which is a major contributor to mortality in Canada. Approximately 23% of the 2009/11 Canadian Health Measures Survey (CHMS) participants had a high level of LDL cholesterol, with prevalence increasing with age, and approximately 15% had a total cholesterol to HDL ratio above the threshold. Objectives To evaluate the frequency of lipid testing in adults not diagnosed with dyslipidemia and in adults on treatment for dyslipidemia. Research Methods A systematic review of the literature set out to identify randomized controlled trials (RCTs), systematic reviews, health technology assessments (HTAs), and observational studies published between January 1, 2000, and November 29, 2012, that evaluated the frequency of testing for dyslipidemia in the 2 populations. Results Two observational studies assessed the frequency of lipid testing, 1 in individuals not on lipid-lowering medications and 1 in treated individuals. Both studies were based on previously collected data intended for a different objective and, therefore, no conclusions could be reached about the frequency of testing at intervals other than the ones used in the original studies. Given this limitation and generalizability issues, the quality of evidence was considered very low. No evidence for the frequency of lipid testing was identified in the 2 HTAs included. Canadian and international guidelines recommend testing for dyslipidemia in individuals at an increased risk for cardiovascular disease. The frequency of testing recommended is based on expert consensus. Conclusions Conclusions on the frequency of lipid testing could not be made based on the 2 observational studies. Current guidelines recommend lipid testing in adults with increased cardiovascular risk, with

  5. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  6. High-resolution mid-IR spectrometer based on frequency upconversion

    DEFF Research Database (Denmark)

    Hu, Qi; Dam, Jeppe Seidelin; Pedersen, Christian

    2012-01-01

    We demonstrate a novel approach for high-resolution spectroscopy based on frequency upconversion and postfiltering by means of a scanning Fabryx2013;Perot interferometer. The system is based on sum-frequency mixing, shifting the spectral content from the mid-infrared to the near-visible region al......-frequency 1064xA0;nm laser. We investigate water vapor emission lines from a butane burner and compare the measured results to model data. The presented method we suggest to be used for real-time monitoring of specific gas lines and reference signals....

  7. Developing future precipitation events from historic events: An Amsterdam case study.

    Science.gov (United States)

    Manola, Iris; van den Hurk, Bart; de Moel, Hans; Aerts, Jeroen

    2016-04-01

    Due to climate change, the frequency and intensity of extreme precipitation events is expected to increase. It is therefore of high importance to develop climate change scenarios tailored towards the local and regional needs of policy makers in order to develop efficient adaptation strategies to reduce the risks from extreme weather events. Current approaches to tailor climate scenarios are often not well adopted in hazard management, since average changes in climate are not a main concern to policy makers, and tailoring climate scenarios to simulate future extremes can be complex. Therefore, a new concept has been introduced recently that uses known historic extreme events as a basis, and modifies the observed data for these events so that the outcome shows how the same event would occur in a warmer climate. This concept is introduced as 'Future Weather', and appeals to the experience of stakeholders and users. This research presents a novel method of projecting a future extreme precipitation event, based on a historic event. The selected precipitation event took place over the broader area of Amsterdam, the Netherlands in the summer of 2014, which resulted in blocked highways, disruption of air transportation, flooded buildings and public facilities. An analysis of rain monitoring stations showed that an event of such intensity has a 5 to 15 years return period. The method of projecting a future event follows a non-linear delta transformation that is applied directly on the observed event assuming a warmer climate to produce an "up-scaled" future precipitation event. The delta transformation is based on the observed behaviour of the precipitation intensity as a function of the dew point temperature during summers. The outcome is then compared to a benchmark method using the HARMONIE numerical weather prediction model, where the boundary conditions of the event from the Ensemble Prediction System of ECMWF (ENS) are perturbed to indicate a warmer climate. The two

  8. Research on Visual Analysis Methods of Terrorism Events

    Science.gov (United States)

    Guo, Wenyue; Liu, Haiyan; Yu, Anzhu; Li, Jing

    2016-06-01

    Under the situation that terrorism events occur more and more frequency throughout the world, improving the response capability of social security incidents has become an important aspect to test governments govern ability. Visual analysis has become an important method of event analysing for its advantage of intuitive and effective. To analyse events' spatio-temporal distribution characteristics, correlations among event items and the development trend, terrorism event's spatio-temporal characteristics are discussed. Suitable event data table structure based on "5W" theory is designed. Then, six types of visual analysis are purposed, and how to use thematic map and statistical charts to realize visual analysis on terrorism events is studied. Finally, experiments have been carried out by using the data provided by Global Terrorism Database, and the results of experiments proves the availability of the methods.

  9. Prestress Force Identification for Externally Prestressed Concrete Beam Based on Frequency Equation and Measured Frequencies

    Directory of Open Access Journals (Sweden)

    Luning Shi

    2014-01-01

    Full Text Available A prestress force identification method for externally prestressed concrete uniform beam based on the frequency equation and the measured frequencies is developed. For the purpose of the prestress force identification accuracy, we first look for the appropriate method to solve the free vibration equation of externally prestressed concrete beam and then combine the measured frequencies with frequency equation to identify the prestress force. To obtain the exact solution of the free vibration equation of multispan externally prestressed concrete beam, an analytical model of externally prestressed concrete beam is set up based on the Bernoulli-Euler beam theory and the function relation between prestress variation and vibration displacement is built. The multispan externally prestressed concrete beam is taken as the multiple single-span beams which must meet the bending moment and rotation angle boundary conditions, the free vibration equation is solved using sublevel simultaneous method and the semi-analytical solution of the free vibration equation which considered the influence of prestress on section rigidity and beam length is obtained. Taking simply supported concrete beam and two-span concrete beam with external tendons as examples, frequency function curves are obtained with the measured frequencies into it and the prestress force can be identified using the abscissa of the crosspoint of frequency functions. Identification value of the prestress force is in good agreement with the test results. The method can accurately identify prestress force of externally prestressed concrete beam and trace the trend of effective prestress force.

  10. Radar network communication through sensing of frequency hopping

    Science.gov (United States)

    Dowla, Farid; Nekoogar, Faranak

    2013-05-28

    In one embodiment, a radar communication system includes a plurality of radars having a communication range and being capable of operating at a sensing frequency and a reporting frequency, wherein the reporting frequency is different than the sensing frequency, each radar is adapted for operating at the sensing frequency until an event is detected, each radar in the plurality of radars has an identification/location frequency for reporting information different from the sensing frequency, a first radar of the radars which senses the event sends a reporting frequency corresponding to its identification/location frequency when the event is detected, and all other radars in the plurality of radars switch their reporting frequencies to match the reporting frequency of the first radar upon detecting the reporting frequency switch of a radar within the communication range. In another embodiment, a method is presented for communicating information in a radar system.

  11. Circumvention of noise contributions in fiber laser based frequency combs.

    Science.gov (United States)

    Benkler, Erik; Telle, Harald; Zach, Armin; Tauser, Florian

    2005-07-25

    We investigate the performance of an Er:fiber laser based femtosecond frequency comb for precision metrological applications. Instead of an active stabilization of the comb, the fluctuations of the carrier-envelope offset phase, the repetition phase, and the phase of the beat from a comb line with an optical reference are synchronously detected. We show that these fluctuations can be effectively eliminated by exploiting their known correlation. In our experimental scheme, we utilize two identically constructed frequency combs for the measurement of the fluctuations, rejecting the influence of a shared optical reference. From measuring a white frequency noise level, we demonstrate that a fractional frequency instability better than 1.4 x 10(-14) for 1 s averaging time can be achieved in frequency metrology applications using the Er:fiber based frequency comb.

  12. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  13. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Science.gov (United States)

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  14. Transportation planning for planned special events

    Science.gov (United States)

    2011-05-01

    Unique among planned special event activities are those events that carry the National Special Security Event (NSSE) designation. NSSEs occur with some frequency, with 35 of these events held between September 1998 and February 2010. These events inc...

  15. Flextensional fiber Bragg grating-based accelerometer for low frequency vibration measurement

    Institute of Scientific and Technical Information of China (English)

    Jinghua Zhang; Xueguang Qiao; Manli Hu; Zhongyao Feng; Hong Gao; Yang Yang; Rui Zhou

    2011-01-01

    @@ The intelligent structural health monitoring method,which uses a fiber Bragg grating(FBG)sensor,is a new approach in the field of civil engineering.However,it lacks a reliable FBG-based accelerometer for taking structural low frequency vibration measurements.In this letter,a flextensional FBG-based accelerometer is proposed and demonstrated.The experimental results indicate that the natural frequency of the developed accelerometer is 16.7 Hz,with a high sensitivity of 410.7 pm/g.In addition,it has a broad and flat response over low frequencies ranging from 1 to 10 Hz.The natural frequency and sensitivity of the accelerometer can be tuned by adding mass to tailor the sensor performance to specific applications.Experimental results are presented to demonstrate the good performance of the proposed FBG-based accelerometer.These results show that the proposed accelerometer is satisfactory for low frequency vibration measurements.%The intelligent structural health monitoring method, which uses a fiber Bragg grating {FBG} sensor, ie a new approach in the field of civil engineering. However, it lacks a reliable FBG-based accelerometer for taking structural low frequency vibration measurements. In this letter, a flextensional FBG-based accelerometer is proposed and demonstrated. The experimental results indicate that the natural frequency of the developed accelerometer is 16.7 Hz, with a high sensitivity of 410.7 pm/g. In addition, it has a broad and flat response over low frequencies ranging from 1 to 10 Hz. The natural frequency and sensitivity of the accelerometer can be tuned by adding mass to tailor the sensor performance to specific applications. Experimental results are presented to demonstrate the good performance of the proposed FBG-based accelerometer. These results show that the proposed accelerometer is satisfactory for low frequency vibration measurements.

  16. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  17. Automatic Detection and Classification of Audio Events for Road Surveillance Applications

    Directory of Open Access Journals (Sweden)

    Noor Almaadeed

    2018-06-01

    Full Text Available This work investigates the problem of detecting hazardous events on roads by designing an audio surveillance system that automatically detects perilous situations such as car crashes and tire skidding. In recent years, research has shown several visual surveillance systems that have been proposed for road monitoring to detect accidents with an aim to improve safety procedures in emergency cases. However, the visual information alone cannot detect certain events such as car crashes and tire skidding, especially under adverse and visually cluttered weather conditions such as snowfall, rain, and fog. Consequently, the incorporation of microphones and audio event detectors based on audio processing can significantly enhance the detection accuracy of such surveillance systems. This paper proposes to combine time-domain, frequency-domain, and joint time-frequency features extracted from a class of quadratic time-frequency distributions (QTFDs to detect events on roads through audio analysis and processing. Experiments were carried out using a publicly available dataset. The experimental results conform the effectiveness of the proposed approach for detecting hazardous events on roads as demonstrated by 7% improvement of accuracy rate when compared against methods that use individual temporal and spectral features.

  18. AN EMPIRICAL ANALYSIS OF THE INFLUENCE OF RISK FACTORS ON THE FREQUENCY AND IMPACT OF SEVERE EVENTS ON THE SUPPLY CHAIN IN THE CZECH REPUBLIC

    Directory of Open Access Journals (Sweden)

    José María Caridad

    2014-12-01

    Full Text Available Purpose: This paper is focused on an analysis and evaluation of severe events according to their frequency of occurrence and their impact on the company's manufacturing and distribution supply chains performance in the Czech Republic. Risk factors are introduced for critical events.Design/methodology: An identification and classification of severe events are realized on the basis of median mapping and mapping of ordinal variability acquired through the questionnaire survey of 82 companies. Analysis of 46 risk factors was sorted into 5 groups. We used asymmetric Somers's d statistics for testing the dependence of frequency and impact of a severe event on selected risk sources. The hierarchical cluster analysis is performed to identify relatively homogeneous groups of critical severe events according to their dependency on risk factors and its strength.Findings: Results showed that ‘a lack of contracts’ is considered to be the most critical severe event. Groups of demand and supply side and an external risk factor group were identified to be the most significant sources of risk factors. The worst cluster encompasses 11% of examined risk factors which should be prevented. We concluded that organizations need to adopt appropriate precautions and risk management methods in logistics.Originality: In this paper, the methodology for severe events evaluation in supply chain is designed. This methodology involves assessing the critical factors which influence the critical events and which should be prevented.

  19. Bivariate frequency analysis of rainfall intensity and duration for urban stormwater infrastructure design

    Science.gov (United States)

    Jun, Changhyun; Qin, Xiaosheng; Gan, Thian Yew; Tung, Yeou-Koung; De Michele, Carlo

    2017-10-01

    This study presents a storm-event based bivariate frequency analysis approach to determine design rainfalls in which, the number, intensity and duration of actual rainstorm events were considered. To derive more realistic design storms, the occurrence probability of an individual rainstorm event was determined from the joint distribution of storm intensity and duration through a copula model. Hourly rainfall data were used at three climate stations respectively located in Singapore, South Korea and Canada. It was found that the proposed approach could give a more realistic description of rainfall characteristics of rainstorm events and design rainfalls. As results, the design rainfall quantities from actual rainstorm events at the three studied sites are consistently lower than those obtained from the conventional rainfall depth-duration-frequency (DDF) method, especially for short-duration storms (such as 1-h). It results from occurrence probabilities of each rainstorm event and a different angle for rainfall frequency analysis, and could offer an alternative way of describing extreme rainfall properties and potentially help improve the hydrologic design of stormwater management facilities in urban areas.

  20. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  1. Hail frequency estimation across Europe based on a combination of overshooting top detections and the ERA-INTERIM reanalysis

    Science.gov (United States)

    Punge, H. J.; Bedka, K. M.; Kunz, M.; Reinbold, A.

    2017-12-01

    This article presents a hail frequency estimation based on the detection of cold overshooting cloud tops (OTs) from the Meteosat Second Generation (MSG) operational weather satellites, in combination with a hail-specific filter derived from the ERA-INTERIM reanalysis. This filter has been designed based on the atmospheric properties in the vicinity of hail reports registered in the European Severe Weather Database (ESWD). These include Convective Available Potential Energy (CAPE), 0-6-km bulk wind shear and freezing level height, evaluated at the nearest time step and interpolated from the reanalysis grid to the location of the hail report. Regions highly exposed to hail events include Northern Italy, followed by South-Eastern Austria and Eastern Spain. Pronounced hail frequency is also found in large parts of Eastern Europe, around the Alps, the Czech Republic, Southern Germany, Southern and Eastern France, and in the Iberic and Apennine mountain ranges.

  2. Spatial-Temporal Event Detection from Geo-Tagged Tweets

    Directory of Open Access Journals (Sweden)

    Yuqian Huang

    2018-04-01

    Full Text Available As one of the most popular social networking services in the world, Twitter allows users to post messages along with their current geographic locations. Such georeferenced or geo-tagged Twitter datasets can benefit location-based services, targeted advertising and geosocial studies. Our study focused on the detection of small-scale spatial-temporal events and their textual content. First, we used Spatial-Temporal Density-Based Spatial Clustering of Applications with Noise (ST-DBSCAN to spatially-temporally cluster the tweets. Then, the word frequencies were summarized for each cluster and the potential topics were modeled by the Latent Dirichlet Allocation (LDA algorithm. Using two years of Twitter data from four college cities in the U.S., we were able to determine the spatial-temporal patterns of two known events, two unknown events and one recurring event, which then were further explored and modeled to identify the semantic content about the events. This paper presents our process and recommendations for both finding event-related tweets as well as understanding the spatial-temporal behaviors and semantic natures of the detected events.

  3. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    Science.gov (United States)

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  4. Method for Assessing Grid Frequency Deviation Due to Wind Power Fluctuation Based on “Time-Frequency Transformation”

    DEFF Research Database (Denmark)

    Jin, Lin; Yuan-zhang, Sun; Sørensen, Poul Ejnar

    2012-01-01

    published studies are based entirely on deterministic methodology. This paper presents a novel assessment method based on Time-Frequency Transformation to overcome shortcomings of existing methods. The main contribution of the paper is to propose a stochastic process simulation model which is a better...... alternative of the existing dynamic frequency deviation simulation model. In this way, the method takes the stochastic wind power fluctuation into full account so as to give a quantitative risk assessment of grid frequency deviation to grid operators, even without using any dynamic simulation tool. The case...

  5. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  6. Estimation of core-damage frequency to evolutionary ALWR [advanced light water reactor] due to seismic initiating events: Task 4.3.3

    International Nuclear Information System (INIS)

    Brooks, R.D.; Harrison, D.G.; Summitt, R.L.

    1990-04-01

    The Electric Power Research Institute (EPRI) is presently developing a requirements document for the design of advanced light water reactors (ALWRs). One of the basic goals of the EPRI ALWR Requirements Document is that the core-damage frequency for an ALWR shall be less than 1.0E-5. To aid in this effort, the Department of Energy's Advanced Reactor Severe Accident Program (ARSAP) initiated a functional probabilistic risk assessment (PRA) to determine how effectively the evolutionary plant requirements contained in the existing EPRI Requirements Document assure that this safety goal will be met. This report develops an approximation of the core-damage frequency due to seismic events for both evolutionary plant designs (pressurized-water reactor (PWR) and boiling-water reactor(BWR)) as modeled in the corresponding functional PRAs. Component fragility values were taken directly form information which has been submitted for inclusion in Appendix A to Volume 1 of the EPRI Requirements Document. The results show a seismic core-damage frequency of 5.2E-6 for PWRS and 5.0E-6 for BWRs. Combined with the internal initiators from the functional PRAs, the overall core-damage frequencies are 6.0E-6 for the pwr and BWR, both of which satisfy the 1.0E-5 EPRI goal. In addition, site-specific considerations, such as more rigid components and less conservative fragility data and seismic hazard curves, may further reduce these frequencies. The effect of seismic events on structures are not addressed in this generic evaluation and should be addressed separately on a design-specific basis. 7 refs., 6 figs., 3 tabs

  7. Radio frequency picosecond phototube

    International Nuclear Information System (INIS)

    Margaryan, A.; Carlini, R.; Ent, R.; Grigoryan, N.; Gyunashyan, K.; Hashimoto, O.; Hovater, K.; Ispiryan, M.; Knyazyan, S.; Kross, B.; Majewski, S.; Marikyan, G.; Mkrtchyan, M.; Parlakyan, L.; Popov, V.; Tang, L.; Vardanyan, H.; Yan, C.; Zhamkochyan, S.; Zorn, C.

    2006-01-01

    We propose a photon detector for recording low-level and ultra-fast optical signals, based on radio frequency (RF) analysis of low-energy photoelectrons (PEs). By using currently developed 500 MHz RF deflector, it is possible to scan circularly and detect single PEs, amplified in multi-channel plates (MCPs). The operation of the tube is investigated by means of thermionic electron source. It is demonstrated that the signals generated in the MCP can be processed event by event; by using available nanosecond electronics and that time resolution better than 20 ps can be achieved. Timing characteristics of the Cherenkov detector with RF phototube in a 'head-on' geometry is investigated by means of Monte Carlo simulation

  8. Radio frequency picosecond phototube

    Energy Technology Data Exchange (ETDEWEB)

    Margaryan, A. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia)]. E-mail: mat@mail.yerphi.am; Carlini, R. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Ent, R. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Grigoryan, N. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Gyunashyan, K. [Yerevan State University of Architecture and Construction, Yerevan (Armenia); Hashimoto, O. [Tohoku University, Sendai 98-77 (Japan); Hovater, K. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Ispiryan, M. [University of Houston, 4800 Calhoun Rd, Houston TX 77204 (United States); Knyazyan, S. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Kross, B. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Majewski, S. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Marikyan, G. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Mkrtchyan, M. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Parlakyan, L. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Popov, V. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Tang, L. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Vardanyan, H. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Yan, C. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States); Zhamkochyan, S. [Yerevan Physics Institute, 2 Alikhanian Brothers Street, Yerevan 375036 (Armenia); Zorn, C. [Thomas Jefferson National Accelerator Facility, Newport News VA 23606 (United States)

    2006-10-15

    We propose a photon detector for recording low-level and ultra-fast optical signals, based on radio frequency (RF) analysis of low-energy photoelectrons (PEs). By using currently developed 500 MHz RF deflector, it is possible to scan circularly and detect single PEs, amplified in multi-channel plates (MCPs). The operation of the tube is investigated by means of thermionic electron source. It is demonstrated that the signals generated in the MCP can be processed event by event; by using available nanosecond electronics and that time resolution better than 20 ps can be achieved. Timing characteristics of the Cherenkov detector with RF phototube in a 'head-on' geometry is investigated by means of Monte Carlo simulation.

  9. Providing frequency regulation reserve services using demand response scheduling

    International Nuclear Information System (INIS)

    Motalleb, Mahdi; Thornton, Matsu; Reihani, Ehsan; Ghorbani, Reza

    2016-01-01

    Highlights: • Proposing a market model for contingency reserve services using demand response. • Considering transient limitations of grid frequency for inverter-based generations. • Price-sensitive scheduling of residential batteries and water heaters using dynamic programming. • Calculating the profits of both generation companies and demand response aggregators. - Abstract: During power grid contingencies, frequency regulation is a primary concern. Historically, frequency regulation during contingency events has been the sole responsibility of the power utility. We present a practical method of using distributed demand response scheduling to provide frequency regulation during contingency events. This paper discusses the implementation of a control system model for the use of distributed energy storage systems such as battery banks and electric water heaters as a source of ancillary services. We present an algorithm which handles the optimization of demand response scheduling for normal operation and during contingency events. We use dynamic programming as an optimization tool. A price signal is developed using optimal power flow calculations to determine the locational marginal price of electricity, while sensor data for water usage is also collected. Using these inputs to dynamic programming, the optimal control signals are given as output. We assume a market model in which distributed demand response resources are sold as a commodity on the open market and profits from demand response aggregators as brokers of distributed demand response resources can be calculated. In considering control decisions for regulation of transient changes in frequency, we focus on IEEE standard 1547 in order to prevent the safety shut-off of inverter-based generation and further exacerbation of frequency droop. This method is applied to IEEE case 118 as a demonstration of the method in practice.

  10. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  11. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  12. Linguistic spatial classifications of event domains in narratives of crime

    Directory of Open Access Journals (Sweden)

    Blake Stephen Howald

    2010-07-01

    Full Text Available Structurally, formal definitions of the linguistic narrative minimally require two temporally linked past-time events. The role of space in this definition, based on spatial language indicating where events occur, is considered optional and non-structural. However, based on narratives with a high frequency of spatial language, recent research has questioned this perspective, suggesting that space is more critical than may be readily apparent. Through an analysis of spatially rich serial criminal narratives, it will be demonstrated that spatial information qualitatively varies relative to narrative events. In particular, statistical classifiers in a supervised machine learning task achieve a 90% accuracy in predicting Pre-Crime, Crime, and Post-Crime events based on spatial (and temporal information. Overall, these results suggest a deeper spatial organization of discourse, which not only provides practical event resolution possibilities, but also challenges traditional formal linguistic definitions of narrative.

  13. Memory Processes in Frequency Judgment: The impact of pre-experimental frequencies and co-occurrences on frequency estimates.

    OpenAIRE

    Renkewitz, Frank

    2004-01-01

    Contemporary theories on frequency processing have been developed in different sub-disciplines of psychology and have shown remarkable discrepancies. Thus, in judgment and decision making, frequency estimates on serially encoded events are mostly traced back to the availability heuristic (Tversky & Kahneman, 1973). Evidence for the use of this heuristic comes from several popular demonstrations of biased frequency estimates. In the area of decision making, these demonstrations led to the ...

  14. SUBTLEX- AL: Albanian word frequencies based on film subtitles

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Rrezarta Avdyli

    2013-06-01

    Full Text Available Recently several studies have shown that word frequency estimation based on subtitle files explains better the variance in word recognition performance than traditional words frequency estimates did. The present study aims to show this frequency estimate in Albanian from more than 2M words coming from film subtitles. Our results show high correlation between the RT from a LD study (120 stimuli and the SUBTLEX- AL, as well as, high correlation between this and the unique existing frequency list of a hundred more frequent Albanian words. These findings suggest that SUBTLEX-AL it is good frequency estimation, furthermore, this is the first database of frequency estimation in Albanian larger than 100 words.

  15. Adverse Event extraction from Structured Product Labels using the Event-based Text-mining of Health Electronic Records (ETHER)system.

    Science.gov (United States)

    Pandey, Abhishek; Kreimeyer, Kory; Foster, Matthew; Botsis, Taxiarchis; Dang, Oanh; Ly, Thomas; Wang, Wei; Forshee, Richard

    2018-01-01

    Structured Product Labels follow an XML-based document markup standard approved by the Health Level Seven organization and adopted by the US Food and Drug Administration as a mechanism for exchanging medical products information. Their current organization makes their secondary use rather challenging. We used the Side Effect Resource database and DailyMed to generate a comparison dataset of 1159 Structured Product Labels. We processed the Adverse Reaction section of these Structured Product Labels with the Event-based Text-mining of Health Electronic Records system and evaluated its ability to extract and encode Adverse Event terms to Medical Dictionary for Regulatory Activities Preferred Terms. A small sample of 100 labels was then selected for further analysis. Of the 100 labels, Event-based Text-mining of Health Electronic Records achieved a precision and recall of 81 percent and 92 percent, respectively. This study demonstrated Event-based Text-mining of Health Electronic Record's ability to extract and encode Adverse Event terms from Structured Product Labels which may potentially support multiple pharmacoepidemiological tasks.

  16. A new approach for bioassays based on frequency- and time-domain measurements of magnetic nanoparticles.

    Science.gov (United States)

    Oisjöen, Fredrik; Schneiderman, Justin F; Astalan, Andrea Prieto; Kalabukhov, Alexey; Johansson, Christer; Winkler, Dag

    2010-01-15

    We demonstrate a one-step wash-free bioassay measurement system capable of tracking biochemical binding events. Our approach combines the high resolution of frequency- and high speed of time-domain measurements in a single device in combination with a fast one-step bioassay. The one-step nature of our magnetic nanoparticle (MNP) based assay reduces the time between sample extraction and quantitative results while mitigating the risks of contamination related to washing steps. Our method also enables tracking of binding events, providing the possibility of, for example, investigation of how chemical/biological environments affect the rate of a binding process or study of the action of certain drugs. We detect specific biological binding events occurring on the surfaces of fluid-suspended MNPs that modify their magnetic relaxation behavior. Herein, we extrapolate a modest sensitivity to analyte of 100 ng/ml with the present setup using our rapid one-step bioassay. More importantly, we determine the size-distributions of the MNP systems with theoretical fits to our data obtained from the two complementary measurement modalities and demonstrate quantitative agreement between them. Copyright 2009 Elsevier B.V. All rights reserved.

  17. U.S. Hail Frequency and the Global Wind Oscillation

    Science.gov (United States)

    Gensini, Vittorio A.; Allen, John T.

    2018-02-01

    Changes in Earth relative atmospheric angular momentum can be described by an index known as the Global Wind Oscillation. This global index accounts for changes in Earth's atmospheric budget of relative angular momentum through interactions of tropical convection anomalies, extratropical dynamics, and engagement of surface torques (e.g., friction and mountain). It is shown herein that U.S. hail events are more (less) likely to occur in low (high) atmospheric angular momentum base states when excluding weak Global Wind Oscillation days, with the strongest relationships found in the boreal spring and fall. Severe, significant severe, and giant hail events are more likely to occur during Global Wind Oscillation phases 8, 1, 2, and 3 during the peak of U.S. severe weather season. Lower frequencies of hail events are generally found in Global Wind Oscillation phases 4-7 but vary based on Global Wind Oscillation amplitude and month. In addition, probabilistic anomalies of atmospheric ingredients supportive of hail producing supercell thunderstorms closely mimic locations of reported hail frequency, helping to corroborate report results.

  18. Analysis of Loss-of-Offsite-Power Events 1997-2015

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Nancy Ellen [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schroeder, John Alton [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    Loss of offsite power (LOOP) can have a major negative impact on a power plant’s ability to achieve and maintain safe shutdown conditions. LOOP event frequencies and times required for subsequent restoration of offsite power are important inputs to plant probabilistic risk assessments. This report presents a statistical and engineering analysis of LOOP frequencies and durations at U.S. commercial nuclear power plants. The data used in this study are based on the operating experience during calendar years 1997 through 2015. LOOP events during critical operation that do not result in a reactor trip, are not included. Frequencies and durations were determined for four event categories: plant-centered, switchyard-centered, grid-related, and weather-related. Emergency diesel generator reliability is also considered (failure to start, failure to load and run, and failure to run more than 1 hour). There is an adverse trend in LOOP durations. The previously reported adverse trend in LOOP frequency was not statistically significant for 2006-2015. Grid-related LOOPs happen predominantly in the summer. Switchyard-centered LOOPs happen predominantly in winter and spring. Plant-centered and weather-related LOOPs do not show statistically significant seasonality. The engineering analysis of LOOP data shows that human errors have been much less frequent since 1997 than in the 1986 -1996 time period.

  19. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  20. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  1. Frequency scanning-based stability analysis method for grid-connected inverter system

    DEFF Research Database (Denmark)

    Wang, Yanbo; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion with conside......This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion...... with consideration of the inverter nonlinearities. Small current disturbance is injected into grid-connected inverter system in a particular frequency range, and the impedance is computed according to the harmonic-frequency response using Fourier analysis, and then the stability is predicted on the basis...... of the impedance stability criterion. The stability issues of grid-connected inverters with grid-current feedback and the converter-current feedback are addressed using the proposed method. The results obtained from simulation and experiments validate the effectiveness of the method. The frequency scanning...

  2. A frequency output ferroelectric phase PNZT capacitor-based temperature sensor

    KAUST Repository

    Khan, Naveed

    2016-09-05

    In this paper, a frequency output temperature sensor based on a 4% Niobium doped 20/80 Zr/Ti Lead Zirconate Titanate (PNZT) capacitor is proposed. The sensor capacitance vs temperature and capacitance vs voltage characteristics are experimentally measured below the Curie temperature of the ferroelectric capacitor. The capacitance of the 20/80 (Zr/Ti) composition PNZT capacitor changes by 29% for a temperature change from 10°C to 100°C, which translates to 0.32%/°C temperature sensitivity. The measured sensor characteristics show less than ∼0.7°C deviation from the ideal linear response. A Wien bridge oscillator based temperature sensor is demonstrated based on the PNZT capacitors. Mathematical analysis for the effect of the op-amp finite unity-gain frequency on the sensor circuit oscillation frequency is provided. The experimentally realized frequency output temperature sensor shows -17.6% relative frequency change for a temperature change from 10°C to 100°C. The proposed capacitive temperature sensor can be used in low-power smart sensor nodes without the need for extensive calibration. © 2015 IEEE.

  3. How does higher frequency monitoring data affect the calibration of a process-based water quality model?

    Science.gov (United States)

    Jackson-Blake, Leah; Helliwell, Rachel

    2015-04-01

    Process-based catchment water quality models are increasingly used as tools to inform land management. However, for such models to be reliable they need to be well calibrated and shown to reproduce key catchment processes. Calibration can be challenging for process-based models, which tend to be complex and highly parameterised. Calibrating a large number of parameters generally requires a large amount of monitoring data, spanning all hydrochemical conditions. However, regulatory agencies and research organisations generally only sample at a fortnightly or monthly frequency, even in well-studied catchments, often missing peak flow events. The primary aim of this study was therefore to investigate how the quality and uncertainty of model simulations produced by a process-based, semi-distributed catchment model, INCA-P (the INtegrated CAtchment model of Phosphorus dynamics), were improved by calibration to higher frequency water chemistry data. Two model calibrations were carried out for a small rural Scottish catchment: one using 18 months of daily total dissolved phosphorus (TDP) concentration data, another using a fortnightly dataset derived from the daily data. To aid comparability, calibrations were carried out automatically using the Markov Chain Monte Carlo - DiffeRential Evolution Adaptive Metropolis (MCMC-DREAM) algorithm. Calibration to daily data resulted in improved simulation of peak TDP concentrations and improved model performance statistics. Parameter-related uncertainty in simulated TDP was large when fortnightly data was used for calibration, with a 95% credible interval of 26 μg/l. This uncertainty is comparable in size to the difference between Water Framework Directive (WFD) chemical status classes, and would therefore make it difficult to use this calibration to predict shifts in WFD status. The 95% credible interval reduced markedly with the higher frequency monitoring data, to 6 μg/l. The number of parameters that could be reliably auto

  4. Low-level contrast statistics of natural images can modulate the frequency of event-related potentials (ERP in humans

    Directory of Open Access Journals (Sweden)

    Masoud Ghodrati

    2016-12-01

    Full Text Available Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs’ power within theta frequency band (~3-7 Hz. This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.

  5. RainyDay: An Online, Open-Source Tool for Physically-based Rainfall and Flood Frequency Analysis

    Science.gov (United States)

    Wright, D.; Yu, G.; Holman, K. D.

    2017-12-01

    Flood frequency analysis in ungaged or changing watersheds typically requires rainfall intensity-duration-frequency (IDF) curves combined with hydrologic models. IDF curves only depict point-scale rainfall depth, while true rainstorms exhibit complex spatial and temporal structures. Floods result from these rainfall structures interacting with watershed features such as land cover, soils, and variable antecedent conditions as well as river channel processes. Thus, IDF curves are traditionally combined with a variety of "design storm" assumptions such as area reduction factors and idealized rainfall space-time distributions to translate rainfall depths into inputs that are suitable for flood hydrologic modeling. The impacts of such assumptions are relatively poorly understood. Meanwhile, modern precipitation estimates from gridded weather radar, grid-interpolated rain gages, satellites, and numerical weather models provide more realistic depictions of rainfall space-time structure. Usage of such datasets for rainfall and flood frequency analysis, however, are hindered by relatively short record lengths. We present RainyDay, an open-source stochastic storm transposition (SST) framework for generating large numbers of realistic rainfall "scenarios." SST "lengthens" the rainfall record by temporal resampling and geospatial transposition of observed storms to extract space-time information from regional gridded rainfall data. Relatively short (10-15 year) records of bias-corrected radar rainfall data are sufficient to estimate rainfall and flood events with much longer recurrence intervals including 100-year and 500-year events. We describe the SST methodology as implemented in RainyDay and compare rainfall IDF results from RainyDay to conventional estimates from NOAA Atlas 14. Then, we demonstrate some of the flood frequency analysis properties that are possible when RainyDay is integrated with a distributed hydrologic model, including robust estimation of flood

  6. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  7. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  8. TEMAC, Top Event Sensitivity Analysis

    International Nuclear Information System (INIS)

    Iman, R.L.; Shortencarier, M.J.

    1988-01-01

    1 - Description of program or function: TEMAC is designed to permit the user to easily estimate risk and to perform sensitivity and uncertainty analyses with a Boolean expression such as produced by the SETS computer program. SETS produces a mathematical representation of a fault tree used to model system unavailability. In the terminology of the TEMAC program, such a mathematical representation is referred to as a top event. The analysis of risk involves the estimation of the magnitude of risk, the sensitivity of risk estimates to base event probabilities and initiating event frequencies, and the quantification of the uncertainty in the risk estimates. 2 - Method of solution: Sensitivity and uncertainty analyses associated with top events involve mathematical operations on the corresponding Boolean expression for the top event, as well as repeated evaluations of the top event in a Monte Carlo fashion. TEMAC employs a general matrix approach which provides a convenient general form for Boolean expressions, is computationally efficient, and allows large problems to be analyzed. 3 - Restrictions on the complexity of the problem - Maxima of: 4000 cut sets, 500 events, 500 values in a Monte Carlo sample, 16 characters in an event name. These restrictions are implemented through the FORTRAN 77 PARAMATER statement

  9. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  10. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  11. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  13. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  14. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  15. Eastern Frequency Response Study

    Energy Technology Data Exchange (ETDEWEB)

    Miller, N.W.; Shao, M.; Pajic, S.; D' Aquila, R.

    2013-05-01

    This study was specifically designed to investigate the frequency response of the Eastern Interconnection that results from large loss-of-generation events of the type targeted by the North American Electric Reliability Corp. Standard BAL-003 Frequency Response and Frequency Bias Setting (NERC 2012a), under possible future system conditions with high levels of wind generation.

  16. Micro-Doppler Signal Time-Frequency Algorithm Based on STFRFT

    Directory of Open Access Journals (Sweden)

    Cunsuo Pang

    2016-09-01

    Full Text Available This paper proposes a time-frequency algorithm based on short-time fractional order Fourier transformation (STFRFT for identification of a complicated movement targets. This algorithm, consisting of a STFRFT order-changing and quick selection method, is effective in reducing the computation load. A multi-order STFRFT time-frequency algorithm is also developed that makes use of the time-frequency feature of each micro-Doppler component signal. This algorithm improves the estimation accuracy of time-frequency curve fitting through multi-order matching. Finally, experiment data were used to demonstrate STFRFT’s performance in micro-Doppler time-frequency analysis. The results validated the higher estimate accuracy of the proposed algorithm. It may be applied to an LFM (Linear frequency modulated pulse radar, SAR (Synthetic aperture radar, or ISAR (Inverse synthetic aperture radar, for improving the probability of target recognition.

  17. Micro-Doppler Signal Time-Frequency Algorithm Based on STFRFT.

    Science.gov (United States)

    Pang, Cunsuo; Han, Yan; Hou, Huiling; Liu, Shengheng; Zhang, Nan

    2016-09-24

    This paper proposes a time-frequency algorithm based on short-time fractional order Fourier transformation (STFRFT) for identification of a complicated movement targets. This algorithm, consisting of a STFRFT order-changing and quick selection method, is effective in reducing the computation load. A multi-order STFRFT time-frequency algorithm is also developed that makes use of the time-frequency feature of each micro-Doppler component signal. This algorithm improves the estimation accuracy of time-frequency curve fitting through multi-order matching. Finally, experiment data were used to demonstrate STFRFT's performance in micro-Doppler time-frequency analysis. The results validated the higher estimate accuracy of the proposed algorithm. It may be applied to an LFM (Linear frequency modulated) pulse radar, SAR (Synthetic aperture radar), or ISAR (Inverse synthetic aperture radar), for improving the probability of target recognition.

  18. Core damage frequency (reactor design) perspectives based on IPE results

    International Nuclear Information System (INIS)

    Camp, A.L.; Dingman, S.E.; Forester, J.A.

    1996-01-01

    This paper provides perspectives gained from reviewing 75 Individual Plant Examination (IPE) submittals covering 108 nuclear power plant units. Variability both within and among reactor types is examined to provide perspectives regarding plant-specific design and operational features, and C, modeling assumptions that play a significant role in the estimates of core damage frequencies in the IPEs. Human actions found to be important in boiling water reactors (BWRs) and in pressurized water reactors (PWRs) are presented and the events most frequently found important are discussed

  19. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  20. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  1. HYPOCENTER DISTRIBUTION OF LOW FREQUENCY EVENT AT PAPANDAYAN VOLCANO

    Directory of Open Access Journals (Sweden)

    Muhammad Mifta Hasan

    2016-10-01

    Full Text Available Papandayan volcano is a stratovolcano with irregular cone-shaped has eight craters around the peak. The most active crater in Papandayan is a Mas crater. Distribution of relocated event calculated using Geiger Adaptive Damping Algorithm (GAD shows that the epicenter of the event centered below Mas crater with maximum rms 0.114. While depth of the hypocenter range between 0-2 km and 5-6 km due to activity of steam and gas.

  2. Common time-frequency analysis of local field potential and pyramidal cell activity in seizure-like events of the rat hippocampus

    Science.gov (United States)

    Cotic, M.; Chiu, A. W. L.; Jahromi, S. S.; Carlen, P. L.; Bardakjian, B. L.

    2011-08-01

    To study cell-field dynamics, physiologists simultaneously record local field potentials and the activity of individual cells from animals performing cognitive tasks, during various brain states or under pathological conditions. However, apart from spike shape and spike timing analyses, few studies have focused on elucidating the common time-frequency structure of local field activity relative to surrounding cells across different periods of phenomena. We have used two algorithms, multi-window time frequency analysis and wavelet phase coherence (WPC), to study common intracellular-extracellular (I-E) spectral features in spontaneous seizure-like events (SLEs) from rat hippocampal slices in a low magnesium epilepsy model. Both algorithms were applied to 'pairs' of simultaneously observed I-E signals from slices in the CA1 hippocampal region. Analyses were performed over a frequency range of 1-100 Hz. I-E spectral commonality varied in frequency and time. Higher commonality was observed from 1 to 15 Hz, and lower commonality was observed in the 15-100 Hz frequency range. WPC was lower in the non-SLE region compared to SLE activity; however, there was no statistical difference in the 30-45 Hz band between SLE and non-SLE modes. This work provides evidence of strong commonality in various frequency bands of I-E SLEs in the rat hippocampus, not only during SLEs but also immediately before and after.

  3. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  4. Gear-box fault detection using time-frequency based methods

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2015-01-01

    Gear-box fault monitoring and detection is important for optimization of power generation and availability of wind turbines. The current industrial approach is to use condition monitoring systems, which runs in parallel with the wind turbine control system, using expensive additional sensors...... in the gear-box resonance frequency can be detected. Two different time–frequency based approaches are presented in this paper. One is a filter based approach and the other is based on a Karhunen–Loeve basis. Both of them detect the gear-box fault with an acceptable detection delay of maximum 100s, which...... is neglectable compared with the fault developing time....

  5. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  6. Probabilistic analysis of external events with focus on the Fukushima event

    International Nuclear Information System (INIS)

    Kollasko, Heiko; Jockenhoevel-Barttfeld, Mariana; Klapp, Ulrich

    2014-01-01

    External hazards are those natural or man-made hazards to a site and facilities that are originated externally to both the site and its processes, i.e. the duty holder may have very little or no control over the hazard. External hazards can have the potential of causing initiating events at the plant, typically transients like e.g., loss of offsite power. Simultaneously, external events may affect safety systems required to control the initiating event and, where applicable, also back-up systems implemented for risk-reduction. The plant safety may especially be threatened when loads from external hazards exceed the load assumptions considered in the design of safety-related systems, structures and components. Another potential threat is given by hazards inducing initiating events not considered in the safety demonstration otherwise. An example is loss of offsite power combined with prolonged plant isolation. Offsite support, e.g., delivery of diesel fuel oil, usually credited in the deterministic safety analysis may not be possible in this case. As the Fukushima events have shown, the biggest threat is likely given by hazards inducing both effects. Such hazards may well be dominant risk contributors even if their return period is very high. In order to identify relevant external hazards for a certain Nuclear Power Plant (NPP) location, a site specific screening analysis is performed, both for single events and for combinations of external events. As a result of the screening analysis, risk significant and therefore relevant (screened-in) single external events and combinations of them are identified for a site. The screened-in events are further considered in a detailed event tree analysis in the frame of the Probabilistic Safety Analysis (PSA) to calculate the core damage/large release frequency resulting from each relevant external event or from each relevant combination. Screening analyses of external events performed at AREVA are based on the approach provided

  7. The effects of high-frequency oscillations in hippocampal electrical activities on the classification of epileptiform events using artificial neural networks

    Science.gov (United States)

    Chiu, Alan W. L.; Jahromi, Shokrollah S.; Khosravani, Houman; Carlen, Peter L.; Bardakjian, Berj L.

    2006-03-01

    The existence of hippocampal high-frequency electrical activities (greater than 100 Hz) during the progression of seizure episodes in both human and animal experimental models of epilepsy has been well documented (Bragin A, Engel J, Wilson C L, Fried I and Buzsáki G 1999 Hippocampus 9 137-42 Khosravani H, Pinnegar C R, Mitchell J R, Bardakjian B L, Federico P and Carlen P L 2005 Epilepsia 46 1-10). However, this information has not been studied between successive seizure episodes or utilized in the application of seizure classification. In this study, we examine the dynamical changes of an in vitro low Mg2+ rat hippocampal slice model of epilepsy at different frequency bands using wavelet transforms and artificial neural networks. By dividing the time-frequency spectrum of each seizure-like event (SLE) into frequency bins, we can analyze their burst-to-burst variations within individual SLEs as well as between successive SLE episodes. Wavelet energy and wavelet entropy are estimated for intracellular and extracellular electrical recordings using sufficiently high sampling rates (10 kHz). We demonstrate that the activities of high-frequency oscillations in the 100-400 Hz range increase as the slice approaches SLE onsets and in later episodes of SLEs. Utilizing the time-dependent relationship between different frequency bands, we can achieve frequency-dependent state classification. We demonstrate that activities in the frequency range 100-400 Hz are critical for the accurate classification of the different states of electrographic seizure-like episodes (containing interictal, preictal and ictal states) in brain slices undergoing recurrent spontaneous SLEs. While preictal activities can be classified with an average accuracy of 77.4 ± 6.7% utilizing the frequency spectrum in the range 0-400 Hz, we can also achieve a similar level of accuracy by using a nonlinear relationship between 100-400 Hz and <4 Hz frequency bands only.

  8. Distributed Event-Based Set-Membership Filtering for a Class of Nonlinear Systems With Sensor Saturations Over Sensor Networks.

    Science.gov (United States)

    Ma, Lifeng; Wang, Zidong; Lam, Hak-Keung; Kyriakoulis, Nikos

    2017-11-01

    In this paper, the distributed set-membership filtering problem is investigated for a class of discrete time-varying system with an event-based communication mechanism over sensor networks. The system under consideration is subject to sector-bounded nonlinearity, unknown but bounded noises and sensor saturations. Each intelligent sensing node transmits the data to its neighbors only when certain triggering condition is violated. By means of a set of recursive matrix inequalities, sufficient conditions are derived for the existence of the desired distributed event-based filter which is capable of confining the system state in certain ellipsoidal regions centered at the estimates. Within the established theoretical framework, two additional optimization problems are formulated: one is to seek the minimal ellipsoids (in the sense of matrix trace) for the best filtering performance, and the other is to maximize the triggering threshold so as to reduce the triggering frequency with satisfactory filtering performance. A numerically attractive chaos algorithm is employed to solve the optimization problems. Finally, an illustrative example is presented to demonstrate the effectiveness and applicability of the proposed algorithm.

  9. Integrated analyzing method for the progress event based on subjects and predicates in events

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Munesawa, Yoshiomi

    2014-01-01

    It is expected to make use of the knowledge that was extracted by analyzing the mistakes of the past to prevent recurrence of accidents. Currently main analytic style is an analytic style that experts decipher deeply the accident cases, but cross-analysis has come to an end with extracting the common factors in the accident cases. We propose an integrated analyzing method for progress events to analyze among accidents in this study. Our method realized the integration of many accident cases by the integration connecting the common keyword called as 'Subject' or 'Predicate' that are extracted from each progress event in accident cases or near-miss cases. Our method can analyze and visualize the partial risk identification and the frequency to cause accidents and the risk assessment from the data integrated accident cases. The result of applying our method to PEC-SAFER accident cases identified 8 hazardous factors which can be caused from tank again, and visualized the high frequent factors that the first factor was damage of tank 26% and the second factor was the corrosion 21%, and visualized the high risks that the first risk was the damage 3.3 x 10 -2 [risk rank / year] and the second risk was the destroy 2.5 x 10 -2 [risk rank / year]. (author)

  10. Science-based risk assessments for rare events in a changing climate

    Science.gov (United States)

    Sobel, A. H.; Tippett, M. K.; Camargo, S. J.; Lee, C. Y.; Allen, J. T.

    2014-12-01

    History shows that substantial investments in protection against any specific type of natural disaster usually occur only after (usually shortly after) that specific type of disaster has happened in a given place. This is true even when it was well known before the event that there was a significant risk that it could occur. Presumably what psychologists Kahneman and Tversky have called "availability bias" is responsible, at least in part, for these failures to act on known but out-of-sample risks. While understandable, this human tendency prepares us poorly for events which are very rare (on the time scales of human lives) and even more poorly for a changing climate, as historical records become a poorer guide. A more forward-thinking and rational approach would require scientific risk assessments that can place meaningful probabilities on events that are rare enough to be absent from the historical record, and that can account for the influences of both anthropogenic climate change and low-frequency natural climate variability. The set of tools available for doing such risk assessments is still quite limited, particularly for some of the most extreme events such as tropical cyclones and tornadoes. We will briefly assess the state of the art for these events in particular, and describe some of our ongoing research to develop new tools for quantitative risk assessment using hybrids of statistical methods and physical understanding of the hazards.

  11. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  12. Probabilistic safety analysis for fire events for the NPP Isar 2

    International Nuclear Information System (INIS)

    Schmaltz, H.; Hristodulidis, A.

    2007-01-01

    The 'Probabilistic Safety Analysis for Fire Events' (Fire-PSA KKI2) for the NPP Isar 2 was performed in addition to the PSA for full power operation and considers all possible events which can be initiated due to a fire. The aim of the plant specific Fire-PSA was to perform a quantitative assessment of fire events during full power operation, which is state of the art. Based on simplistic assumptions referring to the fire induced failures, the influence of system- and component-failures on the frequency of the core damage states was analysed. The Fire-PSA considers events, which will result due to fire-induced failures of equipment on the one hand in a SCRAM and on the other hand in events, which will not have direct operational effects but because of the fire-induced failure of safety related installations the plant will be shut down as a precautionary measure. These events are considered because they may have a not negligible influence on the frequency of core damage states in case of failures during the plant shut down because of the reduced redundancy of safety related systems. (orig.)

  13. Using Web Crawler Technology for Geo-Events Analysis: A Case Study of the Huangyan Island Incident

    Directory of Open Access Journals (Sweden)

    Hao Hu

    2014-04-01

    Full Text Available Social networking and network socialization provide abundant text information and social relationships into our daily lives. Making full use of these data in the big data era is of great significance for us to better understand the changing world and the information-based society. Though politics have been integrally involved in the hyperlinked world issues since the 1990s, the text analysis and data visualization of geo-events faced the bottleneck of traditional manual analysis. Though automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have been explored and built recently, the data mining and information collection are not comprehensive enough because of the sensibility, complexity, relativity, timeliness, and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency, and dissemination path of the Huangyan Island incident were studied by using web crawler technology and the text analysis. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios, and dissemination flow graph, based on the crawled information and data processing not only highlight the characteristics of geo-event itself, but also implicate many interesting phenomenon and deep-seated problems behind it, such as related topics, theme vocabularies, subject contents, hot countries, event bodies, opinion leaders, high-frequency vocabularies, information sources, semantic structure, propagation paths, distribution of different attitudes, and regional difference of net citizens’ response in the Huangyan Island incident. Furthermore, the text analysis of network information with the help of focused web crawler is able to express the time-space relationship of crawled information and the information characteristic of semantic network to the geo-events. Therefore, it is a useful tool to

  14. Event-based plausibility immediately influences on-line language comprehension.

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L; Scheepers, Christoph; McRae, Ken

    2011-07-01

    In some theories of sentence comprehension, linguistically relevant lexical knowledge, such as selectional restrictions, is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients of (as in Rayner, Warren, Juhuasz, & Liversedge, 2004; Warren & McConnell, 2007). Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns, such as hair, when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge rather than lexical-grammatical knowledge.

  15. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-04-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  16. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-07-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  17. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  18. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  19. Frequency domain based LS channel estimation in OFDM based Power line communications

    OpenAIRE

    Bogdanović, Mario

    2015-01-01

    This paper is focused on low voltage power line communication (PLC) realization with an emphasis on channel estimation techniques. The Orthogonal Frequency Division Multiplexing (OFDM) scheme is preferred technology in PLC systems because of its effective combat with frequency selective fading properties of PLC channel. As the channel estimation is one of the crucial problems in OFDM based PLC system because of a problematic area of PLC signal attenuation and interference, the improved LS est...

  20. Frequency Estimator Performance for a Software-Based Beacon Receiver

    Science.gov (United States)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  1. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  2. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  3. Low and high frequency Madden-Julian oscillations in austral summer: interannual variations

    Energy Technology Data Exchange (ETDEWEB)

    Izumo, Takeshi [Research Institute For Global Change (JAMSTEC), Yokohama (Japan); LOCEAN, IRD-CNRS-UPMC, Paris (France); Masson, Sebastien; Vialard, Jerome; Madec, Gurvan [LOCEAN, IRD-CNRS-UPMC, Paris (France); Boyer Montegut, Clement de [IFREMER, Brest (France); Behera, Swadhin K. [Research Institute For Global Change (JAMSTEC), Yokohama (Japan); Takahashi, Keiko [Earth Simulator Center (JAMSTEC), Yokohama (Japan); Yamagata, Toshio [Research Institute For Global Change (JAMSTEC), Yokohama (Japan); University of Tokyo, Tokyo (Japan)

    2010-09-15

    The Madden-Julian oscillation (MJO) is the main component of intraseasonal variability of the tropical convection, with clear climatic impacts at an almost-global scale. Based on satellite observations, it is shown that there are two types of austral-summer MJO events (broadly defined as 30-120 days convective variability with eastward propagation of about 5 m/s). Equatorial MJO events have a period of 30-50 days and tend to be symmetric about the equator, whereas MJO events centered near 8 S tend to have a longer period of 55-100 days. The lower-frequency variability is associated with a strong upper-ocean response, having a clear signature in both sea surface temperature and its diurnal cycle. These two MJO types have different interannual variations, and are modulated by the Indian Ocean Dipole (IOD). Following a negative IOD event, the lower-frequency southern MJO variability increases, while the higher-frequency equatorial MJO strongly diminishes. We propose two possible explanations for this change in properties of the MJO. One possibility is that changes in the background atmospheric circulation after an IOD favour the development of the low-frequency MJO. The other possibility is that the shallower thermocline ridge and mixed layer depth, by enhancing SST intraseasonal variability and thus ocean-atmosphere coupling in the southwest Indian Ocean (the breeding ground of southern MJO onset), favour the lower-frequency southern MJO variability. (orig.)

  4. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  5. Time-Frequency Distribution of Music based on Sparse Wavelet Packet Representations

    DEFF Research Database (Denmark)

    Endelt, Line Ørtoft

    We introduce a new method for generating time-frequency distributions, which is particularly useful for the analysis of music signals. The method presented here is based on $\\ell1$ sparse representations of music signals in a redundant wavelet packet dictionary. The representations are found using...... the minimization methods basis pursuit and best orthogonal basis. Visualizations of the time-frequency distribution are constructed based on a simplified energy distribution in the wavelet packet decomposition. The time-frequency distributions emphasizes structured musical content, including non-stationary content...

  6. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  7. Evaluation of external hazards to nuclear power plants in the United States: Other external events

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Prassinos, P.G.

    1989-02-01

    In support of implementation of the Nuclear Regulatory Commission's Severe Accident Policy, the Lawrence Livermore National Laboratory (LLNL) has performed a study of the risk of core damage to nuclear power plants in the United States due to ''other external events.'' The broad objective has been to gain an understanding of whether ''other external events'' (the hazards not covered by previous reports) are among the major potential accident initiators that may pose a threat of severe reactor core damage or of large radioactive release to the environment from the reactor. The ''other external events'' covered in this report are nearby industrial/military facility accidents, on site hazardous material storage accidents, severe temperature transients, severe weather storms, lightning strikes, external fires, extraterrestrial activity, volcanic activity, earth movement, and abrasive windstorms. The analysis was based on two figures-of-merit, one based on core damage frequency and the other based on the frequency of large radioactive releases. 37 refs., 8 tabs

  8. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  9. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  10. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  11. Acoustic frequency filter based on anisotropic topological phononic crystals

    KAUST Repository

    Chen, Zeguo

    2017-11-02

    We present a design of acoustic frequency filter based on a two-dimensional anisotropic phononic crystal. The anisotropic band structure exhibits either a directional or a combined (global + directional) bandgap at certain frequency regions, depending on the geometry. When the time-reversal symmetry is broken, it may introduce a topologically nontrivial bandgap. The induced nontrivial bandgap and the original directional bandgap result in various interesting wave propagation behaviors, such as frequency filter. We develop a tight-binding model to characterize the effective Hamiltonian of the system, from which the contribution of anisotropy is explicitly shown. Different from the isotropic cases, the Zeeman-type splitting is not linear and the anisotropic bandgap makes it possible to achieve anisotropic propagation characteristics along different directions and at different frequencies.

  12. Acoustic frequency filter based on anisotropic topological phononic crystals

    KAUST Repository

    Chen, Zeguo; Zhao, Jiajun; Mei, Jun; Wu, Ying

    2017-01-01

    We present a design of acoustic frequency filter based on a two-dimensional anisotropic phononic crystal. The anisotropic band structure exhibits either a directional or a combined (global + directional) bandgap at certain frequency regions, depending on the geometry. When the time-reversal symmetry is broken, it may introduce a topologically nontrivial bandgap. The induced nontrivial bandgap and the original directional bandgap result in various interesting wave propagation behaviors, such as frequency filter. We develop a tight-binding model to characterize the effective Hamiltonian of the system, from which the contribution of anisotropy is explicitly shown. Different from the isotropic cases, the Zeeman-type splitting is not linear and the anisotropic bandgap makes it possible to achieve anisotropic propagation characteristics along different directions and at different frequencies.

  13. Estimating parameters of speciation models based on refined summaries of the joint site-frequency spectrum.

    Directory of Open Access Journals (Sweden)

    Aurélien Tellier

    Full Text Available Understanding the processes and conditions under which populations diverge to give rise to distinct species is a central question in evolutionary biology. Since recently diverged populations have high levels of shared polymorphisms, it is challenging to distinguish between recent divergence with no (or very low inter-population gene flow and older splitting events with subsequent gene flow. Recently published methods to infer speciation parameters under the isolation-migration framework are based on summarizing polymorphism data at multiple loci in two species using the joint site-frequency spectrum (JSFS. We have developed two improvements of these methods based on a more extensive use of the JSFS classes of polymorphisms for species with high intra-locus recombination rates. First, using a likelihood based method, we demonstrate that taking into account low-frequency polymorphisms shared between species significantly improves the joint estimation of the divergence time and gene flow between species. Second, we introduce a local linear regression algorithm that considerably reduces the computational time and allows for the estimation of unequal rates of gene flow between species. We also investigate which summary statistics from the JSFS allow the greatest estimation accuracy for divergence time and migration rates for low (around 10 and high (around 100 numbers of loci. Focusing on cases with low numbers of loci and high intra-locus recombination rates we show that our methods for the estimation of divergence time and migration rates are more precise than existing approaches.

  14. Channel and delay estimation for base-station–based cooperative communications in frequency-selective fading channels

    Directory of Open Access Journals (Sweden)

    Hongjun Xu

    2011-07-01

    Full Text Available A channel and delay estimation algorithm for both positive and negative delay, based on the distributed Alamouti scheme, has been recently discussed for base-station–based asynchronous cooperative systems in frequency-flat fading channels. This paper extends the algorithm, the maximum likelihood estimator, to work in frequency-selective fading channels. The minimum mean square error (MMSE performance of channel estimation for both packet schemes and normal schemes is discussed in this paper. The symbol error rate (SER performance of equalisation and detection for both time-reversal space-time block code (STBC and single-carrier STBC is also discussed in this paper. The MMSE simulation results demonstrated the superior performance of the packet scheme over the normal scheme with an improvement in performance of up to 6 dB when feedback was used in the frequency-selective channel at a MSE of 3 x 10–2. The SER simulation results showed that, although both the normal and packet schemes achieved similar diversity orders, the packet scheme demonstrated a 1 dB coding gain over the normal scheme at a SER of 10–5. Finally, the SER simulations showed that the frequency-selective fading system outperformed the frequency-flat fading system.

  15. Analysis of system and of course of events

    International Nuclear Information System (INIS)

    Hoertner, H.; Kersting, E.J.; Puetter, B.M.

    1986-01-01

    The analysis of the system and of the course of events is used to determine the frequency of core melt-out accidents and to describe the safety-related boundary conditions of appropriate accidents. The lecture is concerned with the effect of system changes in the reference plant and the effect of triggering events not assessed in detail or not sufficiently assessed in detail in phase A of the German Risk Study on the frequency of core melt-out accidents, the minimum requirements for system functions for controlling triggering events, i.e. to prevent core melt-out accidents, the reliability data important for reliability investigations and frequency assessments. (orig./DG) [de

  16. Shallow repeating seismic events under an alpine glacier at Mount Rainier, Washington, USA

    Science.gov (United States)

    Thelen, Weston A.; Allstadt, Kate E.; De Angelis, Silvio; Malone, Stephen D.; Moran, Seth C.; Vidale, John

    2013-01-01

    We observed several swarms of repeating low-frequency (1–5 Hz) seismic events during a 3 week period in May–June 2010, near the summit of Mount Rainier, Washington, USA, that likely were a result of stick–slip motion at the base of alpine glaciers. The dominant set of repeating events ('multiplets') featured >4000 individual events and did not exhibit daytime variations in recurrence interval or amplitude. Volcanoes and glaciers around the world are known to produce seismic signals with great variability in both frequency content and size. The low-frequency character and periodic recurrence of the Mount Rainier multiplets mimic long-period seismicity often seen at volcanoes, particularly during periods of unrest. However, their near-surface location, lack of common spectral peaks across the recording network, rapid attenuation of amplitudes with distance, and temporal correlation with weather systems all indicate that ice-related source mechanisms are the most likely explanation. We interpret the low-frequency character of these multiplets to be the result of trapping of seismic energy under glacial ice as it propagates through the highly heterogeneous and attenuating volcanic material. The Mount Rainier multiplet sequences underscore the difficulties in differentiating low-frequency signals due to glacial processes from those caused by volcanic processes on glacier-clad volcanoes.

  17. SPREAD: a high-resolution daily gridded precipitation dataset for Spain – an extreme events frequency and intensity overview

    Directory of Open Access Journals (Sweden)

    R. Serrano-Notivoli

    2017-09-01

    Full Text Available A high-resolution daily gridded precipitation dataset was built from raw data of 12 858 observatories covering a period from 1950 to 2012 in peninsular Spain and 1971 to 2012 in Balearic and Canary islands. The original data were quality-controlled and gaps were filled on each day and location independently. Using the serially complete dataset, a grid with a 5 × 5 km spatial resolution was constructed by estimating daily precipitation amounts and their corresponding uncertainty at each grid node. Daily precipitation estimations were compared to original observations to assess the quality of the gridded dataset. Four daily precipitation indices were computed to characterise the spatial distribution of daily precipitation and nine extreme precipitation indices were used to describe the frequency and intensity of extreme precipitation events. The Mediterranean coast and the Central Range showed the highest frequency and intensity of extreme events, while the number of wet days and dry and wet spells followed a north-west to south-east gradient in peninsular Spain, from high to low values in the number of wet days and wet spells and reverse in dry spells. The use of the total available data in Spain, the independent estimation of precipitation for each day and the high spatial resolution of the grid allowed for a precise spatial and temporal assessment of daily precipitation that is difficult to achieve when using other methods, pre-selected long-term stations or global gridded datasets. SPREAD dataset is publicly available at https://doi.org/10.20350/digitalCSIC/7393.

  18. Microseismic Event Relocation and Focal Mechanism Estimation Based on PageRank Linkage

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2017-12-01

    Microseismicity associated with enhanced geothermal systems (EGS) is key in understanding how subsurface stimulation can modify stress, fracture rock, and increase permeability. Large numbers of microseismic events are commonly associated with hydroshearing an EGS, making data mining methods useful in their analysis. We focus on PageRank, originally developed as Google's search engine, and subsequently adapted for use in seismology to detect low-frequency earthquakes by linking events directly and indirectly through cross-correlation (Aguiar and Beroza, 2014). We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes from the Newberry Volcano EGS in Central Oregon, which has been stimulated two times using high-pressure fluid injection. We create PageRank signal families from both data sets and compare these to the spatial and temporal proximity of associated earthquakes. PageRank families are relocated using differential travel times measured by waveform cross-correlation (CC) and the Bayesloc approach (Myers et al., 2007). Prior to relocation events are loosely clustered with events at a distance from the cluster. After relocation, event families are found to be tightly clustered. Indirect linkage of signals using PageRank is a reliable way to increase the number of events confidently determined to be similar, suggesting an efficient and effective grouping of earthquakes with similar physical characteristics (ie. location, focal mechanism, stress drop). We further explore the possibility of using PageRank families to identify events with similar relative phase polarities and estimate focal mechanisms following Shelly et al. (2016) method, where CC measurements are used to determine individual polarities within event clusters. Given a positive result, PageRank might be a useful tool in adaptive approaches to enhance production at well-instrumented geothermal sites. Prepared by LLNL under Contract DE-AC52-07NA27344

  19. Initiating Event Rates at U.S. Nuclear Power Plants. 1988 - 2013

    International Nuclear Information System (INIS)

    Schroeder, John A.; Bower, Gordon R.

    2014-01-01

    Analyzing initiating event rates is important because it indicates performance among plants and also provides inputs to several U.S. Nuclear Regulatory Commission (NRC) risk-informed regulatory activities. This report presents an analysis of initiating event frequencies at U.S. commercial nuclear power plants since each plant's low-power license date. The evaluation is based on the operating experience from fiscal year 1988 through 2013 as reported in licensee event reports. Engineers with nuclear power plant experience staff reviewed each event report since the last update to this report for the presence of valid scrams or reactor trips at power. To be included in the study, an event had to meet all of the following criteria: includes an unplanned reactor trip (not a scheduled reactor trip on the daily operations schedule), sequence of events starts when reactor is critical and at or above the point of adding heat, occurs at a U.S. commercial nuclear power plant (excluding Fort St. Vrain and LaCrosse), and is reported by a licensee event report. This report displays occurrence rates (baseline frequencies) for the categories of initiating events that contribute to the NRC's Industry Trends Program. Sixteen initiating event groupings are trended and displayed. Initiators are plotted separately for initiating events with different occurrence rates for boiling water reactors and pressurized water reactors. p-values are given for the possible presence of a trend over the most recent 10 years.

  20. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  1. Detection of goal events in soccer videos

    Science.gov (United States)

    Kim, Hyoung-Gook; Roeber, Steffen; Samour, Amjad; Sikora, Thomas

    2005-01-01

    In this paper, we present an automatic extraction of goal events in soccer videos by using audio track features alone without relying on expensive-to-compute video track features. The extracted goal events can be used for high-level indexing and selective browsing of soccer videos. The detection of soccer video highlights using audio contents comprises three steps: 1) extraction of audio features from a video sequence, 2) event candidate detection of highlight events based on the information provided by the feature extraction Methods and the Hidden Markov Model (HMM), 3) goal event selection to finally determine the video intervals to be included in the summary. For this purpose we compared the performance of the well known Mel-scale Frequency Cepstral Coefficients (MFCC) feature extraction method vs. MPEG-7 Audio Spectrum Projection feature (ASP) extraction method based on three different decomposition methods namely Principal Component Analysis( PCA), Independent Component Analysis (ICA) and Non-Negative Matrix Factorization (NMF). To evaluate our system we collected five soccer game videos from various sources. In total we have seven hours of soccer games consisting of eight gigabytes of data. One of five soccer games is used as the training data (e.g., announcers' excited speech, audience ambient speech noise, audience clapping, environmental sounds). Our goal event detection results are encouraging.

  2. Analyzing mobile WiMAX base station deployment under different frequency planning strategies

    Science.gov (United States)

    Salman, M. K.; Ahmad, R. B.; Ali, Ziad G.; Aldhaibani, Jaafar A.; Fayadh, Rashid A.

    2015-05-01

    The frequency spectrum is a precious resource and scarce in the communication markets. Therefore, different techniques are adopted to utilize the available spectrum in deploying WiMAX base stations (BS) in cellular networks. In this paper several types of frequency planning techniques are illustrated, and a comprehensive comparative study between conventional frequency reuse of 1 (FR of 1) and fractional frequency reuse (FFR) is presented. These techniques are widely used in network deployment, because they employ universal frequency (using all the available bandwidth) in their base station installation/configuration within network system. This paper presents a network model of 19 base stations in order to be employed in the comparison of the aforesaid frequency planning techniques. Users are randomly distributed within base stations, users' resource mapping and their burst profile selection are based on the measured signal to interference plus-noise ratio (SINR). Simulation results reveal that the FFR has advantages over the conventional FR of 1 in various metrics. 98 % of downlink resources (slots) are exploited when FFR is applied, whilst it is 81 % at FR of 1. Data rate of FFR has been increased to 10.6 Mbps, while it is 7.98 Mbps at FR of 1. The spectral efficiency is better enhanced (1.072 bps/Hz) at FR of 1 than FFR (0.808 bps/Hz), since FR of 1 exploits all the Bandwidth. The subcarrier efficiency shows how many data bits that can be carried by subcarriers under different frequency planning techniques, the system can carry more data bits under FFR (2.40 bit/subcarrier) than FR of 1 (1.998 bit/subcarrier). This study confirms that FFR can perform better than conventional frequency planning (FR of 1) which made it a strong candidate for WiMAX BS deployment in cellular networks.

  3. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  4. Rapid Active Power Control of Photovoltaic Systems for Grid Frequency Support

    Energy Technology Data Exchange (ETDEWEB)

    Hoke, Anderson; Shirazi, Mariko; Chakraborty, Sudipta; Muljadi, Eduard; Maksimovic, Dragan

    2017-01-01

    As deployment of power electronic coupled generation such as photovoltaic (PV) systems increases, grid operators have shown increasing interest in calling on inverter-coupled generation to help mitigate frequency contingency events by rapidly surging active power into the grid. When responding to contingency events, the faster the active power is provided, the more effective it may be for arresting the frequency event. This paper proposes a predictive PV inverter control method for very fast and accurate control of active power. This rapid active power control method will increase the effectiveness of various higher-level controls designed to mitigate grid frequency contingency events, including fast power-frequency droop, inertia emulation, and fast frequency response, without the need for energy storage. The rapid active power control method, coupled with a maximum power point estimation method, is implemented in a prototype PV inverter connected to a PV array. The prototype inverter's response to various frequency events is experimentally confirmed to be fast (beginning within 2 line cycles and completing within 4.5 line cycles of a severe test event) and accurate (below 2% steady-state error).

  5. High-performance radio frequency transistors based on diameter-separated semiconducting carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Yu; Che, Yuchi; Zhou, Chongwu, E-mail: chongwuz@usc.edu [Department of Electrical Engineering, University of Southern California, Los Angeles, California 90089 (United States); Seo, Jung-Woo T.; Hersam, Mark C. [Department of Materials Science and Engineering and Department of Chemistry, Northwestern University, Evanston, Illinois 60208 (United States); Gui, Hui [Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089 (United States)

    2016-06-06

    In this paper, we report the high-performance radio-frequency transistors based on the single-walled semiconducting carbon nanotubes with a refined average diameter of ∼1.6 nm. These diameter-separated carbon nanotube transistors show excellent transconductance of 55 μS/μm and desirable drain current saturation with an output resistance of ∼100 KΩ μm. An exceptional radio-frequency performance is also achieved with current gain and power gain cut-off frequencies of 23 GHz and 20 GHz (extrinsic) and 65 GHz and 35 GHz (intrinsic), respectively. These radio-frequency metrics are among the highest reported for the carbon nanotube thin-film transistors. This study provides demonstration of radio frequency transistors based on carbon nanotubes with tailored diameter distributions, which will guide the future application of carbon nanotubes in radio-frequency electronics.

  6. Paleo-event data standards for dendrochronology

    Science.gov (United States)

    Elaine Kennedy Sutherland; P. Brewer; W. Gross

    2017-01-01

    Extreme environmental events, such as storm winds, landslides, insect infestations, and wildfire, cause loss of life, resources, and human infrastructure. Disaster riskreduction analysis can be improved with information about past frequency, intensity, and spatial patterns of extreme events. Tree-ring analyses can provide such information: tree rings reflect events as...

  7. ELF whistler events with a reduced intensity observed by the DEMETER spacecraft

    Science.gov (United States)

    Zahlava, J.; Nemec, F.; Santolik, O.; Kolmasova, I.; Parrot, M.

    2017-12-01

    A survey of VLF frequency-time spectrograms obtained by the DEMETER spacecraft (2004-2010, altitude about 700 km) revealed that the intensity of fractional hop whistlers is sometimes significantly reduced at specific frequencies. These frequencies are typically above about 3.4 kHz (second cutoff frequency of the Earth-ionosphere waveguide), and they vary smoothly in time. The events were explained by the wave propagation in the Earth-ionosphere waveguide, and a resulting interference of the first few waveguide modes. We analyze the events whose frequency-time structure is rather similar, but at frequencies below 1 kHz. Altogether, 284 events are identified during the periods with active Burst mode, when high resolution data are measured by DEMETER. The vast majority of events (93%) occurs during the nighttime. All six electromagnetic field components are available, which allows us to perform a detailed wave analysis. An overview of the properties of these events is presented, and their possible origin is discussed.

  8. Microresonator-Based Optical Frequency Combs: A Time Domain Perspective

    Science.gov (United States)

    2016-04-19

    AFRL-AFOSR-VA-TR-2016-0165 (BRI) Microresonator-Based Optical Frequency Combs: A Time Domain Perspective Andrew Weiner PURDUE UNIVERSITY 401 SOUTH...Optical Frequency Combs: A Time Domain Perspective 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1-0236 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data

  9. Construction and Quantification of the One Top model of the Fire Events PSA

    International Nuclear Information System (INIS)

    Kang, Dae Il; Lee, Yoon Hwan; Han, Sang Hoon

    2008-01-01

    KAERI constructed the one top model of the fire events PSA for Ulchin Unit 3 and 4 by using the 'mapping technique'. The mapping technique was developed for the construction and quantification of external events PSA models with a one top model for an internal events PSA. With 'AIMS', the mapping technique can be implemented by the construction of mapping tables. The mapping tables include fire rooms, fire ignition frequency, related initiating events, fire transfer events, and the internal PSA basic events affected by a fire. The constructed one top fire PSA model is based on previously conducted fire PSA results for Ulchin Unit 3 and 4. In this paper, we introduce the construction procedure and quantification results of the one top model of the fire events PSA by using the mapping technique. As the one top model of the fire events PSA developed in this study is based on the previous study, we also introduce the previous fire PSA approach focused on quantification

  10. Synthesis of High-Frequency Ground Motion Using Information Extracted from Low-Frequency Ground Motion

    Science.gov (United States)

    Iwaki, A.; Fujiwara, H.

    2012-12-01

    Broadband ground motion computations of scenario earthquakes are often based on hybrid methods that are the combinations of deterministic approach in lower frequency band and stochastic approach in higher frequency band. Typical computation methods for low-frequency and high-frequency (LF and HF, respectively) ground motions are the numerical simulations, such as finite-difference and finite-element methods based on three-dimensional velocity structure model, and the stochastic Green's function method, respectively. In such hybrid methods, LF and HF wave fields are generated through two different methods that are completely independent of each other, and are combined at the matching frequency. However, LF and HF wave fields are essentially not independent as long as they are from the same event. In this study, we focus on the relation among acceleration envelopes at different frequency bands, and attempt to synthesize HF ground motion using the information extracted from LF ground motion, aiming to propose a new method for broad-band strong motion prediction. Our study area is Kanto area, Japan. We use the K-NET and KiK-net surface acceleration data and compute RMS envelope at four frequency bands: 0.5-1.0 Hz, 1.0-2.0 Hz, 2.0-4.0 Hz, .0-8.0 Hz, and 8.0-16.0 Hz. Taking the ratio of the envelopes of adjacent bands, we find that the envelope ratios have stable shapes at each site. The empirical envelope-ratio characteristics are combined with low-frequency envelope of the target earthquake to synthesize HF ground motion. We have applied the method to M5-class earthquakes and a M7 target earthquake that occurred in the vicinity of Kanto area, and successfully reproduced the observed HF ground motion of the target earthquake. The method can be applied to a broad band ground motion simulation for a scenario earthquake by combining numerically-computed low-frequency (~1 Hz) ground motion with the empirical envelope ratio characteristics to generate broadband ground motion

  11. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  12. High frequency and large deposition of acid fog on high elevation forest.

    Science.gov (United States)

    Igawa, Manabu; Matsumura, Ko; Okochi, Hiroshi

    2002-01-01

    We have collected and analyzed fogwater on the mountainside of Mt. Oyama (1252 m) in the Tanzawa Mountains of Japan and observed the fog event frequency from the base of the mountain with a video camera. The fog event frequency increased with elevation and was observed to be present 46% of the year at the summit. The water deposition via throughfall increased with elevation because of the increase in fogwater interception and was about twice that via rain at the summit, where the air pollutant deposition via throughfall was several times that via rainwater. The dry deposition and the deposition via fogwater were dominant factors in the total ion deposition at high elevation sites. In a fog event, nitric acid, the major acid component on the mountain, is formed during the transport of the air mass from the base of the mountain along the mountainside, where gases including nitric acid deposit and are scavenged by fogwater. Therefore, high acidity caused by nitric acid and relatively low ion strength are observed in the fogwater at high elevation sites.

  13. Calibration of semi-stochastic procedure for simulating high-frequency ground motions

    Science.gov (United States)

    Seyhan, Emel; Stewart, Jonathan P.; Graves, Robert

    2013-01-01

    Broadband ground motion simulation procedures typically utilize physics-based modeling at low frequencies, coupled with semi-stochastic procedures at high frequencies. The high-frequency procedure considered here combines deterministic Fourier amplitude spectra (dependent on source, path, and site models) with random phase. Previous work showed that high-frequency intensity measures from this simulation methodology attenuate faster with distance and have lower intra-event dispersion than in empirical equations. We address these issues by increasing crustal damping (Q) to reduce distance attenuation bias and by introducing random site-to-site variations to Fourier amplitudes using a lognormal standard deviation ranging from 0.45 for Mw  100 km).

  14. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    Science.gov (United States)

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  15. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  16. Frequency-shaped and observer-based discrete-time sliding mode control

    CERN Document Server

    Mehta, Axaykumar

    2015-01-01

    It is well established that the sliding mode control strategy provides an effective and robust method of controlling the deterministic system due to its well-known invariance property to a class of bounded disturbance and parameter variations. Advances in microcomputer technologies have made digital control increasingly popular among the researchers worldwide. And that led to the study of discrete-time sliding mode control design and its implementation. This brief presents, a method for multi-rate frequency shaped sliding mode controller design based on switching and non-switching type of reaching law. In this approach, the frequency dependent compensator dynamics are introduced through a frequency-shaped sliding surface by assigning frequency dependent weighing matrices in a linear quadratic regulator (LQR) design procedure. In this way, the undesired high frequency dynamics or certain frequency disturbance can be eliminated. The states are implicitly obtained by measuring the output at a faster rate than th...

  17. A Fourier analysis of extreme events

    DEFF Research Database (Denmark)

    Mikosch, Thomas Valentin; Zhao, Yuwei

    2014-01-01

    The extremogram is an asymptotic correlogram for extreme events constructed from a regularly varying stationary sequence. In this paper, we define a frequency domain analog of the correlogram: a periodogram generated from a suitable sequence of indicator functions of rare events. We derive basic ...... properties of the periodogram such as the asymptotic independence at the Fourier frequencies and use this property to show that weighted versions of the periodogram are consistent estimators of a spectral density derived from the extremogram....

  18. Event-Based $H_\\infty $ State Estimation for Time-Varying Stochastic Dynamical Networks With State- and Disturbance-Dependent Noises.

    Science.gov (United States)

    Sheng, Li; Wang, Zidong; Zou, Lei; Alsaadi, Fuad E

    2017-10-01

    In this paper, the event-based finite-horizon H ∞ state estimation problem is investigated for a class of discrete time-varying stochastic dynamical networks with state- and disturbance-dependent noises [also called (x,v) -dependent noises]. An event-triggered scheme is proposed to decrease the frequency of the data transmission between the sensors and the estimator, where the signal is transmitted only when certain conditions are satisfied. The purpose of the problem addressed is to design a time-varying state estimator in order to estimate the network states through available output measurements. By employing the completing-the-square technique and the stochastic analysis approach, sufficient conditions are established to ensure that the error dynamics of the state estimation satisfies a prescribed H ∞ performance constraint over a finite horizon. The desired estimator parameters can be designed via solving coupled backward recursive Riccati difference equations. Finally, a numerical example is exploited to demonstrate the effectiveness of the developed state estimation scheme.

  19. Networked Estimation for Event-Based Sampling Systems with Packet Dropouts

    Directory of Open Access Journals (Sweden)

    Young Soo Suh

    2009-04-01

    Full Text Available This paper is concerned with a networked estimation problem in which sensor data are transmitted over the network. In the event-based sampling scheme known as level-crossing or send-on-delta (SOD, sensor data are transmitted to the estimator node if the difference between the current sensor value and the last transmitted one is greater than a given threshold. Event-based sampling has been shown to be more efficient than the time-triggered one in some situations, especially in network bandwidth improvement. However, it cannot detect packet dropout situations because data transmission and reception do not use a periodical time-stamp mechanism as found in time-triggered sampling systems. Motivated by this issue, we propose a modified event-based sampling scheme called modified SOD in which sensor data are sent when either the change of sensor output exceeds a given threshold or the time elapses more than a given interval. Through simulation results, we show that the proposed modified SOD sampling significantly improves estimation performance when packet dropouts happen.

  20. Rocchio-based relevance feedback in video event retrieval

    NARCIS (Netherlands)

    Pingen, G.L.J.; de Boer, M.H.T.; Aly, Robin; Amsaleg, Laurent; Guðmundsson, Gylfi Þór; Gurrin, Cathal; Jónsson, Björn Þór; Satoh, Shin’ichi

    This paper investigates methods for user and pseudo relevance feedback in video event retrieval. Existing feedback methods achieve strong performance but adjust the ranking based on few individual examples. We propose a relevance feedback algorithm (ARF) derived from the Rocchio method, which is a

  1. Intense high-frequency gyrotron-based microwave beams for material processing

    Energy Technology Data Exchange (ETDEWEB)

    Hardek, T.W.; Cooke, W.D.; Katz, J.D.; Perry, W.L.; Rees, D.E.

    1997-03-01

    Microwave processing of materials has traditionally utilized frequencies in the 0.915 and 2.45 GHz regions. Microwave power sources are readily available at these frequencies but the relatively long wavelengths can present challenges in uniformly heating materials. An additional difficulty is the poor coupling of ceramic based materials to the microwave energy. Los Alamos National Laboratory scientists, working in conjunction with the National Center for Manufacturing Sciences (NCMS), have assembled a high-frequency demonstration processing facility utilizing gyrotron based RF sources. The facility is primarily intended to demonstrate the unique features available at frequencies as high as 84 GHz. The authors can readily provide quasi-optical, 37 GHz beams at continuous wave (CW) power levels in the 10 kW range. They have also provided beams at 84 GHz at 10 kW CW power levels. They are presently preparing a facility to demonstrate the sintering of ceramics at 30 GHz. This paper presents an overview of the present demonstration processing facility and describes some of the features they have available now and will have available in the near future.

  2. Nonlinear optics of fibre event horizons.

    Science.gov (United States)

    Webb, Karen E; Erkintalo, Miro; Xu, Yiqing; Broderick, Neil G R; Dudley, John M; Genty, Goëry; Murdoch, Stuart G

    2014-09-17

    The nonlinear interaction of light in an optical fibre can mimic the physics at an event horizon. This analogue arises when a weak probe wave is unable to pass through an intense soliton, despite propagating at a different velocity. To date, these dynamics have been described in the time domain in terms of a soliton-induced refractive index barrier that modifies the velocity of the probe. Here we complete the physical description of fibre-optic event horizons by presenting a full frequency-domain description in terms of cascaded four-wave mixing between discrete single-frequency fields, and experimentally demonstrate signature frequency shifts using continuous wave lasers. Our description is confirmed by the remarkable agreement with experiments performed in the continuum limit, reached using ultrafast lasers. We anticipate that clarifying the description of fibre event horizons will significantly impact on the description of horizon dynamics and soliton interactions in photonics and other systems.

  3. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  4. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  5. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  6. Evaluating Monitoring Strategies to Detect Precipitation-Induced Microbial Contamination Events in Karstic Springs Used for Drinking Water

    Directory of Open Access Journals (Sweden)

    Michael D. Besmer

    2017-11-01

    Full Text Available Monitoring of microbial drinking water quality is a key component for ensuring safety and understanding risk, but conventional monitoring strategies are typically based on low sampling frequencies (e.g., quarterly or monthly. This is of concern because many drinking water sources, such as karstic springs are often subject to changes in bacterial concentrations on much shorter time scales (e.g., hours to days, for example after precipitation events. Microbial contamination events are crucial from a risk assessment perspective and should therefore be targeted by monitoring strategies to establish both the frequency of their occurrence and the magnitude of bacterial peak concentrations. In this study we used monitoring data from two specific karstic springs. We assessed the performance of conventional monitoring based on historical records and tested a number of alternative strategies based on a high-resolution data set of bacterial concentrations in spring water collected with online flow cytometry (FCM. We quantified the effect of increasing sampling frequency and found that for the specific case studied, at least bi-weekly sampling would be needed to detect precipitation events with a probability of >90%. We then proposed an optimized monitoring strategy with three targeted samples per event, triggered by precipitation measurements. This approach is more effective and efficient than simply increasing overall sampling frequency. It would enable the water utility to (1 analyze any relevant event and (2 limit median underestimation of peak concentrations to approximately 10%. We conclude with a generalized perspective on sampling optimization and argue that the assessment of short-term dynamics causing microbial peak loads initially requires increased sampling/analysis efforts, but can be optimized subsequently to account for limited resources. This offers water utilities and public health authorities systematic ways to evaluate and optimize their

  7. LES-based generation of high-frequency fluctuation in wind turbulence obtained by meteorological model

    Science.gov (United States)

    Tamura, Tetsuro; Kawaguchi, Masaharu; Kawai, Hidenori; Tao, Tao

    2017-11-01

    The connection between a meso-scale model and a micro-scale large eddy simulation (LES) is significant to simulate the micro-scale meteorological problem such as strong convective events due to the typhoon or the tornado using LES. In these problems the mean velocity profiles and the mean wind directions change with time according to the movement of the typhoons or tornadoes. Although, a fine grid micro-scale LES could not be connected to a coarse grid meso-scale WRF directly. In LES when the grid is suddenly refined at the interface of nested grids which is normal to the mean advection the resolved shear stresses decrease due to the interpolation errors and the delay of the generation of smaller scale turbulence that can be resolved on the finer mesh. For the estimation of wind gust disaster the peak wind acting on buildings and structures has to be correctly predicted. In the case of meteorological model the velocity fluctuations have a tendency of diffusive variation without the high frequency component due to the numerically filtering effects. In order to predict the peak value of wind velocity with good accuracy, this paper proposes a LES-based method for generating the higher frequency components of velocity and temperature fields obtained by meteorological model.

  8. Radiative transport-based frequency-domain fluorescence tomography

    International Nuclear Information System (INIS)

    Joshi, Amit; Rasmussen, John C; Sevick-Muraca, Eva M; Wareing, Todd A; McGhee, John

    2008-01-01

    We report the development of radiative transport model-based fluorescence optical tomography from frequency-domain boundary measurements. The coupled radiative transport model for describing NIR fluorescence propagation in tissue is solved by a novel software based on the established Attila(TM) particle transport simulation platform. The proposed scheme enables the prediction of fluorescence measurements with non-contact sources and detectors at a minimal computational cost. An adjoint transport solution-based fluorescence tomography algorithm is implemented on dual grids to efficiently assemble the measurement sensitivity Jacobian matrix. Finally, we demonstrate fluorescence tomography on a realistic computational mouse model to locate nM to μM fluorophore concentration distributions in simulated mouse organs

  9. Technical basis document for external events

    International Nuclear Information System (INIS)

    OBERG, B.D.

    2003-01-01

    This document supports the Tank Farms Documented Safety Analysis and presents the technical basis for the FR-equencies of externally initiated accidents. The consequences of externally initiated events are discussed in other documents that correspond to the accident that was caused by the external event. The external events include aircraft crash, vehicle accident, range fire, and rail accident

  10. Regional Frequency and Uncertainty Analysis of Extreme Precipitation in Bangladesh

    Science.gov (United States)

    Mortuza, M. R.; Demissie, Y.; Li, H. Y.

    2014-12-01

    Increased frequency of extreme precipitations, especially those with multiday durations, are responsible for recent urban floods and associated significant losses of lives and infrastructures in Bangladesh. Reliable and routinely updated estimation of the frequency of occurrence of such extreme precipitation events are thus important for developing up-to-date hydraulic structures and stormwater drainage system that can effectively minimize future risk from similar events. In this study, we have updated the intensity-duration-frequency (IDF) curves for Bangladesh using daily precipitation data from 1961 to 2010 and quantified associated uncertainties. Regional frequency analysis based on L-moments is applied on 1-day, 2-day and 5-day annual maximum precipitation series due to its advantages over at-site estimation. The regional frequency approach pools the information from climatologically similar sites to make reliable estimates of quantiles given that the pooling group is homogeneous and of reasonable size. We have used Region of influence (ROI) approach along with homogeneity measure based on L-moments to identify the homogenous pooling groups for each site. Five 3-parameter distributions (i.e., Generalized Logistic, Generalized Extreme value, Generalized Normal, Pearson Type Three, and Generalized Pareto) are used for a thorough selection of appropriate models that fit the sample data. Uncertainties related to the selection of the distributions and historical data are quantified using the Bayesian Model Averaging and Balanced Bootstrap approaches respectively. The results from this study can be used to update the current design and management of hydraulic structures as well as in exploring spatio-temporal variations of extreme precipitation and associated risk.

  11. Interaction between Gender and Skill on Competitive State Anxiety Using the Time-to-Event Paradigm: What Roles Do Intensity, Direction, and Frequency Dimensions Play?

    Directory of Open Access Journals (Sweden)

    John E. Hagan

    2017-05-01

    Full Text Available Background and purpose: The functional understanding and examination of competitive anxiety responses as temporal events that unfold as time-to-competition moves closer has emerged as a topical research area within the domains of sport psychology. However, little is known from an inclusive and interaction oriented perspective. Using the multidimensional anxiety theory as a framework, the present study examined the temporal patterning of competitive anxiety, focusing on the dimensions of intensity, direction, and frequency of intrusions in athletes across gender and skill level.Methods: Elite and semi-elite table tennis athletes from the Ghanaian league (N = 90 completed a modified version of Competitive State Anxiety Inventory-2 (CSAI-2 with the inclusion of the directional and frequency of intrusion scales at three temporal phases (7 days, 2 days, and 1 h prior to a competitive fixture.Results: Multivariate Analyses of Variance repeated measures with follow-up analyses revealed significant interactions for between-subjects factors on all anxiety dimensions (intensity, direction, and frequency. Notably, elite (international female athletes were less cognitively anxious, showed more facilitative interpretation toward somatic anxiety symptoms and experienced less frequency of somatic anxiety symptoms than their male counterparts. However, both elite groups displayed appreciable level of self-confidence. For time-to-event effects, both cognitive and somatic anxiety intensity fluctuated whereas self-confidence showed a steady rise as competition neared. Somatic anxiety debilitative interpretation slightly improved 1 h before competition whereas cognitive anxiety frequencies also increased progressively during the entire preparatory phase.Conclusion: Findings suggest a more dynamic image of elite athletes’ pre-competitive anxiety responses than suggested by former studies, potentially influenced by cultural differences. The use of psychological

  12. Structural Health Monitoring Based on Combined Structural Global and Local Frequencies

    Directory of Open Access Journals (Sweden)

    Jilin Hou

    2014-01-01

    Full Text Available This paper presents a parameter estimation method for Structural Health Monitoring based on the combined measured structural global frequencies and structural local frequencies. First, the global test is experimented to obtain the low order modes which can reflect the global information of the structure. Secondly, the mass is added on the member of structure to increase the local dynamic characteristic and to make the member have local primary frequency, which belongs to structural local frequency and is sensitive to local parameters. Then the parameters of the structure can be optimized accurately using the combined structural global frequencies and structural local frequencies. The effectiveness and accuracy of the proposed method are verified by the experiment of a space truss.

  13. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  14. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  15. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  16. Statistical analysis of hydrodynamic cavitation events

    Science.gov (United States)

    Gimenez, G.; Sommer, R.

    1980-10-01

    The frequency (number of events per unit time) of pressure pulses produced by hydrodynamic cavitation bubble collapses is investigated using statistical methods. The results indicate that this frequency is distributed according to a normal law, its parameters not being time-evolving.

  17. An ultra-stable iodine-based frequency reference for space applications

    Science.gov (United States)

    Schuldt, Thilo; Braxmaier, Claus; Doeringshoff, Klaus; Keetman, Anja; Reggentin, Matthias; Kovalchuk, Evgeny; Peters, Achim

    2012-07-01

    Future space missions require for ultra-stable optical frequency references. Examples are the gravitational wave detector LISA/eLISA (Laser Interferometer Space Antenna), the SpaceTime Asymmetry Research (STAR) program, the aperture-synthesis telescope Darwin and the GRACE (Gravity Recovery and Climate Experiment) follow on mission exploring Earth's gravity. As high long-term frequency stability is required, lasers stabilized to atomic or molecular transitions are preferred, also offering an absolute frequency reference. Frequency stabilities in the 10 ^{-15} domains at longer integration times (up to several hours) are demonstrated in laboratory experiments using setups based on Doppler-free spectroscopy. Such setups with a frequency stability comparable to the hydrogen maser in the microwave domain, have the potential to be developed space compatible on a relatively short time scale. Here, we present the development of ultra-stable optical frequency references based on modulation-transfer spectroscopy of molecular iodine. Noise levels of 2\\cdot10 ^{-14} at an integration time of 1 s and below 3\\cdot10 ^{-15} at integration times between 100 s and 1000 s are demonstrated with a laboratory setup using an 80 cm long iodine cell in single-pass configuration in combination with a frequency-doubled Nd:YAG laser and standard optical components and optomechanic mounts. The frequency stability at longer integration times is (amongst other things) limited by the dimensional stability of the optical setup, i.e. by th pointing stability of the two counter-propagating beams overlapped in the iodine cell. With the goal of a future space compatible setup, a compact frequency standard on EBB (elegant breadboard) level was realized. The spectroscopy unit utilizes a baseplate made of Clearceram-HS, a glass ceramics with an ultra-low coefficient of thermal expansion of 2\\cdot10 ^{-8} K ^{-1}. The optical components are joint to the baseplate using adhesive bonding technology

  18. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  19. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  20. Dynamic event Tress applied to sequences Full Spectrum LOCA. Calculating the frequency of excedeence of damage by integrated Safety Analysis Methodology

    International Nuclear Information System (INIS)

    Gomez-Magan, J. J.; Fernandez, I.; Gil, J.; Marrao, H.; Queral, C.; Gonzalez-Cadelo, J.; Montero-Mayorga, J.; Rivas, J.; Ibane-Llano, C.; Izquierdo, J. M.; Sanchez-Perea, M.; Melendez, E.; Hortal, J.

    2013-01-01

    The Integrated Safety Analysis (ISA) methodology, developed by the Spanish Nuclear Safety council (CSN), has been applied to obtain the dynamic Event Trees (DETs) for full spectrum Loss of Coolant Accidents (LOCAs) of a Westinghouse 3-loop PWR plant. The purpose of this ISA application is to obtain the Damage Excedence Frequency (DEF) for the LOCA Event Tree by taking into account the uncertainties in the break area and the operator actuation time needed to cool down and de pressurize reactor coolant system by means of steam generator. Simulations are performed with SCAIS, a software tool which includes a dynamic coupling with MAAP thermal hydraulic code. The results show the capability of the ISA methodology to obtain the DEF taking into account the time uncertainty in human actions. (Author)

  1. Damage detection in multi-span beams based on the analysis of frequency changes

    International Nuclear Information System (INIS)

    Gillich, G R; Ntakpe, J L; Praisach, Z I; Mimis, M C; Abdel Wahab, M

    2017-01-01

    Crack identification in multi-span beams is performed to determine whether the structure is healthy or not. Among all crack identification methods, these based on measured natural frequency changes present the advantage of simplicity and easy to use in practical engineering. To accurately identify the cracks characteristics for multi-span beam structure, a mathematical model is established, which can predict frequency changes for any boundary conditions, the intermediate supports being hinges. This relation is based on the modal strain energy concept. Since frequency changes are relative small, to obtain natural frequencies with high resolution, a signal processing algorithm based on superposing of numerous spectra is also proposed, which overcomes the disadvantage of Fast Fourier Transform in the aspect of frequency resolution. Based on above-mentioned mathematical model and signal processing algorithm, the method of identifying cracks on multi-span beams is presented. To verify the accuracy of this identification method, experimental examples are conducted on a two-span structure. The results demonstrate that the method proposed in this paper can accurately identify the crack position and depth. (paper)

  2. A Frequency-Based Approach to Intrusion Detection

    Directory of Open Access Journals (Sweden)

    Mian Zhou

    2004-06-01

    Full Text Available Research on network security and intrusion detection strategies presents many challenging issues to both theoreticians and practitioners. Hackers apply an array of intrusion and exploit techniques to cause disruption of normal system operations, but on the defense, firewalls and intrusion detection systems (IDS are typically only effective in defending known intrusion types using their signatures, and are far less than mature when faced with novel attacks. In this paper, we adapt the frequency analysis techniques such as the Discrete Fourier Transform (DFT used in signal processing to the design of intrusion detection algorithms. We demonstrate the effectiveness of the frequency-based detection strategy by running synthetic network intrusion data in simulated networks using the OPNET software. The simulation results indicate that the proposed intrusion detection strategy is effective in detecting anomalous traffic data that exhibit patterns over time, which include several types of DOS and probe attacks. The significance of this new strategy is that it does not depend on the prior knowledge of attack signatures, thus it has the potential to be a useful supplement to existing signature-based IDS and firewalls.

  3. Investigating the frequency and interannual variability in global above-cloud aerosol characteristics with CALIOP and OMI

    Directory of Open Access Journals (Sweden)

    R. Alfaro-Contreras

    2016-01-01

    Full Text Available Seven and a half years (June 2006 to November 2013 of Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP aerosol and cloud layer products are compared with collocated Ozone Monitoring Instrument (OMI aerosol index (AI data and Aqua Moderate Resolution Imaging Spectroradiometer (MODIS cloud products in order to investigate variability in estimates of biannual and monthly above-cloud aerosol (ACA events globally. The active- (CALIOP and passive-based (OMI-MODIS techniques have their advantages and caveats for ACA detection, and thus both are used to derive a thorough and robust comparison of daytime cloudy-sky ACA distribution and climatology. For the first time, baseline above-cloud aerosol optical depth (ACAOD and AI thresholds are derived and examined (AI  =  1.0, ACAOD  =  0.015 for each sensor. Both OMI-MODIS and CALIOP-based daytime spatial distributions of ACA events show similar patterns during both study periods (December–May and (June–November. Divergence exists in some regions, however, such as Southeast Asia during June through November, where daytime cloudy-sky ACA frequencies of up to 10 % are found from CALIOP yet are non-existent from the OMI-based method. Conversely, annual cloudy-sky ACA frequencies of 20–30 % are reported over northern Africa from the OMI-based method yet are largely undetected by the CALIOP-based method. Using a collocated OMI-MODIS-CALIOP data set, our study suggests that the cloudy-sky ACA frequency differences between the OMI-MODIS- and CALIOP-based methods are mostly due to differences in cloud detection capability between MODIS and CALIOP as well as QA flags used. An increasing interannual variability of  ∼  0.3–0.4 % per year (since 2009 in global monthly cloudy-sky ACA daytime frequency of occurrence is found using the OMI-MODIS-based method. Yet, CALIOP-based global daytime ACA frequencies exhibit a near-zero interannual variability. Further analysis suggests

  4. A browser-based event display for the CMS experiment at the LHC

    International Nuclear Information System (INIS)

    Hategan, M; McCauley, T; Nguyen, P

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  5. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  6. Measurement of a discontinuous object based on a dual-frequency grating

    Institute of Scientific and Technical Information of China (English)

    Qiao Nao-Sheng; Cai Xin-Hua; Yao Chun-Mei

    2009-01-01

    The dual-frequency grating measurement theory is proposed in order to carry out the measurement of a discontinuous object. Firstly, the reason why frequency spectra are produced by low frequency gratings and high frequency gratings in the field of frequency is analysed, and the relationship between the wrapped-phase and the unwrappingphase is discussed. Secondly, a method to combine the advantages of the two kinds of gratings is proposed: one stripe is produced in the mutation part of the object measured by a suitable low frequency grating designed by MATLAB, then the phase produced by the low frequency grating need not be unfolded. The integer series of stripes is produced by a high frequency grating designed by MATLAB based on the frequency ratio of the two kinds of gratings and the high frequency wrapped-phase, and the high frequency unwrapping-phase is then obtained. In order to verify the correctness of the theoretical analysis, a steep discontinuous object of 600×600 pixels and 10.00 mm in height is simulated and a discontinuous object of ladder shape which is 32.00 mm in height is used in experiment. Both the simulation and the experiment can restore the discontinuous object height accurately by using the dual-frequency grating measurement theory.

  7. Investigation of frequencies of waves at different traveltimes

    International Nuclear Information System (INIS)

    Babbel, G.; Engelhard, L.; Schimanowski, C.

    1978-03-01

    After finishing preparing theoretical work changes of frequency spectra due to traletime and interbeded layers have been investigated using seismic field recordings, synthetic models and modelseismic records. (three layer model). The most important investigations have been done in order to determine the absorption of seismic waves. Engelhard (Braunschweig) and Babbel (Clausthal) demonstrated that classical methods for determination of absorption (amplitude investigations, division of frequency spectra) using real data cannot solve these problems. Theoretical consideration should give good results of the Q-factor in case of wavelets not superimposed by multiple events. The experiences obtained may be seen as the base of further investigations. (orig.) [de

  8. The rate of adverse events during IV conscious sedation.

    Science.gov (United States)

    Schwamburger, Nathan T; Hancock, Raymond H; Chong, Chol H; Hartup, Grant R; Vandewalle, Kraig S

    2012-01-01

    Conscious sedation has become an integral part of dentistry; it is often used to reduce anxiety or fear in some patients during oral surgery, periodontal surgery, implant placement, and general dentistry procedures. The purpose of this study was to evaluate the frequency of adverse events during IV conscious sedation provided by credentialed general dentists and periodontists in the United States Air Force (USAF). Sedation clinical records (Air Force Form 1417) from calendar year 2009 were requested from all USAF bases. A total of 1,468 records were reviewed and 19 adverse events were noted in 17 patients. IV complication (infiltration) was the most common adverse event. The overall adverse event rate was 1.3 per 100 patients treated. The results of this study show that moderate sedation provided by general dentists and periodontists in the USAF has a low incidence of adverse events, and conscious sedation remains a viable option for providers for the reduction of anxiety in select patients.

  9. Frequency Based Fault Detection in Wind Turbines

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2014-01-01

    In order to obtain lower cost of energy for wind turbines fault detection and accommodation is important. Expensive condition monitoring systems are often used to monitor the condition of rotating and vibrating system parts. One example is the gearbox in a wind turbine. This system is operated...... in parallel to the control system, using different computers and additional often expensive sensors. In this paper a simple filter based algorithm is proposed to detect changes in a resonance frequency in a system, exemplified with faults resulting in changes in the resonance frequency in the wind turbine...... gearbox. Only the generator speed measurement which is available in even simple wind turbine control systems is used as input. Consequently this proposed scheme does not need additional sensors and computers for monitoring the condition of the wind gearbox. The scheme is evaluated on a wide-spread wind...

  10. Frequency Support of PMSG-WTG Based on Improved Inertial Control: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Z.; Wang, X.; Gao, W.; Kang, M.; Hwang, M.; Kang, Y.; Gevorgian, Vahan; Muljadi, Eduard

    2016-03-15

    With increasing integrations of large-scale systems based on permanent magnet synchronous generator wind turbine generators (PMSG-WTGs), the overall inertial response of a power system will tend to deteriorate as a result of the decoupling of rotor speed and grid frequency through the power converter as well as the scheduled retirement of conventional synchronous generators. Thus, PMSG-WTGs can provide value to an electric grid by contributing to the system's inertial response by utilizing the inherent kinetic energy stored in their rotating masses and fast power control. In this work, an improved inertial control method based on the maximum power point tracking operation curve is introduced to enhance the overall frequency support capability of PMSG-WTGs in the case of large supply-demand imbalances. Moreover, this method is implemented in the CART2-PMSG integrated model in MATLAB/Simulink to investigate its impact on the wind turbine's structural loads during the inertial response process. Simulation results indicate that the proposed method can effectively reduce the frequency nadir, arrest the rate of change of frequency, and mitigate the secondary frequency drop while imposing no negative impact on the major mechanical components of the wind turbine.

  11. Short-Period Surface Wave Based Seismic Event Relocation

    Science.gov (United States)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  12. Pulse width modulation based pneumatic frequency tuner of the superconducting resonators at IUAC

    International Nuclear Information System (INIS)

    Pandey, A.; Suman, S.K.; Mathuria, D.S.

    2015-01-01

    The existing phase locking scheme of the quarter wave resonators (QWR) used in superconducting linear accelerator (LINAC) of IUAC consists of a fast time (electronic) and a slow time (pneumatic) control. Presently, piezo based mechanical tuners are being used to phase lock the resonators installed in the second and third accelerating modules of LINAC. However, due to space constraint, the piezo tuner can't be implemented on the resonators of the first accelerating module. Therefore, helium gas operated mechanical tuners are being used to phase lock the resonators against the master oscillator (MO) frequency. The present pneumatic frequency tuner has limitations of non-linearity, hysteresis and slow response time. To overcome these problems and to improve the dynamics of the existing tuner, a new pulse width modulation (PWM) based pneumatic frequency tuning system was adopted and successfully tested. After successful test, the PWM based pneumatic frequency tuner was installed in four QWR of the first accelerating module of LINAC. During beam run the PWM based frequency tuner performed well and the cavities could be phase locked at comparatively higher accelerating fields. A comparison of the existing tuning mechanism and the PWM based tuning system along with the test results will be presented in the paper. (author)

  13. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  14. Analysis of area events as part of probabilistic safety assessment for Romanian TRIGA SSR 14 MW reactor

    International Nuclear Information System (INIS)

    Mladin, D.; Stefan, I.

    2005-01-01

    The international experience has shown that the external events could be an important contributor to plant/ reactor risk. For this reason such events have to be included in the PSA studies. In the context of PSA for nuclear facilities, external events are defined as events originating from outside the plant, but with the potential to create an initiating event at the plant. To support plant safety assessment, PSA can be used to find methods for identification of vulnerable features of the plant and to suggest modifications in order to mitigate the impact of external events or the producing of initiating events. For that purpose, probabilistic assessment of area events concerning fire and flooding risk and impact is necessary. Due to the relatively large power level amongst research reactors, the approach to safety analysis of Romanian 14 MW TRIGA benefits from an ongoing PSA project. In this context, treatment of external events should be considered. The specific tasks proposed for the complete evaluation of area event analysis are: identify the rooms important for facility safety, determine a relative area event risk index for these rooms and a relative area event impact index if the event occurs, evaluate the rooms specific area event frequency, determine the rooms contribution to reactor hazard state frequencies, analyze power supply and room dependencies of safety components (as pumps, motor operated valves). The fire risk analysis methodology is based on Berry's method [1]. This approach provides a systematic procedure to carry out a relative index of different rooms. The factors, which affect the fire probability, are: personal presence in the room, number and type of ignition sources, type and area of combustibles, fuel available in the room, fuel location, and ventilation. The flooding risk analysis is based on the amount of piping in the room. For accuracy of the information regarding piping a facility walk-about is necessary. In case of flooding risk

  15. A power filter for the detection of burst events based on time-frequency spectrum estimation

    International Nuclear Information System (INIS)

    Guidi, G M; Cuoco, E; Vicere, A

    2004-01-01

    We propose as a statistic for the detection of bursts in a gravitational wave interferometer the 'energy' of the events estimated with a time-dependent calculation of the spectrum. This statistic has an asymptotic Gaussian distribution with known statistical moments, which makes it possible to perform a uniformly most powerful test (McDonough R N and Whalen A D 1995 Detection of Signals in Noise (New York: Academic)) on the energy mean. We estimate the receiver operating characteristic (ROC, from the same book) of this statistic for different levels of the signal-to-noise ratio in the specific case of a simulated noise having the spectral density expected for Virgo, using test signals taken from a library of possible waveforms emitted during the collapse of the core of type II supernovae

  16. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method.

    Science.gov (United States)

    Tuta, Jure; Juric, Matjaz B

    2018-03-24

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  17. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method

    Directory of Open Access Journals (Sweden)

    Jure Tuta

    2018-03-01

    Full Text Available This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method, a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.. Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  18. Power MOSFET-diode-based limiter for high-frequency ultrasound systems.

    Science.gov (United States)

    Choi, Hojong; Kim, Min Gon; Cummins, Thomas M; Hwang, Jae Youn; Shung, K Kirk

    2014-10-01

    The purpose of the limiter circuits used in the ultrasound imaging systems is to pass low-voltage echo signals generated by ultrasonic transducers while preventing high-voltage short pulses transmitted by pulsers from damaging front-end circuits. Resistor-diode-based limiters (a 50 Ω resistor with a single cross-coupled diode pair) have been widely used in pulse-echo measurement and imaging system applications due to their low cost and simple architecture. However, resistor-diode-based limiters may not be suited for high-frequency ultrasound transducer applications since they produce large signal conduction losses at higher frequencies. Therefore, we propose a new limiter architecture utilizing power MOSFETs, which we call a power MOSFET-diode-based limiter. The performance of a power MOSFET-diode-based limiter was evaluated with respect to insertion loss (IL), total harmonic distortion (THD), and response time (RT). We compared these results with those of three other conventional limiter designs and showed that the power MOSFET-diode-based limiter offers the lowest IL (-1.33 dB) and fastest RT (0.10 µs) with the lowest suppressed output voltage (3.47 Vp-p) among all the limiters at 70 MHz. A pulse-echo test was performed to determine how the new limiter affected the sensitivity and bandwidth of the transducer. We found that the sensitivity and bandwidth of the transducer were 130% and 129% greater, respectively, when combined with the new power MOSFET-diode-based limiter versus the resistor-diode-based limiter. Therefore, these results demonstrate that the power MOSFET-diode-based limiter is capable of producing lower signal attenuation than the three conventional limiter designs at higher frequency operation. © The Author(s) 2014.

  19. The low frequency 2D vibration sensor based on flat coil element

    Energy Technology Data Exchange (ETDEWEB)

    Djamal, Mitra; Sanjaya, Edi; Islahudin; Ramli [Department of Physics, Institut Teknologi Bandung, Jl. Ganesa 10 Bandung 40116 (Indonesia); Department of Physics, Institut Teknologi Bandung, Jl. Ganesa 10 Bandung 40116 (Indonesia) and Department of Physics, UIN Syarif Hidayatullah, Jl. Ir.H. Djuanda 95 Ciputat 15412 (Indonesia); MTs NW Nurul Iman Kembang Kerang, Jl. Raya Mataram - Lb.Lombok, NTB (Indonesia); Department of Physics, Institut Teknologi Bandung, Jl. Ganesa 10 Bandung 40116 (Indonesia) and Department of Physics,Universitas Negeri Padang, Jl. Prof. Hamka, Padang 25132 (Indonesia)

    2012-06-20

    Vibration like an earthquake is a phenomenon of physics. The characteristics of these vibrations can be used as an early warning system so as to reduce the loss or damage caused by earthquakes. In this paper, we introduced a new type of low frequency 2D vibration sensor based on flat coil element that we have developed. Its working principle is based on position change of a seismic mass that put in front of a flat coil element. The flat coil is a part of a LC oscillator; therefore, the change of seismic mass position will change its resonance frequency. The results of measurements of low frequency vibration sensor in the direction of the x axis and y axis gives the frequency range between 0.2 to 1.0 Hz.

  20. Frequency hopping signal detection based on wavelet decomposition and Hilbert-Huang transform

    Science.gov (United States)

    Zheng, Yang; Chen, Xihao; Zhu, Rui

    2017-07-01

    Frequency hopping (FH) signal is widely adopted by military communications as a kind of low probability interception signal. Therefore, it is very important to research the FH signal detection algorithm. The existing detection algorithm of FH signals based on the time-frequency analysis cannot satisfy the time and frequency resolution requirement at the same time due to the influence of window function. In order to solve this problem, an algorithm based on wavelet decomposition and Hilbert-Huang transform (HHT) was proposed. The proposed algorithm removes the noise of the received signals by wavelet decomposition and detects the FH signals by Hilbert-Huang transform. Simulation results show the proposed algorithm takes into account both the time resolution and the frequency resolution. Correspondingly, the accuracy of FH signals detection can be improved.

  1. Fast oscillations in cortical-striatal networks switch frequency following rewarding events and stimulant drugs.

    Science.gov (United States)

    Berke, J D

    2009-09-01

    Oscillations may organize communication between components of large-scale brain networks. Although gamma-band oscillations have been repeatedly observed in cortical-basal ganglia circuits, their functional roles are not yet clear. Here I show that, in behaving rats, distinct frequencies of ventral striatal local field potential oscillations show coherence with different cortical inputs. The approximately 50 Hz gamma oscillations that normally predominate in awake ventral striatum are coherent with piriform cortex, whereas approximately 80-100 Hz high-gamma oscillations are coherent with frontal cortex. Within striatum, entrainment to gamma rhythms is selective to fast-spiking interneurons, with distinct fast-spiking interneuron populations entrained to different gamma frequencies. Administration of the psychomotor stimulant amphetamine or the dopamine agonist apomorphine causes a prolonged decrease in approximately 50 Hz power and increase in approximately 80-100 Hz power. The same frequency switch is observed for shorter epochs spontaneously in awake, undrugged animals and is consistently provoked for reward receipt. Individual striatal neurons can participate in these brief high-gamma bursts with, or without, substantial changes in firing rate. Switching between discrete oscillatory states may allow different modes of information processing during decision-making and reinforcement-based learning, and may also be an important systems-level process by which stimulant drugs affect cognition and behavior.

  2. Effect of lunar phase on frequency of psychogenic nonepileptic events in the EMU.

    Science.gov (United States)

    Bolen, Robert D; Campbell, Zeke; Dennis, William A; Koontz, Elizabeth H; Pritchard, Paul B

    2016-06-01

    Studies of the effect of a full moon on seizures have yielded mixed results, despite a continuing prevailing belief regarding the association of lunar phase with human behavior. The potential effect of a full moon on psychogenic nonepileptic events has not been as well studied, despite what anecdotal accounts from most epilepsy monitoring unit (EMU) staff would suggest. We obtained the dates and times of all events from patients diagnosed with psychogenic nonepileptic events discharged from our EMU over a two-year period. The events were then plotted on a 29.5-day lunar calendar. Events were also broken down into lunar quarters for statistical analysis. We found a statistically significant increase in psychogenic nonepileptic events during the new moon quarter in our EMU during our studied timeframe. Our results are not concordant with the results of a similarly designed past study, raising the possibility that psychogenic nonepileptic events are not influenced by lunar phase. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Study on frequency characteristics of wireless power transmission system based on magnetic coupling resonance

    Science.gov (United States)

    Liang, L. H.; Liu, Z. Z.; Hou, Y. J.; Zeng, H.; Yue, Z. K.; Cui, S.

    2017-11-01

    In order to study the frequency characteristics of the wireless energy transmission system based on the magnetic coupling resonance, a circuit model based on the magnetic coupling resonant wireless energy transmission system is established. The influence of the load on the frequency characteristics of the wireless power transmission system is analysed. The circuit coupling theory is used to derive the minimum load required to suppress frequency splitting. Simulation and experimental results verify that when the load size is lower than a certain value, the system will appear frequency splitting, increasing the load size can effectively suppress the frequency splitting phenomenon. The power regulation scheme of the wireless charging system based on magnetic coupling resonance is given. This study provides a theoretical basis for load selection and power regulation of wireless power transmission systems.

  4. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  5. Dynamics of large-scale cortical interactions at high gamma frequencies during word production: event related causality (ERC) analysis of human electrocorticography (ECoG).

    Science.gov (United States)

    Korzeniewska, Anna; Franaszczuk, Piotr J; Crainiceanu, Ciprian M; Kuś, Rafał; Crone, Nathan E

    2011-06-15

    Intracranial EEG studies in humans have shown that functional brain activation in a variety of functional-anatomic domains of human cortex is associated with an increase in power at a broad range of high gamma (>60Hz) frequencies. Although these electrophysiological responses are highly specific for the location and timing of cortical processing and in animal recordings are highly correlated with increased population firing rates, there has been little direct empirical evidence for causal interactions between different recording sites at high gamma frequencies. Such causal interactions are hypothesized to occur during cognitive tasks that activate multiple brain regions. To determine whether such causal interactions occur at high gamma frequencies and to investigate their functional significance, we used event-related causality (ERC) analysis to estimate the dynamics, directionality, and magnitude of event-related causal interactions using subdural electrocorticography (ECoG) recorded during two word production tasks: picture naming and auditory word repetition. A clinical subject who had normal hearing but was skilled in American Signed Language (ASL) provided a unique opportunity to test our hypothesis with reference to a predictable pattern of causal interactions, i.e. that language cortex interacts with different areas of sensorimotor cortex during spoken vs. signed responses. Our ERC analyses confirmed this prediction. During word production with spoken responses, perisylvian language sites had prominent causal interactions with mouth/tongue areas of motor cortex, and when responses were gestured in sign language, the most prominent interactions involved hand and arm areas of motor cortex. Furthermore, we found that the sites from which the most numerous and prominent causal interactions originated, i.e. sites with a pattern of ERC "divergence", were also sites where high gamma power increases were most prominent and where electrocortical stimulation mapping

  6. Enhancing Business Process Automation by Integrating RFID Data and Events

    Science.gov (United States)

    Zhao, Xiaohui; Liu, Chengfei; Lin, Tao

    Business process automation is one of the major benefits for utilising Radio Frequency Identification (RFID) technology. Through readers to RFID middleware systems, the information and the movements of tagged objects can be used to trigger business transactions. These features change the way of business applications for dealing with the physical world from mostly quantity-based to object-based. Aiming to facilitate business process automation, this paper introduces a new method to model and incorporate business logics into RFID edge systems from an object-oriented perspective with emphasises on RFID's event-driven characteristics. A framework covering business rule modelling, event handling and system operation invocations is presented on the basis of the event calculus. In regard to the identified delayed effects in RFID-enabled applications, a two-block buffering mechanism is proposed to improve RFID query efficiency within the framework. The performance improvements are analysed with related experiments.

  7. Harmonic spectral components in time sequences of Markov correlated events

    Science.gov (United States)

    Mazzetti, Piero; Carbone, Anna

    2017-07-01

    The paper concerns the analysis of the conditions allowing time sequences of Markov correlated events give rise to a line power spectrum having a relevant physical interest. It is found that by specializing the Markov matrix in order to represent closed loop sequences of events with arbitrary distribution, generated in a steady physical condition, a large set of line spectra, covering all possible frequency values, is obtained. The amplitude of the spectral lines is given by a matrix equation based on a generalized Markov matrix involving the Fourier transform of the distribution functions representing the time intervals between successive events of the sequence. The paper is a complement of a previous work where a general expression for the continuous power spectrum was given. In that case the Markov matrix was left in a more general form, thus preventing the possibility of finding line spectra of physical interest. The present extension is also suggested by the interest of explaining the emergence of a broad set of waves found in the electro and magneto-encephalograms, whose frequency ranges from 0.5 to about 40Hz, in terms of the effects produced by chains of firing neurons within the complex neural network of the brain. An original model based on synchronized closed loop sequences of firing neurons is proposed, and a few numerical simulations are reported as an application of the above cited equation.

  8. The neural bases of spatial frequency processing during scene perception

    Science.gov (United States)

    Kauffmann, Louise; Ramanoël, Stephen; Peyrin, Carole

    2014-01-01

    Theories on visual perception agree that scenes are processed in terms of spatial frequencies. Low spatial frequencies (LSF) carry coarse information whereas high spatial frequencies (HSF) carry fine details of the scene. However, how and where spatial frequencies are processed within the brain remain unresolved questions. The present review addresses these issues and aims to identify the cerebral regions differentially involved in low and high spatial frequency processing, and to clarify their attributes during scene perception. Results from a number of behavioral and neuroimaging studies suggest that spatial frequency processing is lateralized in both hemispheres, with the right and left hemispheres predominantly involved in the categorization of LSF and HSF scenes, respectively. There is also evidence that spatial frequency processing is retinotopically mapped in the visual cortex. HSF scenes (as opposed to LSF) activate occipital areas in relation to foveal representations, while categorization of LSF scenes (as opposed to HSF) activates occipital areas in relation to more peripheral representations. Concomitantly, a number of studies have demonstrated that LSF information may reach high-order areas rapidly, allowing an initial coarse parsing of the visual scene, which could then be sent back through feedback into the occipito-temporal cortex to guide finer HSF-based analysis. Finally, the review addresses spatial frequency processing within scene-selective regions areas of the occipito-temporal cortex. PMID:24847226

  9. External event analysis methods for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1989-01-01

    The US Nuclear Regulatory Commission is sponsoring probabilistic risk assessments of six operating commercial nuclear power plants as part of a major update of the understanding of risk as provided by the original WASH-1400 risk assessments. In contrast to the WASH-1400 studies, at least two of the NUREG-1150 risk assessments will include an analysis of risks due to earthquakes, fires, floods, etc., which are collectively known as eternal events. This paper summarizes the methods to be used in the external event analysis for NUREG-1150 and the results obtained to date. The two plants for which external events are being considered are Surry and Peach Bottom, a PWR and BWR respectively. The external event analyses (through core damage frequency calculations) were completed in June 1989, with final documentation available in September. In contrast to most past external event analyses, wherein rudimentary systems models were developed reflecting each external event under consideration, the simplified NUREG-1150 analyses are based on the availability of the full internal event PRA systems models (event trees and fault trees) and make use of extensive computer-aided screening to reduce them to sequence cut sets important to each external event. This provides two major advantages in that consistency and scrutability with respect to the internal event analysis is achieved, and the full gamut of random and test/maintenance unavailabilities are automatically included, while only those probabilistically important survive the screening process. Thus, full benefit of the internal event analysis is obtained by performing the internal and external event analyses sequentially

  10. Frequency, probability, and prediction: easy solutions to cognitive illusions?

    Science.gov (United States)

    Griffin, D; Buehler, R

    1999-02-01

    Many errors in probabilistic judgment have been attributed to people's inability to think in statistical terms when faced with information about a single case. Prior theoretical analyses and empirical results imply that the errors associated with case-specific reasoning may be reduced when people make frequentistic predictions about a set of cases. In studies of three previously identified cognitive biases, we find that frequency-based predictions are different from-but no better than-case-specific judgments of probability. First, in studies of the "planning fallacy, " we compare the accuracy of aggregate frequency and case-specific probability judgments in predictions of students' real-life projects. When aggregate and single-case predictions are collected from different respondents, there is little difference between the two: Both are overly optimistic and show little predictive validity. However, in within-subject comparisons, the aggregate judgments are significantly more conservative than the single-case predictions, though still optimistically biased. Results from studies of overconfidence in general knowledge and base rate neglect in categorical prediction underline a general conclusion. Frequentistic predictions made for sets of events are no more statistically sophisticated, nor more accurate, than predictions made for individual events using subjective probability. Copyright 1999 Academic Press.

  11. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  12. Separation of musical instruments based on amplitude and frequency comodulation

    Science.gov (United States)

    Jacobson, Barry D.; Cauwenberghs, Gert; Quatieri, Thomas F.

    2002-05-01

    In previous work, amplitude comodulation was investigated as a basis for monaural source separation. Amplitude comodulation refers to similarities in amplitude envelopes of individual spectral components emitted by particular types of sources. In many types of musical instruments, amplitudes of all resonant modes rise/fall, and start/stop together during the course of normal playing. We found that under certain well-defined conditions, a mixture of constant frequency, amplitude comodulated sources can unambiguously be decomposed into its constituents on the basis of these similarities. In this work, system performance was improved by relaxing the constant frequency requirement. String instruments, for example, which are normally played with vibrato, are both amplitude and frequency comodulated sources, and could not be properly tracked under the constant frequency assumption upon which our original algorithm was based. Frequency comodulation refers to similarities in frequency variations of individual harmonics emitted by these types of sources. The analytical difficulty is in defining a representation of the source which properly tracks frequency varying components. A simple, fixed filter bank can only track an individual spectral component for the duration in which it is within the passband of one of the filters. Alternatives are therefore explored which are amenable to real-time implementation.

  13. Separate representation of stimulus frequency, intensity, and duration in auditory sensory memory: an event-related potential and dipole-model analysis.

    Science.gov (United States)

    Giard, M H; Lavikahen, J; Reinikainen, K; Perrin, F; Bertrand, O; Pernier, J; Näätänen, R

    1995-01-01

    Abstract The present study analyzed the neural correlates of acoustic stimulus representation in echoic sensory memory. The neural traces of auditory sensory memory were indirectly studied by using the mismatch negativity (MMN), an event-related potential component elicited by a change in a repetitive sound. The MMN is assumed to reflect change detection in a comparison process between the sensory input from a deviant stimulus and the neural representation of repetitive stimuli in echoic memory. The scalp topographies of the MMNs elicited by pure tones deviating from standard tones by either frequency, intensity, or duration varied according to the type of stimulus deviance, indicating that the MMNs for different attributes originate, at least in part, from distinct neural populations in the auditory cortex. This result was supported by dipole-model analysis. If the MMN generator process occurs where the stimulus information is stored, these findings strongly suggest that the frequency, intensity, and duration of acoustic stimuli have a separate neural representation in sensory memory.

  14. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  15. PRELIMINARY SELECTION OF MGR DESIGN BASIS EVENTS

    International Nuclear Information System (INIS)

    Kappes, J.A.

    1999-01-01

    The purpose of this analysis is to identify the preliminary design basis events (DBEs) for consideration in the design of the Monitored Geologic Repository (MGR). For external events and natural phenomena (e.g., earthquake), the objective is to identify those initiating events that the MGR will be designed to withstand. Design criteria will ensure that radiological release scenarios resulting from these initiating events are beyond design basis (i.e., have a scenario frequency less than once per million years). For internal (i.e., human-induced and random equipment failures) events, the objective is to identify credible event sequences that result in bounding radiological releases. These sequences will be used to establish the design basis criteria for MGR structures, systems, and components (SSCs) design basis criteria in order to prevent or mitigate radiological releases. The safety strategy presented in this analysis for preventing or mitigating DBEs is based on the preclosure safety strategy outlined in ''Strategy to Mitigate Preclosure Offsite Exposure'' (CRWMS M andO 1998f). DBE analysis is necessary to provide feedback and requirements to the design process, and also to demonstrate compliance with proposed 10 CFR 63 (Dyer 1999b) requirements. DBE analysis is also required to identify and classify the SSCs that are important to safety (ITS)

  16. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  17. Stochastic generation of hourly rainstorm events in Johor

    International Nuclear Information System (INIS)

    Nojumuddin, Nur Syereena; Yusof, Fadhilah; Yusop, Zulkifli

    2015-01-01

    Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was used in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor

  18. Stochastic generation of hourly rainstorm events in Johor

    Science.gov (United States)

    Nojumuddin, Nur Syereena; Yusof, Fadhilah; Yusop, Zulkifli

    2015-02-01

    Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was used in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972-2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor.

  19. Stochastic generation of hourly rainstorm events in Johor

    Energy Technology Data Exchange (ETDEWEB)

    Nojumuddin, Nur Syereena; Yusof, Fadhilah [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusop, Zulkifli [Institute of Environmental and Water Resources Management, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia)

    2015-02-03

    Engineers and researchers in water-related studies are often faced with the problem of having insufficient and long rainfall record. Practical and effective methods must be developed to generate unavailable data from limited available data. Therefore, this paper presents a Monte-Carlo based stochastic hourly rainfall generation model to complement the unavailable data. The Monte Carlo simulation used in this study is based on the best fit of storm characteristics. Hence, by using the Maximum Likelihood Estimation (MLE) and Anderson Darling goodness-of-fit test, lognormal appeared to be the best rainfall distribution. Therefore, the Monte Carlo simulation based on lognormal distribution was used in the study. The proposed model was verified by comparing the statistical moments of rainstorm characteristics from the combination of the observed rainstorm events under 10 years and simulated rainstorm events under 30 years of rainfall records with those under the entire 40 years of observed rainfall data based on the hourly rainfall data at the station J1 in Johor over the period of 1972–2011. The absolute percentage error of the duration-depth, duration-inter-event time and depth-inter-event time will be used as the accuracy test. The results showed the first four product-moments of the observed rainstorm characteristics were close with the simulated rainstorm characteristics. The proposed model can be used as a basis to derive rainfall intensity-duration frequency in Johor.

  20. The analysis of cable forces based on natural frequency

    Science.gov (United States)

    Suangga, Made; Hidayat, Irpan; Juliastuti; Bontan, Darwin Julius

    2017-12-01

    A cable is a flexible structural member that is effective at resisting tensile forces. Cables are used in a variety of structures that employ their unique characteristics to create efficient design tension members. The condition of the cable forces in the cable supported structure is an important indication of judging whether the structure is in good condition. Several methods have been developed to measure on site cable forces. Vibration technique using correlation between natural frequency and cable forces is a simple method to determine in situ cable forces, however the method need accurate information on the boundary condition, cable mass, and cable length. The natural frequency of the cable is determined using FFT (Fast Fourier Transform) Technique to the acceleration record of the cable. Based on the natural frequency obtained, the cable forces then can be determine by analytical or by finite element program. This research is focus on the vibration techniques to determine the cable forces, to understand the physical parameter effect of the cable and also modelling techniques to the natural frequency and cable forces.

  1. Modeling time to recovery and initiating event frequency for loss of off-site power incidents at nuclear power plants

    International Nuclear Information System (INIS)

    Iman, R.L.; Hora, S.C.

    1988-01-01

    Industry data representing the time to recovery of loss of off-site power at nuclear power plants for 63 incidents caused by plant-centered losses, grid losses, or severe weather losses are fit with exponential, lognormal, gamma and Weibull probability models. A Bayesian analysis is used to compare the adequacy of each of these models and to provide uncertainty bounds on each of the fitted models. A composite model that combines the probability models fitted to each of the three sources of data is presented as a method for predicting the time to recovery of loss of off-site power. The composite model is very general and can be made site specific by making adjustments on the models used, such as might occur due to the type of switchyard configuration or type of grid, and by adjusting the weights on the individual models, such as might occur with weather conditions existing at a particular plant. Adjustments in the composite model are shown for different models used for switchyard configuration and for different weights due to weather. Bayesian approaches are also presented for modeling the frequency of initiating events leading to loss of off-site power. One Bayesian model assumes that all plants share a common incidence rate for loss of off-site power, while the other Bayesian approach models the incidence rate for each plant relative to the incidence rates of all other plants. Combining the Bayesian models for the frequency of the initiating events with the composite Bayesian model for recovery provides the necessary vehicle for a complete model that incorporates uncertainty into a probabilistic risk assessment

  2. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  3. Portable atomic frequency standard based on coherent population trapping

    Science.gov (United States)

    Shi, Fan; Yang, Renfu; Nian, Feng; Zhang, Zhenwei; Cui, Yongshun; Zhao, Huan; Wang, Nuanrang; Feng, Keming

    2015-05-01

    In this work, a portable atomic frequency standard based on coherent population trapping is designed and demonstrated. To achieve a portable prototype, in the system, a single transverse mode 795nm VCSEL modulated by a 3.4GHz RF source is used as a pump laser which generates coherent light fields. The pump beams pass through a vapor cell containing atom gas and buffer gas. This vapor cell is surrounded by a magnetic shield and placed inside a solenoid which applies a longitudinal magnetic field to lift the Zeeman energy levels' degeneracy and to separate the resonance signal, which has no first-order magnetic field dependence, from the field-dependent resonances. The electrical control system comprises two control loops. The first one locks the laser wavelength to the minimum of the absorption spectrum; the second one locks the modulation frequency and output standard frequency. Furthermore, we designed the micro physical package and realized the locking of a coherent population trapping atomic frequency standard portable prototype successfully. The short-term frequency stability of the whole system is measured to be 6×10-11 for averaging times of 1s, and reaches 5×10-12 at an averaging time of 1000s.

  4. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  5. Reference Beam Pattern Design for Frequency Invariant Beamforming Based on Fast Fourier Transform

    Directory of Open Access Journals (Sweden)

    Wang Zhang

    2016-09-01

    Full Text Available In the field of fast Fourier transform (FFT-based frequency invariant beamforming (FIB, there is still an unsolved problem. That is the selection of the reference beam to make the designed wideband pattern frequency invariant (FI over a given frequency range. This problem is studied in this paper. The research shows that for a given array, the selection of the reference beam pattern is determined by the number of sensors and the ratio of the highest frequency to the lowest frequency of the signal (RHL. The length of the weight vector corresponding to a given reference beam pattern depends on the reference frequency. In addition, the upper bound of the weight length to ensure the FI property over the whole frequency band of interest is also given. When the constraints are added to the reference beam, it does not affect the FI property of the designed wideband beam as long as the symmetry of the reference beam is ensured. Based on this conclusion, a scheme for reference beam design is proposed.

  6. GIS-based rare events logistic regression for mineral prospectivity mapping

    Science.gov (United States)

    Xiong, Yihui; Zuo, Renguang

    2018-02-01

    Mineralization is a special type of singularity event, and can be considered as a rare event, because within a specific study area the number of prospective locations (1s) are considerably fewer than the number of non-prospective locations (0s). In this study, GIS-based rare events logistic regression (RELR) was used to map the mineral prospectivity in the southwestern Fujian Province, China. An odds ratio was used to measure the relative importance of the evidence variables with respect to mineralization. The results suggest that formations, granites, and skarn alterations, followed by faults and aeromagnetic anomaly are the most important indicators for the formation of Fe-related mineralization in the study area. The prediction rate and the area under the curve (AUC) values show that areas with higher probability have a strong spatial relationship with the known mineral deposits. Comparing the results with original logistic regression (OLR) demonstrates that the GIS-based RELR performs better than OLR. The prospectivity map obtained in this study benefits the search for skarn Fe-related mineralization in the study area.

  7. Research on frequency control strategy of interconnected region based on fuzzy PID

    Science.gov (United States)

    Zhang, Yan; Li, Chunlan

    2018-05-01

    In order to improve the frequency control performance of the interconnected power grid, overcome the problems of poor robustness and slow adjustment of traditional regulation, the paper puts forward a frequency control method based on fuzzy PID. The method takes the frequency deviation and tie-line deviation of each area as the control objective, takes the regional frequency deviation and its deviation as input, and uses fuzzy mathematics theory, adjusts PID control parameters online. By establishing the regional frequency control model of water-fire complementary power generation in MATLAB, the regional frequency control strategy is given, and three control modes (TBC-FTC, FTC-FTC, FFC-FTC) are simulated and analyzed. The simulation and experimental results show that, this method has better control performance compared with the traditional regional frequency regulation.

  8. High Frequency Supercapacitors for Piezo-based Energy Harvesting

    Science.gov (United States)

    Ervin, Matthew; Pereira, Carlos; Miller, John; Outlaw, Ronald; Rastegar, Jay; Murray, Richard

    2013-03-01

    Energy harvesting is being investigated as an alternative to batteries for powering munition guidance and fuzing functions during flight. A piezoelectric system that generates energy from the oscillation of a mass on a spring (set in motion by the launch acceleration) is being developed. Original designs stored this energy in an electrolytic capacitor for use during flight. Here we replace the electrolytic capacitor with a smaller, lighter, and potentially more reliable electrochemical double layer capacitor (aka, supercapacitor). The potential problems with using supercapacitors in this application are that the piezoelectric output greatly exceeds the supercapacitor electrolyte breakdown voltage, and the frequency greatly exceeds the operating frequency of commercial supercapacitors. Here we have investigated the use of ultrafast vertically oriented graphene array-based supercapacitors for storing the energy in this application. We find that the electrolyte breakdown is not a serious limitation as it is either kinetically limited by the relatively high frequency of the piezoelectric output, or it is overcome by the self-healing nature of supercapacitors. We also find that these supercapacitors have sufficient dynamic response to efficiently store the generated energy.

  9. [Application of negative binomial regression and modified Poisson regression in the research of risk factors for injury frequency].

    Science.gov (United States)

    Cao, Qingqing; Wu, Zhenqiang; Sun, Ying; Wang, Tiezhu; Han, Tengwei; Gu, Chaomei; Sun, Yehuan

    2011-11-01

    To Eexplore the application of negative binomial regression and modified Poisson regression analysis in analyzing the influential factors for injury frequency and the risk factors leading to the increase of injury frequency. 2917 primary and secondary school students were selected from Hefei by cluster random sampling method and surveyed by questionnaire. The data on the count event-based injuries used to fitted modified Poisson regression and negative binomial regression model. The risk factors incurring the increase of unintentional injury frequency for juvenile students was explored, so as to probe the efficiency of these two models in studying the influential factors for injury frequency. The Poisson model existed over-dispersion (P Poisson regression and negative binomial regression model, was fitted better. respectively. Both showed that male gender, younger age, father working outside of the hometown, the level of the guardian being above junior high school and smoking might be the results of higher injury frequencies. On a tendency of clustered frequency data on injury event, both the modified Poisson regression analysis and negative binomial regression analysis can be used. However, based on our data, the modified Poisson regression fitted better and this model could give a more accurate interpretation of relevant factors affecting the frequency of injury.

  10. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  11. System frequency support of permanent magnet synchronous generator-based wind power plant

    Science.gov (United States)

    Wu, Ziping

    With ever-increasing penetration of wind power into modern electric grids all over the world, a trending replacement of conventional synchronous generators by large wind power plants will likely result in the poor overall frequency regulation performance. On the other hand, permanent magnet synchronous generator wind Turbine System (PMSG-WTG) with full power back to back converters tends to become one of the most promising wind turbine technologies thanks to various advantages. It possesses a significant amount of kinetic energy stored in the rotating mass of turbine blades, which can be utilized to enhance the total inertia of power system. Additionally, the deloaded operation and decoupled control of active and reactive power make it possible for PMSG-WTG to provide a fast frequency regulation through full-power converter. First of all, a comprehensive and in-depth survey is conducted to analyze the motivations for incorporating the inertial response and frequency regulation of VSWT into the system frequency regulation. Besides, control classifications, fundamental control concepts and advanced control schemes implemented for auxiliary frequency support of individual WT or wind power plant are elaborated along with a comparison of the potential frequency regulation capabilities of four major types of WTs. Secondly, a Controls Advanced Research Turbine2-Permanent Magnet Synchronous Generator wind turbine (CART2-PMSG) integrated model representing the typical configuration and operation characteristics of PMSG-WT is established in Matlab/Simulink,. Meanwhile, two different rotor-side converter control schemes, including rotor speed-based control and active power-based control, are integrated into this CART2-PMSG integrated model to perform Maximum Power Point Tracking (MPPT) operation over a wide range of wind speeds, respectively. Thirdly, a novel comprehensive frequency regulation (CFR) control scheme is developed and implemented into the CART2-PMSG model based

  12. Application and Use of PSA-based Event Analysis in Belgium

    International Nuclear Information System (INIS)

    Hulsmans, M.; De Gelder, P.

    2003-01-01

    The paper describes the experiences of the Belgian nuclear regulatory body AVN with the application and the use of the PSAEA guidelines (PSA-based Event Analysis). In 2000, risk-based precursor analysis has increasingly become a part of the AVN process of feedback of operating experience, and constitutes in fact the first PSA application for the Belgian plants. The PSAEA guidelines were established by a consultant in the framework of an international project. In a first stage, AVN applied the PSAEA guidelines to two test cases in order to explore the feasibility and the interest of this type of probabilistic precursor analysis. These pilot studies demonstrated the applicability of the PSAEA method in general, and its applicability to the computer models of the Belgian state-of-the- art PSAs in particular. They revealed insights regarding the event analysis methodology, the resulting event severity and the PSA model itself. The consideration of relevant what-if questions allowed to identify - and in some cases also to quantify - several potential safety issues for improvement. The internal evaluation of PSAEA was positive and AVN decided to routinely perform several PSAEA studies per year. During 2000, PSAEA has increasingly become a part of the AVN process of feedback of operating experience. The objectives of the AVN precursor program have been clearly stated. A first pragmatic set of screening rules for operational events has been drawn up and applied. Six more operational events have been analysed in detail (initiating events as well as condition events) and resulted in a wide spectrum of event severity. In addition to the particular conclusions for each event, relevant insights have been gained regarding for instance event modelling and the interpretation of results. Particular attention has been devoted to the form of the analysis report. After an initial presentation of some key concepts, the particular context of this program and of AVN's objectives, the

  13. Solar observations with a low frequency radio telescope

    Science.gov (United States)

    Myserlis, I.; Seiradakis, J.; Dogramatzidis, M.

    2012-01-01

    We have set up a low frequency radio monitoring station for solar bursts at the Observatory of the Aristotle University in Thessaloniki. The station consists of a dual dipole phased array, a radio receiver and a dedicated computer with the necessary software installed. The constructed radio receiver is based on NASA's Radio Jove project. It operates continuously, since July 2010, at 20.1 MHz (close to the long-wavelength ionospheric cut-off of the radio window) with a narrow bandwidth (~5 kHz). The system is properly calibrated, so that the recorded data are expressed in antenna temperature. Despite the high interference level of an urban region like Thessaloniki (strong broadcasting shortwave radio stations, periodic experimental signals, CBs, etc), we have detected several low frequency solar radio bursts and correlated them with solar flares, X-ray events and other low frequency solar observations. The received signal is monitored in ordinary ASCII format and as audio signal, in order to investigate and exclude man-made radio interference. In order to exclude narrow band interference and calculate the spectral indices of the observed events, a second monitoring station, working at 36 MHz, is under construction at the village of Nikiforos near the town of Drama, about 130 km away of Thessaloniki. Finally, we plan to construct a third monitoring station at 58 MHz, in Thessaloniki. This frequency was revealed to be relatively free of interference, after a thorough investigation of the region.

  14. Frequency and seasonality of flash floods in Slovenia

    Directory of Open Access Journals (Sweden)

    Trobec Tajan

    2017-01-01

    Full Text Available The purpose of this paper is to assess and analyse the dynamics of flash flooding events in Slovenia. The paper examines in particular the frequency of flash floods and their seasonal distribution. The methodology is based on the analysis of historical records and modern flood data. The results of a long-term frequency analysis of 138 flash floods that occurred between 1550 and 2015 are presented. Because of the lack of adequate historical flood data prior to 1950 the main analysis is based on data for the periodbetween1951 and2015, while the analysis of data for the period between1550 and1950 is added as a supplement to the main analysis. Analysis of data for the period after 1950 shows that on average 1.3 flash floods occur each year in Slovenia. The linear trend for the number of flash floods is increasing but is not statistically significant. Despite the fact that the majority of Slovenian rivers have one of the peaks in spring and one of the lows in summer, 90% of flash floods actually occur during meteorological summer or autumn - i.e. between June and November, which shows that discharge regimes and flood regimes are not necessarily related. Because of the lack of flood records from the more distant past as well as the large variability of flash flood events in the last several decades, we cannot provide a definitive answer to the question about possible changes in their frequency and seasonality by relying solely on the detected trends. Nevertheless, considering the results of analysis and future climate change scenarios the frequency of flash floods in Slovenia could increase while the period of flash flood occurrence could be extended.

  15. Biometric identification based on novel frequency domain facial asymmetry measures

    Science.gov (United States)

    Mitra, Sinjini; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-03-01

    In the modern world, the ever-growing need to ensure a system's security has spurred the growth of the newly emerging technology of biometric identification. The present paper introduces a novel set of facial biometrics based on quantified facial asymmetry measures in the frequency domain. In particular, we show that these biometrics work well for face images showing expression variations and have the potential to do so in presence of illumination variations as well. A comparison of the recognition rates with those obtained from spatial domain asymmetry measures based on raw intensity values suggests that the frequency domain representation is more robust to intra-personal distortions and is a novel approach for performing biometric identification. In addition, some feature analysis based on statistical methods comparing the asymmetry measures across different individuals and across different expressions is presented.

  16. Analysis of core damage frequency from internal events: Surry, Unit 1

    International Nuclear Information System (INIS)

    Harper, F.T.

    1986-11-01

    This document contains the accident sequence analyses for Surry, Unit 1; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission (NRC). NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Surry, Unit 1, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provide additional insights regarding the dominant contributors to the Surry core damage frequency estimate. The numerical results are driven to some degree by modeling assumptions and data selection for issues such as reactor coolant pump seal LOCAs, common cause failure probabilities, and plant response to station blackout and loss of electrical bust initiators. The sensitivity studies explore the impact of alternate theories and data on these issues

  17. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  18. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  19. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  20. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  1. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  2. Crossed SMPS MOSFET-based protection circuit for high frequency ultrasound transceivers and transducers.

    Science.gov (United States)

    Choi, Hojong; Shung, K Kirk

    2014-06-12

    The ultrasonic transducer is one of the core components of ultrasound systems, and the transducer's sensitivity is significantly related the loss of electronic components such as the transmitter, receiver, and protection circuit. In an ultrasonic device, protection circuits are commonly used to isolate the electrical noise between an ultrasound transmitter and transducer and to minimize unwanted discharged pulses in order to protect the ultrasound receiver. However, the performance of the protection circuit and transceiver obviously degrade as the operating frequency or voltage increases. We therefore developed a crossed SMPS (Switching Mode Power Supply) MOSFET-based protection circuit in order to maximize the sensitivity of high frequency transducers in ultrasound systems.The high frequency pulse signals need to trigger the transducer, and high frequency pulse signals must be received by the transducer. We therefore selected the SMPS MOSFET, which is the main component of the protection circuit, to minimize the loss in high frequency operation. The crossed configuration of the protection circuit can drive balanced bipolar high voltage signals from the pulser and transfer the balanced low voltage echo signals from the transducer. The equivalent circuit models of the SMPS MOSFET-based protection circuit are shown in order to select the proper device components. The schematic diagram and operation mechanism of the protection circuit is provided to show how the protection circuit is constructed. The P-Spice circuit simulation was also performed in order to estimate the performance of the crossed MOSFET-based protection circuit. We compared the performance of our crossed SMPS MOSFET-based protection circuit with a commercial diode-based protection circuit. At 60 MHz, our expander and limiter circuits have lower insertion loss than the commercial diode-based circuits. The pulse-echo test is typical method to evaluate the sensitivity of ultrasonic transducers

  3. Frequency support capability of variable speed wind turbine based on electromagnetic coupler

    DEFF Research Database (Denmark)

    You, Rui; Barahona Garzón, Braulio; Chai, Jianyun

    2015-01-01

    In the variable speed wind turbine based on electromagnetic coupler (WT-EMC), a synchronous generator is directly coupled with grid. So like conventional power plants WT-EMC is able to support grid frequency inherently. But due to the reduced inertia of synchronous generator, its frequency support...... capability has to be enhanced. In this paper, the frequency support capability of WT-EMC is studied at three typical wind conditions and with two control strategies-droop control and inertial control to enhance its frequency support capability. The synchronous generator speed, more stable than the grid...

  4. Simulation of power fluctuation of wind farms based on frequency domain

    DEFF Research Database (Denmark)

    Lin, Jin; Sun, Yuanzhang; Li, Guojie

    2011-01-01

    , however, is incapable of completely explaining the physical mechanism of randomness of power fluctuation. To remedy such a situation, fluctuation modeling based on the frequency domain is proposed. The frequency domain characteristics of stochastic fluctuation on large wind farms are studied using...... the power spectral density of wind speed, the frequency domain model of a wind power generator and the information on weather and geography of the wind farms. The correctness and effectiveness of the model are verified by comparing the measurement data with simulation results of a certain wind farm. © 2011...

  5. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  6. Cooperative Game Study of Airlines Based on Flight Frequency Optimization

    Directory of Open Access Journals (Sweden)

    Wanming Liu

    2014-01-01

    Full Text Available By applying the game theory, the relationship between airline ticket price and optimal flight frequency is analyzed. The paper establishes the payoff matrix of the flight frequency in noncooperation scenario and flight frequency optimization model in cooperation scenario. The airline alliance profit distribution is converted into profit distribution game based on the cooperation game theory. The profit distribution game is proved to be convex, and there exists an optimal distribution strategy. The results show that joining the airline alliance can increase airline whole profit, the change of negotiated prices and cost is beneficial to profit distribution of large airlines, and the distribution result is in accordance with aviation development.

  7. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  8. Standardized reporting of adverse events after microvascular decompression of cranial nerves; a population-based single-institution consecutive series

    DEFF Research Database (Denmark)

    Bartek, Jiri; Gulati, Sasha; Unsgård, Geirmund

    2016-01-01

    and 1 June 2013. Adverse events occurring within 30 days were classified according to the Landriel Ibanez classification for neurosurgical complications: grade I represents any non-life threatening complication treated without invasive procedures; grade II is complications requiring invasive management......OBJECTIVE: To investigate frequencies of adverse events occurring within 30 days after microvascular decompression (MVD) surgery using a standardized report form of adverse events. METHODS: We conducted a retrospective review of 98 adult patients (≥16 years) treated with MVD between 1 January 1994......; grade III is life-threatening adverse events requiring treatment in an intensive care unit (ICU); grade IV is death as a result of complications. We sought to compare our results with reports from the literature. RESULTS: Patients' median age was 61 years (range 26-83), and 64 (65 %) were females...

  9. History of Fire Events in the U.S. Commercial Nuclear Industry

    International Nuclear Information System (INIS)

    Bijan Najafi; Joglar-Biloch, Francisco; Kassawara, Robert P.; Khalil, Yehia

    2002-01-01

    Over the past decade, interest in performance-based fire protection has increased within the nuclear industry. In support of this growing interest, in 1997 the Electric Power Research Institute (EPRI) developed a long-range plan to develop/improve data and tools needed to support Risk-Informed/Performance-Based fire protection. This plan calls for continued improvement in collection and use of information obtained from fire events at nuclear plants. The data collection process has the objectives of improving the insights gained from such data and reducing the uncertainty in fire risk and fire modeling methods in order to make them a more reliable basis for performance based fire protection programs. In keeping with these objectives, EPRI continues to collect, review and analyze fire events in support of the nuclear industry. EPRI collects these records in cooperation with the Nuclear Electric Insurance Limited (NEIL), by compiling public fire event reports and by direct solicitation of U.S. nuclear facilities. EPRI fire data collection project is based on the principle that the understanding of history is one of the cornerstones of improving fire protection technology and practice. Therefore, the goal has been to develop and maintain a comprehensive database of fire events with flexibility to support various aspects of fire protection engineering. With more than 1850 fire records over a period of three decades and 2400 reactor years, this is the most comprehensive database of nuclear power industry fire events in existence today. In general, the frequency of fires in the U.S. commercial nuclear industry remains constant. In few cases, e.g., transient fires and fires in BWR offgas/recombiner systems, where either increasing or decreasing trends are observed, these trends tend to slow after 1980. The key issues in improving quality of the data remain to be consistency of the recording and reporting of fire events and difficulties in collection of records. EPRI has

  10. The Effects of Semantic Transparency and Base Frequency on the Recognition of English Complex Words

    Science.gov (United States)

    Xu, Joe; Taft, Marcus

    2015-01-01

    A visual lexical decision task was used to examine the interaction between base frequency (i.e., the cumulative frequencies of morphologically related forms) and semantic transparency for a list of derived words. Linear mixed effects models revealed that high base frequency facilitates the recognition of the complex word (i.e., a "base…

  11. Frequency Response Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-01

    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  12. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  13. Event-based rainfall-runoff modelling of the Kelantan River Basin

    Science.gov (United States)

    Basarudin, Z.; Adnan, N. A.; Latif, A. R. A.; Tahir, W.; Syafiqah, N.

    2014-02-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area.

  14. Event-based rainfall-runoff modelling of the Kelantan River Basin

    International Nuclear Information System (INIS)

    Basarudin, Z; Adnan, N A; Latif, A R A; Syafiqah, N; Tahir, W

    2014-01-01

    Flood is one of the most common natural disasters in Malaysia. According to hydrologists there are many causes that contribute to flood events. The two most dominant factors are the meteorology factor (i.e climate change) and change in land use. These two factors contributed to floods in recent decade especially in the monsoonal catchment such as Malaysia. This paper intends to quantify the influence of rainfall during extreme rainfall events on the hydrological model in the Kelantan River catchment. Therefore, two dynamic inputs were used in the study: rainfall and river discharge. The extreme flood events in 2008 and 2004 were compared based on rainfall data for both years. The events were modeled via a semi-distributed HEC-HMS hydrological model. Land use change was not incorporated in the study because the study only tries to quantify rainfall changes during these two events to simulate the discharge and runoff value. Therefore, the land use data representing the year 2004 were used as inputs in the 2008 runoff model. The study managed to demonstrate that rainfall change has a significant impact to determine the peak discharge and runoff depth for the study area

  15. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  16. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  17. Asymptotic Effectiveness of the Event-Based Sampling According to the Integral Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2007-01-01

    Full Text Available A rapid progress in intelligent sensing technology creates new interest in a development of analysis and design of non-conventional sampling schemes. The investigation of the event-based sampling according to the integral criterion is presented in this paper. The investigated sampling scheme is an extension of the pure linear send-on- delta/level-crossing algorithm utilized for reporting the state of objects monitored by intelligent sensors. The motivation of using the event-based integral sampling is outlined. The related works in adaptive sampling are summarized. The analytical closed-form formulas for the evaluation of the mean rate of event-based traffic, and the asymptotic integral sampling effectiveness, are derived. The simulation results verifying the analytical formulas are reported. The effectiveness of the integral sampling is compared with the related linear send-on-delta/level-crossing scheme. The calculation of the asymptotic effectiveness for common signals, which model the state evolution of dynamic systems in time, is exemplified.

  18. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); Xu, Guang-Hua [School of Mechanical Engineering, Xi’an Jiaotong University, Xi’an 710049 (China); State Key Laboratory for Manufacturing Systems Engineering, Xi’an Jiaotong University, Xi’an 710054 (China)

    2015-03-10

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n{sup n} with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method.

  19. High-frequency combination coding-based steady-state visual evoked potential for brain computer interface

    International Nuclear Information System (INIS)

    Zhang, Feng; Zhang, Xin; Xie, Jun; Li, Yeping; Han, Chengcheng; Lili, Li; Wang, Jing; Xu, Guang-Hua

    2015-01-01

    This study presents a new steady-state visual evoked potential (SSVEP) paradigm for brain computer interface (BCI) systems. The goal of this study is to increase the number of targets using fewer stimulation high frequencies, with diminishing subject’s fatigue and reducing the risk of photosensitive epileptic seizures. The new paradigm is High-Frequency Combination Coding-Based High-Frequency Steady-State Visual Evoked Potential (HFCC-SSVEP).Firstly, we studied SSVEP high frequency(beyond 25 Hz)response of SSVEP, whose paradigm is presented on the LED. The SNR (Signal to Noise Ratio) of high frequency(beyond 40 Hz) response is very low, which is been unable to be distinguished through the traditional analysis method; Secondly we investigated the HFCC-SSVEP response (beyond 25 Hz) for 3 frequencies (25Hz, 33.33Hz, and 40Hz), HFCC-SSVEP produces n n with n high stimulation frequencies through Frequence Combination Code. Further, Animproved Hilbert-huang transform (IHHT)-based variable frequency EEG feature extraction method and a local spectrum extreme target identification algorithmare adopted to extract time-frequency feature of the proposed HFCC-SSVEP response.Linear predictions and fixed sifting (iterating) 10 time is used to overcome the shortage of end effect and stopping criterion,generalized zero-crossing (GZC) is used to compute the instantaneous frequency of the proposed SSVEP respondent signals, the improved HHT-based feature extraction method for the proposed SSVEP paradigm in this study increases recognition efficiency, so as to improve ITR and to increase the stability of the BCI system. what is more, SSVEPs evoked by high-frequency stimuli (beyond 25Hz) minimally diminish subject’s fatigue and prevent safety hazards linked to photo-induced epileptic seizures, So as to ensure the system efficiency and undamaging.This study tests three subjects in order to verify the feasibility of the proposed method

  20. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  1. High frequency electromechanical memory cells based on telescoping carbon nanotubes.

    Science.gov (United States)

    Popov, A M; Lozovik, Y E; Kulish, A S; Bichoutskaia, E

    2010-07-01

    A new method to increase the operational frequency of electromechanical memory cells based on the telescoping motion of multi-walled carbon nanotubes through the selection of the form of the switching voltage pulse is proposed. The relative motion of the walls of carbon nanotubes can be controlled through the shape of the interwall interaction energy surface. This allows the use of the memory cells in nonvolatile or volatile regime, depending on the structure of carbon nanotube. Simulations based on ab initio and semi-empirical calculations of the interwall interaction energies are used to estimate the switching voltage and the operational frequency of volatile cells with the electrodes made of carbon nanotubes. The lifetime of nonvolatile memory cells is also predicted.

  2. Order Tracking Based on Robust Peak Search Instantaneous Frequency Estimation

    International Nuclear Information System (INIS)

    Gao, Y; Guo, Y; Chi, Y L; Qin, S R

    2006-01-01

    Order tracking plays an important role in non-stationary vibration analysis of rotating machinery, especially to run-up or coast down. An instantaneous frequency estimation (IFE) based order tracking of rotating machinery is introduced. In which, a peak search algorithms of spectrogram of time-frequency analysis is employed to obtain IFE of vibrations. An improvement to peak search is proposed, which can avoid strong non-order components or noises disturbing to the peak search work. Compared with traditional methods of order tracking, IFE based order tracking is simplified in application and only software depended. Testing testify the validity of the method. This method is an effective supplement to traditional methods, and the application in condition monitoring and diagnosis of rotating machinery is imaginable

  3. Life review based on remembering specific positive events in active aging.

    Science.gov (United States)

    Latorre, José M; Serrano, Juan P; Ricarte, Jorge; Bonete, Beatriz; Ros, Laura; Sitges, Esther

    2015-02-01

    The aim of this study is to evaluate the effectiveness of life review (LR) based on specific positive events in non-depressed older adults taking part in an active aging program. Fifty-five older adults were randomly assigned to an experimental group or an active control (AC) group. A six-session individual training of LR based on specific positive events was carried out with the experimental group. The AC group undertook a "media workshop" of six sessions focused on learning journalistic techniques. Pre-test and post-test measures included life satisfaction, depressive symptoms, experiencing the environment as rewarding, and autobiographical memory (AM) scales. LR intervention decreased depressive symptomatology, improved life satisfaction, and increased specific memories. The findings suggest that practice in AM for specific events is an effective component of LR that could be a useful tool in enhancing emotional well-being in active aging programs, thus reducing depressive symptoms. © The Author(s) 2014.

  4. The RFI situation for a space-based low-frequency radio astronomy instrument

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Boonstra, A.J.

    2016-01-01

    Space based ultra-long wavelength radio astronomy has recently gained a lot of interest. Techniques to open the virtually unexplored frequency band below 30 MHz are becoming within reach at this moment. Due to the ionosphere and the radio interference (RFI) on Earth exploring this frequency band

  5. A methodology for the quantitative risk assessment of major accidents triggered by seismic events

    International Nuclear Information System (INIS)

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-01-01

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units

  6. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Science.gov (United States)

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  7. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  8. Frequency-tuned microwave photon counter based on a superconductive quantum interferometer

    Science.gov (United States)

    Shnyrkov, V. I.; Yangcao, Wu; Soroka, A. A.; Turutanov, O. G.; Lyakhno, V. Yu.

    2018-03-01

    Various types of single-photon counters operating in infrared, ultraviolet, and optical wavelength ranges are successfully used to study electromagnetic fields, analyze radiation sources, and solve problems in quantum informatics. However, their operating principles become ineffective at millimeter band, S-band, and ultra-high frequency bands of wavelengths due to the decrease in quantum energy by 4-5 orders of magnitude. Josephson circuits with discrete Hamiltonians and qubits are a good foundation for the construction of single-photon counters at these frequencies. This paper presents a frequency-tuned microwave photon counter based on a single-junction superconducting quantum interferometer and flux qutrit. The control pulse converts the interferometer into a two-level system for resonance absorption of photons. Decay of the photon-induced excited state changes the magnetic flux in the interferometer, which is measured by a SQUID magnetometer. Schemes for recording the magnetic flux using a DC SQUID or ideal parametric detector, based on a qutrit with high-frequency excitation, are discussed. It is shown that the counter consisting of an interferometer with a Josephson junction and a parametric detector demonstrates high performance and is capable of detecting single photons in a microwave band.

  9. Dependencies in event trees analyzed by Petri nets

    International Nuclear Information System (INIS)

    Nývlt, Ondřej; Rausand, Marvin

    2012-01-01

    This paper discusses how non-marked Petri nets can be used to model and analyze event trees where the pivotal (branching) events are dependent and modeled by fault trees. The dependencies may, for example, be caused by shared utilities, shared components, or general common cause failures that are modeled by beta-factor models. These dependencies are cumbersome to take into account when using standard event-/fault tree modeling techniques, and may lead to significant errors in the calculated end-state probabilities of the event tree if they are not properly analyzed. A new approach is proposed in this paper, where the whole event tree is modeled by a non-marked Petri net and where P-invariants, representing the structural properties of the Petri net, are used to obtain the frequency of each end-state of the event tree with dependencies. The new approach is applied to a real example of an event tree analysis of the Strahov highway tunnel in Prague, Czech Republic, including two types of dependencies (shared Programmable Logic Controllers and Common Cause Failures). - Highlights: ► In this paper, we model and analyze event trees (ET) using Petri nets. ► The pivotal events of the modeled event trees are dependent (e.g., shared PLCs, CCF). ► A new method based on P-invariants to obtain probabilities of end states is proposed. ► Method is shown in the case study of the Stahov tunnel in the Czech Republic.

  10. Location of long-period events below Kilauea Volcano using seismic amplitudes and accurate relative relocation

    Science.gov (United States)

    Battaglia, J.; Got, J.-L.; Okubo, P.

    2003-01-01

    We present methods for improving the location of long-period (LP) events, deep and shallow, recorded below Kilauea Volcano by the permanent seismic network. LP events might be of particular interest to understanding eruptive processes as their source mechanism is assumed to directly involve fluid transport. However, it is usually difficult or impossible to locate their source using traditional arrival time methods because of emergent wave arrivals. At Kilauea, similar LP waveform signatures suggest the existence of LP multiplets. The waveform similarity suggests spatially close sources, while catalog solutions using arrival time estimates are widely scattered beneath Kilauea's summit caldera. In order to improve estimates of absolute LP location, we use the distribution of seismic amplitudes corrected for station site effects. The decay of the amplitude as a function of hypocentral distance is used for inferring LP location. In a second stage, we use the similarity of the events to calculate their relative positions. The analysis of the entire LP seismicity recorded between January 1997 and December 1999 suggests that a very large part of the LP event population, both deep and shallow, is generated by a small number of compact sources. Deep events are systematically composed of a weak high-frequency onset followed by a low-frequency wave train. Aligning the low-frequency wave trains does not lead to aligning the onsets indicating the two parts of the signal are dissociated. This observation favors an interpretation in terms of triggering and resonance of a magmatic conduit. Instead of defining fault planes, the precise relocation of similar LP events, based on the alignment of the high-energy low-frequency wave trains, defines limited size volumes. Copyright 2003 by the American Geophysical Union.

  11. A low-frequency vibration energy harvester based on diamagnetic levitation

    Science.gov (United States)

    Kono, Yuta; Masuda, Arata; Yuan, Fuh-Gwo

    2017-04-01

    This article presents 3-degree-of-freedom theoretical modeling and analysis of a low-frequency vibration energy harvester based on diamagnetic levitation. In recent years, although much attention has been placed on vibration energy harvesting technologies, few harvesters still can operate efficiently at extremely low frequencies in spite of large potential demand in the field of structural health monitoring and wearable applications. As one of the earliest works, Liu, Yuan and Palagummi proposed vertical and horizontal diamagnetic levitation systems as vibration energy harvesters with low resonant frequencies. This study aims to pursue further improvement along this direction, in terms of expanding maximum amplitude and enhancing the flexibility of the operation direction for broader application fields by introducing a new topology of the levitation system.

  12. Characterization of a subset of large amplitude noise events in VIRGO science run 1 (VSR1)

    International Nuclear Information System (INIS)

    Del Prete, M

    2009-01-01

    We report about a characterization study of a subset of large amplitude noise events present in the main data channel of the VIRGO detector. The main motivation of this study is the identification of auxiliary channels which can be used to define veto procedures. We characterized large amplitude events both in the time and in the frequency domain. We found evidence of coincidences among these and disturbances detected by magnetometer's sensors or inside the main power supply. In some cases the disturbances were produced by events in the VIRGO environment such as lightnings, main power supply glitches and airplane traffic. We have found two auxiliary channels that can be used to veto events generated by main power supply glitches or lightnings. A procedure to clean the main channel based on them has been successfully tested. We have also identified two auxiliary channels which are useful for the identification of events generated by airplane traffic. These can be used to implement a vetoing procedure both in the time and in the frequency domain.

  13. Characterization of a subset of large amplitude noise events in VIRGO science run 1 (VSR1)

    Energy Technology Data Exchange (ETDEWEB)

    Del Prete, M [Universita di Pisa, Lungarno Pacinotti, 43, 56126 Pisa Instituto Nazionale di Fisica Nucleare sez. di Pisa, ED C polo Fibonacci, Via F Buonarroti 2, 56127, Pisa (Italy)

    2009-10-21

    We report about a characterization study of a subset of large amplitude noise events present in the main data channel of the VIRGO detector. The main motivation of this study is the identification of auxiliary channels which can be used to define veto procedures. We characterized large amplitude events both in the time and in the frequency domain. We found evidence of coincidences among these and disturbances detected by magnetometer's sensors or inside the main power supply. In some cases the disturbances were produced by events in the VIRGO environment such as lightnings, main power supply glitches and airplane traffic. We have found two auxiliary channels that can be used to veto events generated by main power supply glitches or lightnings. A procedure to clean the main channel based on them has been successfully tested. We have also identified two auxiliary channels which are useful for the identification of events generated by airplane traffic. These can be used to implement a vetoing procedure both in the time and in the frequency domain.

  14. Supervision in the PC based prototype for the ATLAS event filter

    CERN Document Server

    Bee, C P; Etienne, F; Fede, E; Meessen, C; Nacasch, R; Qian, Z; Touchard, F

    1999-01-01

    A prototype of the ATLAS event filter based on commodity PCs linked by a Fast Ethernet switch has been developed in Marseille. The present contribution focus on the supervision aspects of the prototype based on Java and Java mobile agents technology. (5 refs).

  15. Estimative of core damage frequency in IPEN'S IEA-R1 research reactor due to the initiating event of loss of coolant caused by large rupture in the pipe of the primary circuit

    International Nuclear Information System (INIS)

    Hirata, Daniel Massami; Sabundjian, Gaiane; Cabral, Eduardo Lobo Lustosa

    2009-01-01

    The National Commission of Nuclear Energy (CNEN), which is the Brazilian nuclear regulatory commission, imposes safety and licensing standards in order to ensure that the nuclear power plants operate in a safe way. For licensing a nuclear reactor one of the demands of CNEN is the simulation of some accidents and thermalhydraulic transients considered as design base to verify the integrity of the plant when submitted to adverse conditions. The accidents that must be simulated are those that present large probability to occur or those that can cause more serious consequences. According to the FSAR (Final Safety Analysis Report) the initiating event that can cause the largest damage in the core, of the IEA-R1 research reactor at IPEN-CNEN/SP, is the LOCA (Loss of Coolant Accident). The objective of this paper is estimate the frequency of the IEA-R1 core damage, caused by this initiating event. In this paper we analyze the accident evolution and performance of the systems which should mitigate this event: the Emergency Coolant Core System (ECCS) and the isolated pool system. They will be analyzed by means of the event tree. In this work the reliability of these systems are also quantified using the fault tree. (author)

  16. Frequency response function (FRF) based updating of a laser spot welded structure

    Science.gov (United States)

    Zin, M. S. Mohd; Rani, M. N. Abdul; Yunus, M. A.; Sani, M. S. M.; Wan Iskandar Mirza, W. I. I.; Mat Isa, A. A.

    2018-04-01

    The objective of this paper is to present frequency response function (FRF) based updating as a method for matching the finite element (FE) model of a laser spot welded structure with a physical test structure. The FE model of the welded structure was developed using CQUAD4 and CWELD element connectors, and NASTRAN was used to calculate the natural frequencies, mode shapes and FRF. Minimization of the discrepancies between the finite element and experimental FRFs was carried out using the exceptional numerical capability of NASTRAN Sol 200. The experimental work was performed under free-free boundary conditions using LMS SCADAS. Avast improvement in the finite element FRF was achieved using the frequency response function (FRF) based updating with two different objective functions proposed.

  17. Frequency Control in Autanamous Microgrid in the Presence of DFIG based Wind Turbine

    Directory of Open Access Journals (Sweden)

    Ghazanfar Shahgholian

    2015-10-01

    Full Text Available Despite their ever-increasing power injection into power grid, wind turbines play no role in frequency control. On the other hand, power network frequency is mainly adjusted by conventional power plants. DFIG-based wind turbines not only are able to produce power in various mechanical speeds, but they can also reduce speed instantaneously which, in turn, leads to mechanical energy release. Thus, they can aid conventional units in system frequency control. In this paper, the effect of wind energy conversion systems, especially variable speed DFIG-based wind turbines, in controlling and tuning of frequency is investigated when different penetration coefficients are considered in a isolated microgrid comprising of conventional thermal and non-thermal generating unit. To do this, optimal tuning of DFIG's speed controller is performed in different penetration levels using particle swarm optimization (PSO technique. In addition, optimum penetration of wind energy conversion system is studied considering frequency change parameters in a microgrid.

  18. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    Science.gov (United States)

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  19. Assessment of Observational Uncertainty in Extreme Precipitation Events over the Continental United States

    Science.gov (United States)

    Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.; Goodman, A.

    2017-12-01

    Extreme precipitation events are associated with numerous societal and environmental impacts. Furthermore, anthropogenic climate change is projected to alter precipitation intensity across portions of the Continental United States (CONUS). Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. Towards this end, we apply an event-based indicator, developed as a part of NASA's support of the ongoing efforts of the US National Climate Assessment, which assigns categories to extreme precipitation events based on 3-day storm totals as a basis for dataset intercomparison. To assess observational uncertainty across a wide range of historical precipitation measurement approaches, we intercompare in situ station data from the Global Historical Climatology Network (GHCN), satellite-derived precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM), gridded in situ station data from the Parameter-elevation Regressions on Independent Slopes Model (PRISM), global reanalysis from NASA's Modern Era Retrospective-Analysis version 2 (MERRA 2), and regional reanalysis with gauge data assimilation from NCEP's North American Regional Reanalysis (NARR). Results suggest considerable variability across the five-dataset suite in the frequency, spatial extent, and magnitude of extreme precipitation events. Consistent with expectations, higher resolution datasets were found to resemble station data best and capture a greater frequency of high-end extreme events relative to lower spatial resolution datasets. The degree of dataset agreement varies regionally, however all datasets successfully capture the seasonal cycle of precipitation extremes across the CONUS. These intercomparison results provide additional insight about observational uncertainty and the ability of a range of precipitation measurement and analysis products to capture extreme precipitation event climatology. While the event category threshold is fixed

  20. Solar Type II Radio Bursts and IP Type II Events

    Science.gov (United States)

    Cane, H. V.; Erickson, W. C.

    2005-01-01

    We have examined radio data from the WAVES experiment on the Wind spacecraft in conjunction with ground-based data in order to investigate the relationship between the shocks responsible for metric type II radio bursts and the shocks in front of coronal mass ejections (CMEs). The bow shocks of fast, large CMEs are strong interplanetary (IP) shocks, and the associated radio emissions often consist of single broad bands starting below approx. 4 MHz; such emissions were previously called IP type II events. In contrast, metric type II bursts are usually narrowbanded and display two harmonically related bands. In addition to displaying complete dynamic spectra for a number of events, we also analyze the 135 WAVES 1 - 14 MHz slow-drift time periods in 2001-2003. We find that most of the periods contain multiple phenomena, which we divide into three groups: metric type II extensions, IP type II events, and blobs and bands. About half of the WAVES listings include probable extensions of metric type II radio bursts, but in more than half of these events, there were also other slow-drift features. In the 3 yr study period, there were 31 IP type II events; these were associated with the very fastest CMEs. The most common form of activity in the WAVES events, blobs and bands in the frequency range between 1 and 8 MHz, fall below an envelope consistent with the early signatures of an IP type II event. However, most of this activity lasts only a few tens of minutes, whereas IP type II events last for many hours. In this study we find many examples in the radio data of two shock-like phenomena with different characteristics that occur simultaneously in the metric and decametric/hectometric bands, and no clear example of a metric type II burst that extends continuously down in frequency to become an IP type II event. The simplest interpretation is that metric type II bursts, unlike IP type II events, are not caused by shocks driven in front of CMEs.

  1. Blind Separation of Nonstationary Sources Based on Spatial Time-Frequency Distributions

    Directory of Open Access Journals (Sweden)

    Zhang Yimin

    2006-01-01

    Full Text Available Blind source separation (BSS based on spatial time-frequency distributions (STFDs provides improved performance over blind source separation methods based on second-order statistics, when dealing with signals that are localized in the time-frequency (t-f domain. In this paper, we propose the use of STFD matrices for both whitening and recovery of the mixing matrix, which are two stages commonly required in many BSS methods, to provide robust BSS performance to noise. In addition, a simple method is proposed to select the auto- and cross-term regions of time-frequency distribution (TFD. To further improve the BSS performance, t-f grouping techniques are introduced to reduce the number of signals under consideration, and to allow the receiver array to separate more sources than the number of array sensors, provided that the sources have disjoint t-f signatures. With the use of one or more techniques proposed in this paper, improved performance of blind separation of nonstationary signals can be achieved.

  2. Has the frequency of bleeding changed over time for patients presenting with an acute coronary syndrome? The global registry of acute coronary events.

    OpenAIRE

    Fox, KA; Carruthers, K; Steg, PG; Avezum, A; Granger, CB; Montalescot, G; Goodman, SG; Gore, JM; Quill, AL; Eagle, KA; GRACE Investigators,

    2010-01-01

    08.09.14 KB. Ok to add published version to spiral, OA paper AIMS: To determine whether changes in practice, over time, are associated with altered rates of major bleeding in acute coronary syndromes (ACS). METHODS AND RESULTS: Patients from the Global Registry of Acute Coronary Events were enrolled between 2000 and 2007. The main outcome measures were frequency of major bleeding, including haemorrhagic stroke, over time, after adjustment for patient characteristics, and impact of major b...

  3. Low-frequency repetitive transcranial magnetic stimulation (rTMS) affects event-related potential measures of novelty processing in autism.

    Science.gov (United States)

    Sokhadze, Estate; Baruth, Joshua; Tasman, Allan; Mansoor, Mehreen; Ramaswamy, Rajesh; Sears, Lonnie; Mathai, Grace; El-Baz, Ayman; Casanova, Manuel F

    2010-06-01

    In our previous study on individuals with autism spectrum disorder (ASD) (Sokhadze et al., Appl Psychophysiol Biofeedback 34:37-51, 2009a) we reported abnormalities in the attention-orienting frontal event-related potentials (ERP) and the sustained-attention centro-parietal ERPs in a visual oddball experiment. These results suggest that individuals with autism over-process information needed for the successful differentiation of target and novel stimuli. In the present study we examine the effects of low-frequency, repetitive Transcranial Magnetic Stimulation (rTMS) on novelty processing as well as behavior and social functioning in 13 individuals with ASD. Our hypothesis was that low-frequency rTMS application to dorsolateral prefrontal cortex (DLFPC) would result in an alteration of the cortical excitatory/inhibitory balance through the activation of inhibitory GABAergic double bouquet interneurons. We expected to find post-TMS differences in amplitude and latency of early and late ERP components. The results of our current study validate the use of low-frequency rTMS as a modulatory tool that altered the disrupted ratio of cortical excitation to inhibition in autism. After rTMS the parieto-occipital P50 amplitude decreased to novel distracters but not to targets; also the amplitude and latency to targets increased for the frontal P50 while decreasing to non-target stimuli. Low-frequency rTMS minimized early cortical responses to irrelevant stimuli and increased responses to relevant stimuli. Improved selectivity in early cortical responses lead to better stimulus differentiation at later-stage responses as was made evident by our P3b and P3a component findings. These results indicate a significant change in early, middle-latency and late ERP components at the frontal, centro-parietal, and parieto-occipital regions of interest in response to target and distracter stimuli as a result of rTMS treatment. Overall, our preliminary results show that rTMS may prove to

  4. Multi-day activity scheduling reactions to planned activities and future events in a dynamic model of activity-travel behavior

    NARCIS (Netherlands)

    Nijland, L.; Arentze, T.A.; Timmermans, H.J.P.

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of

  5. Multi-Day Activity Scheduling Reactions to Planned Activities and Future Events in a Dynamic Model of Activity-Travel Behavior

    NARCIS (Netherlands)

    Nijland, L.; Arentze, T.; Timmermans, H.

    2014-01-01

    Modeling multi-day planning has received scarce attention in activity-based transport demand modeling so far. However, new dynamic activity-based approaches are being developed at the current moment. The frequency and inflexibility of planned activities and events in activity schedules of

  6. Sensitivity studies on the approaches for addressing multiple initiating events in fire events PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Lim, Ho Gon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    A single fire event within a fire compartment or a fire scenario can cause multiple initiating events (IEs). As an example, a fire in a turbine building fire area can cause a loss of the main feed-water (LOMF) and loss of off-site power (LOOP) IEs. Previous domestic fire events PSA had considered only the most severe initiating event among multiple initiating events. NUREG/CR-6850 and ANS/ASME PRA Standard require that multiple IEs are to be addressed in fire events PSA. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA for Hanul Unit 3 were performed and their results were presented. In this paper, sensitivity studies on the approaches for addressing multiple IEs in fire events PSA are performed and their results were presented. From the sensitivity analysis results, we can find that the incorporations of multiple IEs into fire events PSA model result in the core damage frequency (CDF) increase and may lead to the generation of the duplicate cutsets. Multiple IEs also can occur at internal flooding event or other external events such as seismic event. They should be considered in the constructions of PSA models in order to realistically estimate risk due to flooding or seismic events.

  7. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Science.gov (United States)

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms. PMID:22163972

  8. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  9. Frequency Control Strategy for Black Starts via PMSG-Based Wind Power Generation

    Directory of Open Access Journals (Sweden)

    Yi Tang

    2017-03-01

    Full Text Available The use of wind power generation (WPG as a source for black starts will significantly enhance the resiliency of power systems and shorten their recovery time from blackouts. Given that frequency stability is the most serious issue during the initial recovery period, virtual inertia control can enable wind turbines to provide frequency support to an external system. In this study, a general procedure of WPG participating in black starts is presented, and the key issues are discussed. The adaptability of existing virtual inertia control strategies is analyzed, and improvement work is performed. A new coordinated frequency control strategy is proposed based on the presented improvement work. A local power network with a permanent-magnet synchronous generator (PMSG-based wind farm is modeled and used to verify the effectiveness of the strategy.

  10. Limits on the efficiency of event-based algorithms for Monte Carlo neutron transport

    Directory of Open Access Journals (Sweden)

    Paul K. Romano

    2017-09-01

    Full Text Available The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, the vector speedup is also limited by differences in the execution time for events being carried out in a single event-iteration.

  11. Impact of climate change on extreme rainfall events and flood risk

    Indian Academy of Sciences (India)

    The analysis of the frequency of rainy days, rain days and heavy rainfall days as well as one-day extreme rainfall and return period has been carried out in this study to observe the impact of climate change on extreme rainfall events and flood risk in India. The frequency of heavy rainfall events are decreasing in major parts ...

  12. Building a knowledge base of severe adverse drug events based on AERS reporting data using semantic web technologies.

    Science.gov (United States)

    Jiang, Guoqian; Wang, Liwei; Liu, Hongfang; Solbrig, Harold R; Chute, Christopher G

    2013-01-01

    A semantically coded knowledge base of adverse drug events (ADEs) with severity information is critical for clinical decision support systems and translational research applications. However it remains challenging to measure and identify the severity information of ADEs. The objective of the study is to develop and evaluate a semantic web based approach for building a knowledge base of severe ADEs based on the FDA Adverse Event Reporting System (AERS) reporting data. We utilized a normalized AERS reporting dataset and extracted putative drug-ADE pairs and their associated outcome codes in the domain of cardiac disorders. We validated the drug-ADE associations using ADE datasets from SIDe Effect Resource (SIDER) and the UMLS. We leveraged the Common Terminology Criteria for Adverse Event (CTCAE) grading system and classified the ADEs into the CTCAE in the Web Ontology Language (OWL). We identified and validated 2,444 unique Drug-ADE pairs in the domain of cardiac disorders, of which 760 pairs are in Grade 5, 775 pairs in Grade 4 and 2,196 pairs in Grade 3.

  13. Automatic Seismic-Event Classification with Convolutional Neural Networks.

    Science.gov (United States)

    Bueno Rodriguez, A.; Titos Luzón, M.; Garcia Martinez, L.; Benitez, C.; Ibáñez, J. M.

    2017-12-01

    Active volcanoes exhibit a wide range of seismic signals, providing vast amounts of unlabelled volcano-seismic data that can be analyzed through the lens of artificial intelligence. However, obtaining high-quality labelled data is time-consuming and expensive. Deep neural networks can process data in their raw form, compute high-level features and provide a better representation of the input data distribution. These systems can be deployed to classify seismic data at scale, enhance current early-warning systems and build extensive seismic catalogs. In this research, we aim to classify spectrograms from seven different seismic events registered at "Volcán de Fuego" (Colima, Mexico), during four eruptive periods. Our approach is based on convolutional neural networks (CNNs), a sub-type of deep neural networks that can exploit grid structure from the data. Volcano-seismic signals can be mapped into a grid-like structure using the spectrogram: a representation of the temporal evolution in terms of time and frequency. Spectrograms were computed from the data using Hamming windows with 4 seconds length, 2.5 seconds overlapping and 128 points FFT resolution. Results are compared to deep neural networks, random forest and SVMs. Experiments show that CNNs can exploit temporal and frequency information, attaining a classification accuracy of 93%, similar to deep networks 91% but outperforming SVM and random forest. These results empirically show that CNNs are powerful models to classify a wide range of volcano-seismic signals, and achieve good generalization. Furthermore, volcano-seismic spectrograms contains useful discriminative information for the CNN, as higher layers of the network combine high-level features computed for each frequency band, helping to detect simultaneous events in time. Being at the intersection of deep learning and geophysics, this research enables future studies of how CNNs can be used in volcano monitoring to accurately determine the detection and

  14. Failure frequencies and probabilities applicable to BWR and PWR piping

    International Nuclear Information System (INIS)

    Bush, S.H.; Chockie, A.D.

    1996-03-01

    This report deals with failure probabilities and failure frequencies of nuclear plant piping and the failure frequencies of flanges and bellows. Piping failure probabilities are derived from Piping Reliability Analysis Including Seismic Events (PRAISE) computer code calculations based on fatigue and intergranular stress corrosion as failure mechanisms. Values for both failure probabilities and failure frequencies are cited from several sources to yield a better evaluation of the spread in mean and median values as well as the widths of the uncertainty bands. A general conclusion is that the numbers from WASH-1400 often used in PRAs are unduly conservative. Failure frequencies for both leaks and large breaks tend to be higher than would be calculated using the failure probabilities, primarily because the frequencies are based on a relatively small number of operating years. Also, failure probabilities are substantially lower because of the probability distributions used in PRAISE calculations. A general conclusion is that large LOCA probability values calculated using PRAISE will be quite small, on the order of less than 1E-8 per year (<1E-8/year). The values in this report should be recognized as having inherent limitations and should be considered as estimates and not absolute values. 24 refs 24 refs

  15. External events analysis for the Savannah River Site K reactor

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Wingo, H.E.

    1990-01-01

    The probabilistic external events analysis performed for the Savannah River Site K-reactor PRA considered many different events which are generally perceived to be ''external'' to the reactor and its systems, such as fires, floods, seismic events, and transportation accidents (as well as many others). Events which have been shown to be significant contributors to risk include seismic events, tornados, a crane failure scenario, fires and dam failures. The total contribution to the core melt frequency from external initiators has been found to be 2.2 x 10 -4 per year, from which seismic events are the major contributor (1.2 x 10 -4 per year). Fire initiated events contribute 1.4 x 10 -7 per year, tornados 5.8 x 10 -7 per year, dam failures 1.5 x 10 -6 per year and the crane failure scenario less than 10 -4 per year to the core melt frequency. 8 refs., 3 figs., 5 tabs

  16. The Tracking Resonance Frequency Method for Photoacoustic Measurements Based on the Phase Response

    Science.gov (United States)

    Suchenek, Mariusz

    2017-04-01

    One of the major issues in the use of the resonant photoacoustic cell is the resonance frequency of the cell. The frequency is not stable, and its changes depend mostly on temperature and gas mixture. This paper presents a new method for tracking resonance frequency, where both the amplitude and phase are calculated from the input samples. The stimulating frequency can be adjusted to the resonance frequency of the cell based on the phase. This method was implemented using a digital measurement system with an analog to digital converter, field programmable gate array (FPGA) and a microcontroller. The resonance frequency was changed by the injection of carbon dioxide into the cell. A theoretical description and experimental results are also presented.

  17. Frequency-based time-series gene expression recomposition using PRIISM

    Directory of Open Access Journals (Sweden)

    Rosa Bruce A

    2012-06-01

    Full Text Available Abstract Background Circadian rhythm pathways influence the expression patterns of as much as 31% of the Arabidopsis genome through complicated interaction pathways, and have been found to be significantly disrupted by biotic and abiotic stress treatments, complicating treatment-response gene discovery methods due to clock pattern mismatches in the fold change-based statistics. The PRIISM (Pattern Recomposition for the Isolation of Independent Signals in Microarray data algorithm outlined in this paper is designed to separate pattern changes induced by different forces, including treatment-response pathways and circadian clock rhythm disruptions. Results Using the Fourier transform, high-resolution time-series microarray data is projected to the frequency domain. By identifying the clock frequency range from the core circadian clock genes, we separate the frequency spectrum to different sections containing treatment-frequency (representing up- or down-regulation by an adaptive treatment response, clock-frequency (representing the circadian clock-disruption response and noise-frequency components. Then, we project the components’ spectra back to the expression domain to reconstruct isolated, independent gene expression patterns representing the effects of the different influences. By applying PRIISM on a high-resolution time-series Arabidopsis microarray dataset under a cold treatment, we systematically evaluated our method using maximum fold change and principal component analyses. The results of this study showed that the ranked treatment-frequency fold change results produce fewer false positives than the original methodology, and the 26-hour timepoint in our dataset was the best statistic for distinguishing the most known cold-response genes. In addition, six novel cold-response genes were discovered. PRIISM also provides gene expression data which represents only circadian clock influences, and may be useful for circadian clock studies

  18. Using Web Crawler Technology for Text Analysis of Geo-Events: A Case Study of the Huangyan Island Incident

    Science.gov (United States)

    Hu, H.; Ge, Y. J.

    2013-11-01

    With the social networking and network socialisation have brought more text information and social relationships into our daily lives, the question of whether big data can be fully used to study the phenomenon and discipline of natural sciences has prompted many specialists and scholars to innovate their research. Though politics were integrally involved in the hyperlinked word issues since 1990s, automatic assembly of different geospatial web and distributed geospatial information systems utilizing service chaining have explored and built recently, the information collection and data visualisation of geo-events have always faced the bottleneck of traditional manual analysis because of the sensibility, complexity, relativity, timeliness and unexpected characteristics of political events. Based on the framework of Heritrix and the analysis of web-based text, word frequency, sentiment tendency and dissemination path of the Huangyan Island incident is studied here by combining web crawler technology and the text analysis method. The results indicate that tag cloud, frequency map, attitudes pie, individual mention ratios and dissemination flow graph based on the data collection and processing not only highlight the subject and theme vocabularies of related topics but also certain issues and problems behind it. Being able to express the time-space relationship of text information and to disseminate the information regarding geo-events, the text analysis of network information based on focused web crawler technology can be a tool for understanding the formation and diffusion of web-based public opinions in political events.

  19. Joint Angle and Frequency Estimation Using Multiple-Delay Output Based on ESPRIT

    Science.gov (United States)

    Xudong, Wang

    2010-12-01

    This paper presents a novel ESPRIT algorithm-based joint angle and frequency estimation using multiple-delay output (MDJAFE). The algorithm can estimate the joint angles and frequencies, since the use of multiple output makes the estimation accuracy greatly improved when compared with a conventional algorithm. The useful behavior of the proposed algorithm is verified by simulations.

  20. A New Quantum Key Distribution Scheme Based on Frequency and Time Coding

    International Nuclear Information System (INIS)

    Chang-Hua, Zhu; Chang-Xing, Pei; Dong-Xiao, Quan; Jing-Liang, Gao; Nan, Chen; Yun-Hui, Yi

    2010-01-01

    A new scheme of quantum key distribution (QKD) using frequency and time coding is proposed, in which the security is based on the frequency-time uncertainty relation. In this scheme, the binary information sequence is encoded randomly on either the central frequency or the time delay of the optical pulse at the sender. The central frequency of the single photon pulse is set as ω 1 for bit 0 and set as ω 2 for bit 1 when frequency coding is selected. However, the single photon pulse is not delayed for bit 0 and is delayed in τ for 1 when time coding is selected. At the receiver, either the frequency or the time delay of the pulse is measured randomly, and the final key is obtained after basis comparison, data reconciliation and privacy amplification. With the proposed method, the effect of the noise in the fiber channel and environment on the QKD system can be reduced effectively

  1. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran; Ovcharenko, Oleg; Peter, Daniel

    2017-01-01

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset

  2. Do regional methods really help reduce uncertainties in flood frequency analyses?

    Science.gov (United States)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged

  3. Climate change impacts on the duration and frequency of combined sewer overflows

    Science.gov (United States)

    Fortier, C.; Mailhot, A.

    2012-12-01

    Combined sewer overflows (CSO) occur when large rainwater inflow from heavy precipitation exceeds the capacity of urban combined sewage systems. Many American and European cities with old sewage systems see their water quality significantly deteriorate during such events. In the long term, changes in the rainfall regime due to climate change may lead to more severe and more frequent CSO episodes and thus compel cities to review their global water management. The overall objective of this study is to investigate how climate change will impact CSO frequency and duration. Data from rain gauges located nearby 30 overflow outfalls, in southern Quebec, Canada, were used to identify rain events leading to overflows, using CSO monitored data from May to October during the period 2007-2009. For each site, occurrence and duration of CSO events were recorded and linked to a rainfall event. Many rain events features can be used to predict CSO events, such as total depth, duration, average intensity and peak intensity. Results based on Pearson product-moment correlation coefficients and multiple regression analysis show that CSO occurrence is best predicted by total rainfall. A methodology is proposed to calculate the CSO probability of occurrence and duration for each site of interest using rainfall series as input data. Monte Carlo method is then used to estimate CSO frequency. To evaluate the climate change impact on CSO, these relationships are used with simulated data from the Canadian Regional Climate Model to compare the distribution of annual number of CSO events over the 1960-1990 period and the 2070-2100 period.

  4. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  5. Extreme Precipitation Estimation with Typhoon Morakot Using Frequency and Spatial Analysis

    Directory of Open Access Journals (Sweden)

    Hone-Jay Chu

    2011-01-01

    Full Text Available Typhoon Morakot lashed Taiwan and produced copious amounts of precipitation in 2009. From the point view of hydrological statistics, the impact of the precipitation from typhoon Morakot using a frequency analysis can be analyzed and discussed. The frequency curve, which was fitted mathematically to historical observed data, can be used to estimate the probability of exceedance for runoff events of a certain magnitude. The study integrates frequency analysis and spatial analysis to assess the effect of Typhoon Morakot event on rainfall frequency in the Gaoping River basin of southern Taiwan. First, extreme rainfall data are collected at sixteen stations for durations of 1, 3, 6, 12, and 24 hours and then an appropriate probability distribution was selected to analyze the impact of the extreme hydrological event. Spatial rainfall patterns for a return period of 200-yr with 24-hr duration with and without Typhoon Morakot are estimated. Results show that the rainfall amount is significantly different with long duration with and without the event for frequency analysis. Furthermore, spatial analysis shows that extreme rainfall for a return period of 200-yr is highly dependent on topography and is smaller in the southwest than that in the east. The results not only demonstrate the distinct effect of Typhoon Morakot on frequency analysis, but also could provide reference in future planning of hydrological engineering.

  6. Population based allele frequencies of disease associated polymorphisms in the Personalized Medicine Research Project.

    Science.gov (United States)

    Cross, Deanna S; Ivacic, Lynn C; Stefanski, Elisha L; McCarty, Catherine A

    2010-06-17

    There is a lack of knowledge regarding the frequency of disease associated polymorphisms in populations and population attributable risk for many populations remains unknown. Factors that could affect the association of the allele with disease, either positively or negatively, such as race, ethnicity, and gender, may not be possible to determine without population based allele frequencies.Here we used a panel of 51 polymorphisms previously associated with at least one disease and determined the allele frequencies within the entire Personalized Medicine Research Project population based cohort. We compared these allele frequencies to those in dbSNP and other data sources stratified by race. Differences in allele frequencies between self reported race, region of origin, and sex were determined. There were 19544 individuals who self reported a single racial category, 19027 or (97.4%) self reported white Caucasian, and 11205 (57.3%) individuals were female. Of the 11,208 (57%) individuals with an identifiable region of origin 8337 or (74.4%) were German.41 polymorphisms were significantly different between self reported race at the 0.05 level. Stratification of our Caucasian population by self reported region of origin revealed 19 polymorphisms that were significantly different (p = 0.05) between individuals of different origins. Further stratification of the population by gender revealed few significant differences in allele frequencies between the genders. This represents one of the largest population based allele frequency studies to date. Stratification by self reported race and region of origin revealed wide differences in allele frequencies not only by race but also by region of origin within a single racial group. We report allele frequencies for our Asian/Hmong and American Indian populations; these two minority groups are not typically selected for population allele frequency detection. Population wide allele frequencies are important for the design and

  7. Event-based motion correction for PET transmission measurements with a rotating point source

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Meikle, Steven R; Fulton, Roger

    2011-01-01

    Accurate attenuation correction is important for quantitative positron emission tomography (PET) studies. When performing transmission measurements using an external rotating radioactive source, object motion during the transmission scan can distort the attenuation correction factors computed as the ratio of the blank to transmission counts, and cause errors and artefacts in reconstructed PET images. In this paper we report a compensation method for rigid body motion during PET transmission measurements, in which list mode transmission data are motion corrected event-by-event, based on known motion, to ensure that all events which traverse the same path through the object are recorded on a common line of response (LOR). As a result, the motion-corrected transmission LOR may record a combination of events originally detected on different LORs. To ensure that the corresponding blank LOR records events from the same combination of contributing LORs, the list mode blank data are spatially transformed event-by-event based on the same motion information. The number of counts recorded on the resulting blank LOR is then equivalent to the number of counts that would have been recorded on the corresponding motion-corrected transmission LOR in the absence of any attenuating object. The proposed method has been verified in phantom studies with both stepwise movements and continuous motion. We found that attenuation maps derived from motion-corrected transmission and blank data agree well with those of the stationary phantom and are significantly better than uncorrected attenuation data.

  8. Scenario-based stochastic optimal operation of wind, photovoltaic, pump-storage hybrid system in frequency- based pricing

    International Nuclear Information System (INIS)

    Zare Oskouei, Morteza; Sadeghi Yazdankhah, Ahmad

    2015-01-01

    Highlights: • Two-stage objective function is proposed for optimization problem. • Hourly-based optimal contractual agreement is calculated. • Scenario-based stochastic optimization problem is solved. • Improvement of system frequency by utilizing PSH unit. - Abstract: This paper proposes the operating strategy of a micro grid connected wind farm, photovoltaic and pump-storage hybrid system. The strategy consists of two stages. In the first stage, the optimal hourly contractual agreement is determined. The second stage corresponds to maximizing its profit by adapting energy management strategy of wind and photovoltaic in coordination with optimum operating schedule of storage device under frequency based pricing for a day ahead electricity market. The pump-storage hydro plant is utilized to minimize unscheduled interchange flow and maximize the system benefit by participating in frequency control based on energy price. Because of uncertainties in power generation of renewable sources and market prices, generation scheduling is modeled by a stochastic optimization problem. Uncertainties of parameters are modeled by scenario generation and scenario reduction method. A powerful optimization algorithm is proposed using by General Algebraic Modeling System (GAMS)/CPLEX. In order to verify the efficiency of the method, the algorithm is applied to various scenarios with different wind and photovoltaic power productions in a day ahead electricity market. The numerical results demonstrate the effectiveness of the proposed approach.

  9. Atmospheric dust events in central Asia: Relationship to wind, soil type, and land use

    Science.gov (United States)

    Pi, Huawei; Sharratt, Brenton; Lei, Jiaqiang

    2017-06-01

    Xinjiang Province in northwest China is one of the most important source regions of atmospheric dust in the world. Spatial-temporal characteristics of dust events in the province were investigated by time series analysis of annual dust event frequency and meteorological data collected at 101 meteorological stations from 1960 to 2007. Blowing dust frequency (BDF) and dust storm frequency (DSF) decreased with time in North, South, and East Xinjiang whereas floating dust frequency (FDF) decreased with time only in South and East Xinjiang. Dust concentrations were lower in North than in South Xinjiang and decreased with time in East Xinjiang. Wind significantly influenced the temporal trend in FDF, BDF, and DSF in South Xinjiang and DSF in North Xinjiang. Frequency of dust events was smaller by an order of magnitude in North (10.9 d yr-1) than in South Xinjiang (111.3 d yr-1), possibly due in part to higher annual precipitation in North Xinjiang. Floating dust was most frequently observed in East and South Xinjiang, while blowing dust was most frequently observed in North Xinjiang. The high frequency of floating dust in East and South Xinjiang is likely due to the enclosed terrain that characterizes these regions. Land use and soil type also influenced dust events. Although climate influences frequency of dust events, the occurrence of these events may be reduced most effectively by imposing better land management practices in deciduous forests or orchards characterized by saline soils in respectively North and East Xinjiang and meadows characterized by Guanyu soils in South Xinjiang.

  10. Ultra High-Speed Radio Frequency Switch Based on Photonics.

    Science.gov (United States)

    Ge, Jia; Fok, Mable P

    2015-11-26

    Microwave switches, or Radio Frequency (RF) switches have been intensively used in microwave systems for signal routing. Compared with the fast development of microwave and wireless systems, RF switches have been underdeveloped particularly in terms of switching speed and operating bandwidth. In this paper, we propose a photonics based RF switch that is capable of switching at tens of picoseconds speed, which is hundreds of times faster than any existing RF switch technologies. The high-speed switching property is achieved with the use of a rapidly tunable microwave photonic filter with tens of gigahertz frequency tuning speed, where the tuning mechanism is based on the ultra-fast electro-optics Pockels effect. The RF switch has a wide operation bandwidth of 12 GHz and can go up to 40 GHz, depending on the bandwidth of the modulator used in the scheme. The proposed RF switch can either work as an ON/OFF switch or a two-channel switch, tens of picoseconds switching speed is experimentally observed for both type of switches.

  11. Three-frequency BDS precise point positioning ambiguity resolution based on raw observables

    Science.gov (United States)

    Li, Pan; Zhang, Xiaohong; Ge, Maorong; Schuh, Harald

    2018-02-01

    All BeiDou navigation satellite system (BDS) satellites are transmitting signals on three frequencies, which brings new opportunity and challenges for high-accuracy precise point positioning (PPP) with ambiguity resolution (AR). This paper proposes an effective uncalibrated phase delay (UPD) estimation and AR strategy which is based on a raw PPP model. First, triple-frequency raw PPP models are developed. The observation model and stochastic model are designed and extended to accommodate the third frequency. Then, the UPD is parameterized in raw frequency form while estimating with the high-precision and low-noise integer linear combination of float ambiguity which are derived by ambiguity decorrelation. Third, with UPD corrected, the LAMBDA method is used for resolving full or partial ambiguities which can be fixed. This method can be easily and flexibly extended for dual-, triple- or even more frequency. To verify the effectiveness and performance of triple-frequency PPP AR, tests with real BDS data from 90 stations lasting for 21 days were performed in static mode. Data were processed with three strategies: BDS triple-frequency ambiguity-float PPP, BDS triple-frequency PPP with dual-frequency (B1/B2) and three-frequency AR, respectively. Numerous experiment results showed that compared with the ambiguity-float solution, the performance in terms of convergence time and positioning biases can be significantly improved by AR. Among three groups of solutions, the triple-frequency PPP AR achieved the best performance. Compared with dual-frequency AR, additional the third frequency could apparently improve the position estimations during the initialization phase and under constraint environments when the dual-frequency PPP AR is limited by few satellite numbers.

  12. A Comparison of AOP Classification Based on Difficulty, Importance, and Frequency by Cluster Analysis and Standardized Mean

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Jung, Wondea

    2014-01-01

    In Korea, there are plants that have more than one-hundred kinds of abnormal operation procedures (AOPs). Therefore, operators have started to recognize the importance of classifying the AOPs. They should pay attention to those AOPs required to take emergency measures against an abnormal status that has a more serious effect on plant safety and/or often occurs. We suggested a measure of prioritizing AOPs for a training purpose based on difficulty, importance, and frequency. A DIF analysis based on how difficult the task is, how important it is, and how frequently they occur is a well-known method of assessing the performance, prioritizing training needs and planning. We used an SDIF-mean (Standardized DIF-mean) to prioritize AOPs in the previous paper. For the SDIF-mean, we standardized the three kinds of data respectively. The results of this research will be utilized not only to understand the AOP characteristics at a job analysis level but also to develop an effective AOP training program. The purpose of this paper is to perform a cluster analysis for an AOP classification and compare the results through a cluster analysis with that by a standardized mean based on difficulty, importance, and frequency. In this paper, we categorized AOPs into three groups by a cluster analysis based on D, I, and F. Clustering is the classification of similar objects into groups so that each group shares some common characteristics. In addition, we compared the result by the cluster analysis in this paper with the classification result by the SDIF-mean in the previous paper. From the comparison, we found that a reevaluation can be required to assign a training interval for the AOPs of group C' in the previous paper those have lower SDIF-mean. The reason for this is that some of the AOPs of group C' have quite high D and I values while they have the lowest frequencies. From an educational point of view, AOPs in group which have the highest difficulty and importance, but

  13. A Comparison of AOP Classification Based on Difficulty, Importance, and Frequency by Cluster Analysis and Standardized Mean

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sun Yeong; Jung, Wondea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In Korea, there are plants that have more than one-hundred kinds of abnormal operation procedures (AOPs). Therefore, operators have started to recognize the importance of classifying the AOPs. They should pay attention to those AOPs required to take emergency measures against an abnormal status that has a more serious effect on plant safety and/or often occurs. We suggested a measure of prioritizing AOPs for a training purpose based on difficulty, importance, and frequency. A DIF analysis based on how difficult the task is, how important it is, and how frequently they occur is a well-known method of assessing the performance, prioritizing training needs and planning. We used an SDIF-mean (Standardized DIF-mean) to prioritize AOPs in the previous paper. For the SDIF-mean, we standardized the three kinds of data respectively. The results of this research will be utilized not only to understand the AOP characteristics at a job analysis level but also to develop an effective AOP training program. The purpose of this paper is to perform a cluster analysis for an AOP classification and compare the results through a cluster analysis with that by a standardized mean based on difficulty, importance, and frequency. In this paper, we categorized AOPs into three groups by a cluster analysis based on D, I, and F. Clustering is the classification of similar objects into groups so that each group shares some common characteristics. In addition, we compared the result by the cluster analysis in this paper with the classification result by the SDIF-mean in the previous paper. From the comparison, we found that a reevaluation can be required to assign a training interval for the AOPs of group C' in the previous paper those have lower SDIF-mean. The reason for this is that some of the AOPs of group C' have quite high D and I values while they have the lowest frequencies. From an educational point of view, AOPs in group which have the highest difficulty and importance, but

  14. Changes in Alpha Frequency and Power of the Electroencephalogram during Volatile-Based General Anesthesia

    Directory of Open Access Journals (Sweden)

    Darren Hight

    2017-05-01

    Full Text Available Oscillations in the electroencephalogram (EEG at the alpha frequency (8–12 Hz are thought to be ubiquitous during surgical anesthesia, but the details of how this oscillation responds to ongoing changes in volatile anesthetic concentration have not been well characterized. It is not known how often alpha oscillations are absent in the clinical context, how sensitively alpha frequency and power respond to changes in anesthetic concentration, and what effect increased age has on alpha frequency. Bipolar EEG was recorded frontally from 305 patients undergoing surgery with sevoflurane or desflurane providing general anesthesia. A new method of detecting the presence of alpha oscillations based on the stability of the rate of change of the peak frequency in the alpha range was developed. Linear concentration-response curves were fitted to assess the sensitivity of alpha power and frequency measures to changing levels of anesthesia. Alpha oscillations were seen to be inexplicably absent in around 4% of patients. Maximal alpha power increased with increasing volatile anesthetic concentrations in half of the patients, and decreased in the remaining patients. Alpha frequency decreased with increasing anesthetic concentrations in near to 90% of patients. Increasing age was associated with decreased sensitivity to volatile anesthesia concentrations, and with decreased alpha frequency, which sometimes transitioned into the theta range (5–7 Hz. While peak alpha frequency shows a consistent slowing to increasing volatile concentrations, the peak power of the oscillation does not, suggesting that frequency might be more informative of depth of anesthesia than traditional power based measures during volatile-based anesthesia. The alpha oscillation becomes slower with increasing age, even when the decreased anesthetic needs of older patients were taken into account.

  15. Observer-Based Load Frequency Control for Island Microgrid with Photovoltaic Power

    Directory of Open Access Journals (Sweden)

    Chaoxu Mu

    2017-01-01

    Full Text Available As renewable energy is widely integrated into the power system, the stochastic and intermittent power generation from renewable energy may cause system frequency deviating from the prescribed level, especially for a microgrid. In this paper, the load frequency control (LFC of an island microgrid with photovoltaic (PV power and electric vehicles (EVs is investigated, where the EVs can be treated as distributed energy storages. Considering the disturbances from load change and PV power, an observer-based integral sliding mode (OISM controller is designed to regulate the frequency back to the prescribed value, where the neural network observer is used to online estimate the PV power. Simulation studies on a benchmark microgrid system are presented to illustrate the effectiveness of OISM controller, and comparative results also demonstrate that the proposed method has a superior performance for stabilizing the frequency over the PID control.

  16. Analysis for Human-related Events during the Overhaul

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ji Tae; Kim, Min Chull; Choi, Dong Won; Lee, Durk Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2011-10-15

    The event frequency due to human error is decreasing among 20 operating Nuclear Power Plants (NPPs) excluding the NPP (Shin-Kori unit-1) in the commissioning stage since 2008. However, the events due to human error during an overhaul (O/H) occur annually (see Table I). An analysis for human-related events during the O/H was performed. Similar problems were identified for each event from the analysis and also, organizational and safety cultural factors were also identified

  17. Next-Generation Intensity-Duration-Frequency Curves for Hydrologic Design in Snow-Dominated Environments

    Science.gov (United States)

    Yan, Hongxiang; Sun, Ning; Wigmosta, Mark; Skaggs, Richard; Hou, Zhangshuan; Leung, Ruby

    2018-02-01

    There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial overestimation/underestimation of design basis events and subsequent overdesign/underdesign of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to underdesign, many with significant underestimation of 100 year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for underdesign were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for overdesign at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.

  18. Ancillary Frequency Control of Direct Drive Full-Scale Converter Based Wind Power Plants

    DEFF Research Database (Denmark)

    Hu, Weihao; Su, Chi; Fang, Jiakun

    2013-01-01

    This paper presents a simulation model of a wind power plant based on a MW-level variable speed wind turbine with a full-scale back-to-back power converter developed in the simulation tool of DIgSILENT Power Factory. Three different kinds of ancillary frequency control strategies, namely inertia...... control strategies are effective means for providing ancillary frequency control of variable speed wind turbines with full-scale back-to-back power converters....... emulation, primary frequency control and secondary frequency control, are proposed in order to improve the frequency stability of power systems. The modified IEEE 39-bus test system with a large-scale wind power penetration is chosen as the studied power system. Simulation results show that the proposed...

  19. Dissemination of optical-comb-based ultra-broadband frequency reference through a fiber network.

    Science.gov (United States)

    Nagano, Shigeo; Kumagai, Motohiro; Li, Ying; Ido, Tetsuya; Ishii, Shoken; Mizutani, Kohei; Aoki, Makoto; Otsuka, Ryohei; Hanado, Yuko

    2016-08-22

    We disseminated an ultra-broadband optical frequency reference based on a femtosecond (fs)-laser optical comb through a kilometer-scale fiber link. Its spectrum ranged from 1160 nm to 2180 nm without additional fs-laser combs at the end of the link. By employing a fiber-induced phase noise cancellation technique, the linewidth and fractional frequency instability attained for all disseminated comb modes were of order 1 Hz and 10-18 in a 5000 s averaging time. The ultra-broad optical frequency reference, for which absolute frequency is traceable to Japan Standard Time, was applied in the frequency stabilization of an injection-seeded Q-switched 2051 nm pulse laser for a coherent light detection and ranging LIDAR system.

  20. Intermediate Frequency Digital Receiver Based on Multi-FPGA System

    Directory of Open Access Journals (Sweden)

    Chengchang Zhang

    2016-01-01

    Full Text Available Aiming at high-cost, large-size, and inflexibility problems of traditional analog intermediate frequency receiver in the aerospace telemetry, tracking, and command (TTC system, we have proposed a new intermediate frequency (IF digital receiver based on Multi-FPGA system in this paper. Digital beam forming (DBF is realized by coordinated rotation digital computer (CORDIC algorithm. An experimental prototype has been developed on a compact Multi-FPGA system with three FPGAs to receive 16 channels of IF digital signals. Our experimental results show that our proposed scheme is able to provide a great convenience for the design of IF digital receiver, which offers a valuable reference for real-time, low power, high density, and small size receiver design.

  1. Linear active disturbance rejection-based load frequency control concerning high penetration of wind energy

    International Nuclear Information System (INIS)

    Tang, Yanmei; Bai, Yan; Huang, Congzhi; Du, Bin

    2015-01-01

    Highlights: • A disturbance rejection solution to the load frequency control issue is proposed. • Several power systems with wind energy conversation system have been tested. • A tuning algorithm of the controller parameters was proposed. • The performance of the proposed approach is better than traditional controllers. - Abstract: A new grid load frequency control approach is proposed for the doubly fed induction generator based wind power plants. The load frequency control issue in a power system is undergoing fundamental changes due to the rapidly growing amount of wind energy conversation system, and concentrating on maintaining generation-load balance and disturbance rejection. The prominent feature of the linear active disturbance rejection control approach is that the total disturbance can be estimated and then eliminated in real time. And thus, it is a feasible solution to deal with the load frequency control issue. In this paper, the application of the linear active disturbance rejection control approach in the load frequency control issue for a complex power system with wind energy conversation system based on doubly fed induction generator is investigated. The load frequency control issue is formulated as a decentralized multi-objective optimization control problem, the solution to which is solved by the hybrid particle swarm optimization technique. To show the effectiveness of the proposed control scheme, the robust performance testing based on Monte-Carlo approach is carried out. The performance superiority of the system with the proposed linear active disturbance rejection control approach over that with the traditional proportional integral and fuzzy-proportional integral-based controllers is validated by the simulation results

  2. Optical fiber strain sensor using fiber resonator based on frequency comb Vernier spectroscopy

    DEFF Research Database (Denmark)

    Zhang, Liang; Lu, Ping; Chen, Li

    2012-01-01

    A novel (to our best knowledge) optical fiber strain sensor using a fiber ring resonator based on frequency comb Vernier spectroscopy is proposed and demonstrated. A passively mode-locked optical fiber laser is employed to generate a phased-locked frequency comb. Strain applied to the optical fib...

  3. Distinguishing low frequency oscillations within the 1/f spectral behaviour of electromagnetic brain signals.

    Science.gov (United States)

    Demanuele, Charmaine; James, Christopher J; Sonuga-Barke, Edmund Js

    2007-12-10

    It has been acknowledged that the frequency spectrum of measured electromagnetic (EM) brain signals shows a decrease in power with increasing frequency. This spectral behaviour may lead to difficulty in distinguishing event-related peaks from ongoing brain activity in the electro- and magnetoencephalographic (EEG and MEG) signal spectra. This can become an issue especially in the analysis of low frequency oscillations (LFOs) - below 0.5 Hz - which are currently being observed in signal recordings linked with specific pathologies such as epileptic seizures or attention deficit hyperactivity disorder (ADHD), in sleep studies, etc. In this work we propose a simple method that can be used to compensate for this 1/f trend hence achieving spectral normalisation. This method involves filtering the raw measured EM signal through a differentiator prior to further data analysis. Applying the proposed method to various exemplary datasets including very low frequency EEG recordings, epileptic seizure recordings, MEG data and Evoked Response data showed that this compensating procedure provides a flat spectral base onto which event related peaks can be clearly observed. Findings suggest that the proposed filter is a useful tool for the analysis of physiological data especially in revealing very low frequency peaks which may otherwise be obscured by the 1/f spectral activity inherent in EEG/MEG recordings.

  4. On Mixed Data and Event Driven Design for Adaptive-Critic-Based Nonlinear $H_{\\infty}$ Control.

    Science.gov (United States)

    Wang, Ding; Mu, Chaoxu; Liu, Derong; Ma, Hongwen

    2018-04-01

    In this paper, based on the adaptive critic learning technique, the control for a class of unknown nonlinear dynamic systems is investigated by adopting a mixed data and event driven design approach. The nonlinear control problem is formulated as a two-player zero-sum differential game and the adaptive critic method is employed to cope with the data-based optimization. The novelty lies in that the data driven learning identifier is combined with the event driven design formulation, in order to develop the adaptive critic controller, thereby accomplishing the nonlinear control. The event driven optimal control law and the time driven worst case disturbance law are approximated by constructing and tuning a critic neural network. Applying the event driven feedback control, the closed-loop system is built with stability analysis. Simulation studies are conducted to verify the theoretical results and illustrate the control performance. It is significant to observe that the present research provides a new avenue of integrating data-based control and event-triggering mechanism into establishing advanced adaptive critic systems.

  5. A time and frequency synchronization method for CO-OFDM based on CMA equalizers

    Science.gov (United States)

    Ren, Kaixuan; Li, Xiang; Huang, Tianye; Cheng, Zhuo; Chen, Bingwei; Wu, Xu; Fu, Songnian; Ping, Perry Shum

    2018-06-01

    In this paper, an efficient time and frequency synchronization method based on a new training symbol structure is proposed for polarization division multiplexing (PDM) coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. The coarse timing synchronization is achieved by exploiting the correlation property of the first training symbol, and the fine timing synchronization is accomplished by using the time-domain symmetric conjugate of the second training symbol. Furthermore, based on these training symbols, a constant modulus algorithm (CMA) is proposed for carrier frequency offset (CFO) estimation. Theoretical analysis and simulation results indicate that the algorithm has the advantages of robustness to poor optical signal-to-noise ratio (OSNR) and chromatic dispersion (CD). The frequency offset estimation range can achieve [ -Nsc/2 ΔfN , + Nsc/2 ΔfN ] GHz with the mean normalized estimation error below 12 × 10-3 even under the condition of OSNR as low as 10 dB.

  6. A community-based event delivery protocol in publish/subscribe systems for delay tolerant sensor networks.

    Science.gov (United States)

    Liu, Nianbo; Liu, Ming; Zhu, Jinqi; Gong, Haigang

    2009-01-01

    The basic operation of a Delay Tolerant Sensor Network (DTSN) is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short) paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  7. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  8. Exposure estimates based on broadband elf magnetic field measurements versus the ICNIRP multiple frequency rule

    International Nuclear Information System (INIS)

    Paniagua, Jesus M.; Rufo, Montana; Jimenez, Antonio; Pachon, Fernando T.; Carrero, Julian

    2015-01-01

    The evaluation of exposure to extremely low-frequency (ELF) magnetic fields using broadband measurement techniques gives satisfactory results when the field has essentially a single frequency. Nevertheless, magnetic fields are in most cases distorted by harmonic components. This work analyses the harmonic components of the ELF magnetic field in an outdoor urban context and compares the evaluation of the exposure based on broadband measurements with that based on spectral analysis. The multiple frequency rule of the International Commission on Non-ionizing Radiation Protection (ICNIRP) regulatory guidelines was applied. With the 1998 ICNIRP guideline, harmonics dominated the exposure with a 55 % contribution. With the 2010 ICNIRP guideline, however, the primary frequency dominated the exposure with a 78 % contribution. Values of the exposure based on spectral analysis were significantly higher than those based on broadband measurements. Hence, it is clearly necessary to determine the harmonic components of the ELF magnetic field to assess exposure in urban contexts. (authors)

  9. An asynchronous data-driven event-building scheme based on ATM switching fabrics

    International Nuclear Information System (INIS)

    Letheren, M.; Christiansen, J.; Mandjavidze, I.; Verhille, H.; De Prycker, M.; Pauwels, B.; Petit, G.; Wright, S.; Lumley, J.

    1994-01-01

    The very high data rates expected in experiments at the next generation of high luminosity hadron colliders will be handled by pipelined front-end readout electronics and multiple levels (2 or 3) of triggering. A variety of data acquisition architectures have been proposed for use downstream of the first level trigger. Depending on the architecture, the aggregate bandwidths required for event building are expected to be of the order 10--100 Gbit/s. Here, an Asynchronous Transfer Mode (ATM) packet-switching network technology is proposed as the interconnect for building high-performance, scalable data acquisition architectures. This paper introduces the relevant characteristics of ATM and describes components for the construction of an ATM-based event builder: (1) a multi-path, self-routing, scalable ATM switching fabric, (2) an experimental high performance workstation ATM-interface, and (3) a VMEbus ATM-interface. The requirement for traffic shaping in ATM-based event-builders is discussed and an analysis of the performance of several such schemes is presented

  10. Cross-media color reproduction using the frequency-based spatial gamut mapping algorithm based on human color vision

    Science.gov (United States)

    Wu, Guangyuan; Niu, Shijun; Li, Xiaozhou; Hu, Guichun

    2018-04-01

    Due to the increasing globalization of printing industry, remoting proofing will become the inevitable development trend. Cross-media color reproduction will occur in different color gamuts using remote proofing technologies, which usually leads to the problem of incompatible color gamut. In this paper, to achieve equivalent color reproduction between a monitor and a printer, a frequency-based spatial gamut mapping algorithm is proposed for decreasing the loss of visual color information. The design of algorithm is based on the contrast sensitivity functions (CSF), which exploited CSF spatial filter to preserve luminance of the high spatial frequencies and chrominance of the low frequencies. First we show a general framework for how to apply CSF spatial filter in retention of relevant visual information. Then we compare the proposed framework with HPMINDE, CUSP, Bala's algorithm. The psychophysical experimental results indicated the good performance of the proposed algorithm.

  11. Islanding Detection of Synchronous Machine-Based DGs using Average Frequency Based Index

    Directory of Open Access Journals (Sweden)

    M. Bakhshi

    2013-06-01

    Full Text Available Identification of intentional and unintentional islanding situations of dispersed generators (DGs is one of the most important protection concerns in power systems. Considering safety and reliability problems of distribution networks, an exact diagnosis index is required to discriminate the loss of the main network from the existing parallel operation. Hence, this paper introduces a new islanding detection method for synchronous machine–based DGs. This method uses the average value of the generator frequency to calculate a new detection index. The proposed method is an effective supplement of the over/under frequency protection (OFP/UFP system. The analytical equations and simulation results are used to assess the performance of the proposed method under various scenarios such as different types of faults, load changes and capacitor bank switching. To show the effectiveness of the proposed method, it is compared with the performance of both ROCOF and ROCOFOP methods.

  12. Frequency and voltage dependent electrical responses of poly(triarylamine) thin film-based organic Schottky diode

    Science.gov (United States)

    Anuar Mohamad, Khairul; Tak Hoh, Hang; Alias, Afishah; Ghosh, Bablu Kumar; Fukuda, Hisashi

    2017-11-01

    A metal-organic-metal (MOM) type Schottky diode based on poly (triarylamine) (PTAA) thin films has been fabricated by using the spin coating method. Investigation of the frequency dependent conductance-voltage (G-V-f) and capacitance-voltage (C-V-f) characteristics of the ITO/PTAA/Al MOM type diode were carried out in the frequency range from 12 Hz to 100 kHz using an LCR meter at room temperature. The frequency and bias voltage dependent electrical response were determined by admittance-based measured method in terms of an equivalent circuit model of the parallel combination of resistance and capacitance (RC circuit). Investigation revealed that the conductance is frequency and a bias voltage dependent in which conductance continuous increase as the increasing frequency, respectively. Meanwhile, the capacitance is dependent on frequency up to a certain value of frequency (100 Hz) but decreases at high frequency (1 - 10 kHz). The interface state density in the Schottky diode was determined from G-V and C-V characteristics. The interface state density has values almost constant of 2.8 x 1012 eV-1cm-2 with slightly decrease by increasing frequencies. Consequently, both series resistance and interface trap density were found to decrease with increasing frequency. The frequency dependence of the electrical responses is attributed the distribution density of interface states that could follow the alternating current (AC) signal.

  13. Event-based proactive interference in rhesus monkeys.

    Science.gov (United States)

    Devkar, Deepna T; Wright, Anthony A

    2016-10-01

    Three rhesus monkeys (Macaca mulatta) were tested in a same/different memory task for proactive interference (PI) from prior trials. PI occurs when a previous sample stimulus appears as a test stimulus on a later trial, does not match the current sample stimulus, and the wrong response "same" is made. Trial-unique pictures (scenes, objects, animals, etc.) were used on most trials, except on trials where the test stimulus matched potentially interfering sample stimulus from a prior trial (1, 2, 4, 8, or 16 trials prior). Greater interference occurred when fewer trials separated interference and test. PI functions showed a continuum of interference. Delays between sample and test stimuli and intertrial intervals were manipulated to test how PI might vary as a function of elapsed time. Contrary to a similar study with pigeons, these time manipulations had no discernable effect on the monkey's PI, as shown by compete overlap of PI functions with no statistical differences or interactions. These results suggested that interference was strictly based upon the number of intervening events (trials with other pictures) without regard to elapsed time. The monkeys' apparent event-based interference was further supported by retesting with a novel set of 1,024 pictures. PI from novel pictures 1 or 2 trials prior was greater than from familiar pictures, a familiar set of 1,024 pictures. Moreover, when potentially interfering novel stimuli were 16 trials prior, performance accuracy was actually greater than accuracy on baseline trials (no interference), suggesting that remembering stimuli from 16 trials prior was a cue that this stimulus was not the sample stimulus on the current trial-a somewhat surprising conclusion particularly given monkeys.

  14. The global magnitude-frequency relationship for large explosive volcanic eruptions

    Science.gov (United States)

    Rougier, Jonathan; Sparks, R. Stephen J.; Cashman, Katharine V.; Brown, Sarah K.

    2018-01-01

    For volcanoes, as for other natural hazards, the frequency of large events diminishes with their magnitude, as captured by the magnitude-frequency relationship. Assessing this relationship is valuable both for the insights it provides about volcanism, and for the practical challenge of risk management. We derive a global magnitude-frequency relationship for explosive volcanic eruptions of at least 300Mt of erupted mass (or M4.5). Our approach is essentially empirical, based on the eruptions recorded in the LaMEVE database. It differs from previous approaches mainly in our conservative treatment of magnitude-rounding and under-recording. Our estimate for the return period of 'super-eruptions' (1000Gt, or M8) is 17ka (95% CI: 5.2ka, 48ka), which is substantially shorter than previous estimates, indicating that volcanoes pose a larger risk to human civilisation than previously thought.

  15. Projections of extreme water level events for atolls in the western Tropical Pacific

    Science.gov (United States)

    Merrifield, M. A.; Becker, J. M.; Ford, M.; Yao, Y.

    2014-12-01

    Conditions that lead to extreme water levels and coastal flooding are examined for atolls in the Republic of the Marshall Islands based on a recent field study of wave transformations over fringing reefs, tide gauge observations, and wave model hindcasts. Wave-driven water level extremes pose the largest threat to atoll shorelines, with coastal levels scaling as approximately one-third of the incident breaking wave height. The wave-driven coastal water level is partitioned into a mean setup, low frequency oscillations associated with cross-reef quasi-standing modes, and wind waves that reach the shore after undergoing high dissipation due to breaking and bottom friction. All three components depend on the water level over the reef; however, the sum of the components is independent of water level due to cancelling effects. Wave hindcasts suggest that wave-driven water level extremes capable of coastal flooding are infrequent events that require a peak wave event to coincide with mid- to high-tide conditions. Interannual and decadal variations in sea level do not change the frequency of these events appreciably. Future sea-level rise scenarios significantly increase the flooding threat associated with wave events, with a nearly exponential increase in flooding days per year as sea level exceeds 0.3 to 1.0 m above current levels.

  16. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  17. An assessment of envelope-based demodulation in case of proximity of carrier and modulation frequencies

    Science.gov (United States)

    Shahriar, Md Rifat; Borghesani, Pietro; Randall, R. B.; Tan, Andy C. C.

    2017-11-01

    Demodulation is a necessary step in the field of diagnostics to reveal faults whose signatures appear as an amplitude and/or frequency modulation. The Hilbert transform has conventionally been used for the calculation of the analytic signal required in the demodulation process. However, the carrier and modulation frequencies must meet the conditions set by the Bedrosian identity for the Hilbert transform to be applicable for demodulation. This condition, basically requiring the carrier frequency to be sufficiently higher than the frequency of the modulation harmonics, is usually satisfied in many traditional diagnostic applications (e.g. vibration analysis of gear and bearing faults) due to the order-of-magnitude ratio between the carrier and modulation frequency. However, the diversification of the diagnostic approaches and applications shows cases (e.g. electrical signature analysis-based diagnostics) where the carrier frequency is in close proximity to the modulation frequency, thus challenging the applicability of the Bedrosian theorem. This work presents an analytic study to quantify the error introduced by the Hilbert transform-based demodulation when the Bedrosian identity is not satisfied and proposes a mitigation strategy to combat the error. An experimental study is also carried out to verify the analytical results. The outcome of the error analysis sets a confidence limit on the estimated modulation (both shape and magnitude) achieved through the Hilbert transform-based demodulation in case of violated Bedrosian theorem. However, the proposed mitigation strategy is found effective in combating the demodulation error aroused in this scenario, thus extending applicability of the Hilbert transform-based demodulation.

  18. Frequency analysis of urban runoff quality in an urbanizing catchment of Shenzhen, China

    Science.gov (United States)

    Qin, Huapeng; Tan, Xiaolong; Fu, Guangtao; Zhang, Yingying; Huang, Yuefei

    2013-07-01

    This paper investigates the frequency distribution of urban runoff quality indicators using a long-term continuous simulation approach and evaluates the impacts of proposed runoff control schemes on runoff quality in an urbanizing catchment in Shenzhen, China. Four different indicators are considered to provide a comprehensive assessment of the potential impacts: total runoff depth, event pollutant load, Event Mean Concentration, and peak concentration during a rainfall event. The results obtained indicate that urban runoff quantity and quality in the catchment have significant variations in rainfall events and a very high rate of non-compliance with surface water quality regulations. Three runoff control schemes with the capacity to intercept an initial runoff depth of 5 mm, 10 mm, and 15 mm are evaluated, respectively, and diminishing marginal benefits are found with increasing interception levels in terms of water quality improvement. The effects of seasonal variation in rainfall events are investigated to provide a better understanding of the performance of the runoff control schemes. The pre-flood season has higher risk of poor water quality than other seasons after runoff control. This study demonstrates that frequency analysis of urban runoff quantity and quality provides a probabilistic evaluation of pollution control measures, and thus helps frame a risk-based decision making for urban runoff quality management in an urbanizing catchment.

  19. Precursor analyses - The use of deterministic and PSA based methods in the event investigation process at nuclear power plants

    International Nuclear Information System (INIS)

    2004-09-01

    The efficient feedback of operating experience (OE) is a valuable source of information for improving the safety and reliability of nuclear power plants (NPPs). It is therefore essential to collect information on abnormal events from both internal and external sources. Internal operating experience is analysed to obtain a complete understanding of an event and of its safety implications. Corrective or improvement measures may then be developed, prioritized and implemented in the plant if considered appropriate. Information from external events may also be analysed in order to learn lessons from others' experience and prevent similar occurrences at our own plant. The traditional ways of investigating operational events have been predominantly qualitative. In recent years, a PSA-based method called probabilistic precursor event analysis has been developed, used and applied on a significant scale in many places for a number of plants. The method enables a quantitative estimation of the safety significance of operational events to be incorporated. The purpose of this report is to outline a synergistic process that makes more effective use of operating experience event information by combining the insights and knowledge gained from both approaches, traditional deterministic event investigation and PSA-based event analysis. The PSA-based view on operational events and PSA-based event analysis can support the process of operational event analysis at the following stages of the operational event investigation: (1) Initial screening stage. (It introduces an element of quantitative analysis into the selection process. Quantitative analysis of the safety significance of nuclear plant events can be a very useful measure when it comes to selecting internal and external operating experience information for its relevance.) (2) In-depth analysis. (PSA based event evaluation provides a quantitative measure for judging the significance of operational events, contributors to

  20. Noether's Theorem and its Inverse of Birkhoffian System in Event Space Based on Herglotz Variational Problem

    Science.gov (United States)

    Tian, X.; Zhang, Y.

    2018-03-01

    Herglotz variational principle, in which the functional is defined by a differential equation, generalizes the classical ones defining the functional by an integral. The principle gives a variational principle description of nonconservative systems even when the Lagrangian is independent of time. This paper focuses on studying the Noether's theorem and its inverse of a Birkhoffian system in event space based on the Herglotz variational problem. Firstly, according to the Herglotz variational principle of a Birkhoffian system, the principle of a Birkhoffian system in event space is established. Secondly, its parametric equations and two basic formulae for the variation of Pfaff-Herglotz action of a Birkhoffian system in event space are obtained. Furthermore, the definition and criteria of Noether symmetry of the Birkhoffian system in event space based on the Herglotz variational problem are given. Then, according to the relationship between the Noether symmetry and conserved quantity, the Noether's theorem is derived. Under classical conditions, Noether's theorem of a Birkhoffian system in event space based on the Herglotz variational problem reduces to the classical ones. In addition, Noether's inverse theorem of the Birkhoffian system in event space based on the Herglotz variational problem is also obtained. In the end of the paper, an example is given to illustrate the application of the results.

  1. A High-Precision Time-Frequency Entropy Based on Synchrosqueezing Generalized S-Transform Applied in Reservoir Detection

    Directory of Open Access Journals (Sweden)

    Hui Chen

    2018-06-01

    Full Text Available According to the fact that high frequency will be abnormally attenuated when seismic signals travel across reservoirs, a new method, which is named high-precision time-frequency entropy based on synchrosqueezing generalized S-transform, is proposed for hydrocarbon reservoir detection in this paper. First, the proposed method obtains the time-frequency spectra by synchrosqueezing generalized S-transform (SSGST, which are concentrated around the real instantaneous frequency of the signals. Then, considering the characteristics and effects of noises, we give a frequency constraint condition to calculate the entropy based on time-frequency spectra. The synthetic example verifies that the entropy will be abnormally high when seismic signals have an abnormal attenuation. Besides, comparing with the GST time-frequency entropy and the original SSGST time-frequency entropy in field data, the results of the proposed method show higher precision. Moreover, the proposed method can not only accurately detect and locate hydrocarbon reservoirs, but also effectively suppress the impact of random noises.

  2. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    Science.gov (United States)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  3. Assessment of lightning impact frequency for process equipment

    International Nuclear Information System (INIS)

    Necci, Amos; Antonioni, Giacomo; Cozzani, Valerio; Krausmann, Elisabeth; Borghetti, Alberto; Nucci, Carlo Alberto

    2014-01-01

    Fires and explosions triggered by lightning strikes are among the most frequent Natech scenarios affecting the chemical and process industry. Although lightning hazard is well known, well accepted quantitative procedures to assess the contribution of accidents caused by lightning to industrial risk are still lacking. In the present study, a quantitative methodology for the assessment of the expected frequency of lightning capture by process equipment is presented. A specific model, based on Monte Carlo simulations, was developed to assess the capture frequency of lightning for equipment with a given geometry. The model allows the assessment of lay-out effects and the reduction of the capture probability due to the presence of other structures or equipment items. The results of the Monte Carlo simulations were also used to develop a simplified cell method allowing a straightforward assessment of the lightning impact probability in a quantitative risk assessment framework. The developed approach allows an in-depth analysis of the hazard due to lightning impact by identifying equipment items with the highest expected frequency of lightning impacts in a given lay-out. The model thus supplies useful data to approach the assessment of the quantitative contribution of lightning-triggered accidents to industrial risk. - Highlights: • A specific approach to storage tank lightning impact frequency calculation was developed. • The approach is suitable for the quantitative assessment of industrial risk due to lightning. • The models developed provide lightning capture frequency based on tank geometry. • Lay-out effects due to nearby structures are also accounted. • Capture frequencies may be as high as 10 −1 events/year for standalone unprotected tanks

  4. Bipolar-power-transistor-based limiter for high frequency ultrasound imaging systems.

    Science.gov (United States)

    Choi, Hojong; Yang, Hao-Chung; Shung, K Kirk

    2014-03-01

    High performance limiters are described in this paper for applications in high frequency ultrasound imaging systems. Limiters protect the ultrasound receiver from the high voltage (HV) spikes produced by the transmitter. We present a new bipolar power transistor (BPT) configuration and compare its design and performance to a diode limiter used in traditional ultrasound research and one commercially available limiter. Limiter performance depends greatly on the insertion loss (IL), total harmonic distortion (THD) and response time (RT), each of which will be evaluated in all the limiters. The results indicated that, compared with commercial limiter, BPT-based limiter had less IL (-7.7 dB), THD (-74.6 dB) and lower RT (43 ns) at 100 MHz. To evaluate the capability of these limiters, they were connected to a 100 MHz single element transducer and a two-way pulse-echo test was performed. It was found that the -6 dB bandwidth and sensitivity of the transducer using BPT-based limiter were better than those of the commercial limiter by 22% and 140%, respectively. Compared to the commercial limiter, BPT-based limiter is shown to be capable of minimizing signal attenuation, RT and THD at high frequencies and is thus suited for high frequency ultrasound applications. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Core damage frequency estimation using accident sequence precursor data: 1990-1993

    International Nuclear Information System (INIS)

    Martz, H.F.

    1998-01-01

    The Nuclear Regulatory Commission's (NRC's) ongoing Accident Sequence Precursor (ASP) program uses probabilistic risk assessment (PRA) techniques to assess the potential for severe core damage (henceforth referred to simply as core damage) based on operating events. The types of operating events considered include accident sequence initiators, safety equipment failures, and degradation of plant conditions that could increase the probability that various postulated accident sequences occur. Such operating events potentially reduce the margin of safety available for prevention of core damage an thus can be considered as precursors to core damage. The current process for identifying, analyzing, and documenting ASP events is described in detail in Vanden Heuval et al. The significance of a Licensee Event Report (LER) event (or events) is measured by means of the conditional probability that the event leads to core damage, the so-called conditional core damage probability or, simply, CCDP. When the first ASP study results were published in 1982, it covered the period 1969--1979. In addition to identification and ranking of precursors, the original study attempted to estimate core damage frequency (CDF) based on the precursor events. The purpose of this paper is to compare the average annual CDF estimates calculated using the CCDP sum, Cooke-Goossens, Bier, and Abramson estimators for various reactor classes using the combined ASP data for the four years, 1990--1993. An important outcome of this comparison is an answer to the persistent question regarding the degree and effect of the positive bias of the CCDP sum method in practice. Note that this paper only compares the estimators with each other. Because the true average CDF is unknown, the estimation error is also unknown. Therefore, any observations or characterizations of bias are based on purely theoretical considerations

  6. An automatic method to determine cutoff frequency based on image power spectrum

    International Nuclear Information System (INIS)

    Beis, J.S.; Vancouver Hospital and Health Sciences Center, British Columbia; Celler, A.; Barney, J.S.

    1995-01-01

    The authors present an algorithm for automatically choosing filter cutoff frequency (F c ) using the power spectrum of the projections. The method is based on the assumption that the expectation of the image power spectrum is the sum of the expectation of the blurred object power spectrum (dominant at low frequencies) plus a constant value due to Poisson noise. By considering the discrete components of the noise-dominated high-frequency spectrum as a Gaussian distribution N(μ,σ), the Student t-test determines F c as the highest frequency for which the image frequency components are unlikely to be drawn from N (μ,σ). The method is general and can be applied to any filter. In this work, the authors tested the approach using the Metz restoration filter on simulated, phantom, and patient data with good results. Quantitative performance of the technique was evaluated by plotting recovery coefficient (RC) versus NMSE of reconstructed images

  7. Analysis of Power System Low Frequency Oscillation Based on Energy Shift Theory

    Science.gov (United States)

    Zhang, Junfeng; Zhang, Chunwang; Ma, Daqing

    2018-01-01

    In this paper, a new method for analyzing low-frequency oscillation between analytic areas based on energy coefficient is proposed. The concept of energy coefficient is proposed by constructing the energy function, and the low-frequency oscillation is analyzed according to the energy coefficient under the current operating conditions; meanwhile, the concept of model energy is proposed to analyze the energy exchange behavior between two generators. Not only does this method provide an explanation of low-frequency oscillation from the energy point of view, but also it helps further reveal the dynamic behavior of complex power systems. The case analysis of four-machine two-area and the power system of Jilin Power Grid proves the correctness and effectiveness of the proposed method in low-frequency oscillation analysis of power system.

  8. High frequency energy measurements

    International Nuclear Information System (INIS)

    Stotlar, S.C.

    1981-01-01

    High-frequency (> 100 MHz) energy measurements present special problems to the experimenter. Environment or available electronics often limit the applicability of a given detector type. The physical properties of many detectors are frequency dependent and in some cases, the physical effect employed can be frequency dependent. State-of-the-art measurements generally involve a detection scheme in association with high-speed electronics and a method of data recording. Events can be single or repetitive shot requiring real time, sampling, or digitizing data recording. Potential modification of the pulse by the detector and the associated electronics should not be overlooked. This presentation will review typical applications, methods of choosing a detector, and high-speed detectors. Special considerations and limitations of some applications and devices will be described

  9. Review of the Shoreham Nuclear Power Station Probabilistic Risk Assessment: internal events and core damage frequency

    International Nuclear Information System (INIS)

    Ilberg, D.; Shiu, K.; Hanan, N.; Anavim, E.

    1985-11-01

    A review of the Probabilistic Risk Assessment of the Shoreham Nuclear Power Station was conducted with the broad objective of evaluating its risks in relation to those identified in the Reactor Safety Study (WASH-1400). The scope of the review was limited to the ''front end'' part, i.e., to the evaluation of the frequencies of states in which core damage may occur. Furthermore, the review considered only internally generated accidents, consistent with the scope of the PRA. The review included an assessment of the assumptions and methods used in the Shoreham study. It also encompassed a reevaluation of the main results within the scope and general methodological framework of the Shoreham PRA, including both qualitative and quantitative analyses of accident initiators, data bases, and accident sequences which result in initiation of core damage. Specific comparisons are given between the Shoreham study, the results of the present review, and the WASH-1400 BWR, for the core damage frequency. The effect of modeling uncertainties was considered by a limited sensitivity study so as to show how the results would change if other assumptions were made. This review provides an independently assessed point value estimate of core damage frequency and describes the major contributors, by frontline systems and by accident sequences. 17 figs., 81 tabs

  10. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    OpenAIRE

    Tramblay, Yves; Bouvier, Christophe; Martin, C.; Didon-Lescot, J. F.; Todorovik, D.; Domergue, J. M.

    2010-01-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture,...

  11. A Community-Based Event Delivery Protocol in Publish/Subscribe Systems for Delay Tolerant Sensor Networks

    Directory of Open Access Journals (Sweden)

    Haigang Gong

    2009-09-01

    Full Text Available The basic operation of a Delay Tolerant Sensor Network (DTSN is to finish pervasive data gathering in networks with intermittent connectivity, while the publish/subscribe (Pub/Sub for short paradigm is used to deliver events from a source to interested clients in an asynchronous way. Recently, extension of Pub/Sub systems in DTSNs has become a promising research topic. However, due to the unique frequent partitioning characteristic of DTSNs, extension of a Pub/Sub system in a DTSN is a considerably difficult and challenging problem, and there are no good solutions to this problem in published works. To ad apt Pub/Sub systems to DTSNs, we propose CED, a community-based event delivery protocol. In our design, event delivery is based on several unchanged communities, which are formed by sensor nodes in the network according to their connectivity. CED consists of two components: event delivery and queue management. In event delivery, events in a community are delivered to mobile subscribers once a subscriber comes into the community, for improving the data delivery ratio. The queue management employs both the event successful delivery time and the event survival time to decide whether an event should be delivered or dropped for minimizing the transmission overhead. The effectiveness of CED is demonstrated through comprehensive simulation studies.

  12. Development of a fast piezo-based frequency tuner for superconducting CH cavities

    International Nuclear Information System (INIS)

    Amberg, Michael

    2015-01-01

    In this thesis, a fast piezo-based frequency tuner for current and prospective superconducting (sc) CH-cavities has been developed. The novel tuning concept differs fundamentally from conventional tuning systems for superconducting cavities. So called dynamic bellow tuners are welded into the resonator to act against slow and fast frequency variations during operation. Because of their adjustable length it is possible to specifically influence the capacitance and therefore the resonance frequency of the cavity. To change the length of the dynamic bellow tuners the frequency tuner drive, which consists of a slow tuning device controlled by a stepper motor and a fast piezo-based tuning system, is mounted to the helium vessel of the cavity. To validate the whole tuning concept a frequency tuner drive prototype was built in the workshop of the Institute for Applied Physics (IAP) of Frankfurt University. First successful room temperature measurements show that the developed frequency tuning system is an excellent and promising candidate to fulfill the requirements of slow and fast frequency tuning of sc CH-cavities during operation. Furthermore, several coupled structural and electromagnetic simulations of the sc 325 MHz CH-cavity as well as the sc 217 MHz CH-cavity have been performed with the simulation softwares ANSYS Workbench and CST MicroWave Studio, respectively. With these simulations it was possible to reduce the required frequency range and thus the mechanical stroke of the dynamic bellow tuners on the one hand, and on the other hand the mechanical stability of the particular CH-cavity was investigated to avoid plastic deformations due to limiting external effects. To verify the accuracy of the coupled simulations the structural mechanical behaviour and the resulting frequency variations of the sc CH-cavities dependent on the external influences were measured at room temperature as well as at cryogenic temperatures around 4.2 K. The measurement results of both

  13. Quantifying Changes in Future Intensity-Duration-Frequency Curves Using Multimodel Ensemble Simulations

    Science.gov (United States)

    Ragno, Elisa; AghaKouchak, Amir; Love, Charlotte A.; Cheng, Linyin; Vahedifard, Farshid; Lima, Carlos H. R.

    2018-03-01

    During the last century, we have observed a warming climate with more intense precipitation extremes in some regions, likely due to increases in the atmosphere's water holding capacity. Traditionally, infrastructure design and rainfall-triggered landslide models rely on the notion of stationarity, which assumes that the statistics of extremes do not change significantly over time. However, in a warming climate, infrastructures and natural slopes will likely face more severe climatic conditions, with potential human and socioeconomical consequences. Here we outline a framework for quantifying climate change impacts based on the magnitude and frequency of extreme rainfall events using bias corrected historical and multimodel projected precipitation extremes. The approach evaluates changes in rainfall Intensity-Duration-Frequency (IDF) curves and their uncertainty bounds using a nonstationary model based on Bayesian inference. We show that highly populated areas across the United States may experience extreme precipitation events up to 20% more intense and twice as frequent, relative to historical records, despite the expectation of unchanged annual mean precipitation. Since IDF curves are widely used for infrastructure design and risk assessment, the proposed framework offers an avenue for assessing resilience of infrastructure and landslide hazard in a warming climate.

  14. High accuracy microwave frequency measurement based on single-drive dual-parallel Mach-Zehnder modulator

    DEFF Research Database (Denmark)

    Zhao, Ying; Pang, Xiaodan; Deng, Lei

    2011-01-01

    A novel approach for broadband microwave frequency measurement by employing a single-drive dual-parallel Mach-Zehnder modulator is proposed and experimentally demonstrated. Based on bias manipulations of the modulator, conventional frequency-to-power mapping technique is developed by performing a...... 10−3 relative error. This high accuracy frequency measurement technique is a promising candidate for high-speed electronic warfare and defense applications....

  15. Gender classification in children based on speech characteristics: using fundamental and formant frequencies of Malay vowels.

    Science.gov (United States)

    Zourmand, Alireza; Ting, Hua-Nong; Mirhassani, Seyed Mostafa

    2013-03-01

    Speech is one of the prevalent communication mediums for humans. Identifying the gender of a child speaker based on his/her speech is crucial in telecommunication and speech therapy. This article investigates the use of fundamental and formant frequencies from sustained vowel phonation to distinguish the gender of Malay children aged between 7 and 12 years. The Euclidean minimum distance and multilayer perceptron were used to classify the gender of 360 Malay children based on different combinations of fundamental and formant frequencies (F0, F1, F2, and F3). The Euclidean minimum distance with normalized frequency data achieved a classification accuracy of 79.44%, which was higher than that of the nonnormalized frequency data. Age-dependent modeling was used to improve the accuracy of gender classification. The Euclidean distance method obtained 84.17% based on the optimal classification accuracy for all age groups. The accuracy was further increased to 99.81% using multilayer perceptron based on mel-frequency cepstral coefficients. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  16. Frequency and voltage dependent electrical responses of poly(triarylamine thin film-based organic Schottky diode

    Directory of Open Access Journals (Sweden)

    Mohamad Khairul Anuar

    2017-01-01

    Full Text Available A metal-organic-metal (MOM type Schottky diode based on poly (triarylamine (PTAA thin films has been fabricated by using the spin coating method. Investigation of the frequency dependent conductance-voltage (G-V-f and capacitance-voltage (C-V-f characteristics of the ITO/PTAA/Al MOM type diode were carried out in the frequency range from 12 Hz to 100 kHz using an LCR meter at room temperature. The frequency and bias voltage dependent electrical response were determined by admittance-based measured method in terms of an equivalent circuit model of the parallel combination of resistance and capacitance (RC circuit. Investigation revealed that the conductance is frequency and a bias voltage dependent in which conductance continuous increase as the increasing frequency, respectively. Meanwhile, the capacitance is dependent on frequency up to a certain value of frequency (100 Hz but decreases at high frequency (1 – 10 kHz. The interface state density in the Schottky diode was determined from G-V and C-V characteristics. The interface state density has values almost constant of 2.8 x 1012 eV−1cm−2 with slightly decrease by increasing frequencies. Consequently, both series resistance and interface trap density were found to decrease with increasing frequency. The frequency dependence of the electrical responses is attributed the distribution density of interface states that could follow the alternating current (AC signal.

  17. Gear fault diagnosis based on the structured sparsity time-frequency analysis

    Science.gov (United States)

    Sun, Ruobin; Yang, Zhibo; Chen, Xuefeng; Tian, Shaohua; Xie, Yong

    2018-03-01

    Over the last decade, sparse representation has become a powerful paradigm in mechanical fault diagnosis due to its excellent capability and the high flexibility for complex signal description. The structured sparsity time-frequency analysis (SSTFA) is a novel signal processing method, which utilizes mixed-norm priors on time-frequency coefficients to obtain a fine match for the structure of signals. In order to extract the transient feature from gear vibration signals, a gear fault diagnosis method based on SSTFA is proposed in this work. The steady modulation components and impulsive components of the defective gear vibration signals can be extracted simultaneously by choosing different time-frequency neighborhood and generalized thresholding operators. Besides, the time-frequency distribution with high resolution is obtained by piling different components in the same diagram. The diagnostic conclusion can be made according to the envelope spectrum of the impulsive components or by the periodicity of impulses. The effectiveness of the method is verified by numerical simulations, and the vibration signals registered from a gearbox fault simulator and a wind turbine. To validate the efficiency of the presented methodology, comparisons are made among some state-of-the-art vibration separation methods and the traditional time-frequency analysis methods. The comparisons show that the proposed method possesses advantages in separating feature signals under strong noise and accounting for the inner time-frequency structure of the gear vibration signals.

  18. A Markovian event-based framework for stochastic spiking neural networks.

    Science.gov (United States)

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  19. Agent Based Simulation of Group Emotions Evolution and Strategy Intervention in Extreme Events

    Directory of Open Access Journals (Sweden)

    Bo Li

    2014-01-01

    Full Text Available Agent based simulation method has become a prominent approach in computational modeling and analysis of public emergency management in social science research. The group emotions evolution, information diffusion, and collective behavior selection make extreme incidents studies a complex system problem, which requires new methods for incidents management and strategy evaluation. This paper studies the group emotion evolution and intervention strategy effectiveness using agent based simulation method. By employing a computational experimentation methodology, we construct the group emotion evolution as a complex system and test the effects of three strategies. In addition, the events-chain model is proposed to model the accumulation influence of the temporal successive events. Each strategy is examined through three simulation experiments, including two make-up scenarios and a real case study. We show how various strategies could impact the group emotion evolution in terms of the complex emergence and emotion accumulation influence in extreme events. This paper also provides an effective method of how to use agent-based simulation for the study of complex collective behavior evolution problem in extreme incidents, emergency, and security study domains.

  20. Entropy-based derivation of generalized distributions for hydrometeorological frequency analysis

    Science.gov (United States)

    Chen, Lu; Singh, Vijay P.

    2018-02-01

    Frequency analysis of hydrometeorological and hydrological extremes is needed for the design of hydraulic and civil infrastructure facilities as well as water resources management. A multitude of distributions have been employed for frequency analysis of these extremes. However, no single distribution has been accepted as a global standard. Employing the entropy theory, this study derived five generalized distributions for frequency analysis that used different kinds of information encoded as constraints. These distributions were the generalized gamma (GG), the generalized beta distribution of the second kind (GB2), and the Halphen type A distribution (Hal-A), Halphen type B distribution (Hal-B) and Halphen type inverse B distribution (Hal-IB), among which the GG and GB2 distribution were previously derived by Papalexiou and Koutsoyiannis (2012) and the Halphen family was first derived using entropy theory in this paper. The entropy theory allowed to estimate parameters of the distributions in terms of the constraints used for their derivation. The distributions were tested using extreme daily and hourly rainfall data. Results show that the root mean square error (RMSE) values were very small, which indicated that the five generalized distributions fitted the extreme rainfall data well. Among them, according to the Akaike information criterion (AIC) values, generally the GB2 and Halphen family gave a better fit. Therefore, those general distributions are one of the best choices for frequency analysis. The entropy-based derivation led to a new way for frequency analysis of hydrometeorological extremes.