WorldWideScience

Sample records for event frequency based

  1. Initiating events frequency determination

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2004-01-01

    The paper describes work performed for the Nuclear Power Station (NPS). Work is related to the periodic initiating events frequency update for the Probabilistic Safety Assessment (PSA). Data for all relevant NPS initiating events (IE) were reviewed. The main focus was on events occurring during most recent operating history (i.e., last four years). The final IE frequencies were estimated by incorporating both NPS experience and nuclear industry experience. Each event was categorized according to NPS individual plant examination (IPE) initiating events grouping approach. For the majority of the IE groups, few, or no events have occurred at the NPS. For those IE groups with few or no NPS events, the final estimate was made by means of a Bayesian update with general nuclear industry values. Exceptions are rare loss-of-coolant-accidents (LOCA) events, where evaluation of engineering aspects is used in order to determine frequency.(author)

  2. Under-Frequency Load Shedding Technique Considering Event-Based for an Islanded Distribution Network

    Directory of Open Access Journals (Sweden)

    Hasmaini Mohamad

    2016-06-01

    Full Text Available One of the biggest challenge for an islanding operation is to sustain the frequency stability. A large power imbalance following islanding would cause under-frequency, hence an appropriate control is required to shed certain amount of load. The main objective of this research is to develop an adaptive under-frequency load shedding (UFLS technique for an islanding system. The technique is designed considering an event-based which includes the moment system is islanded and a tripping of any DG unit during islanding operation. A disturbance magnitude is calculated to determine the amount of load to be shed. The technique is modeled by using PSCAD simulation tool. A simulation studies on a distribution network with mini hydro generation is carried out to evaluate the UFLS model. It is performed under different load condition: peak and base load. Results show that the load shedding technique have successfully shed certain amount of load and stabilized the system frequency.

  3. Pattern recognition based on time-frequency analysis and convolutional neural networks for vibrational events in φ-OTDR

    Science.gov (United States)

    Xu, Chengjin; Guan, Junjun; Bao, Ming; Lu, Jiangang; Ye, Wei

    2018-01-01

    Based on vibration signals detected by a phase-sensitive optical time-domain reflectometer distributed optical fiber sensing system, this paper presents an implement of time-frequency analysis and convolutional neural network (CNN), used to classify different types of vibrational events. First, spectral subtraction and the short-time Fourier transform are used to enhance time-frequency features of vibration signals and transform different types of vibration signals into spectrograms, which are input to the CNN for automatic feature extraction and classification. Finally, by replacing the soft-max layer in the CNN with a multiclass support vector machine, the performance of the classifier is enhanced. Experiments show that after using this method to process 4000 vibration signal samples generated by four different vibration events, namely, digging, walking, vehicles passing, and damaging, the recognition rates of vibration events are over 90%. The experimental results prove that this method can automatically make an effective feature selection and greatly improve the classification accuracy of vibrational events in distributed optical fiber sensing systems.

  4. Event group importance measures for top event frequency analyses

    International Nuclear Information System (INIS)

    1995-01-01

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures

  5. Event group importance measures for top event frequency analyses

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-31

    Three traditional importance measures, risk reduction, partial derivative, nd variance reduction, have been extended to permit analyses of the relative importance of groups of underlying failure rates to the frequencies of resulting top events. The partial derivative importance measure was extended by assessing the contribution of a group of events to the gradient of the top event frequency. Given the moments of the distributions that characterize the uncertainties in the underlying failure rates, the expectation values of the top event frequency, its variance, and all of the new group importance measures can be quantified exactly for two familiar cases: (1) when all underlying failure rates are presumed independent, and (2) when pairs of failure rates based on common data are treated as being equal (totally correlated). In these cases, the new importance measures, which can also be applied to assess the importance of individual events, obviate the need for Monte Carlo sampling. The event group importance measures are illustrated using a small example problem and demonstrated by applications made as part of a major reactor facility risk assessment. These illustrations and applications indicate both the utility and the versatility of the event group importance measures.

  6. A power filter for the detection of burst events based on time-frequency spectrum estimation

    International Nuclear Information System (INIS)

    Guidi, G M; Cuoco, E; Vicere, A

    2004-01-01

    We propose as a statistic for the detection of bursts in a gravitational wave interferometer the 'energy' of the events estimated with a time-dependent calculation of the spectrum. This statistic has an asymptotic Gaussian distribution with known statistical moments, which makes it possible to perform a uniformly most powerful test (McDonough R N and Whalen A D 1995 Detection of Signals in Noise (New York: Academic)) on the energy mean. We estimate the receiver operating characteristic (ROC, from the same book) of this statistic for different levels of the signal-to-noise ratio in the specific case of a simulated noise having the spectral density expected for Virgo, using test signals taken from a library of possible waveforms emitted during the collapse of the core of type II supernovae

  7. Evaluation of Frequency and Restoration time for Loss of Offsite Power events based on domestic operation experience

    International Nuclear Information System (INIS)

    Park, Jin-Hee; Han, Sang-Hoon; Lee, Ho Joong

    2006-01-01

    It is recognized that the availability of AC power to nuclear power plants is essential for safe operation and shutdown and accident recovery of commercial nuclear power plants (NPPs). Unavailability of AC power can be a important adverse impact on a plant's ability to recover accident sequences and maintain safe shutdown. The probabilistic safety assessment (PSA or PRA) performed for Korea NPPs also indicated that a loss of offsite power (LOOP) event and a station blackout (SBO) event can be a important contributors to total risk at nuclear power plant, contributing from 30% to 70% of the total risk at some NPPs in Korea. But, up to now, the LOOP and subsequent restoration time are important inputs to plant probabilistic risk assessment have relied upon foreign data. Therefore, in this paper, the actual LOOP events that have occurred from 1978 to 2004 at commercial nuclear power plants in Korea are collected. A statistical analysis for LOOP frequency and restoration time are performed to apply NPPs's specific and realistic risk model in Korea. Additionally, an engineering analysis is also performed to obtain the insights about the LOOP events

  8. Statistical Prediction of Solar Particle Event Frequency Based on the Measurements of Recent Solar Cycles for Acute Radiation Risk Analysis

    Science.gov (United States)

    Myung-Hee, Y. Kim; Shaowen, Hu; Cucinotta, Francis A.

    2009-01-01

    Large solar particle events (SPEs) present significant acute radiation risks to the crew members during extra-vehicular activities (EVAs) or in lightly shielded space vehicles for space missions beyond the protection of the Earth's magnetic field. Acute radiation sickness (ARS) can impair performance and result in failure of the mission. Improved forecasting capability and/or early-warning systems and proper shielding solutions are required to stay within NASA's short-term dose limits. Exactly how to make use of observations of SPEs for predicting occurrence and size is a great challenge, because SPE occurrences themselves are random in nature even though the expected frequency of SPEs is strongly influenced by the time position within the solar activity cycle. Therefore, we developed a probabilistic model approach, where a cumulative expected occurrence curve of SPEs for a typical solar cycle was formed from a non-homogeneous Poisson process model fitted to a database of proton fluence measurements of SPEs that occurred during the past 5 solar cycles (19 - 23) and those of large SPEs identified from impulsive nitrate enhancements in polar ice. From the fitted model, the expected frequency of SPEs was estimated at any given proton fluence threshold (Phi(sub E)) with energy (E) >30 MeV during a defined space mission period. Corresponding Phi(sub E) (E=30, 60, and 100 MeV) fluence distributions were simulated with a random draw from a gamma distribution, and applied for SPE ARS risk analysis for a specific mission period. It has been found that the accurate prediction of deep-seated organ doses was more precisely predicted at high energies, Phi(sub 100), than at lower energies such as Phi(sub 30) or Phi(sub 60), because of the high penetration depth of high energy protons. Estimates of ARS are then described for 90th and 95th percentile events for several mission lengths and for several likely organ dose-rates. The ability to accurately measure high energy protons

  9. Neural network approach to the prediction of seismic events based on low-frequency signal monitoring of the Kuril-Kamchatka and Japanese regions

    Directory of Open Access Journals (Sweden)

    Irina Popova

    2013-08-01

    Full Text Available Very-low-frequency/ low-frequency (VLF/LF sub-ionospheric radiowave monitoring has been widely used in recent years to analyze earthquake preparatory processes. The connection between earthquakes with M ≥5.5 and nighttime disturbances of signal amplitude and phase has been established. Thus, it is possible to use nighttime anomalies of VLF/LF signals as earthquake precursors. Here, we propose a method for estimation of the VLF/LF signal sensitivity to seismic processes using a neural network approach. We apply the error back-propagation technique based on a three-level perceptron to predict a seismic event. The back-propagation technique involves two main stages to solve the problem; namely, network training, and recognition (the prediction itself. To train a neural network, we first create a so-called ‘training set’. The ‘teacher’ specifies the correspondence between the chosen input and the output data. In the present case, a representative database includes both the LF data received over three years of monitoring at the station in Petropavlovsk-Kamchatsky (2005-2007, and the seismicity parameters of the Kuril-Kamchatka and Japanese regions. At the first stage, the neural network established the relationship between the characteristic features of the LF signal (the mean and dispersion of a phase and an amplitude at nighttime for a few days before a seismic event and the corresponding level of correlation with a seismic event, or the absence of a seismic event. For the second stage, the trained neural network was applied to predict seismic events from the LF data using twelve time intervals in 2004, 2005, 2006 and 2007. The results of the prediction are discussed.

  10. Weather Typing-Based Flood Frequency Analysis Verified for Exceptional Historical Events of Past 500 Years Along the Meuse River

    Science.gov (United States)

    De Niel, J.; Demarée, G.; Willems, P.

    2017-10-01

    Governments, policy makers, and water managers are pushed by recent socioeconomic developments such as population growth and increased urbanization inclusive of occupation of floodplains to impose very stringent regulations on the design of hydrological structures. These structures need to withstand storms with return periods typically ranging between 1,250 and 10,000 years. Such quantification involves extrapolations of systematically measured instrumental data, possibly complemented by quantitative and/or qualitative historical data and paleoflood data. The accuracy of the extrapolations is, however, highly unclear in practice. In order to evaluate extreme river peak flow extrapolation and accuracy, we studied historical and instrumental data of the past 500 years along the Meuse River. We moreover propose an alternative method for the estimation of the extreme value distribution of river peak flows, based on weather types derived by sea level pressure reconstructions. This approach results in a more accurate estimation of the tail of the distribution, where current methods are underestimating the design levels related to extreme high return periods. The design flood for a 1,250 year return period is estimated at 4,800 m3 s-1 for the proposed method, compared with 3,450 and 3,900 m3 s-1 for a traditional method and a previous study.

  11. Testing and comparison of three frequency-based magnitude estimating parameters for earthquake early warning based events in the Yunnan region, China in 2014

    Science.gov (United States)

    Zhang, Jianjing; Li, Hongjie

    2018-06-01

    To mitigate potential seismic disasters in the Yunnan region, China, building up suitable magnitude estimation scaling laws for an earthquake early warning system (EEWS) is in high demand. In this paper, the records from the main and after-shocks of the Yingjiang earthquake (M W 5.9), the Ludian earthquake (M W 6.2) and the Jinggu earthquake (M W 6.1), which occurred in Yunnan in 2014, were used to develop three estimators, including the maximum of the predominant period ({{τ }{{p}}}\\max ), the characteristic period (τ c) and the log-average period (τ log), for estimating earthquake magnitude. The correlations between these three frequency-based parameters and catalog magnitudes were developed, compared and evaluated against previous studies. The amplitude and period of seismic waves might be amplified in the Ludian mountain-canyon area by multiple reflections and resonance, leading to excessive values of the calculated parameters, which are consistent with Sichuan’s scaling. As a result, τ log was best correlated with magnitude and τ c had the highest slope of regression equation, while {{τ }{{p}}}\\max performed worst with large scatter and less sensitivity for the change of magnitude. No evident saturation occurred in the case of M 6.1 and M 6.2 in this study. Even though both τ c and τ log performed similarly and can well reflect the size of the Earthquake, τ log has slightly fewer prediction errors for small scale earthquakes (M ≤ 4.5), which was also observed by previous research. Our work offers an insight into the feasibility of a EEWS in Yunnan, China, and this study shows that it is necessary to build up an appropriate scaling law suitable for the warning region.

  12. Frequency of adverse events after vaccination with different vaccinia strains.

    Directory of Open Access Journals (Sweden)

    Mirjam Kretzschmar

    2006-08-01

    Full Text Available BACKGROUND: Large quantities of smallpox vaccine have been stockpiled to protect entire nations against a possible reintroduction of smallpox. Planning for an appropriate use of these stockpiled vaccines in response to a smallpox outbreak requires a rational assessment of the risks of vaccination-related adverse events, compared to the risk of contracting an infection. Although considerable effort has been made to understand the dynamics of smallpox transmission in modern societies, little attention has been paid to estimating the frequency of adverse events due to smallpox vaccination. Studies exploring the consequences of smallpox vaccination strategies have commonly used a frequency of approximately one death per million vaccinations, which is based on a study of vaccination with the New York City Board of Health (NYCBH strain of vaccinia virus. However, a multitude of historical studies of smallpox vaccination with other vaccinia strains suggest that there are strain-related differences in the frequency of adverse events after vaccination. Because many countries have stockpiled vaccine based on the Lister strain of vaccinia virus, a quantitative evaluation of the adverse effects of such vaccines is essential for emergency response planning. We conducted a systematic review and statistical analysis of historical data concerning vaccination against smallpox with different strains of vaccinia virus. METHODS AND FINDINGS: We analyzed historical vaccination data extracted from the literature. We extracted data on the frequency of postvaccinal encephalitis and death with respect to vaccinia strain and age of vaccinees. Using a hierarchical Bayesian approach for meta-analysis, we estimated the expected frequencies of postvaccinal encephalitis and death with respect to age at vaccination for smallpox vaccines based on the NYCBH and Lister vaccinia strains. We found large heterogeneity between findings from different studies and a time-period effect

  13. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  14. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  15. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  16. Estimation of initiating event frequency for external flood events by extreme value theorem

    International Nuclear Information System (INIS)

    Chowdhury, Sourajyoti; Ganguly, Rimpi; Hari, Vibha

    2017-01-01

    External flood is an important common cause initiating event in nuclear power plants (NPPs). It may potentially lead to severe core damage (SCD) by first causing the failure of the systems required for maintaining the heat sinks and then by contributing to failures of engineered systems designed to mitigate such failures. The sample NPP taken here is twin 220 MWe Indian standard pressurized heavy water reactor (PHWR) situated inland. A comprehensive in-house Level-1 internal event PSA for full power had already been performed. External flood assessment was further conducted in area of external hazard risk assessment in response to post-Fukushima measures taken in nuclear industries. The present paper describes the methodology to calculate initiating event (IE) frequency for external flood events for the sample inland Indian NPP. General extreme value (GEV) theory based on maximum likelihood method (MLM) and order statistics approach (OSA) is used to analyse the rainfall data for the site. Thousand-year return level and necessary return periods for extreme rainfall are evaluated. These results along with plant-specific topographical calculations quantitatively establish that external flooding resulting from upstream dam break, river flooding and heavy rainfall (flash flood) would be unlikely for the sample NPP in consideration.

  17. Development of transient initiating event frequencies for use in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors

  18. Development of transient initiating event frequencies for use in probabilistic risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.

  19. Recurrent frequency-size distribution of characteristic events

    Directory of Open Access Journals (Sweden)

    S. G. Abaimov

    2009-04-01

    Full Text Available Statistical frequency-size (frequency-magnitude properties of earthquake occurrence play an important role in seismic hazard assessments. The behavior of earthquakes is represented by two different statistics: interoccurrent behavior in a region and recurrent behavior at a given point on a fault (or at a given fault. The interoccurrent frequency-size behavior has been investigated by many authors and generally obeys the power-law Gutenberg-Richter distribution to a good approximation. It is expected that the recurrent frequency-size behavior should obey different statistics. However, this problem has received little attention because historic earthquake sequences do not contain enough events to reconstruct the necessary statistics. To overcome this lack of data, this paper investigates the recurrent frequency-size behavior for several problems. First, the sequences of creep events on a creeping section of the San Andreas fault are investigated. The applicability of the Brownian passage-time, lognormal, and Weibull distributions to the recurrent frequency-size statistics of slip events is tested and the Weibull distribution is found to be the best-fit distribution. To verify this result the behaviors of numerical slider-block and sand-pile models are investigated and the Weibull distribution is confirmed as the applicable distribution for these models as well. Exponents β of the best-fit Weibull distributions for the observed creep event sequences and for the slider-block model are found to have similar values ranging from 1.6 to 2.2 with the corresponding aperiodicities CV of the applied distribution ranging from 0.47 to 0.64. We also note similarities between recurrent time-interval statistics and recurrent frequency-size statistics.

  20. Estimation of frequency of occurrence of extreme natural external events of very high intensity on the base of (non)available data - Estimation of frequency of rare natural external events of very high intensity on the base of (non)available data

    International Nuclear Information System (INIS)

    Holy, J.; Kolar, L.; Jaros, M.; Hladky, M.; Mlady, O.

    2014-01-01

    The relatively frequent natural external events are usually of minor safety importance, because the NPPs are, with a significant safety margin, constructed and operated to withstand the effects of them. Thus, risk analysis is typically devoted to the natural events of exceptional intensity, which mostly have not occurred up to now, but which still could happen with some low probability, but critical consequences. Since 'direct' plant specific data providing evidence about such events to occur is not at disposal, special data treatment and extrapolation methods have to be employed for frequency estimation. The paper summarizes possible approach to estimation of rate event frequency by means of extrapolation from available data and points out the potential problems and challenges encountered during the analysis. The general framework is commented in the presentation, regarding the effects of choice of probabilistic distribution (Gumbel distribution versus the others), methods of work with data records (To take out some observations and why?) and analysis of quality of input data sets (To mix the data sets from different sources or not? To use 'old' observations?) In the first part of the paper, the approach to creation of NPP Dukovany deterministic design basis regarding natural external events, which was used in past, is summarized. The second, major part of the paper, is devoted to involvement of the ideas of probabilistic safety assessment into safety assessment of external hazards, including such specific topics as addressing the quality of available data records, discussion on possible violation of common assumptions expected to be valid by the rules of statistical data analysis and the ways how to fix it, the choice of probabilistic distribution modeling data variability etc. The examples of results achieved for NPP Dukovany site in Czech republic are given in the final section. This paper represents a coordinated effort with participation of experts and staff

  1. radio frequency based radio frequency based water level monitor

    African Journals Online (AJOL)

    eobe

    ABSTRACT. This paper elucidates a radio frequency (RF) based transmission and reception system used to remotely monitor and .... range the wireless can cover but in this prototype, it ... power supply to the system, the sensed water level is.

  2. Comparison and applicability of landslide susceptibility models based on landslide ratio-based logistic regression, frequency ratio, weight of evidence, and instability index methods in an extreme rainfall event

    Science.gov (United States)

    Wu, Chunhung

    2016-04-01

    Few researches have discussed about the applicability of applying the statistical landslide susceptibility (LS) model for extreme rainfall-induced landslide events. The researches focuses on the comparison and applicability of LS models based on four methods, including landslide ratio-based logistic regression (LRBLR), frequency ratio (FR), weight of evidence (WOE), and instability index (II) methods, in an extreme rainfall-induced landslide cases. The landslide inventory in the Chishan river watershed, Southwestern Taiwan, after 2009 Typhoon Morakot is the main materials in this research. The Chishan river watershed is a tributary watershed of Kaoping river watershed, which is a landslide- and erosion-prone watershed with the annual average suspended load of 3.6×107 MT/yr (ranks 11th in the world). Typhoon Morakot struck Southern Taiwan from Aug. 6-10 in 2009 and dumped nearly 2,000 mm of rainfall in the Chishan river watershed. The 24-hour, 48-hour, and 72-hours accumulated rainfall in the Chishan river watershed exceeded the 200-year return period accumulated rainfall. 2,389 landslide polygons in the Chishan river watershed were extracted from SPOT 5 images after 2009 Typhoon Morakot. The total landslide area is around 33.5 km2, equals to the landslide ratio of 4.1%. The main landslide types based on Varnes' (1978) classification are rotational and translational slides. The two characteristics of extreme rainfall-induced landslide event are dense landslide distribution and large occupation of downslope landslide areas owing to headward erosion and bank erosion in the flooding processes. The area of downslope landslide in the Chishan river watershed after 2009 Typhoon Morakot is 3.2 times higher than that of upslope landslide areas. The prediction accuracy of LS models based on LRBLR, FR, WOE, and II methods have been proven over 70%. The model performance and applicability of four models in a landslide-prone watershed with dense distribution of rainfall

  3. Analysis of core damage frequency: Surry, Unit 1 internal events

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the accident sequence analysis of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed and described here is an extensive of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments form numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.05-E-5 per year, with a 95% upper bound of 1.34E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency. 49 refs., 52 figs., 70 tabs

  4. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    Science.gov (United States)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical

  5. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  6. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-01-01

    Aim The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. Methods A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. Results From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for ‘symptom’ related and ‘mild’ adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Conclusions Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. PMID:26271492

  7. Utilization of Satellite Data to Identify and Monitor Changes in Frequency of Meteorological Events

    Science.gov (United States)

    Mast, J. C.; Dessler, A. E.

    2017-12-01

    Increases in temperature and climate variability due to human-induced climate change is increasing the frequency and magnitude of extreme heat events (i.e., heatwaves). This will have a detrimental impact on the health of human populations and habitability of certain land locations. Here we seek to utilize satellite data records to identify and monitor extreme heat events. We analyze satellite data sets (MODIS and AIRS land surface temperatures (LST) and water vapor profiles (WV)) due to their global coverage and stable calibration. Heat waves are identified based on the frequency of maximum daily temperatures above a threshold, determined as follows. Land surface temperatures are gridded into uniform latitude/longitude bins. Maximum daily temperatures per bin are determined and probability density functions (PDF) of these maxima are constructed monthly and seasonally. For each bin, a threshold is calculated at the 95th percentile of the PDF of maximum temperatures. Per each bin, an extreme heat event is defined based on the frequency of monthly and seasonal days exceeding the threshold. To account for the decreased ability of the human body to thermoregulate with increasing moisture, and to assess lethality of the heat events, we determine the wet-bulb temperature at the locations of extreme heat events. Preliminary results will be presented.

  8. Study of Updating Initiating Event Frequency using Prognostics

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung

    2014-01-01

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA

  9. Study of Updating Initiating Event Frequency using Prognostics

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Lee, Sang-Hwan; Park, Jun-seok; Kim, Hyungdae; Chang, Yoon-Suk; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of)

    2014-10-15

    The Probabilistic Safety Assessment (PSA) model enables to find the relative priority of accident scenarios, weak points in achieving accident prevention or mitigation, and insights to improve those vulnerabilities. Thus, PSA consider realistic calculation for precise and confidence results. However, PSA model still 'conservative' aspects in the procedures of developing a PSA model. One of the sources for the conservatism is caused by the assumption of safety analysis and the estimation of failure frequency. Recently, Surveillance, Diagnosis, and Prognosis (SDP) is a growing trend in applying space and aviation systems in particular. Furthermore, a study dealing with the applicable areas and state-of-the-art status of the SDP in nuclear industry was published. SDP utilizing massive database and information technology among such enabling techniques is worthwhile to be highlighted in terms of the capability of alleviating the conservatism in the conventional PSA. This paper review the concept of integrating PSA and SDP and suggest the updated methodology of Initiating Event (IE) using prognostics. For more detailed, we focus on IE of the Steam Generator Tube Rupture (SGTR) considering tube degradation. This paper is additional research of previous our suggested the research. In this paper, the concept of integrating PSA and SDP are suggested. Prognostics algorithms in SDP are applied at IE, Bes in the Level 1 PSA. As an example, updating SGTR IE and its ageing were considered. Tube ageing were analyzed by using PASTA and Monte Carlo method. After analyzing the tube ageing, conventional SGTR IE were updated by using Bayesian approach. The studied method can help to cover the static and conservatism in PSA.

  10. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  11. Procedures for the external event core damage frequency analyses for NUREG-1150

    International Nuclear Information System (INIS)

    Bohn, M.P.; Lambright, J.A.

    1990-11-01

    This report presents methods which can be used to perform the assessment of risk due to external events at nuclear power plants. These methods were used to perform the external events risk assessments for the Surry and Peach Bottom nuclear power plants as part of the NRC-sponsored NUREG-1150 risk assessments. These methods apply to the full range of hazards such as earthquakes, fires, floods, etc. which are collectively known as external events. The methods described in this report have been developed under NRC sponsorship and represent, in many cases, both advancements and simplifications over techniques that have been used in past years. They also include the most up-to-date data bases on equipment seismic fragilities, fire occurrence frequencies and fire damageability thresholds. The methods described here are based on making full utilization of the power plant systems logic models developed in the internal events analyses. By making full use of the internal events models one obtains an external event analysis that is consistent both in nomenclature and in level of detail with the internal events analyses, and in addition, automatically includes all the appropriate random and tests/maintenance unavailabilities as appropriate. 50 refs., 9 figs., 11 tabs

  12. On the Onset Frequency of Metric Type II Radio Bursts and the Longitudinal Extent of the Associated SEP Events

    Science.gov (United States)

    Makela, P. A.; Gopalswamy, N.; Yashiro, S.; Thakur, N.; Akiyama, S.; Xie, H.

    2017-12-01

    In a recent study Gopalswamy et al. (2017, J. Phys. Conf. Ser., Proc. 16th AIAC) found that the ground level enhancements (GLEs), regular solar energetic particle (SEP) events and filament eruption (FE) associated SEP events have distinct average starting frequencies of the associated type II bursts, although the distributions overlap. They also found that the initial acceleration of the coronal mass ejections (CMEs) associated with the three groups were distinct. Based on these earlier results emphasizing a hierarchical relationship of CME kinematics and SEP events, we studied the possible dependence between the longitudinal spread of the SEP events and the onset frequency of metric type II. The studied >25 MeV SEP events are from the list of Richardson et al. (2014, Sol. Phys. 289) covering the first seven years of the STEREO mission. However, our preliminary results show only a weak correlation between the extent of the SEP event and the onset frequency of the metric type II radio burst.

  13. Shallow very-low-frequency earthquakes accompany slow slip events in the Nankai subduction zone.

    Science.gov (United States)

    Nakano, Masaru; Hori, Takane; Araki, Eiichiro; Kodaira, Shuichi; Ide, Satoshi

    2018-03-14

    Recent studies of slow earthquakes along plate boundaries have shown that tectonic tremor, low-frequency earthquakes, very-low-frequency events (VLFEs), and slow-slip events (SSEs) often accompany each other and appear to share common source faults. However, the source processes of slow events occurring in the shallow part of plate boundaries are not well known because seismic observations have been limited to land-based stations, which offer poor resolution beneath offshore plate boundaries. Here we use data obtained from seafloor observation networks in the Nankai trough, southwest of Japan, to investigate shallow VLFEs in detail. Coincident with the VLFE activity, signals indicative of shallow SSEs were detected by geodetic observations at seafloor borehole observatories in the same region. We find that the shallow VLFEs and SSEs share common source regions and almost identical time histories of moment release. We conclude that these slow events arise from the same fault slip and that VLFEs represent relatively high-frequency fluctuations of slip during SSEs.

  14. HYPOCENTER DISTRIBUTION OF LOW FREQUENCY EVENT AT PAPANDAYAN VOLCANO

    Directory of Open Access Journals (Sweden)

    Muhammad Mifta Hasan

    2016-10-01

    Full Text Available Papandayan volcano is a stratovolcano with irregular cone-shaped has eight craters around the peak. The most active crater in Papandayan is a Mas crater. Distribution of relocated event calculated using Geiger Adaptive Damping Algorithm (GAD shows that the epicenter of the event centered below Mas crater with maximum rms 0.114. While depth of the hypocenter range between 0-2 km and 5-6 km due to activity of steam and gas.

  15. Characterization of the frequency and nature of bleed air contamination events in commercial aircraft.

    Science.gov (United States)

    Shehadi, M; Jones, B; Hosni, M

    2016-06-01

    Contamination of the bleed air used to pressurize and ventilate aircraft cabins is of concern due to the potential health and safety hazards for passengers and crew. Databases from the Federal Aviation Administration, NASA, and other sources were examined in detail to determine the frequency of bleed air contamination incidents. The frequency was examined on an aircraft model basis with the intent of identifying aircraft make and models with elevated frequencies of contamination events. The reported results herein may help investigators to focus future studies of bleed air contamination incidents on smaller number of aircrafts. Incident frequency was normalized by the number of aircraft, number of flights, and flight hours for each model to account for the large variations in the number of aircraft of different models. The focus of the study was on aircraft models that are currently in service and are used by major airlines in the United States. Incidents examined in this study include those related to smoke, oil odors, fumes, and any symptom that might be related to exposure to such contamination, reported by crew members, between 2007 and 2012, for US-based carriers for domestic flights and all international flights that either originated or terminated in the US. In addition to the reported frequency of incidents for different aircraft models, the analysis attempted to identify propulsion engines and auxiliary power units associated with aircrafts that had higher frequencies of incidents. While substantial variations were found in frequency of incidents, it was found that the contamination events were widely distributed across nearly all common models of aircraft. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. A Method to Quantify Plant Availability and Initiating Event Frequency Using a Large Event Tree, Small Fault Tree Model

    International Nuclear Information System (INIS)

    Kee, Ernest J.; Sun, Alice; Rodgers, Shawn; Popova, ElmiraV; Nelson, Paul; Moiseytseva, Vera; Wang, Eric

    2006-01-01

    South Texas Project uses a large fault tree to produce scenarios (minimal cut sets) used in quantification of plant availability and event frequency predictions. On the other hand, the South Texas Project probabilistic risk assessment model uses a large event tree, small fault tree for quantifying core damage and radioactive release frequency predictions. The South Texas Project is converting its availability and event frequency model to use a large event tree, small fault in an effort to streamline application support and to provide additional detail in results. The availability and event frequency model as well as the applications it supports (maintenance and operational risk management, system engineering health assessment, preventive maintenance optimization, and RIAM) are briefly described. A methodology to perform availability modeling in a large event tree, small fault tree framework is described in detail. How the methodology can be used to support South Texas Project maintenance and operations risk management is described in detail. Differences with other fault tree methods and other recently proposed methods are discussed in detail. While the methods described are novel to the South Texas Project Risk Management program and to large event tree, small fault tree models, concepts in the area of application support and availability modeling have wider applicability to the industry. (authors)

  17. GHz band frequency hopping PLL-based frequency synthesizers

    Institute of Scientific and Technical Information of China (English)

    XU Yong; WANG Zhi-gong; GUAN Yu; XU Zhi-jun; QIAO Lu-feng

    2005-01-01

    In this paper we describe a full-integrated circuit containing all building blocks of a completed PLL-based synthesizer except for low pass filter(LPF).The frequency synthesizer is designed for a frequency hopping (FH) transceiver operating up to 1.5 GHz as a local oscillator. The architecture of Voltage Controlled Oscillator (VCO) is optimized to get better performance, and a phase noise of -111.85-dBc/Hz @ 1 MHz and a tuning range of 250 MHz are gained at a centre frequency of 1.35 GHz.A novel Dual-Modulus Prescaler(DMP) is designed to achieve a very low jitter and a lower power.The settling time of PLL is 80 μs while the reference frequency is 400 KHz.This monolithic frequency synthesizer is to integrate all main building blocks of PLL except for the low pass filter,with a maximum VCO output frequency of 1.5 GHz,and is fabricated with a 0.18 μm mixed signal CMOS process. Low power dissipation, low phase noise, large tuning range and fast settling time are gained in this design.

  18. Attitudes of Montenegrin Consumers Toward Advertising Through Sport Among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Bojan Masanovic

    2018-01-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Montenegrin consumers toward advertising through sport among. The sample included 342 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.00. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  19. Attitudes of Turkish Consumers toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Bojan Masanovic

    2017-08-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Serbian consumers toward advertising through sport among. The sample included 173 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between two out of three variables at a significance level of p=.05. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  20. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....

  1. A Study on the Frequency of Initiating Event of OPR-1000 during Outage Periods

    Energy Technology Data Exchange (ETDEWEB)

    Hong Jae Beol; Jae, Moo Sung [Hanyang Univ., Seoul (Korea, Republic of)

    2013-10-15

    These sources of data did not reflect the latest event data which have occurred during the PWR outage to the frequencies of initiating event Electric Power Research Institute(EPRI) in USA collected the data of loss of decay heat removal during outage from 1989 to 2009 and published technical report. Domestic operating experiences for LOOP is gathered in Operational Performance Information System for Nuclear Power Plant(OPIS). To reduce conservatism and obtain completeness for LPSD PSA, those data should be collected and used to update the frequencies. The frequencies of LOSDC and LOOP are reevaluated using the data of EPRI and OPIS in this paper. Quantification is conducted to recalculate core damage frequency(CDF), since the rate is changed. The results are discussed below. To make an accurate estimate of the initiating events of LPSD PSA, the event data were collected and the frequencies of initiating events were updated using Bayesian approach. CDF was evaluated through quantification. Δ CDF is -40% and the dominant contributor is pressurizer PSV stuck open event. The most of the event data in EPRI TR were collected from US nuclear power plant industry. Those data are not enough to evaluate outage risk precisely. Therefore, to reduce conservatism and obtain completeness for LPSD PSA, the licensee event report and domestic data should be collected and reflected to the frequencies of the initiating events during outage.

  2. Attitudes of Consumers from Podgorica toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Nikola Milovic

    2018-04-01

    Full Text Available This investigation was aimed at gaining relevant knowledge about the attitudes of Podgorica consumers toward advertising through sport among. The sample included 330 students from Faculty of Economics in Podgorica, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analyzed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.00. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. The significant differences were found in two of three variables, while the consumers who do not watch sports events had much more negative attitudes toward advertising though sport.

  3. Adequate engineering for lowering the frequency of initiating events at Siemens/KWU

    International Nuclear Information System (INIS)

    Gremm, O.

    1988-01-01

    The analysis of TMI and Chernobyl events shows weak points and deficits in the field of preventive safety features. This should not be forgotten during the ongoing discussion on severe accidents. Therefore the paper explains special preventive safety features which were the results of the development of Siemens/KWU reactor technology. With respect to the present discussion on new reactor concepts special attention is given to the inherent and passive safety features and the engineering which results in low core melt frequency. Such an analysis leads to knowledge modules which are based on experience during licensing procedures and plant operation and should be the starting points for reactor technology of the future

  4. Mutational jackpot events generate effective frequency-dependent selection in adapting populations

    Science.gov (United States)

    Hallatschek, Oskar

    The site-frequency spectrum is one the most easily measurable quantities that characterize the genetic diversity of a population. While most neutral models predict that site frequency spectra should decay with increasing frequency, a high-frequency uptick has been reported in many populations. Anomalies in the high-frequency tail are particularly unsettling because the highest frequencies can be measured with greatest accuracy. Here, we show that an uptick in the spectrum of neutral mutations generally arises when mutant frequencies are dominated by rare jackpot events, mutational events with large descendant numbers. This leads to an effective pattern of frequency-dependent selection (or unstable internal equilibrium at one half frequency) that causes an accumulation of high-frequency polymorphic sites. We reproduce the known uptick occurring for recurrent hitchhiking (genetic draft) as well as rapid adaptation, and (in the future) generalize the shape of the high-frequency tail to other scenarios that are dominated by jackpot events, such as frequent range expansions. We also tackle (in the future) the inverse approach to use the high-frequency uptick for learning about the tail of the offspring number distribution. Positively selected alleles need to surpass, typically, an u NSF Career Award (PoLS), NIH NIGMS R01, Simons Foundation.

  5. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis

  6. Frequency and Variance of Communication Characteristics in Aviation Safety Events

    NARCIS (Netherlands)

    Karanikas, Nektarios; Kaspers, Steffen

    2017-01-01

    In the aviation sector, communication problems have contributed into 70% to 80% of safety occurrences. However, to date we haven’t depicted which communication aspects have affected aviation safety most frequently. Based on literature, we developed a tool which includes communication characteristics

  7. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  8. Human based roots of failures in nuclear events investigations

    International Nuclear Information System (INIS)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag

    2012-01-01

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  9. Potential Indoor Worker Exposure From Handling Area Leakage: Example Event Sequence Frequency Analysis

    International Nuclear Information System (INIS)

    Benke, Roland R.; Adams, George R.

    2008-01-01

    potential event sequences. A hypothetical case is presented for failure of the HVAC exhaust system to provide confinement for contaminated air from otherwise normal operations. This paper presents an example calculation of frequencies for a potential event sequence involving HVAC system failure during otherwise routine wet transfer operations of spent nuclear fuel assemblies from an open container. For the simplified HVAC exhaust system model, the calculation indicated that the potential event sequence may or may not be a Category 1 event sequence, in light of current uncertainties (e.g., final HVAC system design and duration of facility operations). Categorization of potential event sequences is important because different regulatory requirements and performance objectives are specified based on the categorization of event sequences. A companion paper presents a dose calculation methodology and example calculations of indoor worker consequences for the posed example event sequence. Together, the two companion papers demonstrate capabilities for performing confirmatory calculations of frequency and consequence, which may assist the assessment of worker safety during a risk-informed regulatory review of a potential DOE license application

  10. Estimation of average hazardous-event-frequency for allocation of safety-integrity levels

    International Nuclear Information System (INIS)

    Misumi, Y.; Sato, Y.

    1999-01-01

    frequencies are derived based on the fault-trees. Thus, new definitions regarding modes of operation for the allocation of Safety Integrity Levels and shortcut methods for estimation of hazardous-event frequencies are proposed

  11. Classification of hydromagnetic emissions based on frequency--time spectra

    International Nuclear Information System (INIS)

    Fukunishi, H.; Toya, T.; Koike, K.; Kuwashima, M.; Kawamura, M.

    1981-01-01

    By using 3035 hydromagnetic emission events observed in the frequency range of 0.1--2.0 Hz at Syowa (Lapprox.6), HM emissions have been classified into eight subtypes based on their spectral structures, i.e., HM whistler, periodic HM emission, HM chorus, HM emission burst, IPDP, morning IPDP, Pc 1--2 band, and irregular HM emission. It is seen that each subtype has a preferential magnetic local time interval and also a frequency range for its occurrence. Morning IPDP events and irregular HM emissions occur in the magnetic morning hours, while dispersive periodic HM emissions and HM emission bursts occur around magnetic local noon, then HM chorus emissions occur in the afternoon hours and IPDP events occur in the evening hours. Furthermore, it is noticed that the mid-frequencies of these emissions vary from high frequencies in the morning hours to low frequencies in the afternoon hours. On the basis of these results, the generation mechanisms of each subtype are discussed

  12. Calculation of noninformative prior of reliability parameter and initiating event frequency with Jeffreys method

    International Nuclear Information System (INIS)

    He Jie; Zhang Binbin

    2013-01-01

    In the probabilistic safety assessment (PSA) of nuclear power plants, there are few historical records on some initiating event frequencies or component failures in industry. In order to determine the noninformative priors of such reliability parameters and initiating event frequencies, the Jeffreys method in Bayesian statistics was employed. The mathematical mechanism of the Jeffreys prior and the simplified constrained noninformative distribution (SCNID) were elaborated in this paper. The Jeffreys noninformative formulas and the credible intervals of the Gamma-Poisson and Beta-Binomial models were introduced. As an example, the small break loss-of-coolant accident (SLOCA) was employed to show the application of the Jeffreys prior in determining an initiating event frequency. The result shows that the Jeffreys method is an effective method for noninformative prior calculation. (authors)

  13. Variability and trends in dry day frequency and dry event length in the southwestern United States

    Science.gov (United States)

    McCabe, Gregory J.; Legates, David R.; Lins, Harry F.

    2010-01-01

    Daily precipitation from 22 National Weather Service first-order weather stations in the southwestern United States for water years 1951 through 2006 are used to examine variability and trends in the frequency of dry days and dry event length. Dry events with minimum thresholds of 10 and 20 consecutive days of precipitation with less than 2.54 mm are analyzed. For water years and cool seasons (October through March), most sites indicate negative trends in dry event length (i.e., dry event durations are becoming shorter). For the warm season (April through September), most sites also indicate negative trends; however, more sites indicate positive trends in dry event length for the warm season than for water years or cool seasons. The larger number of sites indicating positive trends in dry event length during the warm season is due to a series of dry warm seasons near the end of the 20th century and the beginning of the 21st century. Overall, a large portion of the variability in dry event length is attributable to variability of the El Niño–Southern Oscillation, especially for water years and cool seasons. Our results are consistent with analyses of trends in discharge for sites in the southwestern United States, an increased frequency in El Niño events, and positive trends in precipitation in the southwestern United States.

  14. Ground motion: frequency of occurrence versus amplitude of disturbing transient events

    International Nuclear Information System (INIS)

    Werner, K.L.

    1983-01-01

    Successful collider operation requires that ground motion not exceed certain tolerances. In this note it is pointed out that on occasion these tolerances are exceeded. The frequency of such events and their amplitudes, measured as a function of time of day, have been measured. An examination of the data leads one to conclude that most events are of cultural (i.e., man-made) origin. 2 references, 20 figures

  15. Merging expert and empirical data for rare event frequency estimation: Pool homogenisation for empirical Bayes models

    International Nuclear Information System (INIS)

    Quigley, John; Hardman, Gavin; Bedford, Tim; Walls, Lesley

    2011-01-01

    Empirical Bayes provides one approach to estimating the frequency of rare events as a weighted average of the frequencies of an event and a pool of events. The pool will draw upon, for example, events with similar precursors. The higher the degree of homogeneity of the pool, then the Empirical Bayes estimator will be more accurate. We propose and evaluate a new method using homogenisation factors under the assumption that events are generated from a Homogeneous Poisson Process. The homogenisation factors are scaling constants, which can be elicited through structured expert judgement and used to align the frequencies of different events, hence homogenising the pool. The estimation error relative to the homogeneity of the pool is examined theoretically indicating that reduced error is associated with larger pool homogeneity. The effects of misspecified expert assessments of the homogenisation factors are examined theoretically and through simulation experiments. Our results show that the proposed Empirical Bayes method using homogenisation factors is robust under different degrees of misspecification.

  16. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  17. Probe-controlled soliton frequency shift in the regime of optical event horizon

    DEFF Research Database (Denmark)

    Gu, Jie; Guo, Hairun; Wang, Shaofei

    2015-01-01

    In optical analogy of the event horizon, temporal pulse collision and mutual interactions are mainly between an intense solitary wave (soliton) and a dispersive probe wave. In such a regime, here we numerically investigate the probe-controlled soliton frequency shift as well as the soliton self...

  18. The development on the methodology of the initiating event frequencies for liquid metal reactor KALIMER

    International Nuclear Information System (INIS)

    Jeong, K. S.; Yang, Z. A.; Ah, Y. B.; Jang, W. P.; Jeong, H. Y.; Ha, K. S.; Han, D. H.

    2002-01-01

    In this paper, the PSA methodology of PRISM,Light Water Reactor, Pressurized Heavy Water Reactor are analyzed and the methodology of Initiating Events for KALIMER are suggested. Also,the reliability assessment of assumptions for Pipes Corrosion Frequency is set up. The reliability assessment of Passive Safety System, one of Main Safety System of KALIMER, are discussed and analyzed

  19. Memory for past public events depends on retrieval frequency but not memory age in Alzheimer's disease.

    Science.gov (United States)

    Müller, Stephan; Mychajliw, Christian; Hautzinger, Martin; Fallgatter, Andreas J; Saur, Ralf; Leyhe, Thomas

    2014-01-01

    Alzheimer's disease (AD) is characterized by retrograde memory deficits primarily caused by dysfunction of the hippocampal complex. Unresolved questions exist concerning the time course of hippocampal involvement in conscious recollection of declarative knowledge, as reports of temporal gradients of retrograde amnesia have been inconclusive. The aim of this study was to examine whether the extent and severity of retrograde amnesia is mediated by retrieval frequency or, in contrast, whether it depends on the age of the memory according to the assumptions of the main current theories of memory formation. We compared recall of past public events in patients with AD and healthy control (HC) individuals using the Historic Events Test (HET). The HET assesses knowledge about famous public events of the past 60 years divided into four time segments and consists of subjective memory rating, dating accuracy, and contextual memory tasks. Although memory for public events was impaired in AD patients, there was a strong effect of retrieval frequency across all time segments and both groups. As AD and HC groups derived similar benefits from greater retrieval frequency, cortical structures other than the hippocampal complex may mediate memory retrieval. These findings suggest that more frequently retrieved events and facts become more independent of the hippocampal complex and thus better protected against early damage of AD. This could explain why cognitive activity may delay the onset of memory decline in persons who develop AD.

  20. Identification of homogeneous regions for rainfall regional frequency analysis considering typhoon event in South Korea

    Science.gov (United States)

    Heo, J. H.; Ahn, H.; Kjeldsen, T. R.

    2017-12-01

    South Korea is prone to large, and often disastrous, rainfall events caused by a mixture of monsoon and typhoon rainfall phenomena. However, traditionally, regional frequency analysis models did not consider this mixture of phenomena when fitting probability distributions, potentially underestimating the risk posed by the more extreme typhoon events. Using long-term observed records of extreme rainfall from 56 sites combined with detailed information on the timing and spatial impact of past typhoons from the Korea Meteorological Administration (KMA), this study developed and tested a new mixture model for frequency analysis of two different phenomena; events occurring regularly every year (monsoon) and events only occurring in some years (typhoon). The available annual maximum 24 hour rainfall data were divided into two sub-samples corresponding to years where the annual maximum is from either (1) a typhoon event, or (2) a non-typhoon event. Then, three-parameter GEV distribution was fitted to each sub-sample along with a weighting parameter characterizing the proportion of historical events associated with typhoon events. Spatial patterns of model parameters were analyzed and showed that typhoon events are less commonly associated with annual maximum rainfall in the North-West part of the country (Seoul area), and more prevalent in the southern and eastern parts of the country, leading to the formation of two distinct typhoon regions: (1) North-West; and (2) Southern and Eastern. Using a leave-one-out procedure, a new regional frequency model was tested and compared to a more traditional index flood method. The results showed that the impact of typhoon on design events might previously have been underestimated in the Seoul area. This suggests that the use of the mixture model should be preferred where the typhoon phenomena is less frequent, and thus can have a significant effect on the rainfall-frequency curve. This research was supported by a grant(2017-MPSS31

  1. Financial system loss as an example of high consequence, high frequency events

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, D.E.

    1996-07-01

    Much work has been devoted to high consequence events with low frequency of occurrence. Characteristic of these events are bridge failure (such as that of the Tacoma Narrows), building failure (such as the collapse of a walkway at a Kansas City hotel), or compromise of a major chemical containment system (such as at Bhopal, India). Such events, although rare, have an extreme personal, societal, and financial impact. An interesting variation is demonstrated by financial losses due to fraud and abuse in the money management system. The impact can be huge, entailing very high aggregate costs, but these are a result of the contribution of many small attacks and not the result of a single (or few) massive events. Public awareness is raised through publicized events such as the junk bond fraud perpetrated by Milikin or gross mismanagement in the failure of the Barings Bank through unsupervised trading activities by Leeson in Singapore. These event,s although seemingly large (financial losses may be on the order of several billion dollars), are but small contributors to the estimated $114 billion loss to all types of financial fraud in 1993. This paper explores the magnitude of financial system losses and identifies new areas for analysis of high consequence events including the potential effect of malevolent intent.

  2. Mean occurrence frequency and temporal risk analysis of solar particle events

    International Nuclear Information System (INIS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.; Wilson, John W.

    2006-01-01

    The protection of astronauts from space radiation is required on future exploratory class and long-duration missions. For the accurate projections of radiation doses, a solar cycle statistical model, which quantifies the progression level within the cycle, has been developed. The resultant future cycle projection is then applied to estimate the mean frequency of solar particle events (SPEs) in the near future using a power law function of sunspot number. Detailed temporal behaviors of the recent large event and two historically large events of the August 1972 SPE and the November 1960 SPE are analyzed for dose-rate and cumulative dose equivalent at sensitive organs. Polyethylene shielded 'storm shelters' inside spacecraft are studied to limit astronauts' total exposure at a sensitive site within 10 cSv from a large event as a potential goal that fulfills the ALARA (as low as reasonably achievable) requirement

  3. Frequency shifting at fiber-optical event horizons: The effect of Raman deceleration

    International Nuclear Information System (INIS)

    Robertson, S.; Leonhardt, U.

    2010-01-01

    Pulses in fibers establish analogs of the event horizon [Philbin et al., Science 319, 1367 (2008)]. At a group-velocity horizon, the frequency of a probe wave is shifted. We present a theoretical model of this frequency shifting, taking into account the deceleration of the pulse caused by the Raman effect. The theory shows that the probe-wave spectrum is sensitive to details of the probe-pulse interaction. Our results indicate an additional loss mechanism in the experiment [Philbin et al., Science 319, 1367 (2008)] that has not been accounted for. Our analysis is also valid for more general cases of the interaction of dispersive waves with decelerated solitons.

  4. Measurement of $B^0$ Mixing Frequency Using a New Probability Based Self-Tagging Algorithm Applied to Inclusive Lepton Events from $p\\bar{p}$ Collisions at $\\sqrt{s}$ = 1.8-TeV

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Tushar [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2000-07-01

    We present a measurement of the Bd mixing frequency performed in an inclusive lepton sample, B → l+X. A secondary vertex identifies a B meson decay, and a high pt lepton determines the flavor at the time of decay.

  5. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  6. Event Recognition Based on Deep Learning in Chinese Texts.

    Science.gov (United States)

    Zhang, Yajun; Liu, Zongtian; Zhou, Wen

    2016-01-01

    Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM). Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN), then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  7. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of)

    2015-05-15

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making.

  8. Survey on Prognostics Techniques for Updating Initiating Event Frequency in PSA

    International Nuclear Information System (INIS)

    Kim, Hyeonmin; Heo, Gyunyoung

    2015-01-01

    One of the applications using PSA is a risk monito. The risk monitoring is real-time analysis tool to decide real-time risk based on real state of components and systems. In order to utilize more effective, the methodologies that manipulate the data from Prognostics was suggested. Generally, Prognostic comprehensively includes not only prognostic but also monitoring and diagnostic. The prognostic method must need condition monitoring. In case of applying PHM to a PSA model, the latest condition of NPPs can be identified more clearly. For reducing the conservatism and uncertainties, we suggested the concept that updates the initiating event frequency in a PSA model by using Bayesian approach which is one of the prognostics techniques before. From previous research, the possibility that PSA is updated by using data more correctly was found. In reliability theory, the Bathtub curve divides three parts (infant failure, constant and random failure, wareout failure). In this paper, in order to investigate the applicability of prognostic methods in updating quantitative data in a PSA model, the OLM acceptance criteria from NUREG, the concept of how to using prognostic in PSA, and the enabling prognostic techniques are suggested. The prognostic has the motivation that improved the predictive capabilities using existing monitoring systems, data, and information will enable more accurate equipment risk assessment for improved decision-making

  9. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  10. PSA-based evaluation and rating of operational events

    International Nuclear Information System (INIS)

    Gomez Cobo, A.

    1997-01-01

    The presentation discusses the PSA-based evaluation and rating of operational events, including the following: historical background, procedures for event evaluation using PSA, use of PSA for event rating, current activities

  11. Frequency Based Fault Detection in Wind Turbines

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2014-01-01

    In order to obtain lower cost of energy for wind turbines fault detection and accommodation is important. Expensive condition monitoring systems are often used to monitor the condition of rotating and vibrating system parts. One example is the gearbox in a wind turbine. This system is operated...... in parallel to the control system, using different computers and additional often expensive sensors. In this paper a simple filter based algorithm is proposed to detect changes in a resonance frequency in a system, exemplified with faults resulting in changes in the resonance frequency in the wind turbine...... gearbox. Only the generator speed measurement which is available in even simple wind turbine control systems is used as input. Consequently this proposed scheme does not need additional sensors and computers for monitoring the condition of the wind gearbox. The scheme is evaluated on a wide-spread wind...

  12. The differential effects of increasing frequency and magnitude of extreme events on coral populations.

    Science.gov (United States)

    Fabina, Nicholas S; Baskett, Marissa L; Gross, Kevin

    2015-09-01

    Extreme events, which have profound ecological consequences, are changing in both frequency and magnitude with climate change. Because extreme temperatures induce coral bleaching, we can explore the relative impacts of changes in frequency and magnitude of high temperature events on coral reefs. Here, we combined climate projections and a dynamic population model to determine how changing bleaching regimes influence coral persistence. We additionally explored how coral traits and competition with macroalgae mediate changes in bleaching regimes. Our results predict that severe bleaching events reduce coral persistence more than frequent bleaching. Corals with low adult mortality and high growth rates are successful when bleaching is mild, but bleaching resistance is necessary to persist when bleaching is severe, regardless of frequency. The existence of macroalgae-dominated stable states reduces coral persistence and changes the relative importance of coral traits. Building on previous studies, our results predict that management efforts may need to prioritize protection of "weaker" corals with high adult mortality when bleaching is mild, and protection of "stronger" corals with high bleaching resistance when bleaching is severe. In summary, future reef projections and conservation targets depend on both local bleaching regimes and biodiversity.

  13. Low-Frequency Type III Bursts and Solar Energetic Particle Events

    Science.gov (United States)

    Gopalswamy, Nat; Makela, Pertti

    2010-01-01

    We analyzed the coronal mass ejections (CMEs), flares, and type 11 radio bursts associated with a set of six low frequency (15 min) normally used to define these bursts. All but one of the type III bursts was not associated with a type 11 burst in the metric or longer wavelength domains. The burst without type 11 burst also lacked a solar energetic particle (SEP) event at energies >25 MeV. The 1-MHz duration of the type III burst (28 min) is near the median value of type III durations found for gradual SEP events and ground level enhancement (GLE) events. Yet, there was no sign of SEP events. On the other hand, two other type III bursts from the same active region had similar duration but accompanied by WAVES type 11 bursts; these bursts were also accompanied by SEP events detected by SOHO/ERNE. The CMEs were of similar speeds and the flares are also of similar size and duration. This study suggests that the type III burst duration may not be a good indicator of an SEP event.

  14. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  15. Increased frequency of FBN1 truncating and splicing variants in Marfan syndrome patients with aortic events.

    Science.gov (United States)

    Baudhuin, Linnea M; Kotzer, Katrina E; Lagerstedt, Susan A

    2015-03-01

    Marfan syndrome is a systemic disorder that typically involves FBN1 mutations and cardiovascular manifestations. We investigated FBN1 genotype-phenotype correlations with aortic events (aortic dissection and prophylactic aortic surgery) in patients with Marfan syndrome. Genotype and phenotype information from probands (n = 179) with an FBN1 pathogenic or likely pathogenic variant were assessed. A higher frequency of truncating or splicing FBN1 variants was observed in Ghent criteria-positive patients with an aortic event (n = 34) as compared with all other probands (n = 145) without a reported aortic event (79 vs. 39%; P Marfan syndrome patients with FBN1 truncating and splicing variants.Genet Med 17 3, 177-187.

  16. Determination of the frequency and direct cost of the adverse drug events in Argentina.

    Science.gov (United States)

    Izquierdo, Estela; Rodríguez, Claudio; Pampliega, Eneas; Filinger, Ester

    2009-05-01

    To determine the frequency and the direct costs of adverse drug reactions, in an ambulatory population of the City of Buenos Aires, Argentina and its area of influence. A retrospective study was done during a period of three months on approximately 300.000 residents of the Buenos Aires area, gathering data according to the selected variables by means of the electronic capture of prescriptions dispensed in pharmacies of the area. This method enables the detection and registration of potential conflicts that may arise between a prescribed drug and factors such as: patient's demographic, clinical and drug profile. The analysis unit was defined as the happening of a moderate or severe adverse event reported by the system. The selected variables were the incidence of these effects and the direct cost was calculated as the value of the drugs that induced the adverse event. The events were classified according to the following interactions: a) drug-drug, b) drug-pediatrics, c) drug-gender, d) drug-pregnancy and abuse of controlled substances. The observed frequency shows great variability and the shortage of available data for ambulatory populations. We found 6.74% of reported events over the total of processed items, which generated an additional cost equivalent to 4.58% of the total pharmaceutical expenses. This study has only evaluated the cost occurred by the use of a drug that will lead to an adverse reaction. Moderate and severe reactions were included regardless of the important indirect costs, hospitalization costs, tests, physician fees, etc.

  17. An exploration of the relationship among valence, fading affect, rehearsal frequency, and memory vividness for past personal events.

    Science.gov (United States)

    Lindeman, Meghan I H; Zengel, Bettina; Skowronski, John J

    2017-07-01

    The affect associated with negative (or unpleasant) memories typically tends to fade faster than the affect associated with positive (or pleasant) memories, a phenomenon called the fading affect bias (FAB). We conducted a study to explore the mechanisms related to the FAB. A retrospective recall procedure was used to obtain three self-report measures (memory vividness, rehearsal frequency, affective fading) for both positive events and negative events. Affect for positive events faded less than affect for negative events, and positive events were recalled more vividly than negative events. The perceived vividness of an event (memory vividness) and the extent to which an event has been rehearsed (rehearsal frequency) were explored as possible mediators of the relation between event valence and affect fading. Additional models conceived of affect fading and rehearsal frequency as contributors to a memory's vividness. Results suggested that memory vividness was a plausible mediator of the relation between an event's valence and affect fading. Rehearsal frequency was also a plausible mediator of this relation, but only via its effects on memory vividness. Additional modelling results suggested that affect fading and rehearsal frequency were both plausible mediators of the relation between an event's valence and the event's rated memory vividness.

  18. Event- and interval-based measurement of stuttering: a review.

    Science.gov (United States)

    Valente, Ana Rita S; Jesus, Luis M T; Hall, Andreia; Leahy, Margaret

    2015-01-01

    Event- and interval-based measurements are two different ways of computing frequency of stuttering. Interval-based methodology emerged as an alternative measure to overcome problems associated with reproducibility in the event-based methodology. No review has been made to study the effect of methodological factors in interval-based absolute reliability data or to compute the agreement between the two methodologies in terms of inter-judge, intra-judge and accuracy (i.e., correspondence between raters' scores and an established criterion). To provide a review related to reproducibility of event-based and time-interval measurement, and to verify the effect of methodological factors (training, experience, interval duration, sample presentation order and judgment conditions) on agreement of time-interval measurement; in addition, to determine if it is possible to quantify the agreement between the two methodologies The first two authors searched for articles on ERIC, MEDLINE, PubMed, B-on, CENTRAL and Dissertation Abstracts during January-February 2013 and retrieved 495 articles. Forty-eight articles were selected for review. Content tables were constructed with the main findings. Articles related to event-based measurements revealed values of inter- and intra-judge greater than 0.70 and agreement percentages beyond 80%. The articles related to time-interval measures revealed that, in general, judges with more experience with stuttering presented significantly higher levels of intra- and inter-judge agreement. Inter- and intra-judge values were beyond the references for high reproducibility values for both methodologies. Accuracy (regarding the closeness of raters' judgements with an established criterion), intra- and inter-judge agreement were higher for trained groups when compared with non-trained groups. Sample presentation order and audio/video conditions did not result in differences in inter- or intra-judge results. A duration of 5 s for an interval appears to be

  19. Activated Very Low Frequency Earthquakes By the Slow Slip Events in the Ryukyu Subduction Zone

    Science.gov (United States)

    Nakamura, M.; Sunagawa, N.

    2014-12-01

    The Ryukyu Trench (RT), where the Philippine Sea plate is subducting, has had no known thrust earthquakes with a Mw>8.0 in the last 300 years. However, the rupture source of the 1771 tsunami has been proposed as an Mw > 8.0 earthquake in the south RT. Based on the dating of tsunami boulders, it has been estimated that large tsunamis occur at intervals of 150-400 years in the south Ryukyu arc (RA) (Araoka et al., 2013), although they have not occurred for several thousand years in the central and northern Ryukyu areas (Goto et al., 2014). To address the discrepancy between recent low moment releases by earthquakes and occurrence of paleo-tsunamis in the RT, we focus on the long-term activity of the very low frequency earthquakes (VLFEs), which are good indicators of the stress release in the shallow plate interface. VLFEs have been detected along the RT (Ando et al., 2012), which occur on the plate interface or at the accretionary prism. We used broadband data from the F-net of NIED along the RT and from the IRIS network. We applied two filters to all the raw broadband seismograms: a 0.02-0.05 Hz band-pass filter and a 1 Hz high-pass filter. After identification of the low-frequency events from the band-pass-filtered seismograms, the local and teleseismic events were removed. Then we picked the arrival time of the maximum amplitude of the surface wave of the VLFEs and determined the epicenters. VLFEs occurred on the RA side within 100 km from the trench axis along the RT. Distribution of the 6670 VLFEs from 2002 to 2013 could be divided to several clusters. Principal large clusters were located at 27.1°-29.0°N, 25.5°-26.6°N, and 122.1°-122.4°E (YA). We found that the VLFEs of the YA are modulated by repeating slow slip events (SSEs) which occur beneath south RA. The activity of the VLFEs increased to two times of its ordinary rate in 15 days after the onset of the SSEs. Activation of the VLFEs could be generated by low stress change of 0.02-20 kPa increase in

  20. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  1. Solar micro-bursts of 22. 2 GHz and their relationship to events observed at lower frequencies

    Energy Technology Data Exchange (ETDEWEB)

    Blakey, J R [Universidade Mackenzie, Sao Paulo (Brazil). Centro de Radio-Astronomia e Astrofisica

    1976-01-01

    Observations of McMath region 10433 at 22 GHz using a telescope with a 4 minutes of arc beam during July 1974 revealed the existence events or 'microbursts' with intensities below the sensitivity limit of normal solar patrol instruments. Many of these events were simply the high frequency counterpart of more intense bursts observed at lower frequencies. This note considers the small number of events which suggest that the gyro-synchrotron mechanism alone is incapable of explaining the observations and indicates that a thermal mechanism is needed to explain the high frequency event.

  2. ATTITUDES OF SERBIAN CONSUMERS TOWARD ADVERTISING THROUGH SPORT WITH REGARD TO THE FREQUENCY OF WATCHING SPORTS EVENTS

    Directory of Open Access Journals (Sweden)

    Stevo Popović

    2015-05-01

    Full Text Available It is proposed that potential cosumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product (Pyun, 2006. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Serbian consumers toward advertising through sport among. Methods: The sample included 127 respondents, divided into six subsample groups: cconsumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modeled by seven-point Likert scale. The results of the measuring were analyzed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Results: Based on the statistical analyses it was found that significant differences didn’t occur at multivariate level, as well as between all three variables at a significance level of p=.05. Hence, it is interesting to highlight that it was found there are no significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. Discussion: These results are so important for the marketers, mostly due to the reason they can merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this wasn’t the case in previous investigations (Bjelica and Popović, 2011 and this observation presents relevant information.

  3. Effects of Sound Frequency on Audiovisual Integration: An Event-Related Potential Study.

    Science.gov (United States)

    Yang, Weiping; Yang, Jingjing; Gao, Yulin; Tang, Xiaoyu; Ren, Yanna; Takahashi, Satoshi; Wu, Jinglong

    2015-01-01

    A combination of signals across modalities can facilitate sensory perception. The audiovisual facilitative effect strongly depends on the features of the stimulus. Here, we investigated how sound frequency, which is one of basic features of an auditory signal, modulates audiovisual integration. In this study, the task of the participant was to respond to a visual target stimulus by pressing a key while ignoring auditory stimuli, comprising of tones of different frequencies (0.5, 1, 2.5 and 5 kHz). A significant facilitation of reaction times was obtained following audiovisual stimulation, irrespective of whether the task-irrelevant sounds were low or high frequency. Using event-related potential (ERP), audiovisual integration was found over the occipital area for 0.5 kHz auditory stimuli from 190-210 ms, for 1 kHz stimuli from 170-200 ms, for 2.5 kHz stimuli from 140-200 ms, 5 kHz stimuli from 100-200 ms. These findings suggest that a higher frequency sound signal paired with visual stimuli might be early processed or integrated despite the auditory stimuli being task-irrelevant information. Furthermore, audiovisual integration in late latency (300-340 ms) ERPs with fronto-central topography was found for auditory stimuli of lower frequencies (0.5, 1 and 2.5 kHz). Our results confirmed that audiovisual integration is affected by the frequency of an auditory stimulus. Taken together, the neurophysiological results provide unique insight into how the brain processes a multisensory visual signal and auditory stimuli of different frequencies.

  4. Increased risk of severe hypoglycemic events with increasing frequency of non-severe hypoglycemic events in patients with Type 1 and Type 2 diabetes.

    LENUS (Irish Health Repository)

    Sreenan, Seamus

    2014-07-15

    Severe hypoglycemic events (SHEs) are associated with significant morbidity, mortality and costs. However, the more common non-severe hypoglycemic events (NSHEs) are less well explored. We investigated the association between reported frequency of NSHEs and SHEs among patients with type 1 diabetes mellitus (T1DM) and type 2 diabetes mellitus (T2DM) in the PREDICTIVE study.

  5. Modelado del transformador para eventos de alta frecuencia; Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana Galván Sánchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético. Debido a su construcción física, su representación como un circuito y su modelo matemático son muy complejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de la red eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienen fenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muy detallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operación de un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea de transmisión a una distancia de 5 km de una subestación de potencia. The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physical construction, its representation as a circuit and its mathematical model are very complex. The electromagnetic behavior and all the elements in the power network depend on the involved frequency. So, for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transient behavior. This work analyzes how to pass from a simple model to a very detailed model to simulated high frequency events. The simulated events are the switch operation due to a fault in the system and the impact of an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  6. Aesthetic appreciation: event-related field and time-frequency analyses.

    Science.gov (United States)

    Munar, Enric; Nadal, Marcos; Castellanos, Nazareth P; Flexas, Albert; Maestú, Fernando; Mirasso, Claudio; Cela-Conde, Camilo J

    2011-01-01

    Improvements in neuroimaging methods have afforded significant advances in our knowledge of the cognitive and neural foundations of aesthetic appreciation. We used magnetoencephalography (MEG) to register brain activity while participants decided about the beauty of visual stimuli. The data were analyzed with event-related field (ERF) and Time-Frequency (TF) procedures. ERFs revealed no significant differences between brain activity related with stimuli rated as "beautiful" and "not beautiful." TF analysis showed clear differences between both conditions 400 ms after stimulus onset. Oscillatory power was greater for stimuli rated as "beautiful" than those regarded as "not beautiful" in the four frequency bands (theta, alpha, beta, and gamma). These results are interpreted in the frame of synchronization studies.

  7. Effects of low-frequency repetitive transcranial magnetic stimulation on event-related potential P300

    Science.gov (United States)

    Torii, Tetsuya; Sato, Aya; Iwahashi, Masakuni; Iramina, Keiji

    2012-04-01

    The present study analyzed the effects of repetitive transcranial magnetic stimulation (rTMS) on brain activity. P300 latency of event-related potential (ERP) was used to evaluate the effects of low-frequency and short-term rTMS by stimulating the supramarginal gyrus (SMG), which is considered to be the related area of P300 origin. In addition, the prolonged stimulation effects on P300 latency were analyzed after applying rTMS. A figure-eight coil was used to stimulate left-right SMG, and intensity of magnetic stimulation was 80% of motor threshold. A total of 100 magnetic pulses were applied for rTMS. The effects of stimulus frequency at 0.5 or 1 Hz were determined. Following rTMS, an odd-ball task was performed and P300 latency of ERP was measured. The odd-ball task was performed at 5, 10, and 15 min post-rTMS. ERP was measured prior to magnetic stimulation as a control. Electroencephalograph (EEG) was measured at Fz, Cz, and Pz that were indicated by the international 10-20 electrode system. Results demonstrated that different effects on P300 latency occurred between 0.5-1 Hz rTMS. With 1 Hz low-frequency magnetic stimulation to the left SMG, P300 latency decreased. Compared to the control, the latency time difference was approximately 15 ms at Cz. This decrease continued for approximately 10 min post-rTMS. In contrast, 0.5 Hz rTMS resulted in delayed P300 latency. Compared to the control, the latency time difference was approximately 20 ms at Fz, and this delayed effect continued for approximately 15 min post-rTMS. Results demonstrated that P300 latency varied according to rTMS frequency. Furthermore, the duration of the effect was not similar for stimulus frequency of low-frequency rTMS.

  8. Clinical usefulness and feasibility of time-frequency analysis of chemosensory event-related potentials.

    Science.gov (United States)

    Huart, C; Rombaux, Ph; Hummel, T; Mouraux, A

    2013-09-01

    The clinical usefulness of olfactory event-related brain potentials (OERPs) to assess olfactory function is limited by the relatively low signal-to-noise ratio of the responses identified using conventional time-domain averaging. Recently, it was shown that time-frequency analysis of the obtained EEG signals can markedly improve the signal-to-noise ratio of OERPs in healthy controls, because it enhances both phase-locked and non phase-locked EEG responses. The aim of the present study was to investigate the clinical usefulness of this approach and evaluate its feasibility in a clinical setting. We retrospectively analysed EEG recordings obtained from 45 patients (15 anosmic, 15 hyposmic and 15 normos- mic). The responses to olfactory stimulation were analysed using conventional time-domain analysis and joint time-frequency analysis. The ability of the two methods to discriminate between anosmic, hyposmic and normosmic patients was assessed using a Receiver Operating Characteristic analysis. The discrimination performance of OERPs identified using conventional time-domain averaging was poor. In contrast, the discrimination performance of the EEG response identified in the time-frequency domain was relatively high. Furthermore, we found a significant correlation between the magnitude of this response and the psychophysical olfactory score. Time-frequency analysis of the EEG responses to olfactory stimulation could be used as an effective and reliable diagnostic tool for the objective clinical evaluation of olfactory function in patients.

  9. Analysis of core damage frequency, Surry, Unit 1 internal events appendices

    International Nuclear Information System (INIS)

    Bertucio, R.C.; Julius, J.A.; Cramond, W.R.

    1990-04-01

    This document contains the appendices for the accident sequence analyses of internally initiated events for the Surry Nuclear Station, Unit 1. This is one of the five plant analyses conducted as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 documents the risk of a selected group of nuclear power plants. The work performed is an extensive reanalysis of that published in November 1986 as NUREG/CR-4450, Volume 3. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved. The context and detail of this report are directed toward PRA practitioners who need to know how the work was performed and the details for use in further studies. The mean core damage frequency at Surry was calculated to be 4.0E-5 per year, with a 95% upper bound of 1.3E-4 and 5% lower bound of 6.8E-6 per year. Station blackout type accidents (loss of all AC power) were the largest contributors to the core damage frequency, accounting for approximately 68% of the total. The next type of dominant contributors were Loss of Coolant Accidents (LOCAs). These sequences account for 15% of core damage frequency. No other type of sequence accounts for more than 10% of core damage frequency

  10. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  11. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  12. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  13. Rule-Based Event Processing and Reaction Rules

    Science.gov (United States)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  14. Trends and characteristics observed in nuclear events based on international nuclear event scale reports

    International Nuclear Information System (INIS)

    Watanabe, Norio

    2001-01-01

    The International Nuclear Event Scale (INES) is jointly operated by the IAEA and the OECD-NEA as a means designed for providing prompt, clear and consistent information related to nuclear events, that occurred at nuclear facilities, and facilitating communication between the nuclear community, the media and the public. Nuclear events are reported to the INES with the Scale', a consistent safety significance indicator, which runs from level 0, for events with no safety significance, to level 7 for a major accident with widespread health and environmental effects. Since the operation of INES was initiated in 1990, approximately 500 events have been reported and disseminated. The present paper discusses the trends observed in nuclear events, such as overall trends of the reported events and characteristics of safety significant events with level 2 or higher, based on the INES reports. (author)

  15. DEVS representation of dynamical systems - Event-based intelligent control. [Discrete Event System Specification

    Science.gov (United States)

    Zeigler, Bernard P.

    1989-01-01

    It is shown how systems can be advantageously represented as discrete-event models by using DEVS (discrete-event system specification), a set-theoretic formalism. Such DEVS models provide a basis for the design of event-based logic control. In this control paradigm, the controller expects to receive confirming sensor responses to its control commands within definite time windows determined by its DEVS model of the system under control. The event-based contral paradigm is applied in advanced robotic and intelligent automation, showing how classical process control can be readily interfaced with rule-based symbolic reasoning systems.

  16. Modelado del transformador para eventos de alta frecuencia ;Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana – Galván Sanchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético.Debido a su construcción física, su representación como un circuito y su modelo matemático son muycomplejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de lared eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienenfenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muydetallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operaciónde un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea detransmisión a una distancia de 5 km de una subestación de potencia.The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physicalconstruction, its representation as a circuit and its mathematical model are very complex. Theelectromagnetic behavior and all the elements in the power network depend on the involved frequency. So,for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transientbehavior. This work analyzes how to pass from a simple model to a very detailed model to simulated highfrequency events. The simulated events are the switch operation due to a fault in the system and the impactof an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  17. Characterizing the Frequency and Elevation of Rapid Drainage Events in West Greenland

    Science.gov (United States)

    Cooley, S.; Christoffersen, P.

    2016-12-01

    Rapid drainage of supraglacial lakes on the Greenland Ice Sheet is critical for the establishment of surface-to-bed hydrologic connections and the subsequent transfer of water from surface to bed. Yet, estimates of the number and spatial distribution of rapidly draining lakes vary widely due to limitations in the temporal frequency of image collection and obscureness by cloud. So far, no study has assessed the impact of these observation biases. In this study, we examine the frequency and elevation of rapidly draining lakes in central West Greenland, from 68°N to 72.6°N, and we make a robust statistical analysis to estimate more accurately the likelihood of lakes draining rapidly. Using MODIS imagery and a fully automated lake detection method, we map more than 500 supraglacial lakes per year over a 63000 km2 study area from 2000-2015. Through testing four different definitions of rapidly draining lakes from previously published studies, we find that the number of rapidly draining lakes varies from 3% to 38%. Logistic regression between rapid drainage events and image sampling frequency demonstrates that the number of rapid drainage events is strongly dependent on cloud-free observation percentage. We then develop three new drainage criteria and apply an observation bias correction that suggests a true rapid drainage probability between 36% and 45%, considerably higher than previous studies without bias assessment have reported. We find rapid-draining lakes are on average larger and disappear earlier than slow-draining lakes, and we also observe no elevation differences for the lakes detected as rapidly draining. We conclude a) that methodological problems in rapid drainage research caused by observation bias and varying detection methods have obscured large-scale rapid drainage characteristics and b) that the lack of evidence for an elevation limit on rapid drainage suggests surface-to-bed hydrologic connections may continue to propagate inland as climate warms.

  18. Disruption of perineuronal nets increases the frequency of sharp wave ripple events.

    Science.gov (United States)

    Sun, Zhi Yong; Bozzelli, P Lorenzo; Caccavano, Adam; Allen, Megan; Balmuth, Jason; Vicini, Stefano; Wu, Jian-Young; Conant, Katherine

    2018-01-01

    Hippocampal sharp wave ripples (SWRs) represent irregularly occurring synchronous neuronal population events that are observed during phases of rest and slow wave sleep. SWR activity that follows learning involves sequential replay of training-associated neuronal assemblies and is critical for systems level memory consolidation. SWRs are initiated by CA2 or CA3 pyramidal cells (PCs) and require initial excitation of CA1 PCs as well as participation of parvalbumin (PV) expressing fast spiking (FS) inhibitory interneurons. These interneurons are relatively unique in that they represent the major neuronal cell type known to be surrounded by perineuronal nets (PNNs), lattice like structures composed of a hyaluronin backbone that surround the cell soma and proximal dendrites. Though the function of the PNN is not completely understood, previous studies suggest it may serve to localize glutamatergic input to synaptic contacts and thus influence the activity of ensheathed cells. Noting that FS PV interneurons impact the activity of PCs thought to initiate SWRs, and that their activity is critical to ripple expression, we examine the effects of PNN integrity on SWR activity in the hippocampus. Extracellular recordings from the stratum radiatum of horizontal murine hippocampal hemisections demonstrate SWRs that occur spontaneously in CA1. As compared with vehicle, pre-treatment (120 min) of paired hemislices with hyaluronidase, which cleaves the hyaluronin backbone of the PNN, decreases PNN integrity and increases SWR frequency. Pre-treatment with chondroitinase, which cleaves PNN side chains, also increases SWR frequency. Together, these data contribute to an emerging appreciation of extracellular matrix as a regulator of neuronal plasticity and suggest that one function of mature perineuronal nets could be to modulate the frequency of SWR events. © 2017 Wiley Periodicals, Inc.

  19. Thermal-Diffusivity-Based Frequency References in Standard CMOS

    NARCIS (Netherlands)

    Kashmiri, S.M.

    2012-01-01

    In recent years, a lot of research has been devoted to the realization of accurate integrated frequency references. A thermal-diffusivity-based (TD) frequency reference provides an alternative method of on-chip frequency generation in standard CMOS technology. A frequency-locked loop locks the

  20. An event-based model for contracts

    Directory of Open Access Journals (Sweden)

    Tiziana Cimoli

    2013-02-01

    Full Text Available We introduce a basic model for contracts. Our model extends event structures with a new relation, which faithfully captures the circular dependencies among contract clauses. We establish whether an agreement exists which respects all the contracts at hand (i.e. all the dependencies can be resolved, and we detect the obligations of each participant. The main technical contribution is a correspondence between our model and a fragment of the contract logic PCL. More precisely, we show that the reachable events are exactly those which correspond to provable atoms in the logic. Despite of this strong correspondence, our model improves previous work on PCL by exhibiting a finer-grained notion of culpability, which takes into account the legitimate orderings of events.

  1. SUBTLEX-ESP: Spanish Word Frequencies Based on Film Subtitles

    Science.gov (United States)

    Cuetos, Fernando; Glez-Nosti, Maria; Barbon, Analia; Brysbaert, Marc

    2011-01-01

    Recent studies have shown that word frequency estimates obtained from films and television subtitles are better to predict performance in word recognition experiments than the traditional word frequency estimates based on books and newspapers. In this study, we present a subtitle-based word frequency list for Spanish, one of the most widely spoken…

  2. Probe-controlled soliton frequency shift in the regime of optical event horizon.

    Science.gov (United States)

    Gu, Jie; Guo, Hairun; Wang, Shaofei; Zeng, Xianglong

    2015-08-24

    In optical analogy of the event horizon, temporal pulse collision and mutual interactions are mainly between an intense solitary wave (soliton) and a dispersive probe wave. In such a regime, here we numerically investigate the probe-controlled soliton frequency shift as well as the soliton self-compression. In particular, in the dispersion landscape with multiple zero dispersion wavelengths, bi-directional soliton spectral tunneling effects is possible. Moreover, we propose a mid-infrared soliton self-compression to the generation of few-cycle ultrashort pulses, in a bulk of quadratic nonlinear crystals in contrast to optical fibers or cubic nonlinear media, which could contribute to the community with a simple and flexible method to experimental implementations.

  3. Neural Network Based Load Frequency Control for Restructuring ...

    African Journals Online (AJOL)

    Neural Network Based Load Frequency Control for Restructuring Power Industry. ... an artificial neural network (ANN) application of load frequency control (LFC) of a Multi-Area power system by using a neural network controller is presented.

  4. [Frequency and Type of Traumatic Events in Children and Adolescents with a Posttraumatic Stress Disorder].

    Science.gov (United States)

    Loos, Sabine; Wolf, Saskia; Tutus, Dunja; Goldbeck, Lutz

    2015-01-01

    The risk for children and adolescents to be exposed to a potentially traumatic event (PTE) is high. The present study examines the frequency of PTEs in children and adolescents with Posttraumatic Stress Disorder (PTSD), the type of index trauma, and its relation to PTSD symptom severity and gender. A clinical sample of 159 children and adolescents between 7-16 years was assessed using the Clinician-Administered PTSD Scale for Children and Adolescents (CAPS-CA). All reported PTEs from the checklist were analyzed according to frequency. The index events were categorized according to the following categories: cause (random vs. intentional), relation to offender (intrafamilial vs. extrafamilial), patient's role (victim, witness or vicarious traumatization), and type of PTE (physical or sexual violence). Relation between categories and PTSD symptom severity and sex were analyzed with inferential statistics. On average participants reported five PTEs, most frequently physical violence without weapons (57.9%), loss of loved person through death (45.9%), and sexual abuse/assaults (44%). The most frequent index traumata were intentional (76.7%). Regarding trauma type, there was a significant difference concerning higher symptom severity in children and adolescents who experienced sexual abuse/assault compared to physical violence (t=-1.913(109), p=0.05). A significantly higher symptom severity was found for girls compared to boys for the trauma categories extrafamilial offender (z=-2,27, p=0.02), victim (z=-2,11, p=0,04), and sexual abuse/assault (z=-2,43, p=0,01). Clinical and diagnostic implications are discussed in relation to the amendments of PTSD diagnostic criteria in DSM-5.

  5. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs

  6. Low velocity target detection based on time-frequency image for high frequency ground wave radar

    Institute of Scientific and Technical Information of China (English)

    YAN Songhua; WU Shicai; WEN Biyang

    2007-01-01

    The Doppler spectral broadening resulted from non-stationary movement of target and radio-frequency interference will decrease the veracity of target detection by high frequency ground wave(HEGW)radar.By displaying the change of signal energy on two dimensional time-frequency images based on time-frequency analysis,a new mathematical morphology method to distinguish target from nonlinear time-frequency curves is presented.The analyzed results from the measured data verify that with this new method the target can be detected correctly from wide Doppler spectrum.

  7. The SKI-project External events - Phase 2. Estimation of fire frequencies per plant and per building

    International Nuclear Information System (INIS)

    Poern, K.

    1996-08-01

    The Swedish-Finnish handbook for initiating event frequencies, I-Book, does not contain any fire frequencies. This matter of fact is not defensible considering the substantial risk contribution caused by fires. In the PSAs performed hitherto the initiating fire frequencies have been determined from case to case. Because data are usually very scarce in these areas it is very important to develop unique definitions, to systematically utilize both international and national experiences and to establish an appropriate statistical estimation method. It is also important to present the accumulated experience such that it can be used for different purposes, not only within PSA but also in the concrete fire preventive work. During phase 1 of the project External Events an inventory was made of existing methods for probabilistic fire analysis in general. During phase 2 of the project it was decided to initialize the work on a complementary handbook, called X-Book, in order to encompass the frequencies of system external events, i.e. initiating events that are caused by events occurring outside the system boundaries. In Version 1 of the X-Book the attention is mainly focussed on the estimation of initiating fire frequencies, per plant and per building. This estimation is basically founded on reports that the power companies have collected for this specific purpose. This report describes the statistical model and method that have been used in the estimation process. The methodological results achieved may, possibly after some modification, be applicable also to other types of system external events

  8. A Bayesian approach to unanticipated events frequency estimation in the decision making context of a nuclear research reactor facility

    International Nuclear Information System (INIS)

    Chatzidakis, S.; Staras, A.

    2013-01-01

    Highlights: • The Bayes’ theorem is employed to support the decision making process in a research reactor. • The intention is to calculate parameters related to unanticipated occurrence of events. • Frequency, posterior distribution and confidence limits are calculated. • The approach is demonstrated using two real-world numerical examples. • The approach can be used even if no failures have been observed. - Abstract: Research reactors are considered as multi-tasking environments having the multiple roles of commercial, research and training facilities. Yet, reactor managers have to make decisions, frequently with high economic impact, based on little available knowledge. A systematic approach employing the Bayes’ theorem is proposed to support the decision making process in a research reactor environment. This approach is characterized by low level complexity, appropriate for research reactor facilities. The methodology is demonstrated through the study of two characteristic events that lead to unanticipated system shutdown, namely the de-energization of the control rod magnet and the flapper valve opening. The results obtained demonstrate the suitability of the Bayesian approach in the decision making context when unanticipated events are considered

  9. Analysis of core damage frequency due to external events at the DOE [Department of Energy] N-Reactor

    International Nuclear Information System (INIS)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L.; Baxter, J.T.; Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P.; Brosseau, D.A.

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs

  10. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  11. Towards a Unified Understanding of Event-Related Changes in the EEG: The Firefly Model of Synchronization through Cross-Frequency Phase Modulation

    Science.gov (United States)

    Burgess, Adrian P.

    2012-01-01

    Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing. PMID:23049827

  12. Frequencies and trends of significant characteristics of reported events in Germany

    International Nuclear Information System (INIS)

    Farber, G.; Matthes, H.

    2001-01-01

    In the frame of its support to the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety the GRS continuously performs in-depth technical analyses of reported events at operating nuclear power reactors in Germany which can be used for the determination of plant weaknesses with regard to reactor safety. During the last 18 months, in addition to those activities, the GRS has developed a data bank model for the statistical assessment of events. This model is based on a hierarchically structured, detailed coding system with respect to technical and safety relevant characteristics of the plants and the systematic characterization of plant-specific events. The data bank model is ready for practical application. Results of a first statistical evaluation, taking into account the data sets from the time period 1996 to 1999, are meanwhile available. By increasing the amount of data it will become possible to herewith improve the statements concerning trends of safety aspects. This report describes the coding system, the evaluation model, the data input and the evaluations performed during the period beginning in April 2000. (authors)

  13. Frequencies and trends of significant characteristics of reported events in Germany

    Energy Technology Data Exchange (ETDEWEB)

    Farber, G.; Matthes, H. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koln (Germany)

    2001-07-01

    In the frame of its support to the German Federal Ministry for the Environment, Nature Conservation and Nuclear Safety the GRS continuously performs in-depth technical analyses of reported events at operating nuclear power reactors in Germany which can be used for the determination of plant weaknesses with regard to reactor safety. During the last 18 months, in addition to those activities, the GRS has developed a data bank model for the statistical assessment of events. This model is based on a hierarchically structured, detailed coding system with respect to technical and safety relevant characteristics of the plants and the systematic characterization of plant-specific events. The data bank model is ready for practical application. Results of a first statistical evaluation, taking into account the data sets from the time period 1996 to 1999, are meanwhile available. By increasing the amount of data it will become possible to herewith improve the statements concerning trends of safety aspects. This report describes the coding system, the evaluation model, the data input and the evaluations performed during the period beginning in April 2000. (authors)

  14. Fast oscillations in cortical-striatal networks switch frequency following rewarding events and stimulant drugs.

    Science.gov (United States)

    Berke, J D

    2009-09-01

    Oscillations may organize communication between components of large-scale brain networks. Although gamma-band oscillations have been repeatedly observed in cortical-basal ganglia circuits, their functional roles are not yet clear. Here I show that, in behaving rats, distinct frequencies of ventral striatal local field potential oscillations show coherence with different cortical inputs. The approximately 50 Hz gamma oscillations that normally predominate in awake ventral striatum are coherent with piriform cortex, whereas approximately 80-100 Hz high-gamma oscillations are coherent with frontal cortex. Within striatum, entrainment to gamma rhythms is selective to fast-spiking interneurons, with distinct fast-spiking interneuron populations entrained to different gamma frequencies. Administration of the psychomotor stimulant amphetamine or the dopamine agonist apomorphine causes a prolonged decrease in approximately 50 Hz power and increase in approximately 80-100 Hz power. The same frequency switch is observed for shorter epochs spontaneously in awake, undrugged animals and is consistently provoked for reward receipt. Individual striatal neurons can participate in these brief high-gamma bursts with, or without, substantial changes in firing rate. Switching between discrete oscillatory states may allow different modes of information processing during decision-making and reinforcement-based learning, and may also be an important systems-level process by which stimulant drugs affect cognition and behavior.

  15. Review of the Shoreham Nuclear Power Station Probabilistic Risk Assessment: internal events and core damage frequency

    International Nuclear Information System (INIS)

    Ilberg, D.; Shiu, K.; Hanan, N.; Anavim, E.

    1985-11-01

    A review of the Probabilistic Risk Assessment of the Shoreham Nuclear Power Station was conducted with the broad objective of evaluating its risks in relation to those identified in the Reactor Safety Study (WASH-1400). The scope of the review was limited to the ''front end'' part, i.e., to the evaluation of the frequencies of states in which core damage may occur. Furthermore, the review considered only internally generated accidents, consistent with the scope of the PRA. The review included an assessment of the assumptions and methods used in the Shoreham study. It also encompassed a reevaluation of the main results within the scope and general methodological framework of the Shoreham PRA, including both qualitative and quantitative analyses of accident initiators, data bases, and accident sequences which result in initiation of core damage. Specific comparisons are given between the Shoreham study, the results of the present review, and the WASH-1400 BWR, for the core damage frequency. The effect of modeling uncertainties was considered by a limited sensitivity study so as to show how the results would change if other assumptions were made. This review provides an independently assessed point value estimate of core damage frequency and describes the major contributors, by frontline systems and by accident sequences. 17 figs., 81 tabs

  16. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  17. Evolution in Intensity and Frequency of Extreme Events of Precipitation in Northeast Region and Brazilian Amazon in XXI Century

    Science.gov (United States)

    Fonseca, P. M.; Veiga, J. A.; Correia, F. S.; Brito, A. L.

    2013-05-01

    The aim of this research was evaluate changes in frequency and intensity of extreme events of precipitation in Brazilian Amazon and Northeast Region, doubling CO2 concentration in agreement of IPCC A2 emissions scenarios (Nakicenovic et al., 2001). For this evaluation was used ETA model (Chou et al., 2011), forced with CCSM3 Global model data (Meehl, 2006) to run 4 experiments, only for January, February and March: 1980-1990, 2000-2010, 2040-2050 and 2090-2100. Using the first decade as reference (1980-1990), was evaluated changes occurred in following decades, with a methodology to classify extremes events adapted from Frich (2002) and Gao (2006). Higher was the class, more intense is the event. An increase of 25% was observed in total precipitation in Brazilian Amazon for the end of XXI century and 12% for extreme events type 1, 9% for events type 2 and 10% for type 3. By the other hand, a 17% decrease of precipitation in Brazilian Northeast was observed, and a pronounced decay of 24% and 15% in extreme events contribution type 1 and 2 to total amount of precipitation, respectively. The difference between total normal type events was positive in this three decades compared with reference decade 1980-1990, varying positively from 4 to 6 thousand events included in normality by decade, these events was decreased in your majority of Class 1 events, which presented a decay of at least 3.500 events by each decade. This suggests an intensification of extreme events, considering that the amount of precipitation by class increased, and the number of events by class decreased. To Northeast region, an increasing in 9% of contribution to events type 3 class was observed, as well as in the frequency of this type of events (about of 700 more events). Major decreasing in number of classes extreme events occur in 2000-2010, to classes 1 and 3, with 7,2 and 5,6%, and by the end of century in class 3, with 4,5%. For the three analyzed decades a total decrease of 8.400 events was

  18. Brain-computer interface based on intermodulation frequency

    Science.gov (United States)

    Chen, Xiaogang; Chen, Zhikai; Gao, Shangkai; Gao, Xiaorong

    2013-12-01

    Objective. Most recent steady-state visual evoked potential (SSVEP)-based brain-computer interface (BCI) systems have used a single frequency for each target, so that a large number of targets require a large number of stimulus frequencies and therefore a wider frequency band. However, human beings show good SSVEP responses only in a limited range of frequencies. Furthermore, this issue is especially problematic if the SSVEP-based BCI takes a PC monitor as a stimulator, which is only capable of generating a limited range of frequencies. To mitigate this issue, this study presents an innovative coding method for SSVEP-based BCI by means of intermodulation frequencies. Approach. Simultaneous modulations of stimulus luminance and color at different frequencies were utilized to induce intermodulation frequencies. Luminance flickered at relatively large frequency (10, 12, 15 Hz), while color alternated at low frequency (0.5, 1 Hz). An attractive feature of the proposed method was that it would substantially increase the number of targets at a single flickering frequency by altering color modulated frequencies. Based on this method, the BCI system presented in this study realized eight targets merely using three flickering frequencies. Main results. The online results obtained from 15 subjects (14 healthy and 1 with stroke) revealed that an average classification accuracy of 93.83% and information transfer rate (ITR) of 33.80 bit min-1 were achieved using our proposed SSVEP-based BCI system. Specifically, 5 out of the 15 subjects exhibited an ITR of 40.00 bit min-1 with a classification accuracy of 100%. Significance. These results suggested that intermodulation frequencies could be adopted as steady responses in BCI, for which our system could be used as a practical BCI system.

  19. Very low frequency earthquakes (VLFEs) detected during episodic tremor and slip (ETS) events in Cascadia using a match filter method indicate repeating events

    Science.gov (United States)

    Hutchison, A. A.; Ghosh, A.

    2016-12-01

    Very low frequency earthquakes (VLFEs) occur in transitional zones of faults, releasing seismic energy in the 0.02-0.05 Hz frequency band over a 90 s duration and typically have magntitudes within the range of Mw 3.0-4.0. VLFEs can occur down-dip of the seismogenic zone, where they can transfer stress up-dip potentially bringing the locked zone closer to a critical failure stress. VLFEs also occur up-dip of the seismogenic zone in a region along the plate interface that can rupture coseismically during large megathrust events, such as the 2011 Tohoku-Oki earthquake [Ide et al., 2011]. VLFEs were first detected in Cascadia during the 2011 episodic tremor and slip (ETS) event, occurring coincidentally with tremor [Ghosh et al., 2015]. However, during the 2014 ETS event, VLFEs were spatially and temporally asynchronous with tremor activity [Hutchison and Ghosh, 2016]. Such contrasting behaviors remind us that the mechanics behind such events remain elusive, yet they are responsible for the largest portion of the moment release during an ETS event. Here, we apply a match filter method using known VLFEs as template events to detect additional VLFEs. Using a grid-search centroid moment tensor inversion method, we invert stacks of the resulting match filter detections to ensure moment tensor solutions are similar to that of the respective template events. Our ability to successfully employ a match filter method to VLFE detection in Cascadia intrinsically indicates that these events can be repeating, implying that the same asperities are likely responsible for generating multiple VLFEs.

  20. Towards the Realization of Graphene Based Flexible Radio Frequency Receiver

    Directory of Open Access Journals (Sweden)

    Maruthi N. Yogeesh

    2015-11-01

    Full Text Available We report on our progress and development of high speed flexible graphene field effect transistors (GFETs with high electron and hole mobilities (~3000 cm2/V·s, and intrinsic transit frequency in the microwave GHz regime. We also describe the design and fabrication of flexible graphene based radio frequency system. This RF communication system consists of graphite patch antenna at 2.4 GHz, graphene based frequency translation block (frequency doubler and AM demodulator and graphene speaker. The communication blocks are utilized to demonstrate graphene based amplitude modulated (AM radio receiver operating at 2.4 GHz.

  1. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    Energy Technology Data Exchange (ETDEWEB)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.; Dagle, Jeffery E.; Millard, W. David; Yao, Juan; Glantz, Clifford S.; Gourisetti, Sri Nikhil Gup

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insights from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of

  2. Water based fluidic radio frequency metamaterials

    Science.gov (United States)

    Cai, Xiaobing; Zhao, Shaolin; Hu, Mingjun; Xiao, Junfeng; Zhang, Naibo; Yang, Jun

    2017-11-01

    Electromagnetic metamaterials offer great flexibility for wave manipulation and enable exceptional functionality design, ranging from negative refraction, anomalous reflection, super-resolution imaging, transformation optics to cloaking, etc. However, demonstration of metamaterials with unprecedented functionalities is still challenging and costly due to the structural complexity or special material properties. Here, we demonstrate for the first time the versatile fluidic radio frequency metamaterials with negative refraction using a water-embedded and metal-coated 3D architecture. Effective medium analysis confirms that metallic frames create an evanescent environment while simultaneously water cylinders produce negative permeability under Mie resonance. The water-metal coupled 3D architectures and the accessory devices for measurement are fabricated by 3D printing with post electroless deposition. Our study also reveals the great potential of fluidic metamaterials and versatility of the 3D printing process in rapid prototyping of customized metamaterials.

  3. Comparison between Japan and the United States in the frequency of events in equipment and components at nuclear power plants

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2007-01-01

    The Institute of Nuclear Safety System, Incorporated (INSS) conducted trend analyses until 2005 to compare the frequency of events in certain electrical components and instrumentation components at nuclear power plants between Japan and the United States. The results revealed that events have occurred approximately an order of magnitude less often in Japan than in the United States. This paper compared Japan and the United States in more detail in terms of how often events - events reported under the reporting standards of the Nuclear Information Archive (NUCIA) or the Institute of Nuclear Power Operations (INPO) - occurred in electrical components, instrumentation components and mechanical components at nuclear power plants. The results were as follows: (1) In regard to electrical components and instrumentation components, events have occurred one-eighth less frequently in Japan than in the United States, suggesting that the previous results were correct. (2) Events have occurred more often in mechanical components than electrical components and instrumentation components in both Japan and the United States, and there was a smaller difference in the frequency of events in mechanical components between the two countries. (3) Regarding mechanical components, it was found that events in the pipes for critical systems and equipment, such as reactor coolant systems, emergency core cooling systems, instrument and control systems, ventilating and air-conditioning systems, and turbine equipment, have occurred more often in Japan than in the United States. (4) The above observations suggest that there is little scope for reducing the frequency of events in electrical components and instrumentation components, but that mechanical components such as pipes for main systems like emergency core cooling systems and turbine equipment in the case of PWRs, could be improved by re-examining inspection methods and intervals. (author)

  4. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  5. Event-Based Corpuscular Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    Michielsen, K.; Jin, F.; Raedt, H. De

    A corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one is presented. The event-based corpuscular model is shown to give a

  6. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  7. IBES: a tool for creating instructions based on event segmentation.

    Science.gov (United States)

    Mura, Katharina; Petersen, Nils; Huff, Markus; Ghose, Tandra

    2013-12-26

    Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, 20 participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, 10 and 12 participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  8. Circumvention of noise contributions in fiber laser based frequency combs.

    Science.gov (United States)

    Benkler, Erik; Telle, Harald; Zach, Armin; Tauser, Florian

    2005-07-25

    We investigate the performance of an Er:fiber laser based femtosecond frequency comb for precision metrological applications. Instead of an active stabilization of the comb, the fluctuations of the carrier-envelope offset phase, the repetition phase, and the phase of the beat from a comb line with an optical reference are synchronously detected. We show that these fluctuations can be effectively eliminated by exploiting their known correlation. In our experimental scheme, we utilize two identically constructed frequency combs for the measurement of the fluctuations, rejecting the influence of a shared optical reference. From measuring a white frequency noise level, we demonstrate that a fractional frequency instability better than 1.4 x 10(-14) for 1 s averaging time can be achieved in frequency metrology applications using the Er:fiber based frequency comb.

  9. The determination of random event-rate based on counter live-time measurement; Determination de la frequence reelle d'evenements aleatoires par comptage et mesure du temps mort du compteur; Opredelenie skorosti scheta besporyadochno sleduyushchikh impul'sov, osnovannoe na izmerenii rabochego vremeni schetchika; Sobre la determinacion del ritmo de sucesos aleatorios basada en la medida del tiempo eficaz de un contador

    Energy Technology Data Exchange (ETDEWEB)

    Radeka, V [Institut Rudjer Boskovic, Zagreb, Yugoslavia (Croatia)

    1962-04-15

    The method of determining the true rate of events generated by a random process based on a counting device and live-time measurement is analysed. The determined rate is basically independent of the counter-resolving time. It is shown that the error caused by the resolving time of an event-to-pulse converter at the input of the system is substantially lower than the actual reduction of the rate by the converter itself. Live-time measurement error is discussed with respect to the application limit of the method. The analysis given may be applied to pulse-height analysers and counters using live-time measurement. The method can simply be realized in pulse-height analysers and counters with electronic timers. (author) [French] L'auteur presente une methode de determination de la frequence reelle d'evenements engendres dans un processus aleatoire, par comptage et mesure du temps mort du compteur. La frequence mesuree ne depend pas du temps de resolution du compteur. L'auteur montre que l'erreur due au temps de resolution du convertisseur evenement-impulsion place a l'entree du dispositif est nettement inferieure a la diminution de frequence qu'entraine la conversion. Il discute l'erreur de la mesure du temps mort en ce qui concerne la limite d'application de la methode. Cette analyse peut egalement s'appliquer aux ensembles selecteur d'amplitudes et compteur d'impulsions. La methode peut etre realisee d'une maniere simple dans des ensembles selecteur d'amplitudes et compteur d'impulsions comportant un chronoscope electronique. (author) [Spanish] El autor analiza un metodo para determinar el ritmo verdadero de los impulsos generados por un proceso aleatorio que se basa en el empleo de un dispositivo de contaje y en la medicion del tiempo eficaz. El ritmo determinado es basicamente independiente del tiempo de resolucion del contador. Demuestra que el error ocasionado en la alimentacion del sistema por el tiempo de resolucion de un convertidor de sucesos en impulsos es

  10. Assessing loss event frequencies of smart grid cyber threats: Encoding flexibility into FAIR using Bayesian network approach

    NARCIS (Netherlands)

    Le, Anhtuan; Chen, Yue; Chai, Kok Keong; Vasenev, Alexandr; Montoya, L.

    Assessing loss event frequencies (LEF) of smart grid cyber threats is essential for planning cost-effective countermeasures. Factor Analysis of Information Risk (FAIR) is a well-known framework that can be applied to consider threats in a structured manner by using look-up tables related to a

  11. Do changes in the frequency, magnitude and timing of extreme climatic events threaten the population viability of coastal birds?

    NARCIS (Netherlands)

    van de Pol, Martijn; Ens, Bruno J.; Heg, Dik; Brouwer, Lyanne; Krol, Johan; Maier, Martin; Exo, Klaus-Michael; Oosterbeek, Kees; Lok, Tamar; Eising, Corine M.; Koffijberg, Kees

    P>1. Climate change encompasses changes in both the means and the extremes of climatic variables, but the population consequences of the latter are intrinsically difficult to study. 2. We investigated whether the frequency, magnitude and timing of rare but catastrophic flooding events have changed

  12. On frequency-weighted coprime factorization based controller reduction

    OpenAIRE

    Varga, Andras

    2003-01-01

    We consider the efficient solution of a class of coprime factorization based controller approximation problems by using frequency-weighted balancing related model reduction approaches. It is shown that for some special stability enforcing frequency-weights, the computation of the frequency-weighted controllability and observability grammians can be done by solving reduced order Lyapunov equations. The new approach can be used in conjunction with accuracy enhancing square-root and balancing-fr...

  13. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  14. Prestress Force Identification for Externally Prestressed Concrete Beam Based on Frequency Equation and Measured Frequencies

    Directory of Open Access Journals (Sweden)

    Luning Shi

    2014-01-01

    Full Text Available A prestress force identification method for externally prestressed concrete uniform beam based on the frequency equation and the measured frequencies is developed. For the purpose of the prestress force identification accuracy, we first look for the appropriate method to solve the free vibration equation of externally prestressed concrete beam and then combine the measured frequencies with frequency equation to identify the prestress force. To obtain the exact solution of the free vibration equation of multispan externally prestressed concrete beam, an analytical model of externally prestressed concrete beam is set up based on the Bernoulli-Euler beam theory and the function relation between prestress variation and vibration displacement is built. The multispan externally prestressed concrete beam is taken as the multiple single-span beams which must meet the bending moment and rotation angle boundary conditions, the free vibration equation is solved using sublevel simultaneous method and the semi-analytical solution of the free vibration equation which considered the influence of prestress on section rigidity and beam length is obtained. Taking simply supported concrete beam and two-span concrete beam with external tendons as examples, frequency function curves are obtained with the measured frequencies into it and the prestress force can be identified using the abscissa of the crosspoint of frequency functions. Identification value of the prestress force is in good agreement with the test results. The method can accurately identify prestress force of externally prestressed concrete beam and trace the trend of effective prestress force.

  15. Silicon-Chip-Based Optical Frequency Combs

    Science.gov (United States)

    2015-10-26

    fiber-based polarization controllers and a polarization beam splitter , and the output power is monitored with a sensitive photodiode. We use a...a single CW laser beam coupled to a microresonators can produce stabilized, octave-spanning combs through highly cascaded four-wave mixing (FWM...resonator designs , the resonator and the coupling waveguide are monolithically integrated. Thus, the entire on-chip configuration of CMOS-compatible

  16. A Kalman-based Fundamental Frequency Estimation Algorithm

    DEFF Research Database (Denmark)

    Shi, Liming; Nielsen, Jesper Kjær; Jensen, Jesper Rindom

    2017-01-01

    Fundamental frequency estimation is an important task in speech and audio analysis. Harmonic model-based methods typically have superior estimation accuracy. However, such methods usually as- sume that the fundamental frequency and amplitudes are station- ary over a short time frame. In this pape...

  17. Do the frequencies of adverse events increase, decrease, or stay the same with long-term use of statins?

    Science.gov (United States)

    Huddy, Karlyn; Dhesi, Pavittarpaul; Thompson, Paul D

    2013-02-01

    Statins are widely used for their cholesterol-lowering properties and proven reduction of cardiovascular disease risk. Many patients take statins as long-term treatment for a variety of conditions without a clear-cut understanding of how treatment duration affects the frequency of adverse effects. We aimed to evaluate whether the frequencies of documented adverse events increase, decrease, or remain unchanged with long-term statin use. We reviewed the established literature to define the currently known adverse effects of statin therapy, including myopathy, central nervous system effects, and the appearance of diabetes, and the frequency of these events with long-term medication use. The frequency of adverse effects associated with long-term statin therapy appears to be low. Many patients who develop side effects from statin therapy do so relatively soon after initiation of therapy, so the frequency of side effects from statin therapy when expressed as a percentage of current users decreases over time. Nevertheless, patients may develop side effects such as muscle pain and weakness years after starting statin therapy; however, the absolute number of patients affected by statin myopathy increases with treatment duration. Also, clinical trials of statin therapy rarely exceed 5 years, so it is impossible to determine with certainty the frequency of long-term side effects with these drugs.

  18. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  19. SUBTLEX- AL: Albanian word frequencies based on film subtitles

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Rrezarta Avdyli

    2013-06-01

    Full Text Available Recently several studies have shown that word frequency estimation based on subtitle files explains better the variance in word recognition performance than traditional words frequency estimates did. The present study aims to show this frequency estimate in Albanian from more than 2M words coming from film subtitles. Our results show high correlation between the RT from a LD study (120 stimuli and the SUBTLEX- AL, as well as, high correlation between this and the unique existing frequency list of a hundred more frequent Albanian words. These findings suggest that SUBTLEX-AL it is good frequency estimation, furthermore, this is the first database of frequency estimation in Albanian larger than 100 words.

  20. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  1. Ontology-based prediction of surgical events in laparoscopic surgery

    Science.gov (United States)

    Katić, Darko; Wekerle, Anna-Laura; Gärtner, Fabian; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2013-03-01

    Context-aware technologies have great potential to help surgeons during laparoscopic interventions. Their underlying idea is to create systems which can adapt their assistance functions automatically to the situation in the OR, thus relieving surgeons from the burden of managing computer assisted surgery devices manually. To this purpose, a certain kind of understanding of the current situation in the OR is essential. Beyond that, anticipatory knowledge of incoming events is beneficial, e.g. for early warnings of imminent risk situations. To achieve the goal of predicting surgical events based on previously observed ones, we developed a language to describe surgeries and surgical events using Description Logics and integrated it with methods from computational linguistics. Using n-Grams to compute probabilities of followup events, we are able to make sensible predictions of upcoming events in real-time. The system was evaluated on professionally recorded and labeled surgeries and showed an average prediction rate of 80%.

  2. Prediction problem for target events based on the inter-event waiting time

    Science.gov (United States)

    Shapoval, A.

    2010-11-01

    In this paper we address the problem of forecasting the target events of a time series given the distribution ξ of time gaps between target events. Strong earthquakes and stock market crashes are the two types of such events that we are focusing on. In the series of earthquakes, as McCann et al. show [W.R. Mc Cann, S.P. Nishenko, L.R. Sykes, J. Krause, Seismic gaps and plate tectonics: seismic potential for major boundaries, Pure and Applied Geophysics 117 (1979) 1082-1147], there are well-defined gaps (called seismic gaps) between strong earthquakes. On the other hand, usually there are no regular gaps in the series of stock market crashes [M. Raberto, E. Scalas, F. Mainardi, Waiting-times and returns in high-frequency financial data: an empirical study, Physica A 314 (2002) 749-755]. For the case of seismic gaps, we analytically derive an upper bound of prediction efficiency given the coefficient of variation of the distribution ξ. For the case of stock market crashes, we develop an algorithm that predicts the next crash within a certain time interval after the previous one. We show that this algorithm outperforms random prediction. The efficiency of our algorithm sets up a lower bound of efficiency for effective prediction of stock market crashes.

  3. Effects of the major sudden stratospheric warming event of 2009 on the subionospheric very low frequency/low frequency radio signals

    Science.gov (United States)

    Pal, S.; Hobara, Y.; Chakrabarti, S. K.; Schnoor, P. W.

    2017-07-01

    This paper presents effects of the major sudden stratospheric warming (SSW) event of 2009 on the subionospheric very low frequency/low frequency (VLF/LF) radio signals propagating in the Earth-ionosphere waveguide. Signal amplitudes from four transmitters received by VLF/LF radio networks of Germany and Japan corresponding to the major SSW event are investigated for possible anomalies and atmospheric influence on the high- to middle-latitude ionosphere. Significant anomalous increase or decrease of nighttime and daytime amplitudes of VLF/LF signals by ˜3-5 dB during the SSW event have been found for all propagation paths associated with stratospheric temperature rise at 10 hPa level. Increase or decrease in VLF/LF amplitudes during daytime and nighttime is actually due to the modification of the lower ionospheric boundary conditions in terms of electron density and electron-neutral collision frequency profiles and associated modal interference effects between the different propagating waveguide modes during the SSW period. TIMED/SABER mission data are also used to investigate the upper mesospheric conditions over the VLF/LF propagation path during the same time period. We observe a decrease in neutral temperature and an increase in pressure at the height of 75-80 km around the peak time of the event. VLF/LF anomalies are correlated and in phase with the stratospheric temperature and mesospheric pressure variation, while minimum of mesospheric cooling shows a 2-3 day delay with maximum VLF/LF anomalies. Simulations of VLF/LF diurnal variation are performed using the well-known Long Wave Propagating Capability (LWPC) code within the Earth-ionosphere waveguide to explain the VLF/LF anomalies qualitatively.

  4. Rates for parallax-shifted microlensing events from ground-based observations of the galactic bulge

    International Nuclear Information System (INIS)

    Buchalter, A.; Kamionkowski, M.

    1997-01-01

    The parallax effect in ground-based microlensing (ML) observations consists of a distortion to the standard ML light curve arising from the Earth's orbital motion. This can be used to partially remove the degeneracy among the system parameters in the event timescale, t 0 . In most cases, the resolution in current ML surveys is not accurate enough to observe this effect, but parallax could conceivably be detected with frequent follow-up observations of ML events in progress, providing the photometric errors are small enough. We calculate the expected fraction of ML events where the shape distortions will be observable by such follow-up observations, adopting Galactic models for the lens and source distributions that are consistent with observed microlensing timescale distributions. We study the dependence of the rates for parallax-shifted events on the frequency of follow-up observations and on the precision of the photometry. For example, we find that for hourly observations with typical photometric errors of 0.01 mag, 6% of events where the lens is in the bulge, and 31% of events where the lens is in the disk (or ∼10% of events overall), will give rise to a measurable parallax shift at the 95% confidence level. These fractions may be increased by improved photometric accuracy and increased sampling frequency. While long-duration events are favored, the surveys would be effective in picking out such distortions in events with timescales as low as t 0 ∼20 days. We study the dependence of these fractions on the assumed disk mass function and find that a higher parallax incidence is favored by mass functions with higher mean masses. Parallax measurements yield the reduced transverse speed, v, which gives both the relative transverse speed and lens mass as a function of distance. We give examples of the accuracies with which v may be measured in typical parallax events. (Abstract Truncated)

  5. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  6. Preventing Medication Error Based on Knowledge Management Against Adverse Event

    OpenAIRE

    Hastuti, Apriyani Puji; Nursalam, Nursalam; Triharini, Mira

    2017-01-01

    Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE) reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE). Methods: This study consisted of two sta...

  7. A ROOT based event display software for JUNO

    Science.gov (United States)

    You, Z.; Li, K.; Zhang, Y.; Zhu, J.; Lin, T.; Li, W.

    2018-02-01

    An event display software SERENA has been designed for the Jiangmen Underground Neutrino Observatory (JUNO). The software has been developed in the JUNO offline software system and is based on the ROOT display package EVE. It provides an essential tool to display detector and event data for better understanding of the processes in the detectors. The software has been widely used in JUNO detector optimization, simulation, reconstruction and physics study.

  8. Abstracting event-based control models for high autonomy systems

    Science.gov (United States)

    Luh, Cheng-Jye; Zeigler, Bernard P.

    1993-01-01

    A high autonomy system needs many models on which to base control, management, design, and other interventions. These models differ in level of abstraction and in formalism. Concepts and tools are needed to organize the models into a coherent whole. The paper deals with the abstraction processes for systematic derivation of related models for use in event-based control. The multifaceted modeling methodology is briefly reviewed. The morphism concepts needed for application to model abstraction are described. A theory for supporting the construction of DEVS models needed for event-based control is then presented. An implemented morphism on the basis of this theory is also described.

  9. Seasonal variability of stream water quality response to storm events captured using high-frequency and multi-parameter data

    Science.gov (United States)

    Fovet, O.; Humbert, G.; Dupas, R.; Gascuel-Odoux, C.; Gruau, G.; Jaffrezic, A.; Thelusma, G.; Faucheux, M.; Gilliet, N.; Hamon, Y.; Grimaldi, C.

    2018-04-01

    The response of stream chemistry to storm is of major interest for understanding the export of dissolved and particulate species from catchments. The related challenge is the identification of active hydrological flow paths during these events and of the sources of chemical elements for which these events are hot moments of exports. An original four-year data set that combines high frequency records of stream flow, turbidity, nitrate and dissolved organic carbon concentrations, and piezometric levels was used to characterize storm responses in a headwater agricultural catchment. The data set was used to test to which extend the shallow groundwater was impacting the variability of storm responses. A total of 177 events were described using a set of quantitative and functional descriptors related to precipitation, stream and groundwater pre-event status and event dynamics, and to the relative dynamics between water quality parameters and flow via hysteresis indices. This approach led to identify different types of response for each water quality parameter which occurrence can be quantified and related to the seasonal functioning of the catchment. This study demonstrates that high-frequency records of water quality are precious tools to study/unique in their ability to emphasize the variability of catchment storm responses.

  10. Effect of lunar phase on frequency of psychogenic nonepileptic events in the EMU.

    Science.gov (United States)

    Bolen, Robert D; Campbell, Zeke; Dennis, William A; Koontz, Elizabeth H; Pritchard, Paul B

    2016-06-01

    Studies of the effect of a full moon on seizures have yielded mixed results, despite a continuing prevailing belief regarding the association of lunar phase with human behavior. The potential effect of a full moon on psychogenic nonepileptic events has not been as well studied, despite what anecdotal accounts from most epilepsy monitoring unit (EMU) staff would suggest. We obtained the dates and times of all events from patients diagnosed with psychogenic nonepileptic events discharged from our EMU over a two-year period. The events were then plotted on a 29.5-day lunar calendar. Events were also broken down into lunar quarters for statistical analysis. We found a statistically significant increase in psychogenic nonepileptic events during the new moon quarter in our EMU during our studied timeframe. Our results are not concordant with the results of a similarly designed past study, raising the possibility that psychogenic nonepileptic events are not influenced by lunar phase. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Frequency Based Real-time Pricing for Residential Prosumers

    Science.gov (United States)

    Hambridge, Sarah Mabel

    This work is the first to explore frequency based pricing for secondary frequency control as a price-reactive control mechanism for residential prosumers. A frequency based real-time electricity rate is designed as an autonomous market control mechanism for residential prosumers to provide frequency support as an ancillary service. In addition, prosumers are empowered to participate in dynamic energy transactions, therefore integrating Distributed Energy Resources (DERs), and increasing distributed energy storage onto the distributed grid. As the grid transitions towards DERs, a new market based control system will take the place of the legacy distributed system and possibly the legacy bulk power system. DERs provide many benefits such as energy independence, clean generation, efficiency, and reliability to prosumers during blackouts. However, the variable nature of renewable energy and current lack of installed energy storage on the grid will create imbalances in supply and demand as uptake increases, affecting the grid frequency and system operation. Through a frequency-based electricity rate, prosumers will be encouraged to purchase energy storage systems (ESS) to offset their neighbor's distributed generation (DG) such as solar. Chapter 1 explains the deregulation of the power system and move towards Distributed System Operators (DSOs), as prosumers become owners of microgrids and energy cells connected to the distributed system. Dynamic pricing has been proposed as a benefit to prosumers, giving them the ability to make decisions in the energy market, while also providing a way to influence and control their behavior. Frequency based real-time pricing is a type of dynamic pricing which falls between price-reactive control and transactive control. Prosumer-to-prosumer transactions may take the place of prosumer-to-utility transactions, building The Energy Internet. Frequency based pricing could be a mechanism for determining prosumer prices and supporting

  12. Event-based Sensing for Space Situational Awareness

    Science.gov (United States)

    Cohen, G.; Afshar, S.; van Schaik, A.; Wabnitz, A.; Bessell, T.; Rutten, M.; Morreale, B.

    A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatiotemporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and nighttime (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An eventbased sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding

  13. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  14. CMS DAQ Event Builder Based on Gigabit Ethernet

    CERN Document Server

    Bauer, G; Branson, J; Brett, A; Cano, E; Carboni, A; Ciganek, M; Cittolin, S; Erhan, S; Gigi, D; Glege, F; Gómez-Reino, Robert; Gulmini, M; Gutiérrez-Mlot, E; Gutleber, J; Jacobs, C; Kim, J C; Klute, M; Lipeles, E; Lopez-Perez, Juan Antonio; Maron, G; Meijers, F; Meschi, E; Moser, R; Murray, S; Oh, A; Orsini, L; Paus, C; Petrucci, A; Pieri, M; Pollet, L; Rácz, A; Sakulin, H; Sani, M; Schieferdecker, P; Schwick, C; Sumorok, K; Suzuki, I; Tsirigkas, D; Varela, J

    2007-01-01

    The CMS Data Acquisition System is designed to build and filter events originating from 476 detector data sources at a maximum trigger rate of 100 KHz. Different architectures and switch technologies have been evaluated to accomplish this purpose. Events will be built in two stages: the first stage will be a set of event builders called FED Builders. These will be based on Myrinet technology and will pre-assemble groups of about 8 data sources. The second stage will be a set of event builders called Readout Builders. These will perform the building of full events. A single Readout Builder will build events from 72 sources of 16 KB fragments at a rate of 12.5 KHz. In this paper we present the design of a Readout Builder based on TCP/IP over Gigabit Ethernet and the optimization that was required to achieve the design throughput. This optimization includes architecture of the Readout Builder, the setup of TCP/IP, and hardware selection.

  15. OBEST: The Object-Based Event Scenario Tree Methodology

    International Nuclear Information System (INIS)

    WYSS, GREGORY D.; DURAN, FELICIA A.

    2001-01-01

    Event tree analysis and Monte Carlo-based discrete event simulation have been used in risk assessment studies for many years. This report details how features of these two methods can be combined with concepts from object-oriented analysis to develop a new risk assessment methodology with some of the best features of each. The resultant Object-Based Event Scenarios Tree (OBEST) methodology enables an analyst to rapidly construct realistic models for scenarios for which an a priori discovery of event ordering is either cumbersome or impossible (especially those that exhibit inconsistent or variable event ordering, which are difficult to represent in an event tree analysis). Each scenario produced by OBEST is automatically associated with a likelihood estimate because probabilistic branching is integral to the object model definition. The OBEST method uses a recursive algorithm to solve the object model and identify all possible scenarios and their associated probabilities. Since scenario likelihoods are developed directly by the solution algorithm, they need not be computed by statistical inference based on Monte Carlo observations (as required by some discrete event simulation methods). Thus, OBEST is not only much more computationally efficient than these simulation methods, but it also discovers scenarios that have extremely low probabilities as a natural analytical result--scenarios that would likely be missed by a Monte Carlo-based method. This report documents the OBEST methodology, the demonstration software that implements it, and provides example OBEST models for several different application domains, including interactions among failing interdependent infrastructure systems, circuit analysis for fire risk evaluation in nuclear power plants, and aviation safety studies

  16. The neural bases of spatial frequency processing during scene perception

    Science.gov (United States)

    Kauffmann, Louise; Ramanoël, Stephen; Peyrin, Carole

    2014-01-01

    Theories on visual perception agree that scenes are processed in terms of spatial frequencies. Low spatial frequencies (LSF) carry coarse information whereas high spatial frequencies (HSF) carry fine details of the scene. However, how and where spatial frequencies are processed within the brain remain unresolved questions. The present review addresses these issues and aims to identify the cerebral regions differentially involved in low and high spatial frequency processing, and to clarify their attributes during scene perception. Results from a number of behavioral and neuroimaging studies suggest that spatial frequency processing is lateralized in both hemispheres, with the right and left hemispheres predominantly involved in the categorization of LSF and HSF scenes, respectively. There is also evidence that spatial frequency processing is retinotopically mapped in the visual cortex. HSF scenes (as opposed to LSF) activate occipital areas in relation to foveal representations, while categorization of LSF scenes (as opposed to HSF) activates occipital areas in relation to more peripheral representations. Concomitantly, a number of studies have demonstrated that LSF information may reach high-order areas rapidly, allowing an initial coarse parsing of the visual scene, which could then be sent back through feedback into the occipito-temporal cortex to guide finer HSF-based analysis. Finally, the review addresses spatial frequency processing within scene-selective regions areas of the occipito-temporal cortex. PMID:24847226

  17. Integrated analyzing method for the progress event based on subjects and predicates in events

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Munesawa, Yoshiomi

    2014-01-01

    It is expected to make use of the knowledge that was extracted by analyzing the mistakes of the past to prevent recurrence of accidents. Currently main analytic style is an analytic style that experts decipher deeply the accident cases, but cross-analysis has come to an end with extracting the common factors in the accident cases. We propose an integrated analyzing method for progress events to analyze among accidents in this study. Our method realized the integration of many accident cases by the integration connecting the common keyword called as 'Subject' or 'Predicate' that are extracted from each progress event in accident cases or near-miss cases. Our method can analyze and visualize the partial risk identification and the frequency to cause accidents and the risk assessment from the data integrated accident cases. The result of applying our method to PEC-SAFER accident cases identified 8 hazardous factors which can be caused from tank again, and visualized the high frequent factors that the first factor was damage of tank 26% and the second factor was the corrosion 21%, and visualized the high risks that the first risk was the damage 3.3 x 10 -2 [risk rank / year] and the second risk was the destroy 2.5 x 10 -2 [risk rank / year]. (author)

  18. Shallow very-low-frequency earthquakes accompanied with slow slip event along the plate boundary of the Nankai trough

    Science.gov (United States)

    Nakano, M.; Hori, T.; Araki, E.; Kodaira, S.; Ide, S.

    2017-12-01

    Recent improvements of seismic and geodetic observations have revealed the existence of a new family of slow earthquakes occurring along or close to the plate boundary worldwide. In the viewpoint of the characteristic time scales, the slow earthquakes can be classified into several groups as low-frequency tremor or tectonic tremor (LFT) dominated in several hertz, very-low-frequency earthquake (VLFE) dominated in 10 to 100 s, and short- and long-term slow-slip event (SSE) with durations of days to years. In many cases, these slow earthquakes are accompanied with other types of slow events. However, the events occurring offshore, especially beneath the toe of accretionary prism, are poorly understood because of the difficulty to detect signals. Utilizing the data captured from oceanfloor observation networks which many efforts have recently been taken to develop is necessary to improve our understandings for these events. Here, we investigated CMT analysis of shallow VLFEs using data obtained from DONET oceanfloor observation networks along the Nankai trough, southwest of Japan. We found that shallow VLFEs have almost identical history of moment release with that of synchronous SSE which occurred at the same region recently found by Araki et al. (2017). VLFE sources show updip migrations during the activity, coincident with the migration of SSE source. From these findings we conclude that these slow events share the same fault slip, and VLFE represent high-frequency fluctuations of slip during SSE. This result imply that shallow SSE along the plate interface would have occurred in the background during the shallow VLFE activities repeatedly observed along the Nankai trough, but the SSE was not reported because of difficult detections.

  19. Central FPGA-based destination and load control in the LHCb MHz event readout

    Science.gov (United States)

    Jacobsson, R.

    2012-10-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  20. Central FPGA-based destination and load control in the LHCb MHz event readout

    International Nuclear Information System (INIS)

    Jacobsson, R.

    2012-01-01

    The readout strategy of the LHCb experiment is based on complete event readout at 1 MHz. A set of 320 sub-detector readout boards transmit event fragments at total rate of 24.6 MHz at a bandwidth usage of up to 70 GB/s over a commercial switching network based on Gigabit Ethernet to a distributed event building and high-level trigger processing farm with 1470 individual multi-core computer nodes. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. An FPGA-based central master module, partly operating at the LHC bunch clock frequency of 40.08 MHz and partly at a double clock speed, is in charge of the entire trigger and readout control from the front-end electronics up to the high-level trigger farm. One FPGA is dedicated to controlling the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load control and trigger rate regulation as a function of the global farm load. It also allows the vital task of fast central monitoring and automatic recovery in-flight of failing nodes while maintaining dead-time and event loss at a minimum. This paper demonstrates the strength and suitability of implementing this real-time task for a very large distributed system in an FPGA where no random delays are introduced, and where extreme reliability and accurate event accounting are fundamental requirements. It was in use during the entire commissioning phase of LHCb and has been in faultless operation during the first two years of physics luminosity data taking.

  1. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    Science.gov (United States)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  2. Cooperative Game Study of Airlines Based on Flight Frequency Optimization

    Directory of Open Access Journals (Sweden)

    Wanming Liu

    2014-01-01

    Full Text Available By applying the game theory, the relationship between airline ticket price and optimal flight frequency is analyzed. The paper establishes the payoff matrix of the flight frequency in noncooperation scenario and flight frequency optimization model in cooperation scenario. The airline alliance profit distribution is converted into profit distribution game based on the cooperation game theory. The profit distribution game is proved to be convex, and there exists an optimal distribution strategy. The results show that joining the airline alliance can increase airline whole profit, the change of negotiated prices and cost is beneficial to profit distribution of large airlines, and the distribution result is in accordance with aviation development.

  3. Frequency Estimator Performance for a Software-Based Beacon Receiver

    Science.gov (United States)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.; Miranda, Felix

    2014-01-01

    As propagation terminals have evolved, their design has trended more toward a software-based approach that facilitates convenient adjustment and customization of the receiver algorithms. One potential improvement is the implementation of a frequency estimation algorithm, through which the primary frequency component of the received signal can be estimated with a much greater resolution than with a simple peak search of the FFT spectrum. To select an estimator for usage in a QV-band beacon receiver, analysis of six frequency estimators was conducted to characterize their effectiveness as they relate to beacon receiver design.

  4. Tracking the time course of word-frequency effects in auditory word recognition with event-related potentials.

    Science.gov (United States)

    Dufour, Sophie; Brunellière, Angèle; Frauenfelder, Ulrich H

    2013-04-01

    Although the word-frequency effect is one of the most established findings in spoken-word recognition, the precise processing locus of this effect is still a topic of debate. In this study, we used event-related potentials (ERPs) to track the time course of the word-frequency effect. In addition, the neighborhood density effect, which is known to reflect mechanisms involved in word identification, was also examined. The ERP data showed a clear frequency effect as early as 350 ms from word onset on the P350, followed by a later effect at word offset on the late N400. A neighborhood density effect was also found at an early stage of spoken-word processing on the PMN, and at word offset on the late N400. Overall, our ERP differences for word frequency suggest that frequency affects the core processes of word identification starting from the initial phase of lexical activation and including target word selection. They thus rule out any interpretation of the word frequency effect that is limited to a purely decisional locus after word identification has been completed. Copyright © 2012 Cognitive Science Society, Inc.

  5. Microresonator-Based Optical Frequency Combs: A Time Domain Perspective

    Science.gov (United States)

    2016-04-19

    AFRL-AFOSR-VA-TR-2016-0165 (BRI) Microresonator-Based Optical Frequency Combs: A Time Domain Perspective Andrew Weiner PURDUE UNIVERSITY 401 SOUTH...Optical Frequency Combs: A Time Domain Perspective 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-12-1-0236 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data

  6. Frequency of Testing for Dyslipidemia: An Evidence-Based Analysis

    Science.gov (United States)

    2014-01-01

    Background Dyslipidemias include high levels of total cholesterol, low-density lipoprotein (LDL) cholesterol, and triglycerides and low levels of high-density lipoprotein (HDL) cholesterol. Dyslipidemia is a risk factor for cardiovascular disease, which is a major contributor to mortality in Canada. Approximately 23% of the 2009/11 Canadian Health Measures Survey (CHMS) participants had a high level of LDL cholesterol, with prevalence increasing with age, and approximately 15% had a total cholesterol to HDL ratio above the threshold. Objectives To evaluate the frequency of lipid testing in adults not diagnosed with dyslipidemia and in adults on treatment for dyslipidemia. Research Methods A systematic review of the literature set out to identify randomized controlled trials (RCTs), systematic reviews, health technology assessments (HTAs), and observational studies published between January 1, 2000, and November 29, 2012, that evaluated the frequency of testing for dyslipidemia in the 2 populations. Results Two observational studies assessed the frequency of lipid testing, 1 in individuals not on lipid-lowering medications and 1 in treated individuals. Both studies were based on previously collected data intended for a different objective and, therefore, no conclusions could be reached about the frequency of testing at intervals other than the ones used in the original studies. Given this limitation and generalizability issues, the quality of evidence was considered very low. No evidence for the frequency of lipid testing was identified in the 2 HTAs included. Canadian and international guidelines recommend testing for dyslipidemia in individuals at an increased risk for cardiovascular disease. The frequency of testing recommended is based on expert consensus. Conclusions Conclusions on the frequency of lipid testing could not be made based on the 2 observational studies. Current guidelines recommend lipid testing in adults with increased cardiovascular risk, with

  7. Analysis of core damage frequency from internal events: Surry, Unit 1

    International Nuclear Information System (INIS)

    Harper, F.T.

    1986-11-01

    This document contains the accident sequence analyses for Surry, Unit 1; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission (NRC). NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Surry, Unit 1, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provide additional insights regarding the dominant contributors to the Surry core damage frequency estimate. The numerical results are driven to some degree by modeling assumptions and data selection for issues such as reactor coolant pump seal LOCAs, common cause failure probabilities, and plant response to station blackout and loss of electrical bust initiators. The sensitivity studies explore the impact of alternate theories and data on these issues

  8. Frequency of adverse events in plateletpheresis donors in regional transfusion centre in North India.

    Science.gov (United States)

    Patidar, Gopal Kumar; Sharma, Ratti Ram; Marwaha, Neelam

    2013-10-01

    Although automated cell separators have undergone a lot of technical refinements, attention has been focused on the quality of platelet concentrates than on donor safety. We planned this prospective study to look into donor safety aspect by studying adverse events in normal healthy plateletpheresis donors. The study included 500 healthy, first-time (n=301) and repeat (n=199) plateletpheresis donors after informed consent. The plateletpheresis procedures were performed on Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. The adverse events during procedure were recorded and classified according to their nature. The pre and post procedure hematological and biochemical profiles of these donors were also assessed with the help of automated cell counter and analyser respectively. A total of 18% (n=90) adverse events were recorded in 500 plateletpheresis donors, of which 9% of were hypocalcaemia in nature followed by hematoma (7.4%), vasovagal reaction (0.8%) and kit related adverse events in (0.8%). There was significant post procedure drop in Hb, Hct, platelet count of the donors (padverse events in Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. Donor reactions can adversely affect the voluntary donor recruitment strategies to increase the public awareness regarding constant need for blood and blood products. Commonly observed adverse events in plateletpheresis donors were hypocalcemia, hematoma formation and vasovagal reactions which can be prevented by pre-donation education of the donors and change of machine configuration. Nevertheless, more prospective studies on this aspect are required in order to establish guidelines for donor safety in apheresis and also to help in assessing donor suitability, especially given the present trend of double product apheresis collections. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  10. Separation of musical instruments based on amplitude and frequency comodulation

    Science.gov (United States)

    Jacobson, Barry D.; Cauwenberghs, Gert; Quatieri, Thomas F.

    2002-05-01

    In previous work, amplitude comodulation was investigated as a basis for monaural source separation. Amplitude comodulation refers to similarities in amplitude envelopes of individual spectral components emitted by particular types of sources. In many types of musical instruments, amplitudes of all resonant modes rise/fall, and start/stop together during the course of normal playing. We found that under certain well-defined conditions, a mixture of constant frequency, amplitude comodulated sources can unambiguously be decomposed into its constituents on the basis of these similarities. In this work, system performance was improved by relaxing the constant frequency requirement. String instruments, for example, which are normally played with vibrato, are both amplitude and frequency comodulated sources, and could not be properly tracked under the constant frequency assumption upon which our original algorithm was based. Frequency comodulation refers to similarities in frequency variations of individual harmonics emitted by these types of sources. The analytical difficulty is in defining a representation of the source which properly tracks frequency varying components. A simple, fixed filter bank can only track an individual spectral component for the duration in which it is within the passband of one of the filters. Alternatives are therefore explored which are amenable to real-time implementation.

  11. Rocchio-based relevance feedback in video event retrieval

    NARCIS (Netherlands)

    Pingen, G.L.J.; de Boer, M.H.T.; Aly, Robin; Amsaleg, Laurent; Guðmundsson, Gylfi Þór; Gurrin, Cathal; Jónsson, Björn Þór; Satoh, Shin’ichi

    This paper investigates methods for user and pseudo relevance feedback in video event retrieval. Existing feedback methods achieve strong performance but adjust the ranking based on few individual examples. We propose a relevance feedback algorithm (ARF) derived from the Rocchio method, which is a

  12. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  13. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  14. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  15. Event-based historical value-at-risk

    NARCIS (Netherlands)

    Hogenboom, F.P.; Winter, Michael; Hogenboom, A.C.; Jansen, Milan; Frasincar, F.; Kaymak, U.

    2012-01-01

    Value-at-Risk (VaR) is an important tool to assess portfolio risk. When calculating VaR based on historical stock return data, we hypothesize that this historical data is sensitive to outliers caused by news events in the sampled period. In this paper, we research whether the VaR accuracy can be

  16. A spatial and nonstationary model for the frequency of extreme rainfall events

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Madsen, Henrik; Rosbjerg, Dan

    2013-01-01

    of extreme rainfall events, a statistical model is tested for this purpose. The model is built on the theory of generalized linear models and uses Poisson regression solved by generalized estimation equations. Spatial and temporal explanatory variables can be included simultaneously, and their relative...

  17. Core damage frequency (reactor design) perspectives based on IPE results

    International Nuclear Information System (INIS)

    Camp, A.L.; Dingman, S.E.; Forester, J.A.

    1996-01-01

    This paper provides perspectives gained from reviewing 75 Individual Plant Examination (IPE) submittals covering 108 nuclear power plant units. Variability both within and among reactor types is examined to provide perspectives regarding plant-specific design and operational features, and C, modeling assumptions that play a significant role in the estimates of core damage frequencies in the IPEs. Human actions found to be important in boiling water reactors (BWRs) and in pressurized water reactors (PWRs) are presented and the events most frequently found important are discussed

  18. Acoustic frequency filter based on anisotropic topological phononic crystals

    KAUST Repository

    Chen, Zeguo

    2017-11-02

    We present a design of acoustic frequency filter based on a two-dimensional anisotropic phononic crystal. The anisotropic band structure exhibits either a directional or a combined (global + directional) bandgap at certain frequency regions, depending on the geometry. When the time-reversal symmetry is broken, it may introduce a topologically nontrivial bandgap. The induced nontrivial bandgap and the original directional bandgap result in various interesting wave propagation behaviors, such as frequency filter. We develop a tight-binding model to characterize the effective Hamiltonian of the system, from which the contribution of anisotropy is explicitly shown. Different from the isotropic cases, the Zeeman-type splitting is not linear and the anisotropic bandgap makes it possible to achieve anisotropic propagation characteristics along different directions and at different frequencies.

  19. Acoustic frequency filter based on anisotropic topological phononic crystals

    KAUST Repository

    Chen, Zeguo; Zhao, Jiajun; Mei, Jun; Wu, Ying

    2017-01-01

    We present a design of acoustic frequency filter based on a two-dimensional anisotropic phononic crystal. The anisotropic band structure exhibits either a directional or a combined (global + directional) bandgap at certain frequency regions, depending on the geometry. When the time-reversal symmetry is broken, it may introduce a topologically nontrivial bandgap. The induced nontrivial bandgap and the original directional bandgap result in various interesting wave propagation behaviors, such as frequency filter. We develop a tight-binding model to characterize the effective Hamiltonian of the system, from which the contribution of anisotropy is explicitly shown. Different from the isotropic cases, the Zeeman-type splitting is not linear and the anisotropic bandgap makes it possible to achieve anisotropic propagation characteristics along different directions and at different frequencies.

  20. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  1. Event-Based Stabilization over Networks with Transmission Delays

    Directory of Open Access Journals (Sweden)

    Xiangyu Meng

    2012-01-01

    Full Text Available This paper investigates asymptotic stabilization for linear systems over networks based on event-driven communication. A new communication logic is proposed to reduce the feedback effort, which has some advantages over traditional ones with continuous feedback. Considering the effect of time-varying transmission delays, the criteria for the design of both the feedback gain and the event-triggering mechanism are derived to guarantee the stability and performance requirements. Finally, the proposed techniques are illustrated by an inverted pendulum system and a numerical example.

  2. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Attitudes of Consumers from the Sarajevo Canton in Bosnia and Herzegovina toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Izet Bajramovic

    2018-04-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Sarajevo consumers toward advertising through sport among. The sample included 358 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.00. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  4. Attitudes of Consumers from the Mostar Canton in Bosnia and Herzegovina toward Advertising through Sport among the Frequency of Watching Sports Events

    Directory of Open Access Journals (Sweden)

    Marina Vukotic

    2018-04-01

    Full Text Available It is proposed that potential consumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Mostar consumers toward advertising through sport among. The sample included 228 respondents, divided into six subsample groups: consumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modelled by seven-point Likert scale. The results of the measuring were analysed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Based on the statistical analyses it was found that significant differences occur at multivariate level, as well as between all three variables at a significance level of p=.006. Hence, it is interesting to highlight that it was found there are significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. These results are so important for the marketers, mostly due to the reason they can’t merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this is the case in previous investigations and this observation presents relevant information.

  5. Carbon nanotube transistor based high-frequency electronics

    Science.gov (United States)

    Schroter, Michael

    At the nanoscale carbon nanotubes (CNTs) have higher carrier mobility and carrier velocity than most incumbent semiconductors. Thus CNT based field-effect transistors (FETs) are being considered as strong candidates for replacing existing MOSFETs in digital applications. In addition, the predicted high intrinsic transit frequency and the more recent finding of ways to achieve highly linear transfer characteristics have inspired investigations on analog high-frequency (HF) applications. High linearity is extremely valuable for an energy efficient usage of the frequency spectrum, particularly in mobile communications. Compared to digital applications, the much more relaxed constraints for CNT placement and lithography combined with already achieved operating frequencies of at least 10 GHz for fabricated devices make an early entry in the low GHz HF market more feasible than in large-scale digital circuits. Such a market entry would be extremely beneficial for funding the development of production CNTFET based process technology. This talk will provide an overview on the present status and feasibility of HF CNTFET technology will be given from an engineering point of view, including device modeling, experimental results, and existing roadblocks. Carbon nanotube transistor based high-frequency electronics.

  6. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  7. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  8. System risk evolution analysis and risk critical event identification based on event sequence diagram

    International Nuclear Information System (INIS)

    Luo, Pengcheng; Hu, Yang

    2013-01-01

    During system operation, the environmental, operational and usage conditions are time-varying, which causes the fluctuations of the system state variables (SSVs). These fluctuations change the accidents’ probabilities and then result in the system risk evolution (SRE). This inherent relation makes it feasible to realize risk control by monitoring the SSVs in real time, herein, the quantitative analysis of SRE is essential. Besides, some events in the process of SRE are critical to system risk, because they act like the “demarcative points” of safety and accident, and this characteristic makes each of them a key point of risk control. Therefore, analysis of SRE and identification of risk critical events (RCEs) are remarkably meaningful to ensure the system to operate safely. In this context, an event sequence diagram (ESD) based method of SRE analysis and the related Monte Carlo solution are presented; RCE and risk sensitive variable (RSV) are defined, and the corresponding identification methods are also proposed. Finally, the proposed approaches are exemplified with an accident scenario of an aircraft getting into the icing region

  9. Event-Based User Classification in Weibo Media

    Directory of Open Access Journals (Sweden)

    Liang Guo

    2014-01-01

    Full Text Available Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  10. Event-based user classification in Weibo media.

    Science.gov (United States)

    Guo, Liang; Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately.

  11. DYNAMIC AUTHORIZATION BASED ON THE HISTORY OF EVENTS

    Directory of Open Access Journals (Sweden)

    Maxim V. Baklanovsky

    2016-11-01

    Full Text Available The new paradigm in the field of access control systems with fuzzy authorization is proposed. Let there is a set of objects in a single data transmissionnetwork. The goal is to develop dynamic authorization protocol based on correctness of presentation of events (news occurred earlier in the network. We propose mathematical method that keeps compactly the history of events, neglects more distant and least-significant events, composes and verifies authorization data. The history of events is represented as vectors of numbers. Each vector is multiplied by several stochastic vectors. The result is known that if vectors of events are sparse, then by solving the problem of -optimization they can be restored with high accuracy. Results of experiments for vectors restoring have shown that the greater the number of stochastic vectors is, the better accuracy of restored vectors is observed. It has been established that the largest absolute components are restored earlier. Access control system with the proposed dynamic authorization method enables to compute fuzzy confidence coefficients in networks with frequently changing set of participants, mesh-networks, multi-agent systems.

  12. Frequency of Extreme Heat Event as a Surrogate Exposure Metric for Examining the Human Health Effects of Climate Change.

    Directory of Open Access Journals (Sweden)

    Crystal Romeo Upperman

    Full Text Available Epidemiological investigation of the impact of climate change on human health, particularly chronic diseases, is hindered by the lack of exposure metrics that can be used as a marker of climate change that are compatible with health data. Here, we present a surrogate exposure metric created using a 30-year baseline (1960-1989 that allows users to quantify long-term changes in exposure to frequency of extreme heat events with near unabridged spatial coverage in a scale that is compatible with national/state health outcome data. We evaluate the exposure metric by decade, seasonality, area of the country, and its ability to capture long-term changes in weather (climate, including natural climate modes. Our findings show that this generic exposure metric is potentially useful to monitor trends in the frequency of extreme heat events across varying regions because it captures long-term changes; is sensitive to the natural climate modes (ENSO events; responds well to spatial variability, and; is amenable to spatial/temporal aggregation, making it useful for epidemiological studies.

  13. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  14. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-04-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  15. Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin

    2018-07-01

    Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.

  16. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, Mitzi L.; Gallagher, D. L.; Whitt, A.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing real-time science related events has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. Panel participation will be used to communicate the problems and lessons learned from these activities over the last three years.

  17. A multispacecraft event study of Pc5 ultralow-frequency waves in the magnetosphere and their external drivers

    International Nuclear Information System (INIS)

    Wang, Chih-Ping; Thorne, Richard; Liu, Terry Z.; Hartinger, Michael D.; Nagai, Tsugunobu

    2017-01-01

    We investigate a quiet time event of magnetospheric Pc5 ultralow-frequency (ULF) waves and their likely external drivers using multiple spacecraft observations. Enhancements of electric and magnetic field perturbations in two narrow frequency bands, 1.5–2 mHz and 3.5–4 mHz, were observed over a large radial distance range from r ~ 5 to 11 RE. During the first half of this event, perturbations were mainly observed in the transverse components and only in the 3.5–4 mHz band. In comparison, enhancements were stronger during the second half in both transverse and compressional components and in both frequency bands. No indication of field line resonances was found for these magnetic field perturbations. Perturbations in these two bands were also observed in the magnetosheath, but not in the solar wind dynamic pressure perturbations. For the first interval, good correlations between the flow perturbations in the magnetosphere and magnetosheath and an indirect signature for Kelvin-Helmholtz (K-H) vortices suggest K-H surface waves as the driver. For the second interval, good correlations are found between the magnetosheath dynamic pressure perturbations, magnetopause deformation, and magnetospheric waves, all in good correspondence to interplanetary magnetic field (IMF) discontinuities. The characteristics of these perturbations can be explained by being driven by foreshock perturbations resulting from these IMF discontinuities. This event shows that even during quiet periods, K-H-unstable magnetopause and ion foreshock perturbations can combine to create a highly dynamic magnetospheric ULF wave environment

  18. Poisson-event-based analysis of cell proliferation.

    Science.gov (United States)

    Summers, Huw D; Wills, John W; Brown, M Rowan; Rees, Paul

    2015-05-01

    A protocol for the assessment of cell proliferation dynamics is presented. This is based on the measurement of cell division events and their subsequent analysis using Poisson probability statistics. Detailed analysis of proliferation dynamics in heterogeneous populations requires single cell resolution within a time series analysis and so is technically demanding to implement. Here, we show that by focusing on the events during which cells undergo division rather than directly on the cells themselves a simplified image acquisition and analysis protocol can be followed, which maintains single cell resolution and reports on the key metrics of cell proliferation. The technique is demonstrated using a microscope with 1.3 μm spatial resolution to track mitotic events within A549 and BEAS-2B cell lines, over a period of up to 48 h. Automated image processing of the bright field images using standard algorithms within the ImageJ software toolkit yielded 87% accurate recording of the manually identified, temporal, and spatial positions of the mitotic event series. Analysis of the statistics of the interevent times (i.e., times between observed mitoses in a field of view) showed that cell division conformed to a nonhomogeneous Poisson process in which the rate of occurrence of mitotic events, λ exponentially increased over time and provided values of the mean inter mitotic time of 21.1 ± 1.2 hours for the A549 cells and 25.0 ± 1.1 h for the BEAS-2B cells. Comparison of the mitotic event series for the BEAS-2B cell line to that predicted by random Poisson statistics indicated that temporal synchronisation of the cell division process was occurring within 70% of the population and that this could be increased to 85% through serum starvation of the cell culture. © 2015 International Society for Advancement of Cytometry.

  19. Low-level contrast statistics of natural images can modulate the frequency of event-related potentials (ERP in humans

    Directory of Open Access Journals (Sweden)

    Masoud Ghodrati

    2016-12-01

    Full Text Available Humans are fast and accurate in categorizing complex natural images. It is, however, unclear what features of visual information are exploited by brain to perceive the images with such speed and accuracy. It has been shown that low-level contrast statistics of natural scenes can explain the variance of amplitude of event-related potentials (ERP in response to rapidly presented images. In this study, we investigated the effect of these statistics on frequency content of ERPs. We recorded ERPs from human subjects, while they viewed natural images each presented for 70 ms. Our results showed that Weibull contrast statistics, as a biologically plausible model, explained the variance of ERPs the best, compared to other image statistics that we assessed. Our time-frequency analysis revealed a significant correlation between these statistics and ERPs’ power within theta frequency band (~3-7 Hz. This is interesting, as theta band is believed to be involved in context updating and semantic encoding. This correlation became significant at ~110 ms after stimulus onset, and peaked at 138 ms. Our results show that not only the amplitude but also the frequency of neural responses can be modulated with low-level contrast statistics of natural images and highlights their potential role in scene perception.

  20. Intelligent Transportation Control based on Proactive Complex Event Processing

    OpenAIRE

    Wang Yongheng; Geng Shaofeng; Li Qian

    2016-01-01

    Complex Event Processing (CEP) has become the key part of Internet of Things (IoT). Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is p...

  1. Deep learning based beat event detection in action movie franchises

    Science.gov (United States)

    Ejaz, N.; Khan, U. A.; Martínez-del-Amor, M. A.; Sparenberg, H.

    2018-04-01

    Automatic understanding and interpretation of movies can be used in a variety of ways to semantically manage the massive volumes of movies data. "Action Movie Franchises" dataset is a collection of twenty Hollywood action movies from five famous franchises with ground truth annotations at shot and beat level of each movie. In this dataset, the annotations are provided for eleven semantic beat categories. In this work, we propose a deep learning based method to classify shots and beat-events on this dataset. The training dataset for each of the eleven beat categories is developed and then a Convolution Neural Network is trained. After finding the shot boundaries, key frames are extracted for each shot and then three classification labels are assigned to each key frame. The classification labels for each of the key frames in a particular shot are then used to assign a unique label to each shot. A simple sliding window based method is then used to group adjacent shots having the same label in order to find a particular beat event. The results of beat event classification are presented based on criteria of precision, recall, and F-measure. The results are compared with the existing technique and significant improvements are recorded.

  2. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  3. FIREDATA, Nuclear Power Plant Fire Event Data Base

    International Nuclear Information System (INIS)

    Wheelis, W.T.

    2001-01-01

    1 - Description of program or function: FIREDATA contains raw fire event data from 1965 through June 1985. These data were obtained from a number of reference sources including the American Nuclear Insurers, Licensee Event Reports, Nuclear Power Experience, Electric Power Research Institute Fire Loss Data and then collated into one database developed in the personal computer database management system, dBASE III. FIREDATA is menu-driven and asks interactive questions of the user that allow searching of the database for various aspects of a fire such as: location, mode of plant operation at the time of the fire, means of detection and suppression, dollar loss, etc. Other features include the capability of searching for single or multiple criteria (using Boolean 'and' or 'or' logical operations), user-defined keyword searches of fire event descriptions, summary displays of fire event data by plant name of calendar date, and options for calculating the years of operating experience for all commercial nuclear power plants from any user-specified date and the ability to display general plant information. 2 - Method of solution: The six database files used to store nuclear power plant fire event information, FIRE, DESC, SUM, OPEXPER, OPEXBWR, and EXPERPWR, are accessed by software to display information meeting user-specified criteria or to perform numerical calculations (e.g., to determine the operating experience of a nuclear plant). FIRE contains specific searchable data relating to each of 354 fire events. A keyword concept is used to search each of the 31 separate entries or fields. DESC contains written descriptions of each of the fire events. SUM holds basic plant information for all plants proposed, under construction, in operation, or decommissioned. This includes the initial criticality and commercial operation dates, the physical location of the plant, and its operating capacity. OPEXPER contains date information and data on how various plant locations are

  4. Single-Event Effects in High-Frequency Linear Amplifiers: Experiment and Analysis

    Science.gov (United States)

    Zeinolabedinzadeh, Saeed; Ying, Hanbin; Fleetwood, Zachary E.; Roche, Nicolas J.-H.; Khachatrian, Ani; McMorrow, Dale; Buchner, Stephen P.; Warner, Jeffrey H.; Paki-Amouzou, Pauline; Cressler, John D.

    2017-01-01

    The single-event transient (SET) response of two different silicon-germanium (SiGe) X-band (8-12 GHz) low noise amplifier (LNA) topologies is fully investigated in this paper. The two LNAs were designed and implemented in 130nm SiGe HBT BiCMOS process technology. Two-photon absorption (TPA) laser pulses were utilized to induce transients within various devices in these LNAs. Impulse response theory is identified as a useful tool for predicting the settling behavior of the LNAs subjected to heavy ion strikes. Comprehensive device and circuit level modeling and simulations were performed to accurately simulate the behavior of the circuits under ion strikes. The simulations agree well with TPA measurements. The simulation, modeling and analysis presented in this paper can be applied for any other circuit topologies for SET modeling and prediction.

  5. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    International Nuclear Information System (INIS)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L.

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs

  6. The high cost of low-frequency events: the anatomy and economics of surgical mishaps.

    Science.gov (United States)

    Couch, N P; Tilney, N L; Rayner, A A; Moore, F D

    1981-03-12

    We conducted a one-year prospective survey to identify adverse outcomes due to error during care in the field of general surgery. We identified 36 such cases among 5612 surgical admissions to the Peter Bent Brigham Hospital, but in 23 cases the initiating mishap had occurred in another hospital before transfer. In two thirds of the cases the mishap was due to an error of commission: an unnecessary, defective or inappropriate operative procedure. Twenty of these patients died in the hospital, and in 11 death was directly attributable to the error. Five of the 16 survivors left the hospital with serious physical impairment. A satisfactory outcome was achieved in only 11 cases (31%). The average hospital stay was 42 days, with the duration ranging from one to 325 days; the total cost for the 36 patients was $1,732,432. We suggest that all hospitals develop comprehensive methods to identify and prevent these costly and unnecessary events.

  7. Event-related potentials reflecting the frequency of unattended spoken words

    DEFF Research Database (Denmark)

    Shtyrov, Yury; Kimppa, Lilli; Pulvermüller, Friedemann

    2011-01-01

    , in passive non-attend conditions, with acoustically matched high- and low-frequency words along with pseudo-words. Using factorial and correlation analyses, we found that already at ~120 ms after the spoken stimulus information was available, amplitude of brain responses was modulated by the words' lexical...... for the most frequent word stimuli, later-on (~270 ms), a more global lexicality effect with bilateral perisylvian sources was found for all stimuli, suggesting faster access to more frequent lexical entries. Our results support the account of word memory traces as interconnected neuronal circuits, and suggest......How are words represented in the human brain and can these representations be qualitatively assessed with respect to their structure and properties? Recent research demonstrates that neurophysiological signatures of individual words can be measured when subjects do not focus their attention...

  8. Portable atomic frequency standard based on coherent population trapping

    Science.gov (United States)

    Shi, Fan; Yang, Renfu; Nian, Feng; Zhang, Zhenwei; Cui, Yongshun; Zhao, Huan; Wang, Nuanrang; Feng, Keming

    2015-05-01

    In this work, a portable atomic frequency standard based on coherent population trapping is designed and demonstrated. To achieve a portable prototype, in the system, a single transverse mode 795nm VCSEL modulated by a 3.4GHz RF source is used as a pump laser which generates coherent light fields. The pump beams pass through a vapor cell containing atom gas and buffer gas. This vapor cell is surrounded by a magnetic shield and placed inside a solenoid which applies a longitudinal magnetic field to lift the Zeeman energy levels' degeneracy and to separate the resonance signal, which has no first-order magnetic field dependence, from the field-dependent resonances. The electrical control system comprises two control loops. The first one locks the laser wavelength to the minimum of the absorption spectrum; the second one locks the modulation frequency and output standard frequency. Furthermore, we designed the micro physical package and realized the locking of a coherent population trapping atomic frequency standard portable prototype successfully. The short-term frequency stability of the whole system is measured to be 6×10-11 for averaging times of 1s, and reaches 5×10-12 at an averaging time of 1000s.

  9. FREQUENCY OF SOLAR-LIKE SYSTEMS AND OF ICE AND GAS GIANTS BEYOND THE SNOW LINE FROM HIGH-MAGNIFICATION MICROLENSING EVENTS IN 2005-2008

    International Nuclear Information System (INIS)

    Gould, A.; Dong, Subo; Gaudi, B. S.; Han, C.

    2010-01-01

    We present the first measurement of the planet frequency beyond the 'snow line', for the planet-to-star mass-ratio interval -4.5 2 N pl )/(d log q d log s) = (0.36±0.15) dex -2 at the mean mass ratio q = 5 x 10 -4 with no discernible deviation from a flat (Oepik's law) distribution in log-projected separation s. The determination is based on a sample of six planets detected from intensive follow-up observations of high-magnification (A>200) microlensing events during 2005-2008. The sampled host stars have a typical mass M host ∼ 0.5 M sun , and detection is sensitive to planets over a range of planet-star-projected separations (s -1 max R E , s max R E ), where R E ∼ 3.5 AU(M host /M sun ) 1/2 is the Einstein radius and s max ∼ (q/10 -4.3 ) 1/3 . This corresponds to deprojected separations roughly three times the 'snow line'. We show that the observations of these events have the properties of a 'controlled experiment', which is what permits measurement of absolute planet frequency. High-magnification events are rare, but the survey-plus-follow-up high-magnification channel is very efficient: half of all high-mag events were successfully monitored and half of these yielded planet detections. The extremely high sensitivity of high-mag events leads to a policy of monitoring them as intensively as possible, independent of whether they show evidence of planets. This is what allows us to construct an unbiased sample. The planet frequency derived from microlensing is a factor 8 larger than the one derived from Doppler studies at factor ∼25 smaller star-planet separations (i.e., periods 2-2000 days). However, this difference is basically consistent with the gradient derived from Doppler studies (when extrapolated well beyond the separations from which it is measured). This suggests a universal separation distribution across 2 dex in planet-star separation, 2 dex in mass ratio, and 0.3 dex in host mass. Finally, if all planetary systems were 'analogs' of the solar

  10. Biometric identification based on novel frequency domain facial asymmetry measures

    Science.gov (United States)

    Mitra, Sinjini; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-03-01

    In the modern world, the ever-growing need to ensure a system's security has spurred the growth of the newly emerging technology of biometric identification. The present paper introduces a novel set of facial biometrics based on quantified facial asymmetry measures in the frequency domain. In particular, we show that these biometrics work well for face images showing expression variations and have the potential to do so in presence of illumination variations as well. A comparison of the recognition rates with those obtained from spatial domain asymmetry measures based on raw intensity values suggests that the frequency domain representation is more robust to intra-personal distortions and is a novel approach for performing biometric identification. In addition, some feature analysis based on statistical methods comparing the asymmetry measures across different individuals and across different expressions is presented.

  11. Short-Period Surface Wave Based Seismic Event Relocation

    Science.gov (United States)

    White-Gaynor, A.; Cleveland, M.; Nyblade, A.; Kintner, J. A.; Homman, K.; Ammon, C. J.

    2017-12-01

    Accurate and precise seismic event locations are essential for a broad range of geophysical investigations. Superior location accuracy generally requires calibration with ground truth information, but superb relative location precision is often achievable independently. In explosion seismology, low-yield explosion monitoring relies on near-source observations, which results in a limited number of observations that challenges our ability to estimate any locations. Incorporating more distant observations means relying on data with lower signal-to-noise ratios. For small, shallow events, the short-period (roughly 1/2 to 8 s period) fundamental-mode and higher-mode Rayleigh waves (including Rg) are often the most stable and visible portion of the waveform at local distances. Cleveland and Ammon [2013] have shown that teleseismic surface waves are valuable observations for constructing precise, relative event relocations. We extend the teleseismic surface wave relocation method, and apply them to near-source distances using Rg observations from the Bighorn Arche Seismic Experiment (BASE) and the Earth Scope USArray Transportable Array (TA) seismic stations. Specifically, we present relocation results using short-period fundamental- and higher-mode Rayleigh waves (Rg) in a double-difference relative event relocation for 45 delay-fired mine blasts and 21 borehole chemical explosions. Our preliminary efforts are to explore the sensitivity of the short-period surface waves to local geologic structure, source depth, explosion magnitude (yield), and explosion characteristics (single-shot vs. distributed source, etc.). Our results show that Rg and the first few higher-mode Rayleigh wave observations can be used to constrain the relative locations of shallow low-yield events.

  12. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  13. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  14. High frequency electromechanical memory cells based on telescoping carbon nanotubes.

    Science.gov (United States)

    Popov, A M; Lozovik, Y E; Kulish, A S; Bichoutskaia, E

    2010-07-01

    A new method to increase the operational frequency of electromechanical memory cells based on the telescoping motion of multi-walled carbon nanotubes through the selection of the form of the switching voltage pulse is proposed. The relative motion of the walls of carbon nanotubes can be controlled through the shape of the interwall interaction energy surface. This allows the use of the memory cells in nonvolatile or volatile regime, depending on the structure of carbon nanotube. Simulations based on ab initio and semi-empirical calculations of the interwall interaction energies are used to estimate the switching voltage and the operational frequency of volatile cells with the electrodes made of carbon nanotubes. The lifetime of nonvolatile memory cells is also predicted.

  15. Order Tracking Based on Robust Peak Search Instantaneous Frequency Estimation

    International Nuclear Information System (INIS)

    Gao, Y; Guo, Y; Chi, Y L; Qin, S R

    2006-01-01

    Order tracking plays an important role in non-stationary vibration analysis of rotating machinery, especially to run-up or coast down. An instantaneous frequency estimation (IFE) based order tracking of rotating machinery is introduced. In which, a peak search algorithms of spectrogram of time-frequency analysis is employed to obtain IFE of vibrations. An improvement to peak search is proposed, which can avoid strong non-order components or noises disturbing to the peak search work. Compared with traditional methods of order tracking, IFE based order tracking is simplified in application and only software depended. Testing testify the validity of the method. This method is an effective supplement to traditional methods, and the application in condition monitoring and diagnosis of rotating machinery is imaginable

  16. The analysis of cable forces based on natural frequency

    Science.gov (United States)

    Suangga, Made; Hidayat, Irpan; Juliastuti; Bontan, Darwin Julius

    2017-12-01

    A cable is a flexible structural member that is effective at resisting tensile forces. Cables are used in a variety of structures that employ their unique characteristics to create efficient design tension members. The condition of the cable forces in the cable supported structure is an important indication of judging whether the structure is in good condition. Several methods have been developed to measure on site cable forces. Vibration technique using correlation between natural frequency and cable forces is a simple method to determine in situ cable forces, however the method need accurate information on the boundary condition, cable mass, and cable length. The natural frequency of the cable is determined using FFT (Fast Fourier Transform) Technique to the acceleration record of the cable. Based on the natural frequency obtained, the cable forces then can be determine by analytical or by finite element program. This research is focus on the vibration techniques to determine the cable forces, to understand the physical parameter effect of the cable and also modelling techniques to the natural frequency and cable forces.

  17. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    International Nuclear Information System (INIS)

    Zhang, Xun; Wang, Shouyang; Yu, Lean; Lai, Kin Keung

    2009-01-01

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  18. Estimating the impact of extreme events on crude oil price. An EMD-based event analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xun; Wang, Shouyang [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); School of Mathematical Sciences, Graduate University of Chinese Academy of Sciences, Beijing 100190 (China); Yu, Lean [Institute of Systems Science, Academy of Mathematics and Systems Science, Chinese Academy of Sciences, Beijing 100190 (China); Lai, Kin Keung [Department of Management Sciences, City University of Hong Kong, Tat Chee Avenue, Kowloon (China)

    2009-09-15

    The impact of extreme events on crude oil markets is of great importance in crude oil price analysis due to the fact that those events generally exert strong impact on crude oil markets. For better estimation of the impact of events on crude oil price volatility, this study attempts to use an EMD-based event analysis approach for this task. In the proposed method, the time series to be analyzed is first decomposed into several intrinsic modes with different time scales from fine-to-coarse and an average trend. The decomposed modes respectively capture the fluctuations caused by the extreme event or other factors during the analyzed period. It is found that the total impact of an extreme event is included in only one or several dominant modes, but the secondary modes provide valuable information on subsequent factors. For overlapping events with influences lasting for different periods, their impacts are separated and located in different modes. For illustration and verification purposes, two extreme events, the Persian Gulf War in 1991 and the Iraq War in 2003, are analyzed step by step. The empirical results reveal that the EMD-based event analysis method provides a feasible solution to estimating the impact of extreme events on crude oil prices variation. (author)

  19. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method.

    Science.gov (United States)

    Tuta, Jure; Juric, Matjaz B

    2018-03-24

    This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method), a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah) and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.). Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  20. MFAM: Multiple Frequency Adaptive Model-Based Indoor Localization Method

    Directory of Open Access Journals (Sweden)

    Jure Tuta

    2018-03-01

    Full Text Available This paper presents MFAM (Multiple Frequency Adaptive Model-based localization method, a novel model-based indoor localization method that is capable of using multiple wireless signal frequencies simultaneously. It utilizes indoor architectural model and physical properties of wireless signal propagation through objects and space. The motivation for developing multiple frequency localization method lies in the future Wi-Fi standards (e.g., 802.11ah and the growing number of various wireless signals present in the buildings (e.g., Wi-Fi, Bluetooth, ZigBee, etc.. Current indoor localization methods mostly rely on a single wireless signal type and often require many devices to achieve the necessary accuracy. MFAM utilizes multiple wireless signal types and improves the localization accuracy over the usage of a single frequency. It continuously monitors signal propagation through space and adapts the model according to the changes indoors. Using multiple signal sources lowers the required number of access points for a specific signal type while utilizing signals, already present in the indoors. Due to the unavailability of the 802.11ah hardware, we have evaluated proposed method with similar signals; we have used 2.4 GHz Wi-Fi and 868 MHz HomeMatic home automation signals. We have performed the evaluation in a modern two-bedroom apartment and measured mean localization error 2.0 to 2.3 m and median error of 2.0 to 2.2 m. Based on our evaluation results, using two different signals improves the localization accuracy by 18% in comparison to 2.4 GHz Wi-Fi-only approach. Additional signals would improve the accuracy even further. We have shown that MFAM provides better accuracy than competing methods, while having several advantages for real-world usage.

  1. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  2. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  3. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  4. Helium gas purity monitor based on low frequency acoustic resonance

    Science.gov (United States)

    Kasthurirengan, S.; Jacob, S.; Karunanithi, R.; Karthikeyan, A.

    1996-05-01

    Monitoring gas purity is an important aspect of gas recovery stations where air is usually one of the major impurities. Purity monitors of Katherometric type are commercially available for this purpose. Alternatively, we discuss here a helium gas purity monitor based on acoustic resonance of a cavity at audio frequencies. It measures the purity by monitoring the resonant frequency of a cylindrical cavity filled with the gas under test and excited by conventional telephone transducers fixed at the ends. The use of the latter simplifies the design considerably. The paper discusses the details of the resonant cavity and the electronic circuit along with temperature compensation. The unit has been calibrated with helium gas of known purities. The unit has a response time of the order of 10 minutes and measures the gas purity to an accuracy of 0.02%. The unit has been installed in our helium recovery system and is found to perform satisfactorily.

  5. Intermediate Frequency Digital Receiver Based on Multi-FPGA System

    Directory of Open Access Journals (Sweden)

    Chengchang Zhang

    2016-01-01

    Full Text Available Aiming at high-cost, large-size, and inflexibility problems of traditional analog intermediate frequency receiver in the aerospace telemetry, tracking, and command (TTC system, we have proposed a new intermediate frequency (IF digital receiver based on Multi-FPGA system in this paper. Digital beam forming (DBF is realized by coordinated rotation digital computer (CORDIC algorithm. An experimental prototype has been developed on a compact Multi-FPGA system with three FPGAs to receive 16 channels of IF digital signals. Our experimental results show that our proposed scheme is able to provide a great convenience for the design of IF digital receiver, which offers a valuable reference for real-time, low power, high density, and small size receiver design.

  6. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  7. Quantification of LOCA core damage frequency based on thermal-hydraulics analysis

    International Nuclear Information System (INIS)

    Cho, Jaehyun; Park, Jin Hee; Kim, Dong-San; Lim, Ho-Gon

    2017-01-01

    Highlights: • We quantified the LOCA core damage frequency based on the best-estimated success criteria analysis. • The thermal-hydraulic analysis using MARS code has been applied to Korea Standard Nuclear Power Plants. • Five new event trees with new break size boundaries and new success criteria were developed. • The core damage frequency is 5.80E−07 (/y), which is 12% less than the conventional PSA event trees. - Abstract: A loss-of-coolant accident (LOCA) has always been significantly considered one of the most important initiating events. However, most probabilistic safety assessment models, up to now, have undoubtedly adopted the three groups of LOCA, and even an exact break size boundary that used in WASH-1400 reports was published in 1975. With an awareness of the importance of a realistic PSA for a risk-informed application, several studies have tried to find the realistic thermal-hydraulic behavior of a LOCA, and improve the PSA model. The purpose of this research is to obtain realistic results of the LOCA core damage frequency based on a success criteria analysis using the best-estimate thermal-hydraulics code. To do so, the Korea Standard Nuclear Power Plant (KSNP) was selected for this study. The MARS code was used for a thermal hydraulics analysis and the AIMS code was used for the core damage quantification. One of the major findings in the thermal hydraulics analysis was that the decay power is well removed by only a normal secondary cooling in LOCAs of below 1.4 in and by only a high pressure safety injection in LOCAs of 0.8–9.4 in. Based on the thermal hydraulics results regarding new break size boundaries and new success criteria, five new event trees (ETs) were developed. The core damage frequency of new LOCA ETs is 5.80E−07 (/y), which is 12% less than the conventional PSA ETs. In this research, we obtained not only thermal-hydraulics characteristics for the entire break size of a LOCA in view of the deterministic safety

  8. Quantification of LOCA core damage frequency based on thermal-hydraulics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jaehyun, E-mail: chojh@kaeri.re.kr; Park, Jin Hee; Kim, Dong-San; Lim, Ho-Gon

    2017-04-15

    Highlights: • We quantified the LOCA core damage frequency based on the best-estimated success criteria analysis. • The thermal-hydraulic analysis using MARS code has been applied to Korea Standard Nuclear Power Plants. • Five new event trees with new break size boundaries and new success criteria were developed. • The core damage frequency is 5.80E−07 (/y), which is 12% less than the conventional PSA event trees. - Abstract: A loss-of-coolant accident (LOCA) has always been significantly considered one of the most important initiating events. However, most probabilistic safety assessment models, up to now, have undoubtedly adopted the three groups of LOCA, and even an exact break size boundary that used in WASH-1400 reports was published in 1975. With an awareness of the importance of a realistic PSA for a risk-informed application, several studies have tried to find the realistic thermal-hydraulic behavior of a LOCA, and improve the PSA model. The purpose of this research is to obtain realistic results of the LOCA core damage frequency based on a success criteria analysis using the best-estimate thermal-hydraulics code. To do so, the Korea Standard Nuclear Power Plant (KSNP) was selected for this study. The MARS code was used for a thermal hydraulics analysis and the AIMS code was used for the core damage quantification. One of the major findings in the thermal hydraulics analysis was that the decay power is well removed by only a normal secondary cooling in LOCAs of below 1.4 in and by only a high pressure safety injection in LOCAs of 0.8–9.4 in. Based on the thermal hydraulics results regarding new break size boundaries and new success criteria, five new event trees (ETs) were developed. The core damage frequency of new LOCA ETs is 5.80E−07 (/y), which is 12% less than the conventional PSA ETs. In this research, we obtained not only thermal-hydraulics characteristics for the entire break size of a LOCA in view of the deterministic safety

  9. Event-based soil loss models for construction sites

    Science.gov (United States)

    Trenouth, William R.; Gharabaghi, Bahram

    2015-05-01

    The elevated rates of soil erosion stemming from land clearing and grading activities during urban development, can result in excessive amounts of eroded sediments entering waterways and causing harm to the biota living therein. However, construction site event-based soil loss simulations - required for reliable design of erosion and sediment controls - are one of the most uncertain types of hydrologic models. This study presents models with improved degree of accuracy to advance the design of erosion and sediment controls for construction sites. The new models are developed using multiple linear regression (MLR) on event-based permutations of the Universal Soil Loss Equation (USLE) and artificial neural networks (ANN). These models were developed using surface runoff monitoring datasets obtained from three sites - Greensborough, Cookstown, and Alcona - in Ontario and datasets mined from the literature for three additional sites - Treynor, Iowa, Coshocton, Ohio and Cordoba, Spain. The predictive MLR and ANN models can serve as both diagnostic and design tools for the effective sizing of erosion and sediment controls on active construction sites, and can be used for dynamic scenario forecasting when considering rapidly changing land use conditions during various phases of construction.

  10. Modeling time to recovery and initiating event frequency for loss of off-site power incidents at nuclear power plants

    International Nuclear Information System (INIS)

    Iman, R.L.; Hora, S.C.

    1988-01-01

    Industry data representing the time to recovery of loss of off-site power at nuclear power plants for 63 incidents caused by plant-centered losses, grid losses, or severe weather losses are fit with exponential, lognormal, gamma and Weibull probability models. A Bayesian analysis is used to compare the adequacy of each of these models and to provide uncertainty bounds on each of the fitted models. A composite model that combines the probability models fitted to each of the three sources of data is presented as a method for predicting the time to recovery of loss of off-site power. The composite model is very general and can be made site specific by making adjustments on the models used, such as might occur due to the type of switchyard configuration or type of grid, and by adjusting the weights on the individual models, such as might occur with weather conditions existing at a particular plant. Adjustments in the composite model are shown for different models used for switchyard configuration and for different weights due to weather. Bayesian approaches are also presented for modeling the frequency of initiating events leading to loss of off-site power. One Bayesian model assumes that all plants share a common incidence rate for loss of off-site power, while the other Bayesian approach models the incidence rate for each plant relative to the incidence rates of all other plants. Combining the Bayesian models for the frequency of the initiating events with the composite Bayesian model for recovery provides the necessary vehicle for a complete model that incorporates uncertainty into a probabilistic risk assessment

  11. SPREAD: a high-resolution daily gridded precipitation dataset for Spain – an extreme events frequency and intensity overview

    Directory of Open Access Journals (Sweden)

    R. Serrano-Notivoli

    2017-09-01

    Full Text Available A high-resolution daily gridded precipitation dataset was built from raw data of 12 858 observatories covering a period from 1950 to 2012 in peninsular Spain and 1971 to 2012 in Balearic and Canary islands. The original data were quality-controlled and gaps were filled on each day and location independently. Using the serially complete dataset, a grid with a 5 × 5 km spatial resolution was constructed by estimating daily precipitation amounts and their corresponding uncertainty at each grid node. Daily precipitation estimations were compared to original observations to assess the quality of the gridded dataset. Four daily precipitation indices were computed to characterise the spatial distribution of daily precipitation and nine extreme precipitation indices were used to describe the frequency and intensity of extreme precipitation events. The Mediterranean coast and the Central Range showed the highest frequency and intensity of extreme events, while the number of wet days and dry and wet spells followed a north-west to south-east gradient in peninsular Spain, from high to low values in the number of wet days and wet spells and reverse in dry spells. The use of the total available data in Spain, the independent estimation of precipitation for each day and the high spatial resolution of the grid allowed for a precise spatial and temporal assessment of daily precipitation that is difficult to achieve when using other methods, pre-selected long-term stations or global gridded datasets. SPREAD dataset is publicly available at https://doi.org/10.20350/digitalCSIC/7393.

  12. Single event upset threshold estimation based on local laser irradiation

    International Nuclear Information System (INIS)

    Chumakov, A.I.; Egorov, A.N.; Mavritsky, O.B.; Yanenko, A.V.

    1999-01-01

    An approach for estimation of ion-induced SEU threshold based on local laser irradiation is presented. Comparative experiment and software simulation research were performed at various pulse duration and spot size. Correlation of single event threshold LET to upset threshold laser energy under local irradiation was found. The computer analysis of local laser irradiation of IC structures was developed for SEU threshold LET estimation. The correlation of local laser threshold energy with SEU threshold LET was shown. Two estimation techniques were suggested. The first one is based on the determination of local laser threshold dose taking into account the relation of sensitive area to local irradiated area. The second technique uses the photocurrent peak value instead of this relation. The agreement between the predicted and experimental results demonstrates the applicability of this approach. (authors)

  13. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  14. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    Energy Technology Data Exchange (ETDEWEB)

    Khristova, R; Kaltchev, B; Dimitrov, B [Energoproekt, Sofia (Bulgaria); Nedyalkova, D; Sonev, A [Kombinat Atomna Energetika, Kozloduj (Bulgaria)

    1996-12-31

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs.

  15. Reliability research based experience with systems and events at the Kozloduy NPP units 1-4

    International Nuclear Information System (INIS)

    Khristova, R.; Kaltchev, B.; Dimitrov, B.; Nedyalkova, D.; Sonev, A.

    1995-01-01

    An overview of equipment reliability based on operational data of selected safety systems at the Kozloduy NPP is presented. Conclusions are drawn on reliability of the service water system, feed water system, emergency power supply - category 2, emergency high pressure ejection system and spray system. For the units 1-4 all recorded accident protocols in the period 1974-1993 have been processed and the main initiators identified. A list with 39 most frequent initiators of accidents/incidents is compiled. The human-caused errors account for 27% of all events. The reliability characteristics and frequencies have been calculated for all initiating events. It is concluded that there have not been any accidents with consequences for fuel integrity or radioactive release. 14 refs

  16. Input signal shaping based on harmonic frequency response function for suppressing nonlinear optical frequency in frequency-scanning interferometry

    Science.gov (United States)

    Zhu, Yu; Liu, Zhigang; Deng, Wen; Deng, Zhongwen

    2018-05-01

    Frequency-scanning interferometry (FSI) using an external cavity diode laser (ECDL) is essential for many applications of the absolute distance measurement. However, owing to the hysteresis and creep of the piezoelectric actuator inherent in the ECDL, the optical frequency scanning exhibits a nonlinearity that seriously affects the phase extraction accuracy of the interference signal and results in the reduction of the measurement accuracy. To suppress the optical frequency nonlinearity, a harmonic frequency synthesis method for shaping the desired input signal instead of the original triangular wave is presented. The effectiveness of the presented shaping method is demonstrated through the comparison of the experimental results. Compared with an incremental Renishaw interferometer, the standard deviation of the displacement measurement of the FSI system is less than 2.4 μm when driven by the shaped signal.

  17. Only low frequency event-related EEG activity is compromised in multiple sclerosis: insights from an independent component clustering analysis.

    Directory of Open Access Journals (Sweden)

    Hanni Kiiski

    Full Text Available Cognitive impairment (CI, often examined with neuropsychological tests such as the Paced Auditory Serial Addition Test (PASAT, affects approximately 65% of multiple sclerosis (MS patients. The P3b event-related potential (ERP, evoked when an infrequent target stimulus is presented, indexes cognitive function and is typically compared across subjects' scalp electroencephalography (EEG data. However, the clustering of independent components (ICs is superior to scalp-based EEG methods because it can accommodate the spatiotemporal overlap inherent in scalp EEG data. Event-related spectral perturbations (ERSPs; event-related mean power spectral changes and inter-trial coherence (ITCs; event-related consistency of spectral phase reveal a more comprehensive overview of EEG activity. Ninety-five subjects (56 MS patients, 39 controls completed visual and auditory two-stimulus P3b event-related potential tasks and the PASAT. MS patients were also divided into CI and non-CI groups (n = 18 in each based on PASAT scores. Data were recorded from 128-scalp EEG channels and 4 IC clusters in the visual, and 5 IC clusters in the auditory, modality were identified. In general, MS patients had significantly reduced ERSP theta power versus controls, and a similar pattern was observed for CI vs. non-CI MS patients. The ITC measures were also significantly different in the theta band for some clusters. The finding that MS patients had reduced P3b task-related theta power in both modalities is a reflection of compromised connectivity, likely due to demyelination, that may have disrupted early processes essential to P3b generation, such as orientating and signal detection. However, for posterior sources, MS patients had a greater decrease in alpha power, normally associated with enhanced cognitive function, which may reflect a compensatory mechanism in response to the compromised early cognitive processing.

  18. High Frequency Supercapacitors for Piezo-based Energy Harvesting

    Science.gov (United States)

    Ervin, Matthew; Pereira, Carlos; Miller, John; Outlaw, Ronald; Rastegar, Jay; Murray, Richard

    2013-03-01

    Energy harvesting is being investigated as an alternative to batteries for powering munition guidance and fuzing functions during flight. A piezoelectric system that generates energy from the oscillation of a mass on a spring (set in motion by the launch acceleration) is being developed. Original designs stored this energy in an electrolytic capacitor for use during flight. Here we replace the electrolytic capacitor with a smaller, lighter, and potentially more reliable electrochemical double layer capacitor (aka, supercapacitor). The potential problems with using supercapacitors in this application are that the piezoelectric output greatly exceeds the supercapacitor electrolyte breakdown voltage, and the frequency greatly exceeds the operating frequency of commercial supercapacitors. Here we have investigated the use of ultrafast vertically oriented graphene array-based supercapacitors for storing the energy in this application. We find that the electrolyte breakdown is not a serious limitation as it is either kinetically limited by the relatively high frequency of the piezoelectric output, or it is overcome by the self-healing nature of supercapacitors. We also find that these supercapacitors have sufficient dynamic response to efficiently store the generated energy.

  19. Single Frequency Network Based Distributed Passive Radar Technology

    Directory of Open Access Journals (Sweden)

    Wan Xian-rong

    2015-01-01

    Full Text Available The research and application of passive radar are heading from single transmitter-receiver pair to multiple transmitter-receiver pairs. As an important class of the illuminators of opportunity, most of modern digital broadcasting and television systems work on Single Frequency Network (SFN, which intrinsically determines that the passive radar based on such illuminators must be distributed and networked. In consideration of the remarkable working and processing mode of passive radar under SFN configuration, this paper proposes the concept of SFN-based Distributed Passive Radar (SDPR. The main characteristics and key problems of SDPR are first described. Then several potential solutions are discussed for part of the key technologies. The feasibility of SDPR is demonstrated by preliminary experimental results. Finally, the concept of four network convergence that includes the broadcast based passive radar network is conceived, and its application prospects are discussed.

  20. Radiative transport-based frequency-domain fluorescence tomography

    International Nuclear Information System (INIS)

    Joshi, Amit; Rasmussen, John C; Sevick-Muraca, Eva M; Wareing, Todd A; McGhee, John

    2008-01-01

    We report the development of radiative transport model-based fluorescence optical tomography from frequency-domain boundary measurements. The coupled radiative transport model for describing NIR fluorescence propagation in tissue is solved by a novel software based on the established Attila(TM) particle transport simulation platform. The proposed scheme enables the prediction of fluorescence measurements with non-contact sources and detectors at a minimal computational cost. An adjoint transport solution-based fluorescence tomography algorithm is implemented on dual grids to efficiently assemble the measurement sensitivity Jacobian matrix. Finally, we demonstrate fluorescence tomography on a realistic computational mouse model to locate nM to μM fluorophore concentration distributions in simulated mouse organs

  1. Frequency selective surfaces based high performance microstrip antenna

    CERN Document Server

    Narayan, Shiv; Jha, Rakesh Mohan

    2016-01-01

    This book focuses on performance enhancement of printed antennas using frequency selective surfaces (FSS) technology. The growing demand of stealth technology in strategic areas requires high-performance low-RCS (radar cross section) antennas. Such requirements may be accomplished by incorporating FSS into the antenna structure either in its ground plane or as the superstrate, due to the filter characteristics of FSS structure. In view of this, a novel approach based on FSS technology is presented in this book to enhance the performance of printed antennas including out-of-band structural RCS reduction. In this endeavor, the EM design of microstrip patch antennas (MPA) loaded with FSS-based (i) high impedance surface (HIS) ground plane, and (ii) the superstrates are discussed in detail. The EM analysis of proposed FSS-based antenna structures have been carried out using transmission line analogy, in combination with the reciprocity theorem. Further, various types of novel FSS structures are considered in desi...

  2. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  3. Triple-Frequency GPS Precise Point Positioning Ambiguity Resolution Using Dual-Frequency Based IGS Precise Clock Products

    Directory of Open Access Journals (Sweden)

    Fei Liu

    2017-01-01

    Full Text Available With the availability of the third civil signal in the Global Positioning System, triple-frequency Precise Point Positioning ambiguity resolution methods have drawn increasing attention due to significantly reduced convergence time. However, the corresponding triple-frequency based precise clock products are not widely available and adopted by applications. Currently, most precise products are generated based on ionosphere-free combination of dual-frequency L1/L2 signals, which however are not consistent with the triple-frequency ionosphere-free carrier-phase measurements, resulting in inaccurate positioning and unstable float ambiguities. In this study, a GPS triple-frequency PPP ambiguity resolution method is developed using the widely used dual-frequency based clock products. In this method, the interfrequency clock biases between the triple-frequency and dual-frequency ionosphere-free carrier-phase measurements are first estimated and then applied to triple-frequency ionosphere-free carrier-phase measurements to obtain stable float ambiguities. After this, the wide-lane L2/L5 and wide-lane L1/L2 integer property of ambiguities are recovered by estimating the satellite fractional cycle biases. A test using a sparse network is conducted to verify the effectiveness of the method. The results show that the ambiguity resolution can be achieved in minutes even tens of seconds and the positioning accuracy is in decimeter level.

  4. VLSI-based video event triggering for image data compression

    Science.gov (United States)

    Williams, Glenn L.

    1994-02-01

    Long-duration, on-orbit microgravity experiments require a combination of high resolution and high frame rate video data acquisition. The digitized high-rate video stream presents a difficult data storage problem. Data produced at rates of several hundred million bytes per second may require a total mission video data storage requirement exceeding one terabyte. A NASA-designed, VLSI-based, highly parallel digital state machine generates a digital trigger signal at the onset of a video event. High capacity random access memory storage coupled with newly available fuzzy logic devices permits the monitoring of a video image stream for long term (DC-like) or short term (AC-like) changes caused by spatial translation, dilation, appearance, disappearance, or color change in a video object. Pre-trigger and post-trigger storage techniques are then adaptable to archiving only the significant video images.

  5. Long distance measurement with a femtosecond laser based frequency comb

    Science.gov (United States)

    Bhattacharya, N.; Cui, M.; Zeitouny, M. G.; Urbach, H. P.; van den Berg, S. A.

    2017-11-01

    Recent advances in the field of ultra-short pulse lasers have led to the development of reliable sources of carrier envelope phase stabilized femtosecond pulses. The pulse train generated by such a source has a frequency spectrum that consists of discrete, regularly spaced lines known as a frequency comb. In this case both the frequency repetition and the carrier-envelope-offset frequency are referenced to a frequency standard, like an atomic clock. As a result the accuracy of the frequency standard is transferred to the optical domain, with the frequency comb as transfer oscillator. These unique properties allow the frequency comb to be applied as a versatile tool, not only for time and frequency metrology, but also in fundamental physics, high-precision spectroscopy, and laser noise characterization. The pulse-to-pulse phase relationship of the light emitted by the frequency comb has opened up new directions for long range highly accurate distance measurement.

  6. Event-based proactive interference in rhesus monkeys.

    Science.gov (United States)

    Devkar, Deepna T; Wright, Anthony A

    2016-10-01

    Three rhesus monkeys (Macaca mulatta) were tested in a same/different memory task for proactive interference (PI) from prior trials. PI occurs when a previous sample stimulus appears as a test stimulus on a later trial, does not match the current sample stimulus, and the wrong response "same" is made. Trial-unique pictures (scenes, objects, animals, etc.) were used on most trials, except on trials where the test stimulus matched potentially interfering sample stimulus from a prior trial (1, 2, 4, 8, or 16 trials prior). Greater interference occurred when fewer trials separated interference and test. PI functions showed a continuum of interference. Delays between sample and test stimuli and intertrial intervals were manipulated to test how PI might vary as a function of elapsed time. Contrary to a similar study with pigeons, these time manipulations had no discernable effect on the monkey's PI, as shown by compete overlap of PI functions with no statistical differences or interactions. These results suggested that interference was strictly based upon the number of intervening events (trials with other pictures) without regard to elapsed time. The monkeys' apparent event-based interference was further supported by retesting with a novel set of 1,024 pictures. PI from novel pictures 1 or 2 trials prior was greater than from familiar pictures, a familiar set of 1,024 pictures. Moreover, when potentially interfering novel stimuli were 16 trials prior, performance accuracy was actually greater than accuracy on baseline trials (no interference), suggesting that remembering stimuli from 16 trials prior was a cue that this stimulus was not the sample stimulus on the current trial-a somewhat surprising conclusion particularly given monkeys.

  7. LPI Radar Waveform Recognition Based on Time-Frequency Distribution

    Directory of Open Access Journals (Sweden)

    Ming Zhang

    2016-10-01

    Full Text Available In this paper, an automatic radar waveform recognition system in a high noise environment is proposed. Signal waveform recognition techniques are widely applied in the field of cognitive radio, spectrum management and radar applications, etc. We devise a system to classify the modulating signals widely used in low probability of intercept (LPI radar detection systems. The radar signals are divided into eight types of classifications, including linear frequency modulation (LFM, BPSK (Barker code modulation, Costas codes and polyphase codes (comprising Frank, P1, P2, P3 and P4. The classifier is Elman neural network (ENN, and it is a supervised classification based on features extracted from the system. Through the techniques of image filtering, image opening operation, skeleton extraction, principal component analysis (PCA, image binarization algorithm and Pseudo–Zernike moments, etc., the features are extracted from the Choi–Williams time-frequency distribution (CWD image of the received data. In order to reduce the redundant features and simplify calculation, the features selection algorithm based on mutual information between classes and features vectors are applied. The superiority of the proposed classification system is demonstrated by the simulations and analysis. Simulation results show that the overall ratio of successful recognition (RSR is 94.7% at signal-to-noise ratio (SNR of −2 dB.

  8. A Frequency-Based Approach to Intrusion Detection

    Directory of Open Access Journals (Sweden)

    Mian Zhou

    2004-06-01

    Full Text Available Research on network security and intrusion detection strategies presents many challenging issues to both theoreticians and practitioners. Hackers apply an array of intrusion and exploit techniques to cause disruption of normal system operations, but on the defense, firewalls and intrusion detection systems (IDS are typically only effective in defending known intrusion types using their signatures, and are far less than mature when faced with novel attacks. In this paper, we adapt the frequency analysis techniques such as the Discrete Fourier Transform (DFT used in signal processing to the design of intrusion detection algorithms. We demonstrate the effectiveness of the frequency-based detection strategy by running synthetic network intrusion data in simulated networks using the OPNET software. The simulation results indicate that the proposed intrusion detection strategy is effective in detecting anomalous traffic data that exhibit patterns over time, which include several types of DOS and probe attacks. The significance of this new strategy is that it does not depend on the prior knowledge of attack signatures, thus it has the potential to be a useful supplement to existing signature-based IDS and firewalls.

  9. High Frequency Vibration Based Fatigue Testing of Developmental Alloys

    Science.gov (United States)

    Holycross, Casey M.; Srinivasan, Raghavan; George, Tommy J.; Tamirisakandala, Seshacharyulu; Russ, Stephan M.

    Many fatigue test methods have been previously developed to rapidly evaluate fatigue behavior. This increased test speed can come at some expense, since these methods may require non-standard specimen geometry or increased facility and equipment capability. One such method, developed by George et al, involves a base-excited plate specimen driven into a high frequency bending resonant mode. This resonant mode is of sufficient frequency (typically 1200 to 1700 Hertz) to accumulate 107 cycles in a few hours. One of the main limitations of this test method is that fatigue cracking is almost certainly guaranteed to be surface initiated at regions of high stress. This brings into question the validity of the fatigue test results, as compared to more traditional uniaxial, smooth-bar testing, since high stresses are subjecting only a small volume to fatigue damage. This limitation also brings into question the suitability of this method to screen developmental alloys, should their initiation life be governed by subsurface flaws. However, if applicable, the rapid generation of fatigue data using this method would facilitate faster design iterations, identifying more quickly, material and manufacturing process deficiencies. The developmental alloy used in this study was a powder metallurgy boron-modified Ti-6Al-4V, a new alloy currently being considered for gas turbine engine fan blades. Plate specimens were subjected to fully reversed bending fatigue. Results are compared with existing data from commercially available Ti-6Al-4V using both vibration based and more traditional fatigue test methods.

  10. Ultra High-Speed Radio Frequency Switch Based on Photonics.

    Science.gov (United States)

    Ge, Jia; Fok, Mable P

    2015-11-26

    Microwave switches, or Radio Frequency (RF) switches have been intensively used in microwave systems for signal routing. Compared with the fast development of microwave and wireless systems, RF switches have been underdeveloped particularly in terms of switching speed and operating bandwidth. In this paper, we propose a photonics based RF switch that is capable of switching at tens of picoseconds speed, which is hundreds of times faster than any existing RF switch technologies. The high-speed switching property is achieved with the use of a rapidly tunable microwave photonic filter with tens of gigahertz frequency tuning speed, where the tuning mechanism is based on the ultra-fast electro-optics Pockels effect. The RF switch has a wide operation bandwidth of 12 GHz and can go up to 40 GHz, depending on the bandwidth of the modulator used in the scheme. The proposed RF switch can either work as an ON/OFF switch or a two-channel switch, tens of picoseconds switching speed is experimentally observed for both type of switches.

  11. Floods of the Lower Tisza from the late 17th century onwards: frequency, magnitude, seasonality and great flood events

    Science.gov (United States)

    Kiss, Andrea

    2016-04-01

    The present paper is based on a recently developed database including contemporary original, administrative, legal and private source materials (published and archival) as well as media reports related to the floods occurred on the lower sections of the Tisza river in Hungary, with special emphasis on the area of Szeged town. The study area is well-represented by contemporary source evidence from the late 17th century onwards, when the town and its broader area was reoccupied from the Ottoman Turkish Empire. Concerning the applied source materials, the main bases of investigation are the administrative (archival) sources such as town council protocols of Szeged and county meeting protocols of Csanád and Csongrád Counties. In these (legal-)administrative documents damaging events (natural/environmental hazards) were systematically recorded. Moreover, other source types such as taxation-related damage accounts as well as private and official reports, letters and correspondence (published, unpublished) were also included. Concerning published evidence, a most important source is flood reports in contemporary newspapers as well as town chronicles and other contemporary narratives. In the presentation the main focus is on the analysis of flood-rich flood-poor periods of the last ca. 330 years; moreover, the seasonality distribution as well as the magnitude of Tisza flood events are also discussed. Another important aim of the poster is to provide a short overview, in the form of case studies, on the greatest flood events (e.g. duration, magnitude, damages, multi-annual consequences), and their further impacts on the urban and countryside development as well as on (changes in) flood defence strategies. In this respect, especially two flood events, the great (1815-)1816 and the catastrophic 1879 flood (shortly with causes and consequences) - that practically erased Szeged town from the ground - are presented in more detail.

  12. Fast LCMV-based Methods for Fundamental Frequency Estimation

    DEFF Research Database (Denmark)

    Jensen, Jesper Rindom; Glentis, George-Othon; Christensen, Mads Græsbøll

    2013-01-01

    peaks and require matrix inversions for each point in the search grid. In this paper, we therefore consider fast implementations of LCMV-based fundamental frequency estimators, exploiting the estimators' inherently low displacement rank of the used Toeplitz-like data covariance matrices, using...... with several orders of magnitude, but, as we show, further computational savings can be obtained by the adoption of an approximative IAA-based data covariance matrix estimator, reminiscent of the recently proposed Quasi-Newton IAA technique. Furthermore, it is shown how the considered pitch estimators can...... as such either the classic time domain averaging covariance matrix estimator, or, if aiming for an increased spectral resolution, the covariance matrix resulting from the application of the recent iterative adaptive approach (IAA). The proposed exact implementations reduce the required computational complexity...

  13. Tunable antenna radome based on graphene frequency selective surface

    Science.gov (United States)

    Qu, Meijun; Rao, Menglou; Li, Shufang; Deng, Li

    2017-09-01

    In this paper, a graphene-based frequency selective surface (FSS) is proposed. The proposed FSS exhibits a tunable bandpass filtering characteristic due to the alterable conductivity of the graphene strips which is controlled by chemical potential. Based on the reconfigurable bandpass property of the proposed FSS, a cylindrical antenna radome is designed using the FSS unit cells. A conventional omnidirectional dipole can realize a two-beam directional pattern when it is placed into the proposed antenna radome. Forward and backward endfire radiations of the dipole loaded with the radome is realized by properly adjusting the chemical potential. The proposed antenna radome is extremely promising for beam-scanning in terahertz and mid-infrared plasmonic devices and systems when the gain of a conventional antenna needs to be enhanced.

  14. A data base approach for prediction of deforestation-induced mass wasting events

    Science.gov (United States)

    Logan, T. L.

    1981-01-01

    A major topic of concern in timber management is determining the impact of clear-cutting on slope stability. Deforestation treatments on steep mountain slopes have often resulted in a high frequency of major mass wasting events. The Geographic Information System (GIS) is a potentially useful tool for predicting the location of mass wasting sites. With a raster-based GIS, digitally encoded maps of slide hazard parameters can be overlayed and modeled to produce new maps depicting high probability slide areas. The present investigation has the objective to examine the raster-based information system as a tool for predicting the location of the clear-cut mountain slopes which are most likely to experience shallow soil debris avalanches. A literature overview is conducted, taking into account vegetation, roads, precipitation, soil type, slope-angle and aspect, and models predicting mass soil movements. Attention is given to a data base approach and aspects of slide prediction.

  15. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  16. Rydberg-atom based radio-frequency electrometry using frequency modulation spectroscopy in room temperature vapor cells.

    Science.gov (United States)

    Kumar, Santosh; Fan, Haoquan; Kübler, Harald; Jahangiri, Akbar J; Shaffer, James P

    2017-04-17

    Rydberg atom-based electrometry enables traceable electric field measurements with high sensitivity over a large frequency range, from gigahertz to terahertz. Such measurements are particularly useful for the calibration of radio frequency and terahertz devices, as well as other applications like near field imaging of electric fields. We utilize frequency modulated spectroscopy with active control of residual amplitude modulation to improve the signal to noise ratio of the optical readout of Rydberg atom-based radio frequency electrometry. Matched filtering of the signal is also implemented. Although we have reached similarly, high sensitivity with other read-out methods, frequency modulated spectroscopy is advantageous because it is well-suited for building a compact, portable sensor. In the current experiment, ∼3 µV cm-1 Hz-1/2 sensitivity is achieved and is found to be photon shot noise limited.

  17. EGC: a time-frequency augmented template-based method for gravitational wave burst search in ground-based interferometers

    International Nuclear Information System (INIS)

    Clapson, Andre-Claude; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Varvella, Monica

    2008-01-01

    The detection of burst-type events in the output of ground gravitational wave detectors is particularly challenging. The potential variety of astrophysical waveforms, as proposed by simulations and analytic studies in general relativity and the discrimination of actual signals from instrumental noise both are critical issues. Robust methods that achieve reasonable detection performances over a wide range of signals are required. We present here a hybrid burst-detection pipeline related to time-frequency transforms while based on matched filtering to provide robustness against noise characteristics. Studies on simulated noise show that the algorithm has a detection efficiency similar to other methods over very different waveforms and particularly good timing even for low amplitude signals: no bias for most tested waveforms and an average accuracy of 1.1 ms (down to 0.1 ms in the best case). Time-frequency-type parameters, useful for event classification, are also derived for noise spectral densities unfavourable to standard time-frequency algorithms

  18. EGC: a time-frequency augmented template-based method for gravitational wave burst search in ground-based interferometers

    Energy Technology Data Exchange (ETDEWEB)

    Clapson, Andre-Claude; Barsuglia, Matteo; Bizouard, Marie-Anne; Brisson, Violette; Cavalier, Fabien; Davier, Michel; Hello, Patrice; Leroy, Nicolas; Varvella, Monica [LAL, Universite Paris-Sud 11, BP 34, 91898 Orsay (France)

    2008-02-07

    The detection of burst-type events in the output of ground gravitational wave detectors is particularly challenging. The potential variety of astrophysical waveforms, as proposed by simulations and analytic studies in general relativity and the discrimination of actual signals from instrumental noise both are critical issues. Robust methods that achieve reasonable detection performances over a wide range of signals are required. We present here a hybrid burst-detection pipeline related to time-frequency transforms while based on matched filtering to provide robustness against noise characteristics. Studies on simulated noise show that the algorithm has a detection efficiency similar to other methods over very different waveforms and particularly good timing even for low amplitude signals: no bias for most tested waveforms and an average accuracy of 1.1 ms (down to 0.1 ms in the best case). Time-frequency-type parameters, useful for event classification, are also derived for noise spectral densities unfavourable to standard time-frequency algorithms.

  19. Frequency modulation reveals the phasing of orbital eccentricity during Cretaceous Oceanic Anoxic Event II and the Eocene hyperthermals

    Science.gov (United States)

    Laurin, Jiří; Meyers, Stephen R.; Galeotti, Simone; Lanci, Luca

    2016-05-01

    Major advances in our understanding of paleoclimate change derive from a precise reconstruction of the periods, amplitudes and phases of the 'Milankovitch cycles' of precession, obliquity and eccentricity. While numerous quantitative approaches exist for the identification of these astronomical cycles in stratigraphic data, limitations in radioisotopic dating, and instability of the theoretical astronomical solutions beyond ∼50 Myr ago, can challenge identification of the phase relationships needed to constrain climate response and anchor floating astrochronologies. Here we demonstrate that interference patterns accompanying frequency modulation (FM) of short eccentricity provide a robust basis for identifying the phase of long eccentricity forcing in stratigraphic data. One- and two-dimensional models of sedimentary distortion of the astronomical signal are used to evaluate the veracity of the FM method, and indicate that pristine eccentricity FM can be readily distinguished in paleo-records. Apart from paleoclimatic implications, the FM approach provides a quantitative technique for testing and calibrating theoretical astronomical solutions, and for refining chronologies for the deep past. We present two case studies that use the FM approach to evaluate major carbon-cycle perturbations of the Eocene and Late Cretaceous. Interference patterns in the short-eccentricity band reveal that Eocene hyperthermals ETM2 ('Elmo'), H2, I1 and ETM3 (X; ∼52-54 Myr ago) were associated with maxima in the 405-kyr cycle of orbital eccentricity. The same eccentricity configuration favored regional anoxic episodes in the Mediterranean during the Middle and Late Cenomanian (∼94.5-97 Myr ago). The initial phase of the global Oceanic Anoxic Event II (OAE II; ∼93.9-94.5 Myr ago) coincides with maximum and falling 405-kyr eccentricity, and the recovery phase occurs during minimum and rising 405-kyr eccentricity. On a Myr scale, the event overlaps with a node in eccentricity

  20. What a Shame: Increased Rates of OMS Resident Burnout May Be Related to the Frequency of Shamed Events During Training.

    Science.gov (United States)

    Shapiro, Michael C; Rao, Sowmya R; Dean, Jason; Salama, Andrew R

    2017-03-01

    Shame is an ineffective tool in residency education that often results in depression, isolation, and worse patient care. This study aimed to assess burnout, depersonalization, and personal achievement levels in current oral and maxillofacial surgery (OMS) residents, to assess the prevalence of the use of shame in OMS residency training, and to determine whether there is a relation between shame exposure and resident burnout, depersonalization, and personal achievement levels. An anonymous 20-question cross-sectional survey was developed incorporating the Maslach Burnout Index and a previously validated shame questionnaire and sent to all OMS program directors affiliated with the American Association of Oral and Maxillofacial Surgeons for distribution among their respective residents in 2016. Univariate analyses were used to determine the distribution of the predictor (shame) and outcome (burnout) by gender and by frequency of shaming events. Multivariable logistic regression analysis was used to assess the relation of shame to burnout. A 2-sided P value less than .05 was considered statistically significant. Two hundred seventeen responses were received; 82% of respondents were men (n = 178), 95% were 25 to 34 years old (n = 206), and 58% (n = 126) were enrolled in a 4-year program. Frequently shamed residents were more likely to have depression (58 vs 22%; P < .0001), isolation (55 vs 22%; P < .0001), and poor job performance (50 vs 30%; P < .0001). Residents who were frequently shamed were more likely to experience moderate to severe burnout (odds ratio = 4.6; 95% confidence interval, 2.1-10.0; P < .001) and severe depersonalization (odds ratio = 5.1; 95% confidence interval, 2.1-12.0; P < .0001) than residents who had never or infrequently been shamed. There is a clear relation between the number of shame events and burnout and depersonalization levels. It is important to understand the negative impact that the experience of shame has on residents

  1. Method for Assessing Grid Frequency Deviation Due to Wind Power Fluctuation Based on “Time-Frequency Transformation”

    DEFF Research Database (Denmark)

    Jin, Lin; Yuan-zhang, Sun; Sørensen, Poul Ejnar

    2012-01-01

    published studies are based entirely on deterministic methodology. This paper presents a novel assessment method based on Time-Frequency Transformation to overcome shortcomings of existing methods. The main contribution of the paper is to propose a stochastic process simulation model which is a better...... alternative of the existing dynamic frequency deviation simulation model. In this way, the method takes the stochastic wind power fluctuation into full account so as to give a quantitative risk assessment of grid frequency deviation to grid operators, even without using any dynamic simulation tool. The case...

  2. Event Completion: Event Based Inferences Distort Memory in a Matter of Seconds

    Science.gov (United States)

    Strickland, Brent; Keil, Frank

    2011-01-01

    We present novel evidence that implicit causal inferences distort memory for events only seconds after viewing. Adults watched videos of someone launching (or throwing) an object. However, the videos omitted the moment of contact (or release). Subjects falsely reported seeing the moment of contact when it was implied by subsequent footage but did…

  3. Detection of planets in extremely weak central perturbation microlensing events via next-generation ground-based surveys

    International Nuclear Information System (INIS)

    Chung, Sun-Ju; Lee, Chung-Uk; Koo, Jae-Rim

    2014-01-01

    Even though the recently discovered high-magnification event MOA-2010-BLG-311 had complete coverage over its peak, confident planet detection did not happen due to extremely weak central perturbations (EWCPs, fractional deviations of ≲ 2%). For confident detection of planets in EWCP events, it is necessary to have both high cadence monitoring and high photometric accuracy better than those of current follow-up observation systems. The next-generation ground-based observation project, Korea Microlensing Telescope Network (KMTNet), satisfies these conditions. We estimate the probability of occurrence of EWCP events with fractional deviations of ≤2% in high-magnification events and the efficiency of detecting planets in the EWCP events using the KMTNet. From this study, we find that the EWCP events occur with a frequency of >50% in the case of ≲ 100 M E planets with separations of 0.2 AU ≲ d ≲ 20 AU. We find that for main-sequence and sub-giant source stars, ≳ 1 M E planets in EWCP events with deviations ≤2% can be detected with frequency >50% in a certain range that changes with the planet mass. However, it is difficult to detect planets in EWCP events of bright stars like giant stars because it is easy for KMTNet to be saturated around the peak of the events because of its constant exposure time. EWCP events are caused by close, intermediate, and wide planetary systems with low-mass planets and close and wide planetary systems with massive planets. Therefore, we expect that a much greater variety of planetary systems than those already detected, which are mostly intermediate planetary systems, regardless of the planet mass, will be significantly detected in the near future.

  4. Frequency of damage by external hazards based on geographical information

    Energy Technology Data Exchange (ETDEWEB)

    Becker, G. [RISA Sicherheitsanalysen GmbH, Berlin (Germany); Camarinopoulos, A.; Karali, T. [ERRA, Athens (Greece); Camarinopoulos, L. [Piraeus Univ. (Greece); Schubert, B. [VENE, Hamburg (Germany)

    2013-07-01

    External explosions can significantly contribute to risk of damage for industrial plants. External explosions may origin from other plants in the neighborhood, which store and operate with explosive substances, or from transport of such substances on road, rail, or water. In all cases, some accident is a necessary condition for a hazard. Another probabilistic element is the probability of ignition. If transport causes the explosion, the location of the accident will influence the consequences. If deflagration is involved, ignition will not necessarily occur at the place of the accident, but a cloud of a combustible gas-air mixture may develop, which will ignite at some distance depending on wind velocity. In order to avoid unnecessarily pessimistic approaches, geographical information can be used in addition to local weather statistics. Geographical information systems provide map material for sites, roads, rail and rivers on a computer. This information can be used to find frequencies of damage based on numerical integration or on Monte Carlo simulation. A probabilistic model has been developed. It is based on: - A joint probability density function for wind direction and wind speed, which has been estimated from local weather statistics, - Frequency of hazards for neighboring plants and various types of traffic, - Statistics on the amounts and types of explosive materials, - The model has been implemented using one numerical integrations method and two variants of Monte Carlo method. Data has been collected and applied for a nuclear power plant in Northern Germany as an example. The method, however, can be used for any type of plant subject to external explosion hazards. In its present form, it makes use of design criteria specific for nuclear power plants, but these could be replaced by different criteria. (orig.)

  5. Nano-resonator frequency response based on strain gradient theory

    International Nuclear Information System (INIS)

    Miandoab, Ehsan Maani; Yousefi-Koma, Aghil; Pishkenari, Hossein Nejat; Fathi, Mohammad

    2014-01-01

    This paper aims to explore the dynamic behaviour of a nano-resonator under ac and dc excitation using strain gradient theory. To achieve this goal, the partial differential equation of nano-beam vibration is first converted to an ordinary differential equation by the Galerkin projection method and the lumped model is derived. Lumped parameters of the nano-resonator, such as linear and nonlinear springs and damper coefficients, are compared with those of classical theory and it is demonstrated that beams with smaller thickness display greater deviation from classical parameters. Stable and unstable equilibrium points based on classic and non-classical theories are also compared. The results show that, regarding the applied dc voltage, the dynamic behaviours expected by classical and non-classical theories are significantly different, such that one theory predicts the un-deformed shape as the stable condition, while the other theory predicts that the beam will experience bi-stability. To obtain the frequency response of the nano-resonator, a general equation including cubic and quadratic nonlinearities in addition to parametric electrostatic excitation terms is derived, and the analytical solution is determined using a second-order multiple scales method. Based on frequency response analysis, the softening and hardening effects given by two theories are investigated and compared, and it is observed that neglecting the size effect can lead to two completely different predictions in the dynamic behaviour of the resonators. The findings of this article can be helpful in the design and characterization of the size-dependent dynamic behaviour of resonators on small scales. (paper)

  6. Frequency of damage by external hazards based on geographical information

    International Nuclear Information System (INIS)

    Becker, G.; Camarinopoulos, A.; Karali, T.; Camarinopoulos, L.; Schubert, B.

    2013-01-01

    External explosions can significantly contribute to risk of damage for industrial plants. External explosions may origin from other plants in the neighborhood, which store and operate with explosive substances, or from transport of such substances on road, rail, or water. In all cases, some accident is a necessary condition for a hazard. Another probabilistic element is the probability of ignition. If transport causes the explosion, the location of the accident will influence the consequences. If deflagration is involved, ignition will not necessarily occur at the place of the accident, but a cloud of a combustible gas-air mixture may develop, which will ignite at some distance depending on wind velocity. In order to avoid unnecessarily pessimistic approaches, geographical information can be used in addition to local weather statistics. Geographical information systems provide map material for sites, roads, rail and rivers on a computer. This information can be used to find frequencies of damage based on numerical integration or on Monte Carlo simulation. A probabilistic model has been developed. It is based on: - A joint probability density function for wind direction and wind speed, which has been estimated from local weather statistics, - Frequency of hazards for neighboring plants and various types of traffic, - Statistics on the amounts and types of explosive materials, - The model has been implemented using one numerical integrations method and two variants of Monte Carlo method. Data has been collected and applied for a nuclear power plant in Northern Germany as an example. The method, however, can be used for any type of plant subject to external explosion hazards. In its present form, it makes use of design criteria specific for nuclear power plants, but these could be replaced by different criteria. (orig.)

  7. Does acute radio-frequency electromagnetic field exposure affect visual event-related potentials in healthy adults?

    Science.gov (United States)

    Dalecki, Anna; Loughran, Sarah P; Verrender, Adam; Burdon, Catriona A; Taylor, Nigel A S; Croft, Rodney J

    2018-05-01

    To use improved methods to address the question of whether acute exposure to radio-frequency (RF) electromagnetic fields (RF-EMF) affects early (80-200 ms) sensory and later (180-600 ms) cognitive processes as indexed by event-related potentials (ERPs). Thirty-six healthy subjects completed a visual discrimination task during concurrent exposure to a Global System for Mobile Communications (GSM)-like, 920 MHz signal with peak-spatial specific absorption rate for 10 g of tissue of 0 W/kg of body mass (Sham), 1 W/kg (Low RF) and 2 W/kg (High RF). A fully randomised, counterbalanced, double-blind design was used. P1 amplitude was reduced (p = .02) and anterior N1 latency was increased (p = .04) during Exposure compared to Sham. There were no effects on any other ERP latencies or amplitudes. RF-EMF exposure may affect early perceptual (P1) and preparatory motor (anterior N1) processes. However, only two ERP indices, out of 56 comparisons, were observed to differ between RF-EMF exposure and Sham, suggesting that these observations may be due to chance. These observations are consistent with previous findings that RF-EMF exposure has no reliable impact on cognition (e.g., accuracy and response speed). Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  8. Population genetical investigation of the level of mutagenesis and teratological events frequency in ecologically different regions of Kazakhstan

    International Nuclear Information System (INIS)

    Kashaganova, Zh.A.; Zhapbasov, R.; Kadyrova, N.Zh.; Karimbaeva, K.S.; Mamyrbaeva, A.N.; Altaeva, N.Z.

    2008-01-01

    Full text: Kazakhstan territory is unique including regions with radioactive pollution of Semipalatinsk nuclear test territory and storage of radioactive waste of uranium mines and metallurgy enterprises, and regions of drying Aral sea. These technogenic factors may cause some types of chromosome aberrations and developmental anomalies in mammals. The level of mutagenesis was estimated basing on chromosome aberrations and genomic mutation frequencies in bone marrow cells of natural rodents populations (Allactaga major Kern, Allactaga saltator Eversman, Cytellus eritrogenus Br.) and domestic animals (sheep, cattle, horse), which inhabit these regions. Sheep populations which are bred in the regions with different climatic conditions were used for teratological investigations. Different generations are met in the populations of mice family rodents caught in the nature. So studying the animals of different ages separately we can estimate the frequency of mutations in the animals of different age inhabiting the same radiation polluted regions. The frequency of chromosome abe rations in mice family rodents from such territories was twice as high as from the clear territories. In some animals chromosome aberration types characteristic for radiation mutagenesis (dicentrics, double acentric fragments) were found. High level of cytogenetical instability in somatic cells of agricultural animals which were bred on the pastures within former nuclear test territories for several generations may be caused by chronic radiation in low doses. The analysis of the spectrum of recorder chromosome aberrations in somatic cells and their dynamics in different animal species inhabiting for several generations these territories being chronically irradiated, allows us to investigate the direction of genetical evolution of mammals genofond structure induced by ecological factors. Comparative analysis of the frequencies of spontaneous abortuses, deadborn and newborn animals with innate

  9. Adaptive Window Zero-Crossing-Based Instantaneous Frequency Estimation

    Directory of Open Access Journals (Sweden)

    Sekhar S Chandra

    2004-01-01

    Full Text Available We address the problem of estimating instantaneous frequency (IF of a real-valued constant amplitude time-varying sinusoid. Estimation of polynomial IF is formulated using the zero-crossings of the signal. We propose an algorithm to estimate nonpolynomial IF by local approximation using a low-order polynomial, over a short segment of the signal. This involves the choice of window length to minimize the mean square error (MSE. The optimal window length found by directly minimizing the MSE is a function of the higher-order derivatives of the IF which are not available a priori. However, an optimum solution is formulated using an adaptive window technique based on the concept of intersection of confidence intervals. The adaptive algorithm enables minimum MSE-IF (MMSE-IF estimation without requiring a priori information about the IF. Simulation results show that the adaptive window zero-crossing-based IF estimation method is superior to fixed window methods and is also better than adaptive spectrogram and adaptive Wigner-Ville distribution (WVD-based IF estimators for different signal-to-noise ratio (SNR.

  10. CSI Frequency Domain Fingerprint-Based Passive Indoor Human Detection

    Directory of Open Access Journals (Sweden)

    Chong Han

    2018-04-01

    Full Text Available Passive indoor personnel detection technology is now a hot topic. Existing methods have been greatly influenced by environmental changes, and there are problems with the accuracy and robustness of detection. Passive personnel detection based on Wi-Fi not only solves the above problems, but also has the advantages of being low cost and easy to implement, and can be better applied to elderly care and safety monitoring. In this paper, we propose a passive indoor personnel detection method based on Wi-Fi, which we call FDF-PIHD (Frequency Domain Fingerprint-based Passive Indoor Human Detection. Through this method, fine-grained physical layer Channel State Information (CSI can be extracted to generate feature fingerprints so as to help determine the state in the scene by matching online fingerprints with offline fingerprints. In order to improve accuracy, we combine the detection results of three receiving antennas to obtain the final test result. The experimental results show that the detection rates of our proposed scheme all reach above 90%, no matter whether the scene is human-free, stationary or a moving human presence. In addition, it can not only detect whether there is a target indoors, but also determine the current state of the target.

  11. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event...

  12. SPEED : a semantics-based pipeline for economic event detection

    NARCIS (Netherlands)

    Hogenboom, F.P.; Hogenboom, A.C.; Frasincar, F.; Kaymak, U.; Meer, van der O.; Schouten, K.; Vandic, D.; Parsons, J.; Motoshi, S.; Shoval, P.; Woo, C.; Wand, Y.

    2010-01-01

    Nowadays, emerging news on economic events such as acquisitions has a substantial impact on the financial markets. Therefore, it is important to be able to automatically and accurately identify events in news items in a timely manner. For this, one has to be able to process a large amount of

  13. Semantics-based information extraction for detecting economic events

    NARCIS (Netherlands)

    A.C. Hogenboom (Alexander); F. Frasincar (Flavius); K. Schouten (Kim); O. van der Meer

    2013-01-01

    textabstractAs today's financial markets are sensitive to breaking news on economic events, accurate and timely automatic identification of events in news items is crucial. Unstructured news items originating from many heterogeneous sources have to be mined in order to extract knowledge useful for

  14. Logical Discrete Event Systems in a trace theory based setting

    NARCIS (Netherlands)

    Smedinga, R.

    1993-01-01

    Discrete event systems can be modelled using a triple consisting of some alphabet (representing the events that might occur), and two trace sets (sets of possible strings) denoting the possible behaviour and the completed tasks of the system. Using this definition we are able to formulate and solve

  15. AN EMPIRICAL ANALYSIS OF THE INFLUENCE OF RISK FACTORS ON THE FREQUENCY AND IMPACT OF SEVERE EVENTS ON THE SUPPLY CHAIN IN THE CZECH REPUBLIC

    Directory of Open Access Journals (Sweden)

    José María Caridad

    2014-12-01

    Full Text Available Purpose: This paper is focused on an analysis and evaluation of severe events according to their frequency of occurrence and their impact on the company's manufacturing and distribution supply chains performance in the Czech Republic. Risk factors are introduced for critical events.Design/methodology: An identification and classification of severe events are realized on the basis of median mapping and mapping of ordinal variability acquired through the questionnaire survey of 82 companies. Analysis of 46 risk factors was sorted into 5 groups. We used asymmetric Somers's d statistics for testing the dependence of frequency and impact of a severe event on selected risk sources. The hierarchical cluster analysis is performed to identify relatively homogeneous groups of critical severe events according to their dependency on risk factors and its strength.Findings: Results showed that ‘a lack of contracts’ is considered to be the most critical severe event. Groups of demand and supply side and an external risk factor group were identified to be the most significant sources of risk factors. The worst cluster encompasses 11% of examined risk factors which should be prevented. We concluded that organizations need to adopt appropriate precautions and risk management methods in logistics.Originality: In this paper, the methodology for severe events evaluation in supply chain is designed. This methodology involves assessing the critical factors which influence the critical events and which should be prevented.

  16. A model-based approach to operational event groups ranking

    Energy Technology Data Exchange (ETDEWEB)

    Simic, Zdenko [European Commission Joint Research Centre, Petten (Netherlands). Inst. for Energy and Transport; Maqua, Michael [Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (GRS), Koeln (Germany); Wattrelos, Didier [Institut de Radioprotection et de Surete Nucleaire (IRSN), Fontenay-aux-Roses (France)

    2014-04-15

    The operational experience (OE) feedback provides improvements in all industrial activities. Identification of the most important and valuable groups of events within accumulated experience is important in order to focus on a detailed investigation of events. The paper describes the new ranking method and compares it with three others. Methods have been described and applied to OE events utilised by nuclear power plants in France and Germany for twenty years. The results show that different ranking methods only roughly agree on which of the event groups are the most important ones. In the new ranking method the analytical hierarchy process is applied in order to assure consistent and comprehensive weighting determination for ranking indexes. The proposed method allows a transparent and flexible event groups ranking and identification of the most important OE for further more detailed investigation in order to complete the feedback. (orig.)

  17. Power system frequency estimation based on an orthogonal decomposition method

    Science.gov (United States)

    Lee, Chih-Hung; Tsai, Men-Shen

    2018-06-01

    In recent years, several frequency estimation techniques have been proposed by which to estimate the frequency variations in power systems. In order to properly identify power quality issues under asynchronously-sampled signals that are contaminated with noise, flicker, and harmonic and inter-harmonic components, a good frequency estimator that is able to estimate the frequency as well as the rate of frequency changes precisely is needed. However, accurately estimating the fundamental frequency becomes a very difficult task without a priori information about the sampling frequency. In this paper, a better frequency evaluation scheme for power systems is proposed. This method employs a reconstruction technique in combination with orthogonal filters, which may maintain the required frequency characteristics of the orthogonal filters and improve the overall efficiency of power system monitoring through two-stage sliding discrete Fourier transforms. The results showed that this method can accurately estimate the power system frequency under different conditions, including asynchronously sampled signals contaminated by noise, flicker, and harmonic and inter-harmonic components. The proposed approach also provides high computational efficiency.

  18. A Machine Learning-based Rainfall System for GPM Dual-frequency Radar

    Science.gov (United States)

    Tan, H.; Chandrasekar, V.; Chen, H.

    2017-12-01

    Precipitation measurement produced by the Global Precipitation Measurement (GPM) Dual-frequency Precipitation Radar (DPR) plays an important role in researching the water circle and forecasting extreme weather event. Compare with its predecessor - Tropical Rainfall Measuring Mission (TRMM) Precipitation Radar (PR), GRM DPR measures precipitation in two different frequencies (i.e., Ku and Ka band), which can provide detailed information on the microphysical properties of precipitation particles, quantify particle size distribution and quantitatively measure light rain and falling snow. This paper presents a novel Machine Learning system for ground-based and space borne radar rainfall estimation. The system first trains ground radar data for rainfall estimation using rainfall measurements from gauges and subsequently uses the ground radar based rainfall estimates to train GPM DPR data in order to get space based rainfall product. Therein, data alignment between space DPR and ground radar is conducted using the methodology proposed by Bolen and Chandrasekar (2013), which can minimize the effects of potential geometric distortion of GPM DPR observations. For demonstration purposes, rainfall measurements from three rain gauge networks near Melbourne, Florida, are used for training and validation purposes. These three gauge networks, which are located in Kennedy Space Center (KSC), South Florida Water Management District (SFL), and St. Johns Water Management District (STJ), include 33, 46, and 99 rain gauge stations, respectively. Collocated ground radar observations from the National Weather Service (NWS) Weather Surveillance Radar - 1988 Doppler (WSR-88D) in Melbourne (i.e., KMLB radar) are trained with the gauge measurements. The trained model is then used to derive KMLB radar based rainfall product, which is used to train GPM DPR data collected from coincident overpasses events. The machine learning based rainfall product is compared against the GPM standard products

  19. Cardiovascular Events in Cancer Patients Treated with Highly or Moderately Emetogenic Chemotherapy: Results from a Population-Based Study

    International Nuclear Information System (INIS)

    Vo, T. T.; Nelson, J. J.

    2012-01-01

    Studies on cardiovascular safety in cancer patients treated with highly or moderately emetogenic chemotherapy (HEC or MEC), who may have taken the antiemetic, aprepitant, have been limited to clinical trials and postmarketing spontaneous reports. Our study explored background rates of cardiovascular disease (CVD) events among HEC- or MEC-treated cancer patients in a population-based setting to contextualize events seen in a new drug development program and to determine at a high level whether rates differed by aprepitant usage. Medical and pharmacy claims data from the 2005-2007 IMPACT National Benchmark Database were classified into emetogenic chemotherapy categories and CVD outcomes. Among 5827 HEC/MEC-treated patients, frequencies were highest for hypertension (16-21%) and composites of venous (7-12%) and arterial thromboembolic events (4-7%). Aprepitant users generally did not experience higher frequencies of events compared to nonusers. Our study serves as a useful benchmark of background CVD event rates in a population-based setting of cancer patients.

  20. Modelling of extreme rainfall events in Peninsular Malaysia based on annual maximum and partial duration series

    Science.gov (United States)

    Zin, Wan Zawiah Wan; Shinyie, Wendy Ling; Jemain, Abdul Aziz

    2015-02-01

    In this study, two series of data for extreme rainfall events are generated based on Annual Maximum and Partial Duration Methods, derived from 102 rain-gauge stations in Peninsular from 1982-2012. To determine the optimal threshold for each station, several requirements must be satisfied and Adapted Hill estimator is employed for this purpose. A semi-parametric bootstrap is then used to estimate the mean square error (MSE) of the estimator at each threshold and the optimal threshold is selected based on the smallest MSE. The mean annual frequency is also checked to ensure that it lies in the range of one to five and the resulting data is also de-clustered to ensure independence. The two data series are then fitted to Generalized Extreme Value and Generalized Pareto distributions for annual maximum and partial duration series, respectively. The parameter estimation methods used are the Maximum Likelihood and the L-moment methods. Two goodness of fit tests are then used to evaluate the best-fitted distribution. The results showed that the Partial Duration series with Generalized Pareto distribution and Maximum Likelihood parameter estimation provides the best representation for extreme rainfall events in Peninsular Malaysia for majority of the stations studied. Based on these findings, several return values are also derived and spatial mapping are constructed to identify the distribution characteristic of extreme rainfall in Peninsular Malaysia.

  1. Dust events in Beijing, China (2004–2006: comparison of ground-based measurements with columnar integrated observations

    Directory of Open Access Journals (Sweden)

    Z. J. Wu

    2009-09-01

    Full Text Available Ambient particle number size distributions spanning three years were used to characterize the frequency and intensity of atmospheric dust events in the urban areas of Beijing, China in combination with AERONET sun/sky radiometer data. Dust events were classified into two types based on the differences in particle number and volume size distributions and local weather conditions. This categorization was confirmed by aerosol index images, columnar aerosol optical properties, and vertical potential temperature profiles. During the type-1 events, dust particles dominated the total particle volume concentration (<10 μm, with a relative share over 70%. Anthropogenic particles in the Aitken and accumulation mode played a subordinate role here because of high wind speeds (>4 m s−1. The type-2 events occurred in rather stagnant air masses and were characterized by a lower volume fraction of coarse mode particles (on average, 55%. Columnar optical properties showed that the superposition of dust and anthropogenic aerosols in type-2 events resulted in a much higher AOD (average: 1.51 than for the rather pure dust aerosols in type-1 events (average AOD: 0.36. A discrepancy was found between the ground-based and column integrated particle volume size distributions, especially for the coarse mode particles. This discrepancy likely originates from both the limited comparability of particle volume size distributions derived from Sun photometer and in situ number size distributions, and the inhomogeneous vertical distribution of particles during dust events.

  2. Frequency domain based LS channel estimation in OFDM based Power line communications

    OpenAIRE

    Bogdanović, Mario

    2015-01-01

    This paper is focused on low voltage power line communication (PLC) realization with an emphasis on channel estimation techniques. The Orthogonal Frequency Division Multiplexing (OFDM) scheme is preferred technology in PLC systems because of its effective combat with frequency selective fading properties of PLC channel. As the channel estimation is one of the crucial problems in OFDM based PLC system because of a problematic area of PLC signal attenuation and interference, the improved LS est...

  3. Numerical Simulations of Slow Stick Slip Events with PFC, a DEM Based Code

    Science.gov (United States)

    Ye, S. H.; Young, R. P.

    2017-12-01

    Nonvolcanic tremors around subduction zone have become a fascinating subject in seismology in recent years. Previous studies have shown that the nonvolcanic tremor beneath western Shikoku is composed of low frequency seismic waves overlapping each other. This finding provides direct link between tremor and slow earthquakes. Slow stick slip events are considered to be laboratory scaled slow earthquakes. Slow stick slip events are traditionally studied with direct shear or double direct shear experiment setup, in which the sliding velocity can be controlled to model a range of fast and slow stick slips. In this study, a PFC* model based on double direct shear is presented, with a central block clamped by two side blocks. The gauge layers between the central and side blocks are modelled as discrete fracture networks with smooth joint bonds between pairs of discrete elements. In addition, a second model is presented in this study. This model consists of a cylindrical sample subjected to triaxial stress. Similar to the previous model, a weak gauge layer at a 45 degrees is added into the sample, on which shear slipping is allowed. Several different simulations are conducted on this sample. While the confining stress is maintained at the same level in different simulations, the axial loading rate (displacement rate) varies. By varying the displacement rate, a range of slipping behaviour, from stick slip to slow stick slip are observed based on the stress-strain relationship. Currently, the stick slip and slow stick slip events are strictly observed based on the stress-strain relationship. In the future, we hope to monitor the displacement and velocity of the balls surrounding the gauge layer as a function of time, so as to generate a synthetic seismogram. This will allow us to extract seismic waveforms and potentially simulate the tremor-like waves found around subduction zones. *Particle flow code, a discrete element method based numerical simulation code developed by

  4. Diet Activity Characteristic of Large-scale Sports Events Based on HACCP Management Model

    OpenAIRE

    Xiao-Feng Su; Li Guo; Li-Hua Gao; Chang-Zhuan Shao

    2015-01-01

    The study proposed major sports events dietary management based on "HACCP" management model. According to the characteristic of major sports events catering activities. Major sports events are not just showcase level of competitive sports activities which have become comprehensive special events including social, political, economic, cultural and other factors, complex. Sporting events conferred reach more diverse goals and objectives of economic, political, cultural, technological and other ...

  5. Autocorrel I: A Neural Network Based Network Event Correlation Approach

    National Research Council Canada - National Science Library

    Japkowicz, Nathalie; Smith, Reuben

    2005-01-01

    .... We use the autoassociator to build prototype software to cluster network alerts generated by a Snort intrusion detection system, and discuss how the results are significant, and how they can be applied to other types of network events.

  6. Balboa: A Framework for Event-Based Process Data Analysis

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1998-01-01

    .... We have built Balboa as a bridge between the data collection and the analysis tools, facilitating the gathering and management of event data, and simplifying the construction of tools to analyze the data...

  7. A MEMS-based high frequency x-ray chopper

    Energy Technology Data Exchange (ETDEWEB)

    Siria, A; Schwartz, W; Chevrier, J [Institut Neel, CNRS-Universite Joseph Fourier Grenoble, BP 166, F-38042 Grenoble Cedex 9 (France); Dhez, O; Comin, F [ESRF, 6 rue Jules Horowitz, F-38043 Grenoble Cedex 9 (France); Torricelli, G [Department of Physics and Astronomy, University of Leicester, University Road, Leicester LE1 7RH (United Kingdom)

    2009-04-29

    Time-resolved x-ray experiments require intensity modulation at high frequencies (advanced rotating choppers have nowadays reached the kHz range). We here demonstrate that a silicon microlever oscillating at 13 kHz with nanometric amplitude can be used as a high frequency x-ray chopper. We claim that using micro-and nanoelectromechanical systems (MEMS and NEMS), it will be possible to achieve higher frequencies in excess of hundreds of megahertz. Working at such a frequency can open a wealth of possibilities in chemistry, biology and physics time-resolved experiments.

  8. Noise-based frequency offset modulation in wideband frequency-selective fading channels

    NARCIS (Netherlands)

    Meijerink, Arjan; Cotton, S.L.; Bentum, Marinus Jan; Scanlon, W.G.

    2009-01-01

    A frequency offset modulation scheme using wideband noise carriers is considered. The main advantage of such a scheme is that it enables fast receiver synchronization without channel adaptation, while providing robustness to multipath fading and in-band interference. This is important for low-power

  9. Dynamic model based novel findings in power systems analysis and frequency measurement verification

    Science.gov (United States)

    Kook, Kyung Soo

    This study selects several new advanced topics in power systems, and verifies their usefulness using the simulation. In the study on ratio of the equivalent reactance and resistance of the bulk power systems, the simulation results give us the more correct value of X/R of the bulk power system, which can explain why the active power compensation is also important in voltage flicker mitigation. In the application study of the Energy Storage System(ESS) to the wind power, the new model implementation of the ESS connected to the wind power is proposed, and the control effect of ESS to the intermittency of the wind power is verified. Also this study conducts the intensive simulations for clarifying the behavior of the wide-area power system frequency as well as the possibility of the on-line instability detection. In our POWER IT Laboratory, since 2003, the U.S. national frequency monitoring network (FNET) has been being continuously operated to monitor the wide-area power system frequency in the U.S. Using the measured frequency data, the event of the power system is triggered, and its location and scale are estimated. This study also looks for the possibility of using the simulation technologies to contribute the applications of FNET, finds similarity of the event detection orders between the frequency measurements and the simulations in the U.S. Eastern power grid, and develops the new methodology for estimating the event location based on the simulated N-1 contingencies using the frequency measurement. It has been pointed out that the simulation results can not represent the actual response of the power systems due to the inevitable limit of modeling power systems and different operating conditions of the systems at every second. However, in the circumstances that we need to test such an important infrastructure supplying the electric energy without taking any risk of it, the software based simulation will be the best solution to verify the new technologies in

  10. Demosaicking Based on Optimization and Projection in Different Frequency Bands

    Directory of Open Access Journals (Sweden)

    Omer OsamaA

    2008-01-01

    Full Text Available Abstract A fast and effective iterative demosaicking algorithm is described for reconstructing a full-color image from single-color filter array data. The missing color values are interpolated on the basis of optimization and projection in different frequency bands. A filter bank is used to decompose an initially interpolated image into low-frequency and high-frequency bands. In the low-frequency band, a quadratic cost function is minimized in accordance with the observation that the low-frequency components of chrominance slowly vary within an object region. In the high-frequency bands, the high-frequency components of the unknown values are projected onto the high-frequency components of the known values. Comparison of the proposed algorithm with seven state-of-the-art demosaicking algorithms showed that it outperforms all of them for 20 images on average in terms of objective quality and that it is competitive with them from the subjective quality and complexity points of view.

  11. Multi-frequency exciting and spectrogram-based ECT method

    CERN Document Server

    Chady, T

    2000-01-01

    The purpose of this paper is to experimentally demonstrate advantages of a multi-frequency ECT system. In this system, a precise crack imaging was achieved by using spectrograms obtained from an eddy-current probe multi-frequency response. A complex signal containing selected sinusoidal components was used as an excitation. The results of measurements for various test specimens are presented.

  12. Compact blue laser devices based on nonlinear frequency upconversion

    International Nuclear Information System (INIS)

    Risk, W.P.

    1989-01-01

    This paper reports how miniature sources of coherent blue radiation can be produced by using nonlinear optical materials for frequency upconversion of the infrared radiation emitted by laser diodes. Direct upconversion of laser diode radiation is possible, but there are several advantages to using the diode laser to pump a solid-state laser which is then upconverted. In either case, the challenge is to find combinations of nonlinear materials and laser for efficient frequency upconversion. Several examples have been demonstrated. These include intracavity frequency doubling of a diode-pumped 946-nm Nd:YAG laser, intracavity frequency mixing of a 809-nm GaAlAs laser diode with a diode- pumped 1064-nm Nd:YAG laser, and direct frequency doubling of a 994-nm strained-layer InGaAs laser diode

  13. A new approach for bioassays based on frequency- and time-domain measurements of magnetic nanoparticles.

    Science.gov (United States)

    Oisjöen, Fredrik; Schneiderman, Justin F; Astalan, Andrea Prieto; Kalabukhov, Alexey; Johansson, Christer; Winkler, Dag

    2010-01-15

    We demonstrate a one-step wash-free bioassay measurement system capable of tracking biochemical binding events. Our approach combines the high resolution of frequency- and high speed of time-domain measurements in a single device in combination with a fast one-step bioassay. The one-step nature of our magnetic nanoparticle (MNP) based assay reduces the time between sample extraction and quantitative results while mitigating the risks of contamination related to washing steps. Our method also enables tracking of binding events, providing the possibility of, for example, investigation of how chemical/biological environments affect the rate of a binding process or study of the action of certain drugs. We detect specific biological binding events occurring on the surfaces of fluid-suspended MNPs that modify their magnetic relaxation behavior. Herein, we extrapolate a modest sensitivity to analyte of 100 ng/ml with the present setup using our rapid one-step bioassay. More importantly, we determine the size-distributions of the MNP systems with theoretical fits to our data obtained from the two complementary measurement modalities and demonstrate quantitative agreement between them. Copyright 2009 Elsevier B.V. All rights reserved.

  14. Incident Light Frequency-Based Image Defogging Algorithm

    Directory of Open Access Journals (Sweden)

    Wenbo Zhang

    2017-01-01

    Full Text Available To solve the color distortion problem produced by the dark channel prior algorithm, an improved method for calculating transmittance of all channels, respectively, was proposed in this paper. Based on the Beer-Lambert Law, the influence between the frequency of the incident light and the transmittance was analyzed, and the ratios between each channel’s transmittance were derived. Then, in order to increase efficiency, the input image was resized to a smaller size before acquiring the refined transmittance which will be resized to the same size of original image. Finally, all the transmittances were obtained with the help of the proportion between each color channel, and then they were used to restore the defogging image. Experiments suggest that the improved algorithm can produce a much more natural result image in comparison with original algorithm, which means the problem of high color saturation was eliminated. What is more, the improved algorithm speeds up by four to nine times compared to the original algorithm.

  15. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  16. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  17. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    International Nuclear Information System (INIS)

    Brinkmann, Markus; Eichbaum, Kathrin; Kammann, Ulrike; Hudjetz, Sebastian; Cofalla, Catrina; Buchinger, Sebastian; Reifferscheid, Georg; Schüttrumpf, Holger; Preuss, Thomas

    2014-01-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios

  18. Pulse frequency in pulsed brachytherapy based on tissue repair kinetics

    International Nuclear Information System (INIS)

    Sminia, Peter; Schneider, Christoph J.; Koedooder, Kees; Tienhoven, Geertjan van; Blank, Leo E.C.M.; Gonzalez Gonzalez, Dionisio

    1998-01-01

    Purpose: Investigation of normal tissue sparing in pulsed brachytherapy (PB) relative to continuous low-dose rate irradiation (CLDR) by adjusting pulse frequency based on tissue repair characteristics. Method: Using the linear quadratic model, the relative effectiveness (RE) of a 20 Gy boost was calculated for tissue with an α/β ratio ranging from 2 to 10 Gy and a half-time of sublethal damage repair between 0.1 and 3 h. The boost dose was considered to be delivered either in a number of pulses varying from 2 to 25, or continuously at a dose rate of 0.50, 0.80, or 1.20 Gy/h. Results: The RE of 20 Gy was found to be identical for PB in 25 pulses of 0.80 Gy each h and CLDR delivered at 0.80 Gy/h for any α/β value and for a repair half-time > 0.75 h. When normal tissue repair half-times are assumed to be longer than tumor repair half-times, normal tissue sparing can be obtained, within the restriction of a fixed overall treatment time, with higher dose per pulse and longer period time (time elapsed between start of pulse n and start of pulse n + 1). An optimum relative normal tissue sparing larger than 10% was found with 4 pulses of 5 Gy every 8 h. Hence, a therapeutic gain might be obtained when changing from CLDR to PB by adjusting the physical dose in such a way that the biological dose on the tumor is maintained. The normal tissue-sparing phenomenon can be explained by an increase in RE with longer period time for tissue with high α/β ratio and fast or intermediate repair half-time, and the RE for tissue with low α/β ratio and long repair half-time remains almost constant. Conclusion: Within the benchmark of the LQ model, advantage in normal tissue-sparing is expected when matching the pulse frequency to the repair kinetics of the normal tissue exposed. A period time longer than 1 h may lead to a reduction of late normal tissue complications. This theoretical advantage emphasizes the need for better knowledge of human tissue-repair kinetics

  19. Estimating parameters of speciation models based on refined summaries of the joint site-frequency spectrum.

    Directory of Open Access Journals (Sweden)

    Aurélien Tellier

    Full Text Available Understanding the processes and conditions under which populations diverge to give rise to distinct species is a central question in evolutionary biology. Since recently diverged populations have high levels of shared polymorphisms, it is challenging to distinguish between recent divergence with no (or very low inter-population gene flow and older splitting events with subsequent gene flow. Recently published methods to infer speciation parameters under the isolation-migration framework are based on summarizing polymorphism data at multiple loci in two species using the joint site-frequency spectrum (JSFS. We have developed two improvements of these methods based on a more extensive use of the JSFS classes of polymorphisms for species with high intra-locus recombination rates. First, using a likelihood based method, we demonstrate that taking into account low-frequency polymorphisms shared between species significantly improves the joint estimation of the divergence time and gene flow between species. Second, we introduce a local linear regression algorithm that considerably reduces the computational time and allows for the estimation of unequal rates of gene flow between species. We also investigate which summary statistics from the JSFS allow the greatest estimation accuracy for divergence time and migration rates for low (around 10 and high (around 100 numbers of loci. Focusing on cases with low numbers of loci and high intra-locus recombination rates we show that our methods for the estimation of divergence time and migration rates are more precise than existing approaches.

  20. Capacitance-Based Frequency Adjustment of Micro Piezoelectric Vibration Generator

    Directory of Open Access Journals (Sweden)

    Xinhua Mao

    2014-01-01

    Full Text Available Micro piezoelectric vibration generator has a wide application in the field of microelectronics. Its natural frequency is unchanged after being manufactured. However, resonance cannot occur when the natural frequencies of a piezoelectric generator and the source of vibration frequency are not consistent. Output voltage of the piezoelectric generator will sharply decline. It cannot normally supply power for electronic devices. In order to make the natural frequency of the generator approach the frequency of vibration source, the capacitance FM technology is adopted in this paper. Different capacitance FM schemes are designed by different locations of the adjustment layer. The corresponding capacitance FM models have been established. Characteristic and effect of the capacitance FM have been simulated by the FM model. Experimental results show that the natural frequency of the generator could vary from 46.5 Hz to 42.4 Hz when the bypass capacitance value increases from 0 nF to 30 nF. The natural frequency of a piezoelectric vibration generator could be continuously adjusted by this method.

  1. Capacitance-based frequency adjustment of micro piezoelectric vibration generator.

    Science.gov (United States)

    Mao, Xinhua; He, Qing; Li, Hong; Chu, Dongliang

    2014-01-01

    Micro piezoelectric vibration generator has a wide application in the field of microelectronics. Its natural frequency is unchanged after being manufactured. However, resonance cannot occur when the natural frequencies of a piezoelectric generator and the source of vibration frequency are not consistent. Output voltage of the piezoelectric generator will sharply decline. It cannot normally supply power for electronic devices. In order to make the natural frequency of the generator approach the frequency of vibration source, the capacitance FM technology is adopted in this paper. Different capacitance FM schemes are designed by different locations of the adjustment layer. The corresponding capacitance FM models have been established. Characteristic and effect of the capacitance FM have been simulated by the FM model. Experimental results show that the natural frequency of the generator could vary from 46.5 Hz to 42.4 Hz when the bypass capacitance value increases from 0 nF to 30 nF. The natural frequency of a piezoelectric vibration generator could be continuously adjusted by this method.

  2. The Effects of Semantic Transparency and Base Frequency on the Recognition of English Complex Words

    Science.gov (United States)

    Xu, Joe; Taft, Marcus

    2015-01-01

    A visual lexical decision task was used to examine the interaction between base frequency (i.e., the cumulative frequencies of morphologically related forms) and semantic transparency for a list of derived words. Linear mixed effects models revealed that high base frequency facilitates the recognition of the complex word (i.e., a "base…

  3. A fit-based frequency programme for the PS

    CERN Document Server

    Hancock, S

    2007-01-01

    Since the probes in the PS reference magnet that generate the so-called B-train are fairly short, they cannot register any change in magnetic length due to saturation. Hence the idea to derive the effective dipole magnetic field seen by the beam from measurements of revolution frequency and mean radial position over an entire cycle, to fit a saturation law, and to use the result to make a new frequency programme. Although far from new, the idea has never been implemented due to the tacit assumption that any imperfections in the existing frequency programme are taken care of by the action of the servo loops of the various beam controls. More recently, the delivery of ions at low energy from LEIR has called into question the accuracy the raw frequency programme and the idea has been revisited in a brief parasitic MD.

  4. Time-Frequency Based Instantaneous Frequency Estimation of Sparse Signals from an Incomplete Set of Samples

    Science.gov (United States)

    2014-06-17

    100 0 2 4 Wigner distribution 0 50 100 0 0.5 1 Auto-correlation function 0 50 100 0 2 4 L- Wigner distribution 0 50 100 0 0.5 1 Auto-correlation function ...bilinear or higher order autocorrelation functions will increase the number of missing samples, the analysis shows that accurate instantaneous...frequency estimation can be achieved even if we deal with only few samples, as long as the auto-correlation function is properly chosen to coincide with

  5. Characterising Event-Based DOM Inputs to an Urban Watershed

    Science.gov (United States)

    Croghan, D.; Bradley, C.; Hannah, D. M.; Van Loon, A.; Sadler, J. P.

    2017-12-01

    Dissolved Organic Matter (DOM) composition in urban streams is dominated by terrestrial inputs after rainfall events. Urban streams have particularly strong terrestrial-riverine connections due to direct input from terrestrial drainage systems. Event driven DOM inputs can have substantial adverse effects on water quality. Despite this, DOM from important catchment sources such as road drains and Combined Sewage Overflows (CSO's) remains poorly characterised within urban watersheds. We studied DOM sources within an urbanised, headwater watershed in Birmingham, UK. Samples from terrestrial sources (roads, roofs and a CSO), were collected manually after the onset of rainfall events of varying magnitude, and again within 24-hrs of the event ending. Terrestrial samples were analysed for fluorescence, absorbance and Dissolved Organic Carbon (DOC) concentration. Fluorescence and absorbance indices were calculated, and Parallel Factor Analysis (PARAFAC) was undertaken to aid sample characterization. Substantial differences in fluorescence, absorbance, and DOC were observed between source types. PARAFAC-derived components linked to organic pollutants were generally highest within road derived samples, whilst humic-like components tended to be highest within roof samples. Samples taken from the CSO generally contained low fluorescence, however this likely represents a dilution effect. Variation within source groups was particularly high, and local land use seemed to be the driving factor for road and roof drain DOM character and DOC quantity. Furthermore, high variation in fluorescence, absorbance and DOC was apparent between all sources depending on event type. Drier antecedent conditions in particular were linked to greater presence of terrestrially-derived components and higher DOC content. Our study indicates that high variations in DOM character occur between source types, and over small spatial scales. Road drains located on main roads appear to contain the poorest

  6. Multiple daytime nucleation events in semi-clean savannah and industrial environments in South Africa: analysis based on observations

    Directory of Open Access Journals (Sweden)

    A. Hirsikko

    2013-06-01

    Full Text Available Recent studies have shown very high frequencies of atmospheric new particle formation in different environments in South Africa. Our aim here was to investigate the causes for two or three consecutive daytime nucleation events, followed by subsequent particle growth during the same day. We analysed 108 and 31 such days observed in a polluted industrial and moderately polluted rural environments, respectively, in South Africa. The analysis was based on two years of measurements at each site. After rejecting the days having notable changes in the air mass origin or local wind direction, i.e. two major reasons for observed multiple nucleation events, we were able to investigate other factors causing this phenomenon. Clouds were present during, or in between most of the analysed multiple particle formation events. Therefore, some of these events may have been single events, interrupted somehow by the presence of clouds. From further analysis, we propose that the first nucleation and growth event of the day was often associated with the mixing of a residual air layer rich in SO2 (oxidized to sulphuric acid into the shallow surface-coupled layer. The second nucleation and growth event of the day usually started before midday and was sometimes associated with renewed SO2 emissions from industrial origin. However, it was also evident that vapours other than sulphuric acid were required for the particle growth during both events. This was especially the case when two simultaneously growing particle modes were observed. Based on our analysis, we conclude that the relative contributions of estimated H2SO4 and other vapours on the first and second nucleation and growth events of the day varied from day to day, depending on anthropogenic and natural emissions, as well as atmospheric conditions.

  7. Frequency and predictors of stroke after acute myocardial infarction: specific aspects of in-hospital and postdischarge events.

    Science.gov (United States)

    Hachet, Olivier; Guenancia, Charles; Stamboul, Karim; Daubail, Benoit; Richard, Carole; Béjot, Yannick; Yameogo, Valentin; Gudjoncik, Aurélie; Cottin, Yves; Giroud, Maurice; Lorgis, Luc

    2014-12-01

    Stroke is a serious complication after acute myocardial infarction (AMI) and is closely associated with decreased survival. This study aimed to investigate the frequency, characteristics, and factors associated with in-hospital and postdischarge stroke in patients with AMI. Eight thousand four hundred eighty-five consecutive patients admitted to a cardiology intensive care unit for AMI, between January 2001 and July 2010. Stroke/transient ischemic attack were collected during 1-year follow-up. One hundred twenty-three in-hospital strokes were recorded: 65 (52.8%) occurred on the first day after admission for AMI, and 108 (87%) within the first 5 days. One hundred six patients (86.2%-incidence rate 1.25%) experienced in-hospital ischemic stroke, and 14 patients (11.4%-incidence rate 0.16%) were diagnosed with an in-hospital hemorrhagic stroke. In-hospital ischemic stroke subtypes according to the Trial of Org 10 172 in Acute Stroke Treatment (TOAST) classification showed that only 2 types of stroke were identified more frequently. As expected, the leading subtype of in-hospital ischemic stroke was cardioembolic stroke (n=64, 60%), the second was stroke of undetermined pathogenesis (n=38, 36%). After multivariable backward regression analysis, female sex, previous transient ischemic attack (TIA)/stroke, new-onset atrial fibrillation, left ventricular ejection fraction (odds ratio per point of left ventricular ejection fraction), and C-reactive protein were independently associated with in-hospital ischemic stroke. When antiplatelet and anticoagulation therapy within the first 48 hours was introduced into the multivariable model, we found that implementing these treatments (≥1) was an independent protective factor of in-hospital stroke. In-hospital hemorrhagic stroke was dramatically increased (5-fold) when thrombolysis was prescribed as the reperfusion treatment. However, the different parenteral anticoagulants were not predictors of risk in univariable analysis

  8. Frequency Response Function Based Damage Identification for Aerospace Structures

    Science.gov (United States)

    Oliver, Joseph Acton

    Structural health monitoring technologies continue to be pursued for aerospace structures in the interests of increased safety and, when combined with health prognosis, efficiency in life-cycle management. The current dissertation develops and validates damage identification technology as a critical component for structural health monitoring of aerospace structures and, in particular, composite unmanned aerial vehicles. The primary innovation is a statistical least-squares damage identification algorithm based in concepts of parameter estimation and model update. The algorithm uses frequency response function based residual force vectors derived from distributed vibration measurements to update a structural finite element model through statistically weighted least-squares minimization producing location and quantification of the damage, estimation uncertainty, and an updated model. Advantages compared to other approaches include robust applicability to systems which are heavily damped, large, and noisy, with a relatively low number of distributed measurement points compared to the number of analytical degrees-of-freedom of an associated analytical structural model (e.g., modal finite element model). Motivation, research objectives, and a dissertation summary are discussed in Chapter 1 followed by a literature review in Chapter 2. Chapter 3 gives background theory and the damage identification algorithm derivation followed by a study of fundamental algorithm behavior on a two degree-of-freedom mass-spring system with generalized damping. Chapter 4 investigates the impact of noise then successfully proves the algorithm against competing methods using an analytical eight degree-of-freedom mass-spring system with non-proportional structural damping. Chapter 5 extends use of the algorithm to finite element models, including solutions for numerical issues, approaches for modeling damping approximately in reduced coordinates, and analytical validation using a composite

  9. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  10. Web-based online system for recording and examing of events in power plants

    International Nuclear Information System (INIS)

    Seyd Farshi, S.; Dehghani, M.

    2004-01-01

    Occurrence of events in power plants could results in serious drawbacks in generation of power. This suggests high degree of importance for online recording and examing of events. In this paper an online web-based system is introduced, which records and examines events in power plants. Throughout the paper, procedures for design and implementation of this system, its features and results gained are explained. this system provides predefined level of online access to all data of events for all its users in power plants, dispatching, regional utilities and top-level managers. By implementation of electric power industry intranet, an expandable modular system to be used in different sectors of industry is offered. Web-based online recording and examing system for events offers the following advantages: - Online recording of events in power plants. - Examing of events in regional utilities. - Access to event' data. - Preparing managerial reports

  11. Model Based Verification of Cyber Range Event Environments

    Science.gov (United States)

    2015-11-13

    that may include users, applications, operating systems, servers, hosts, routers, switches, control planes , and instrumentation planes , many of...which lack models for their configuration. Our main contributions in this paper are the following. First, we have developed a configuration ontology...configuration errors in environment designs for several cyber range events. The rest of the paper is organized as follows. Section 2 provides an overview of

  12. Assessment of System Frequency Support Effect of PMSG-WTG Using Torque-Limit-Based Inertial Control: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiao; Gao, Wenzhong; Wang, Jianhui; Wu, Ziping; Yan, Weihang; Gevorgian, Vahan; Zhang, Yingchen; Muljadi, Eduard; Kang, Moses; Hwang, Min; Kang, Yong Cheol

    2017-05-12

    To release the 'hidden inertia' of variable-speed wind turbines for temporary frequency support, a method of torque-limit-based inertial control is proposed in this paper. This method aims to improve the frequency support capability considering the maximum torque restriction of a permanent magnet synchronous generator. The advantages of the proposed method are improved frequency nadir (FN) in the event of an under-frequency disturbance; and avoidance of over-deceleration and a second frequency dip during the inertial response. The system frequency response is different, with different slope values in the power-speed plane when the inertial response is performed. The proposed method is evaluated in a modified three-machine, nine-bus system. The simulation results show that there is a trade-off between the recovery time and FN, such that a gradual slope tends to improve the FN and restrict the rate of change of frequency aggressively while causing an extension of the recovery time. These results provide insight into how to properly design such kinds of inertial control strategies for practical applications.

  13. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  14. Power Load Event Detection and Classification Based on Edge Symbol Analysis and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Lei Jiang

    2012-01-01

    Full Text Available Energy signature analysis of power appliance is the core of nonintrusive load monitoring (NILM where the detailed data of the appliances used in houses are obtained by analyzing changes in the voltage and current. This paper focuses on developing an automatic power load event detection and appliance classification based on machine learning. In power load event detection, the paper presents a new transient detection algorithm. By turn-on and turn-off transient waveforms analysis, it can accurately detect the edge point when a device is switched on or switched off. The proposed load classification technique can identify different power appliances with improved recognition accuracy and computational speed. The load classification method is composed of two processes including frequency feature analysis and support vector machine. The experimental results indicated that the incorporation of the new edge detection and turn-on and turn-off transient signature analysis into NILM revealed more information than traditional NILM methods. The load classification method has achieved more than ninety percent recognition rate.

  15. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Science.gov (United States)

    Pawlowski, Andrzej; Guzman, Jose Luis; Rodríguez, Francisco; Berenguel, Manuel; Sánchez, José; Dormido, Sebastián

    2009-01-01

    Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results. PMID:22389597

  16. Simulation Analysis of SPWM Variable Frequency Speed Based on Simulink

    Directory of Open Access Journals (Sweden)

    Min-Yan DI

    2014-01-01

    Full Text Available This article is studied on currently a very active field of researching sinusoidal pulse width modulation (SPWM frequency speed control system, and strengthen researched on the simulation model of speed control system with MATLAB / Simulink / Power System simulation tools, thus we can find the best way to simulation. We apply it to the actual conveyor belt, frequency conversion motor, when the obtained simulation results are compared with the measured data, we prove that the method is practical and effective. The results of our research have a guiding role for the future engineering and technical personnel in asynchronous motor SPWM VVVF CAD design.

  17. Event-related desynchronization and synchronization in MEG: Framework for analysis and illustrative datasets related to discrimination of frequency-modulated tones.

    Science.gov (United States)

    Zygierewicz, J; Sieluzycki, C; König, R; Durka, P J

    2008-02-15

    We introduce a complete framework for the calculation of statistically significant event-related desynchronization and synchronization (ERD/ERS) in the time-frequency plane for magnetoencephalographic (MEG) data, and provide free Internet access to software and illustrative datasets related to a classification task of frequency-modulated (FM) tones. Event-related changes in MEG were analysed on the basis of the normal component of the magnetic field acquired by the 148 magnetometers of the hardware configuration of our whole-head MEG device, and by computing planar gradients in longitudinal and latitudinal direction. Time-frequency energy density for the magnetometer as well as the two gradient configurations is first approximated using short-time Fourier transform. Subsequently, detailed information is obtained from high-resolution time-frequency maps for the most interesting sensors by means of the computationally much more demanding matching pursuit parametrization. We argue that the ERD/ERS maps are easier to interpret in the gradient approaches and discuss the superior resolution of the matching pursuit time-frequency representation compared to short-time Fourier and wavelet transforms. Experimental results are accompanied by the following resources, available from http://brain.fuw.edu.pl/MEG: (a) 48 high-resolution figures presenting the results of four subjects in all applicable settings, (b) raw datasets, and (c) complete software environment, allowing to recompute these figures from the raw datasets.

  18. The effects of high-frequency oscillations in hippocampal electrical activities on the classification of epileptiform events using artificial neural networks

    Science.gov (United States)

    Chiu, Alan W. L.; Jahromi, Shokrollah S.; Khosravani, Houman; Carlen, Peter L.; Bardakjian, Berj L.

    2006-03-01

    The existence of hippocampal high-frequency electrical activities (greater than 100 Hz) during the progression of seizure episodes in both human and animal experimental models of epilepsy has been well documented (Bragin A, Engel J, Wilson C L, Fried I and Buzsáki G 1999 Hippocampus 9 137-42 Khosravani H, Pinnegar C R, Mitchell J R, Bardakjian B L, Federico P and Carlen P L 2005 Epilepsia 46 1-10). However, this information has not been studied between successive seizure episodes or utilized in the application of seizure classification. In this study, we examine the dynamical changes of an in vitro low Mg2+ rat hippocampal slice model of epilepsy at different frequency bands using wavelet transforms and artificial neural networks. By dividing the time-frequency spectrum of each seizure-like event (SLE) into frequency bins, we can analyze their burst-to-burst variations within individual SLEs as well as between successive SLE episodes. Wavelet energy and wavelet entropy are estimated for intracellular and extracellular electrical recordings using sufficiently high sampling rates (10 kHz). We demonstrate that the activities of high-frequency oscillations in the 100-400 Hz range increase as the slice approaches SLE onsets and in later episodes of SLEs. Utilizing the time-dependent relationship between different frequency bands, we can achieve frequency-dependent state classification. We demonstrate that activities in the frequency range 100-400 Hz are critical for the accurate classification of the different states of electrographic seizure-like episodes (containing interictal, preictal and ictal states) in brain slices undergoing recurrent spontaneous SLEs. While preictal activities can be classified with an average accuracy of 77.4 ± 6.7% utilizing the frequency spectrum in the range 0-400 Hz, we can also achieve a similar level of accuracy by using a nonlinear relationship between 100-400 Hz and <4 Hz frequency bands only.

  19. A SAS-based solution to evaluate study design efficiency of phase I pediatric oncology trials via discrete event simulation.

    Science.gov (United States)

    Barrett, Jeffrey S; Jayaraman, Bhuvana; Patel, Dimple; Skolnik, Jeffrey M

    2008-06-01

    Previous exploration of oncology study design efficiency has focused on Markov processes alone (probability-based events) without consideration for time dependencies. Barriers to study completion include time delays associated with patient accrual, inevaluability (IE), time to dose limiting toxicities (DLT) and administrative and review time. Discrete event simulation (DES) can incorporate probability-based assignment of DLT and IE frequency, correlated with cohort in the case of DLT, with time-based events defined by stochastic relationships. A SAS-based solution to examine study efficiency metrics and evaluate design modifications that would improve study efficiency is presented. Virtual patients are simulated with attributes defined from prior distributions of relevant patient characteristics. Study population datasets are read into SAS macros which select patients and enroll them into a study based on the specific design criteria if the study is open to enrollment. Waiting times, arrival times and time to study events are also sampled from prior distributions; post-processing of study simulations is provided within the decision macros and compared across designs in a separate post-processing algorithm. This solution is examined via comparison of the standard 3+3 decision rule relative to the "rolling 6" design, a newly proposed enrollment strategy for the phase I pediatric oncology setting.

  20. Event-based criteria in GT-STAF information indices: theory, exploratory diversity analysis and QSPR applications.

    Science.gov (United States)

    Barigye, S J; Marrero-Ponce, Y; Martínez López, Y; Martínez Santiago, O; Torrens, F; García Domenech, R; Galvez, J

    2013-01-01

    Versatile event-based approaches for the definition of novel information theory-based indices (IFIs) are presented. An event in this context is the criterion followed in the "discovery" of molecular substructures, which in turn serve as basis for the construction of the generalized incidence and relations frequency matrices, Q and F, respectively. From the resultant F, Shannon's, mutual, conditional and joint entropy-based IFIs are computed. In previous reports, an event named connected subgraphs was presented. The present study is an extension of this notion, in which we introduce other events, namely: terminal paths, vertex path incidence, quantum subgraphs, walks of length k, Sach's subgraphs, MACCs, E-state and substructure fingerprints and, finally, Ghose and Crippen atom-types for hydrophobicity and refractivity. Moreover, we define magnitude-based IFIs, introducing the use of the magnitude criterion in the definition of mutual, conditional and joint entropy-based IFIs. We also discuss the use of information-theoretic parameters as a measure of the dissimilarity of codified structural information of molecules. Finally, a comparison of the statistics for QSPR models obtained with the proposed IFIs and DRAGON's molecular descriptors for two physicochemical properties log P and log K of 34 derivatives of 2-furylethylenes demonstrates similar to better predictive ability than the latter.

  1. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, M.; Gallagher, D. L.; Whitt, A.; Six, N. Frank (Technical Monitor)

    2002-01-01

    For the past four years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of science communication through the web resources on the Internet. The program includes extended stories about NAS.4 science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. We give here, examples of events, problems, and lessons learned from these activities.

  2. Fault trees based on past accidents. Factorial analysis of events

    International Nuclear Information System (INIS)

    Vaillant, M.

    1977-01-01

    The method of the fault tree is already useful in the qualitative step before any reliability calculation. The construction of the tree becomes even simpler when we just want to describe how the events happened. Differently from screenplays that introduce several possibilities by means of the conjunction OR, you only have here the conjunction AND, which will not be written at all. This method is presented by INRS (1) for the study of industrial injuries; it may also be applied to material damages. (orig.) [de

  3. Ionospheric correction for spaceborne single-frequency GPS based ...

    Indian Academy of Sciences (India)

    A modified ionospheric correction method and the corresponding approximate algorithm for spaceborne single-frequency Global Positioning System (GPS) users are proposed in this study. Single Layer Model (SLM) mapping function for spaceborne GPS was analyzed. SLM mapping functions at different altitudes were ...

  4. Ionospheric correction for spaceborne single-frequency GPS based ...

    Indian Academy of Sciences (India)

    The Klobuchar model was used to compute ionospheric delays for the dlft station, and .... dual-frequency GPS receivers; therefore, the iono- ... The mapping function is defined as the ratio of .... eter in the processing of an extended set of single.

  5. Dynamic phasor based frequency scanning for grid-connected ...

    Indian Academy of Sciences (India)

    M K Das

    2017-10-11

    Oct 11, 2017 ... situations, it is not so for systems with low-order harmonics, individual-phase schemes, unbalanced or single- ..... rents at x А 2xo and x, respectively, due to the fundamental ... c = xot, where xo is the operating frequency.

  6. Fast Grid Frequency Support from Distributed Inverter-Based Resources

    Energy Technology Data Exchange (ETDEWEB)

    Hoke, Anderson F [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-05-04

    This presentation summarizes power hardware-in-the-loop testing performed to evaluate the ability of distributed inverter-coupled generation to support grid frequency on the fastest time scales. The research found that distributed PV inverters and other DERs can effectively support the grid on sub-second time scales.

  7. Femtosecond frequency comb based distance measurement in air

    NARCIS (Netherlands)

    Balling, P.; Kren, P.; Masika, P.; van den Berg, S.A.

    2009-01-01

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The

  8. High resolution mid-infrared spectroscopy based on frequency upconversion

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin; Hu, Qi; Tidemand-Lichtenberg, Peter

    2013-01-01

    signals can be analyzed. The obtainable frequency resolution is usually in the nm range where sub nm resolution is preferred in many applications, like gas spectroscopy. In this work we demonstrate how to obtain sub nm resolution when using upconversion. In the presented realization one object point...... high resolution spectral performance by observing emission from hot water vapor in a butane gas burner....

  9. Wavelet Based Characterization of Low Radio Frequency Solar Emissions

    Science.gov (United States)

    Suresh, A.; Sharma, R.; Das, S. B.; Oberoi, D.; Pankratius, V.; Lonsdale, C.

    2016-12-01

    Low-frequency solar radio observations with the Murchison Widefield Array (MWA) have revealed the presence of numerous short-lived, narrow-band weak radio features, even during quiet solar conditions. In their appearance in in the frequency-time plane, they come closest to the solar type III bursts, but with much shorter spectral spans and flux densities, so much so that they are not detectable with the usual swept frequency radio spectrographs. These features occur at rates of many thousand features per hour in the 30.72 MHz MWA bandwidth, and hence necessarily require an automated approach to determine robust statistical estimates of their properties, e.g., distributions of spectral widths, temporal spans, flux densities, slopes in the time-frequency plane and distribution over frequency. To achieve this, a wavelet decomposition approach has been developed for feature recognition and subsequent parameter extraction from the MWA dynamic spectrum. This work builds on earlier work by the members of this team to achieve a reliable flux calibration in a computationally efficient manner. Preliminary results show that the distribution of spectral span of these features peaks around 3 MHz, most of them last for less than two seconds and are characterized by flux densities of about 60% of the background solar emission. In analogy with the solar type III bursts, this non-thermal emission is envisaged to arise via coherent emission processes. There is also an exciting possibility that these features might correspond to radio signatures of nanoflares, hypothesized (Gold, 1964; Parker, 1972) to explain coronal heating.

  10. Detailed Analysis of Torque Ripple in High Frequency Signal Injection based Sensor less PMSM Drives

    Directory of Open Access Journals (Sweden)

    Ravikumar Setty A.

    2017-01-01

    Full Text Available High Frequency Signal Injection based techniques are robust and well proven to estimate the rotor position from stand still to low speed. However, Injected high frequency signal introduces, high frequency harmonics in the motor phase currents and results in significant Output Torque ripple. There is no detailed analysis exist in the literature, to study the effect of injected signal frequency on Torque ripple. Objective of this work is to study the Torque Ripple resulting from High Frequency signal injection in PMSM motor drives. Detailed MATLAB/Simulink simulations are carried to quantify the Torque ripple at different Signal frequencies.

  11. LES-based generation of high-frequency fluctuation in wind turbulence obtained by meteorological model

    Science.gov (United States)

    Tamura, Tetsuro; Kawaguchi, Masaharu; Kawai, Hidenori; Tao, Tao

    2017-11-01

    The connection between a meso-scale model and a micro-scale large eddy simulation (LES) is significant to simulate the micro-scale meteorological problem such as strong convective events due to the typhoon or the tornado using LES. In these problems the mean velocity profiles and the mean wind directions change with time according to the movement of the typhoons or tornadoes. Although, a fine grid micro-scale LES could not be connected to a coarse grid meso-scale WRF directly. In LES when the grid is suddenly refined at the interface of nested grids which is normal to the mean advection the resolved shear stresses decrease due to the interpolation errors and the delay of the generation of smaller scale turbulence that can be resolved on the finer mesh. For the estimation of wind gust disaster the peak wind acting on buildings and structures has to be correctly predicted. In the case of meteorological model the velocity fluctuations have a tendency of diffusive variation without the high frequency component due to the numerically filtering effects. In order to predict the peak value of wind velocity with good accuracy, this paper proposes a LES-based method for generating the higher frequency components of velocity and temperature fields obtained by meteorological model.

  12. Tsunami detection by high-frequency radar in British Columbia: performance assessment of the time-correlation algorithm for synthetic and real events

    Science.gov (United States)

    Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.

    2018-02-01

    The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.

  13. Tsunami detection by high-frequency radar in British Columbia: performance assessment of the time-correlation algorithm for synthetic and real events

    Science.gov (United States)

    Guérin, Charles-Antoine; Grilli, Stéphan T.; Moran, Patrick; Grilli, Annette R.; Insua, Tania L.

    2018-05-01

    The authors recently proposed a new method for detecting tsunamis using high-frequency (HF) radar observations, referred to as "time-correlation algorithm" (TCA; Grilli et al. Pure Appl Geophys 173(12):3895-3934, 2016a, 174(1): 3003-3028, 2017). Unlike standard algorithms that detect surface current patterns, the TCA is based on analyzing space-time correlations of radar signal time series in pairs of radar cells, which does not require inverting radial surface currents. This was done by calculating a contrast function, which quantifies the change in pattern of the mean correlation between pairs of neighboring cells upon tsunami arrival, with respect to a reference correlation computed in the recent past. In earlier work, the TCA was successfully validated based on realistic numerical simulations of both the radar signal and tsunami wave trains. Here, this algorithm is adapted to apply to actual data from a HF radar installed in Tofino, BC, for three test cases: (1) a simulated far-field tsunami generated in the Semidi Subduction Zone in the Aleutian Arc; (2) a simulated near-field tsunami from a submarine mass failure on the continental slope off of Tofino; and (3) an event believed to be a meteotsunami, which occurred on October 14th, 2016, off of the Pacific West Coast and was measured by the radar. In the first two cases, the synthetic tsunami signal is superimposed onto the radar signal by way of a current memory term; in the third case, the tsunami signature is present within the radar data. In light of these test cases, we develop a detection methodology based on the TCA, using a correlation contrast function, and show that in all three cases the algorithm is able to trigger a timely early warning.

  14. Has the frequency of bleeding changed over time for patients presenting with an acute coronary syndrome? The global registry of acute coronary events.

    OpenAIRE

    Fox, KA; Carruthers, K; Steg, PG; Avezum, A; Granger, CB; Montalescot, G; Goodman, SG; Gore, JM; Quill, AL; Eagle, KA; GRACE Investigators,

    2010-01-01

    08.09.14 KB. Ok to add published version to spiral, OA paper AIMS: To determine whether changes in practice, over time, are associated with altered rates of major bleeding in acute coronary syndromes (ACS). METHODS AND RESULTS: Patients from the Global Registry of Acute Coronary Events were enrolled between 2000 and 2007. The main outcome measures were frequency of major bleeding, including haemorrhagic stroke, over time, after adjustment for patient characteristics, and impact of major b...

  15. In-cylinder pressure-based direct techniques and time frequency analysis for combustion diagnostics in IC engines

    International Nuclear Information System (INIS)

    D’Ambrosio, S.; Ferrari, A.; Galleani, L.

    2015-01-01

    Highlights: • Direct pressure-based techniques have been applied successfully to spark-ignition engines. • The burned mass fraction of pressure-based techniques has been compared with that of 2- and 3-zone combustion models. • The time frequency analysis has been employed to simulate complex diesel combustion events. - Abstract: In-cylinder pressure measurement and analysis has historically been a key tool for off-line combustion diagnosis in internal combustion engines, but online applications for real-time condition monitoring and combustion management have recently become popular. The present investigation presents and compares different low computing-cost in-cylinder pressure based methods for the analyses of the main features of combustion, that is, the start of combustion, the end of combustion and the crankshaft angle that responds to half of the overall burned mass. The instantaneous pressure in the combustion chamber has been used as an input datum for the described analytical procedures and it has been measured by means of a standard piezoelectric transducer. Traditional pressure-based techniques have been shown to be able to predict the burned mass fraction time history more accurately in spark ignition engines than in diesel engines. The most suitable pressure-based techniques for both spark ignition and compression ignition engines have been chosen on the basis of the available experimental data. Time–frequency analysis has also been applied to the analysis of diesel combustion, which is richer in events than spark ignited combustion. Time frequency algorithms for the calculation of the mean instantaneous frequency are computationally efficient, allow the main events of the diesel combustion to be identified and provide the greatest benefits in the presence of multiple injection events. These algorithms can be optimized and applied to onboard diagnostics tools designed for real control, but can also be used as an advanced validation tool for

  16. Nano-Scale Devices for Frequency-Based Magnetic Biosensing

    Science.gov (United States)

    2017-01-31

    show the basic measurement setup (the field is applied perpendicular to the disk plane). A radiofrequency signal is injected across the disk (disks...shown in Fig. 7(a). A spectrum analyser (S.A.) (or a high frequency oscilloscope) is used to measure the radiofrequency STO output signal with Fig...crystals and, via electrical measurements , in magnetic-vortex-containing, isolated micro- and nano-devices. Via micromagnetic simulations, we have largely

  17. Base-Level Guide for Electromagnetic Frequency Radiation

    Science.gov (United States)

    2012-12-01

    only through exposure to very powerful magnetic fields (> 4 Tesla). For the limits, the force influences on metallic objects are relevant...impractical. Nonferrous shielding materials have little or no effect upon a magnetic field. Magnetic shielding that is effective at low frequencies... detectors or area monitors in Air Force operational environments. This is noted in AFOSH 48-9. In cases where the environment may exceed ten (10

  18. Sparse time-frequency decomposition based on dictionary adaptation.

    Science.gov (United States)

    Hou, Thomas Y; Shi, Zuoqiang

    2016-04-13

    In this paper, we propose a time-frequency analysis method to obtain instantaneous frequencies and the corresponding decomposition by solving an optimization problem. In this optimization problem, the basis that is used to decompose the signal is not known a priori. Instead, it is adapted to the signal and is determined as part of the optimization problem. In this sense, this optimization problem can be seen as a dictionary adaptation problem, in which the dictionary is adaptive to one signal rather than a training set in dictionary learning. This dictionary adaptation problem is solved by using the augmented Lagrangian multiplier (ALM) method iteratively. We further accelerate the ALM method in each iteration by using the fast wavelet transform. We apply our method to decompose several signals, including signals with poor scale separation, signals with outliers and polluted by noise and a real signal. The results show that this method can give accurate recovery of both the instantaneous frequencies and the intrinsic mode functions. © 2016 The Author(s).

  19. Common time-frequency analysis of local field potential and pyramidal cell activity in seizure-like events of the rat hippocampus

    Science.gov (United States)

    Cotic, M.; Chiu, A. W. L.; Jahromi, S. S.; Carlen, P. L.; Bardakjian, B. L.

    2011-08-01

    To study cell-field dynamics, physiologists simultaneously record local field potentials and the activity of individual cells from animals performing cognitive tasks, during various brain states or under pathological conditions. However, apart from spike shape and spike timing analyses, few studies have focused on elucidating the common time-frequency structure of local field activity relative to surrounding cells across different periods of phenomena. We have used two algorithms, multi-window time frequency analysis and wavelet phase coherence (WPC), to study common intracellular-extracellular (I-E) spectral features in spontaneous seizure-like events (SLEs) from rat hippocampal slices in a low magnesium epilepsy model. Both algorithms were applied to 'pairs' of simultaneously observed I-E signals from slices in the CA1 hippocampal region. Analyses were performed over a frequency range of 1-100 Hz. I-E spectral commonality varied in frequency and time. Higher commonality was observed from 1 to 15 Hz, and lower commonality was observed in the 15-100 Hz frequency range. WPC was lower in the non-SLE region compared to SLE activity; however, there was no statistical difference in the 30-45 Hz band between SLE and non-SLE modes. This work provides evidence of strong commonality in various frequency bands of I-E SLEs in the rat hippocampus, not only during SLEs but also immediately before and after.

  20. Fire!: An Event-Based Science Module. Teacher's Guide. Chemistry and Fire Ecology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  1. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  2. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  3. Maximal potential patent foramen diameter does not correlate with the type or frequency of the neurologic event prior to closure.

    Science.gov (United States)

    Kutty, Shelby; Brown, Kimberly; Qureshi, Athar M; Latson, Larry A

    2009-01-01

    We analyzed our data on patients undergoing transcatheter patent foramen ovale (PFO) closure to determine if the maximal potential PFO diameter (MPPD) by balloon sizing correlates with important clinical characteristics in this population. We defined stroke as a focal neurologic deficit lasting >24 h, or focal deficit of shorter duration associated with permanent MRI/CT changes consistent with a focal infarction. Parameters analyzed included age, gender, anticoagulation, hypertension, smoking, MRI/CT findings and MPPD at catheterization. We specifically analyzed the type of neurologic event (stroke/transient ischemic attack, TIA), and number of recorded preceding clinical neurologic events. In 216 consecutive patients, 167 suffered a stroke. MRI/CT changes consistent with one or more embolic events were seen in 156 patients; 49 had a clinical TIA. There was no significant difference in MPPD between stroke (11.0 +/- 3.6 mm) and TIA groups (10.9 +/- 3.9 mm; 95% confidence interval for difference: -1.33 to 1.00). MPPD did not differ between MRI/CT-positive vs. -negative strokes, and had no correlation with the number of identified pre-closure clinical neurologic events. Continued investigation is needed to determine whether other PFO characteristics, or other anatomic/physiologic parameters, may be useful to identify patients at high risk for cryptogenic stroke/TIA, even before they have their first neurologic event. Copyright 2008 S. Karger AG, Basel.

  4. Event-building and PC farm based level-3 trigger at the CDF experiment

    CERN Document Server

    Anikeev, K; Furic, I K; Holmgren, D; Korn, A J; Kravchenko, I V; Mulhearn, M; Ngan, P; Paus, C; Rakitine, A; Rechenmacher, R; Shah, T; Sphicas, Paris; Sumorok, K; Tether, S; Tseng, J

    2000-01-01

    In the technical design report the event building process at Fermilab's CDF experiment is required to function at an event rate of 300 events/sec. The events are expected to have an average size of 150 kBytes (kB) and are assembled from fragments of 16 readout locations. The fragment size from the different locations varies between 12 kB and 16 kB. Once the events are assembled they are fed into the Level-3 trigger which is based on processors running programs to filter events using the full event information. Computing power on the order of a second on a Pentium II processor is required per event. The architecture design is driven by the cost and is therefore based on commodity components: VME processor modules running VxWorks for the readout, an ATM switch for the event building, and Pentium PCs running Linux as an operation system for the Level-3 event processing. Pentium PCs are also used to receive events from the ATM switch and further distribute them to the processing nodes over multiple 100 Mbps Ether...

  5. Caplacizumab reduces the frequency of major thromboembolic events, exacerbations and death in patients with acquired thrombotic thrombocytopenic purpura.

    Science.gov (United States)

    Peyvandi, F; Scully, M; Kremer Hovinga, J A; Knöbl, P; Cataland, S; De Beuf, K; Callewaert, F; De Winter, H; Zeldin, R K

    2017-07-01

    Essentials Acquired thrombotic thrombocytopenic purpura (aTTP) is linked with significant morbidity/mortality. Caplacizumab's effect on major thromboembolic (TE) events, exacerbations and death was studied. Fewer caplacizumab-treated patients had a major TE event, an exacerbation, or died versus placebo. Caplacizumab has the potential to reduce the acute morbidity and mortality associated with aTTP. Background Acquired thrombotic thrombocytopenic purpura (aTTP) is a life-threatening autoimmune thrombotic microangiopathy. In spite of treatment with plasma exchange and immunosuppression, patients remain at risk for thrombotic complications, exacerbations, and death. In the phase II TITAN study, treatment with caplacizumab, an anti-von Willebrand factor Nanobody ® was shown to reduce the time to confirmed platelet count normalization and exacerbations during treatment. Objective The clinical benefit of caplacizumab was further investigated in a post hoc analysis of the incidence of major thromboembolic events and exacerbations during the study drug treatment period and thrombotic thrombocytopenic purpura-related death during the study. Methods The Standardized Medical Dictionary for Regulatory Activities (MedDRA) Query (SMQ) for 'embolic and thrombotic events' was run to investigate the occurrence of major thromboembolic events and exacerbations in the safety population of the TITAN study, which consisted of 72 patients, of whom 35 received caplacizumab and 37 received placebo. Results Four events (one pulmonary embolism and three aTTP exacerbations) were reported in four patients in the caplacizumab group, and 20 such events were reported in 14 patients in the placebo group (two acute myocardial infarctions, one ischemic stroke, one hemorrhagic stroke, one pulmonary embolism, one deep vein thrombosis, one venous thrombosis, and 13 aTTP exacerbations). Two of the placebo-treated patients died from aTTP during the study. Conclusion In total, 11.4% of caplacizumab

  6. Core damage frequency perspectives based on IPE results

    International Nuclear Information System (INIS)

    Dingman, S.E.; Camp, A.L.; LaChance, J.L.; Drouin, M.T.

    1996-01-01

    In November 1988, the US Nuclear Regulatory Commission (NRC) issued Generic Letter 88-20 requesting that all licensees perform an individual Plant Examination (IPE) to identify any plant-specific vulnerability to severe accidents and report the results to the Commission. This paper provides perspectives gained from reviewing 75 Individual Plant Examination (IPE) submittals covering 108 nuclear power plant units. Variability both within and among reactor types is examined to provide perspectives regarding plant-specific design and operational features, and modeling assumptions that play a significant role in the estimates of core damage frequencies in the IPEs

  7. Wireless Chalcogenide Nanoionic-Based Radio-Frequency Switch

    Science.gov (United States)

    Nessel, James; Miranda, Felix

    2013-01-01

    A new nonvolatile nanoionic switch is powered and controlled through wireless radio-frequency (RF) transmission. A thin layer of chalcogenide glass doped with a metal ion, such as silver, comprises the operational portion of the switch. For the switch to function, an oxidizable electrode is made positive (anode) with respect to an opposing electrode (cathode) when sufficient bias, typically on the order of a few tenths of a volt or more, is applied. This action causes the metal ions to flow toward the cathode through a coordinated hopping mechanism. At the cathode, a reduction reaction occurs to form a metal deposit. This metal deposit creates a conductive path that bridges the gap between electrodes to turn the switch on. Once this conductive path is formed, no further power is required to maintain it. To reverse this process, the metal deposit is made positive with respect to the original oxidizable electrode, causing the dissolution of the metal bridge thereby turning the switch off. Once the metal deposit has been completely dissolved, the process self-terminates. This switching process features the following attributes. It requires very little to change states (i.e., on and off). Furthermore, no power is required to maintain the states; hence, the state of the switch is nonvolatile. Because of these attributes the integration of a rectenna to provide the necessary power and control is unique to this embodiment. A rectenna, or rectifying antenna, generates DC power from an incident RF signal. The low voltages and power required for the nanoionic switch control are easily generated from this system and provide the switch with a novel capability to be operated and powered from an external wireless device. In one realization, an RF signal of a specific frequency can be used to set the switch into an off state, while another frequency can be used to set the switch to an on state. The wireless, miniaturized, and nomoving- part features of this switch make it

  8. Short-Term Effects of Changing Precipitation Patterns on Shrub-Steppe Grasslands: Seasonal Watering Is More Important than Frequency of Watering Events.

    Science.gov (United States)

    Densmore-McCulloch, Justine A; Thompson, Donald L; Fraser, Lauchlan H

    2016-01-01

    Climate change is expected to alter precipitation patterns. Droughts may become longer and more frequent, and the timing and intensity of precipitation may change. We tested how shifting precipitation patterns, both seasonally and by frequency of events, affects soil nitrogen availability, plant biomass and diversity in a shrub-steppe temperate grassland along a natural productivity gradient in Lac du Bois Grasslands Protected Area near Kamloops, British Columbia, Canada. We manipulated seasonal watering patterns by either exclusively watering in the spring or the fall. To simulate spring precipitation we restricted precipitation inputs in the fall, then added 50% more water than the long term average in the spring, and vice-versa for the fall precipitation treatment. Overall, the amount of precipitation remained roughly the same. We manipulated the frequency of rainfall events by either applying water weekly (frequent) or monthly (intensive). After 2 years, changes in the seasonality of watering had greater effects on plant biomass and diversity than changes in the frequency of watering. Fall watering reduced biomass and increased species diversity, while spring watering had little effect. The reduction in biomass in fall watered treatments was due to a decline in grasses, but not forbs. Plant available N, measured by Plant Root Simulator (PRS)-probes, increased from spring to summer to fall, and was higher in fall watered treatments compared to spring watered treatments when measured in the fall. The only effect observed due to frequency of watering events was greater extractable soil N in monthly applied treatments compared to weekly watering treatments. Understanding the effects of changing precipitation patterns on grasslands will allow improved grassland conservation and management in the face of global climatic change, and here we show that if precipitation is more abundant in the fall, compared to the spring, grassland primary productivity will likely be

  9. The power grid AGC frequency bias coefficient online identification method based on wide area information

    Science.gov (United States)

    Wang, Zian; Li, Shiguang; Yu, Ting

    2015-12-01

    This paper propose online identification method of regional frequency deviation coefficient based on the analysis of interconnected grid AGC adjustment response mechanism of regional frequency deviation coefficient and the generator online real-time operation state by measured data through PMU, analyze the optimization method of regional frequency deviation coefficient in case of the actual operation state of the power system and achieve a more accurate and efficient automatic generation control in power system. Verify the validity of the online identification method of regional frequency deviation coefficient by establishing the long-term frequency control simulation model of two-regional interconnected power system.

  10. Knowledge based query expansion in complex multimedia event detection

    NARCIS (Netherlands)

    Boer, M. de; Schutte, K.; Kraaij, W.

    2016-01-01

    A common approach in content based video information retrieval is to perform automatic shot annotation with semantic labels using pre-trained classifiers. The visual vocabulary of state-of-the-art automatic annotation systems is limited to a few thousand concepts, which creates a semantic gap

  11. Knowledge based query expansion in complex multimedia event detection

    NARCIS (Netherlands)

    Boer, M.H.T. de; Schutte, K.; Kraaij, W.

    2015-01-01

    A common approach in content based video information retrieval is to perform automatic shot annotation with semantic labels using pre-trained classifiers. The visual vocabulary of state-of-the-art automatic annotation systems is limited to a few thousand concepts, which creates a semantic gap

  12. Islanding Detection of Synchronous Machine-Based DGs using Average Frequency Based Index

    Directory of Open Access Journals (Sweden)

    M. Bakhshi

    2013-06-01

    Full Text Available Identification of intentional and unintentional islanding situations of dispersed generators (DGs is one of the most important protection concerns in power systems. Considering safety and reliability problems of distribution networks, an exact diagnosis index is required to discriminate the loss of the main network from the existing parallel operation. Hence, this paper introduces a new islanding detection method for synchronous machine–based DGs. This method uses the average value of the generator frequency to calculate a new detection index. The proposed method is an effective supplement of the over/under frequency protection (OFP/UFP system. The analytical equations and simulation results are used to assess the performance of the proposed method under various scenarios such as different types of faults, load changes and capacitor bank switching. To show the effectiveness of the proposed method, it is compared with the performance of both ROCOF and ROCOFOP methods.

  13. Mercury Atomic Frequency Standards for Space Based Navigation and Timekeeping

    Science.gov (United States)

    Tjoelker, R. L.; Burt, E. A.; Chung, S.; Hamell, R. L.; Prestage, J. D.; Tucker, B.; Cash, P.; Lutwak, R.

    2012-01-01

    A low power Mercury Atomic Frequency Standard (MAFS) has been developed and demonstrated on the path towards future space clock applications. A self contained mercury ion breadboard clock: emulating flight clock interfaces, steering a USO local oscillator, and consuming approx 40 Watts has been operating at JPL for more than a year. This complete, modular ion clock instrument demonstrates that key GNSS size, weight, and power (SWaP) requirements can be achieved while still maintaining short and long term performance demonstrated in previous ground ion clocks. The MAFS breadboard serves as a flexible platform for optimizing further space clock development and guides engineering model design trades towards fabrication of an ion clock for space flight.

  14. Nanoparticle array based optical frequency selective surfaces: theory and design.

    Science.gov (United States)

    Saeidi, Chiya; van der Weide, Daniel

    2013-07-01

    We demonstrate a synthesis procedure for designing a bandstop optical frequency selective surface (FSS) composed of nanoparticle (NP) elements. The proposed FSS uses two-dimensional (2-D) periodic arrays of NPs with subwavelength unit-cell dimensions. We derive equivalent circuit for a nanoparticle array (NPA) using the closed-form solution for a 2-D NPA excited by a plane wave in the limit of the dipole approximation, which includes contribution from both individual and collective plasmon modes. Using the extracted equivalent circuit, we demonstrate synthesis of an optical FSS using cascaded NPA layers as coupled resonators, which we validate with both circuit model and full-wave simulation for a third-order Butterworth bandstop prototype.

  15. High frequency modulation circuits based on photoconductive wide bandgap switches

    Science.gov (United States)

    Sampayan, Stephen

    2018-02-13

    Methods, systems, and devices for high voltage and/or high frequency modulation. In one aspect, an optoelectronic modulation system includes an array of two or more photoconductive switch units each including a wide bandgap photoconductive material coupled between a first electrode and a second electrode, a light source optically coupled to the WBGP material of each photoconductive switch unit via a light path, in which the light path splits into multiple light paths to optically interface with each WBGP material, such that a time delay of emitted light exists along each subsequent split light path, and in which the WBGP material conducts an electrical signal when a light signal is transmitted to the WBGP material, and an output to transmit the electrical signal conducted by each photoconductive switch unit. The time delay of the photons emitted through the light path is substantially equivalent to the time delay of the electrical signal.

  16. Femtosecond frequency comb based distance measurement in air.

    Science.gov (United States)

    Balling, Petr; Kren, Petr; Masika, Pavel; van den Berg, S A

    2009-05-25

    Interferometric measurement of distance using a femtosecond frequency comb is demonstrated and compared with a counting interferometer displacement measurement. A numerical model of pulse propagation in air is developed and the results are compared with experimental data for short distances. The relative agreement for distance measurement in known laboratory conditions is better than 10(-7). According to the model, similar precision seems feasible even for long-distance measurement in air if conditions are sufficiently known. It is demonstrated that the relative width of the interferogram envelope even decreases with the measured length, and a fringe contrast higher than 90% could be obtained for kilometer distances in air, if optimal spectral width for that length and wavelength is used. The possibility of comb radiation delivery to the interferometer by an optical fiber is shown by model and experiment, which is important from a practical point of view.

  17. Hail frequency estimation across Europe based on a combination of overshooting top detections and the ERA-INTERIM reanalysis

    Science.gov (United States)

    Punge, H. J.; Bedka, K. M.; Kunz, M.; Reinbold, A.

    2017-12-01

    This article presents a hail frequency estimation based on the detection of cold overshooting cloud tops (OTs) from the Meteosat Second Generation (MSG) operational weather satellites, in combination with a hail-specific filter derived from the ERA-INTERIM reanalysis. This filter has been designed based on the atmospheric properties in the vicinity of hail reports registered in the European Severe Weather Database (ESWD). These include Convective Available Potential Energy (CAPE), 0-6-km bulk wind shear and freezing level height, evaluated at the nearest time step and interpolated from the reanalysis grid to the location of the hail report. Regions highly exposed to hail events include Northern Italy, followed by South-Eastern Austria and Eastern Spain. Pronounced hail frequency is also found in large parts of Eastern Europe, around the Alps, the Czech Republic, Southern Germany, Southern and Eastern France, and in the Iberic and Apennine mountain ranges.

  18. Interaction between Gender and Skill on Competitive State Anxiety Using the Time-to-Event Paradigm: What Roles Do Intensity, Direction, and Frequency Dimensions Play?

    Directory of Open Access Journals (Sweden)

    John E. Hagan

    2017-05-01

    Full Text Available Background and purpose: The functional understanding and examination of competitive anxiety responses as temporal events that unfold as time-to-competition moves closer has emerged as a topical research area within the domains of sport psychology. However, little is known from an inclusive and interaction oriented perspective. Using the multidimensional anxiety theory as a framework, the present study examined the temporal patterning of competitive anxiety, focusing on the dimensions of intensity, direction, and frequency of intrusions in athletes across gender and skill level.Methods: Elite and semi-elite table tennis athletes from the Ghanaian league (N = 90 completed a modified version of Competitive State Anxiety Inventory-2 (CSAI-2 with the inclusion of the directional and frequency of intrusion scales at three temporal phases (7 days, 2 days, and 1 h prior to a competitive fixture.Results: Multivariate Analyses of Variance repeated measures with follow-up analyses revealed significant interactions for between-subjects factors on all anxiety dimensions (intensity, direction, and frequency. Notably, elite (international female athletes were less cognitively anxious, showed more facilitative interpretation toward somatic anxiety symptoms and experienced less frequency of somatic anxiety symptoms than their male counterparts. However, both elite groups displayed appreciable level of self-confidence. For time-to-event effects, both cognitive and somatic anxiety intensity fluctuated whereas self-confidence showed a steady rise as competition neared. Somatic anxiety debilitative interpretation slightly improved 1 h before competition whereas cognitive anxiety frequencies also increased progressively during the entire preparatory phase.Conclusion: Findings suggest a more dynamic image of elite athletes’ pre-competitive anxiety responses than suggested by former studies, potentially influenced by cultural differences. The use of psychological

  19. Estimation of core-damage frequency to evolutionary ALWR [advanced light water reactor] due to seismic initiating events: Task 4.3.3

    International Nuclear Information System (INIS)

    Brooks, R.D.; Harrison, D.G.; Summitt, R.L.

    1990-04-01

    The Electric Power Research Institute (EPRI) is presently developing a requirements document for the design of advanced light water reactors (ALWRs). One of the basic goals of the EPRI ALWR Requirements Document is that the core-damage frequency for an ALWR shall be less than 1.0E-5. To aid in this effort, the Department of Energy's Advanced Reactor Severe Accident Program (ARSAP) initiated a functional probabilistic risk assessment (PRA) to determine how effectively the evolutionary plant requirements contained in the existing EPRI Requirements Document assure that this safety goal will be met. This report develops an approximation of the core-damage frequency due to seismic events for both evolutionary plant designs (pressurized-water reactor (PWR) and boiling-water reactor(BWR)) as modeled in the corresponding functional PRAs. Component fragility values were taken directly form information which has been submitted for inclusion in Appendix A to Volume 1 of the EPRI Requirements Document. The results show a seismic core-damage frequency of 5.2E-6 for PWRS and 5.0E-6 for BWRs. Combined with the internal initiators from the functional PRAs, the overall core-damage frequencies are 6.0E-6 for the pwr and BWR, both of which satisfy the 1.0E-5 EPRI goal. In addition, site-specific considerations, such as more rigid components and less conservative fragility data and seismic hazard curves, may further reduce these frequencies. The effect of seismic events on structures are not addressed in this generic evaluation and should be addressed separately on a design-specific basis. 7 refs., 6 figs., 3 tabs

  20. Tag and Neighbor based Recommender systems for Medical events

    DEFF Research Database (Denmark)

    Bayyapu, Karunakar Reddy; Dolog, Peter

    2010-01-01

    This paper presents an extension of a multifactor recommendation approach based on user tagging with term neighbours. Neighbours of words in tag vectors and documents provide for hitting larger set of documents and not only those matching with direct tag vectors or content of the documents. Tag...... in the situations where the quality of tags is lower. We discuss the approach on the examples from the existing Medworm system to indicate the usefulness of the approach....

  1. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  2. Future frequencies of extreme weather events in the National Wildlife Refuges of the conterminous U.S.

    Science.gov (United States)

    Martinuzzi, Sebastian; Allstadt, Andrew J.; Bateman, Brooke L.; Heglund, Patricia J.; Pidgeon, Anna M.; Thogmartin, Wayne E.; Vavrus, Stephen J.; Radeloff, Volker C.

    2016-01-01

    Climate change is a major challenge for managers of protected areas world-wide, and managers need information about future climate conditions within protected areas. Prior studies of climate change effects in protected areas have largely focused on average climatic conditions. However, extreme weather may have stronger effects on wildlife populations and habitats than changes in averages. Our goal was to quantify future changes in the frequency of extreme heat, drought, and false springs, during the avian breeding season, in 415 National Wildlife Refuges in the conterminous United States. We analyzed spatially detailed data on extreme weather frequencies during the historical period (1950–2005) and under different scenarios of future climate change by mid- and late-21st century. We found that all wildlife refuges will likely experience substantial changes in the frequencies of extreme weather, but the types of projected changes differed among refuges. Extreme heat is projected to increase dramatically in all wildlife refuges, whereas changes in droughts and false springs are projected to increase or decrease on a regional basis. Half of all wildlife refuges are projected to see increases in frequency (> 20% higher than the current rate) in at least two types of weather extremes by mid-century. Wildlife refuges in the Southwest and Pacific Southwest are projected to exhibit the fastest rates of change, and may deserve extra attention. Climate change adaptation strategies in protected areas, such as the U.S. wildlife refuges, may need to seriously consider future changes in extreme weather, including the considerable spatial variation of these changes.

  3. Measurement of a discontinuous object based on a dual-frequency grating

    Institute of Scientific and Technical Information of China (English)

    Qiao Nao-Sheng; Cai Xin-Hua; Yao Chun-Mei

    2009-01-01

    The dual-frequency grating measurement theory is proposed in order to carry out the measurement of a discontinuous object. Firstly, the reason why frequency spectra are produced by low frequency gratings and high frequency gratings in the field of frequency is analysed, and the relationship between the wrapped-phase and the unwrappingphase is discussed. Secondly, a method to combine the advantages of the two kinds of gratings is proposed: one stripe is produced in the mutation part of the object measured by a suitable low frequency grating designed by MATLAB, then the phase produced by the low frequency grating need not be unfolded. The integer series of stripes is produced by a high frequency grating designed by MATLAB based on the frequency ratio of the two kinds of gratings and the high frequency wrapped-phase, and the high frequency unwrapping-phase is then obtained. In order to verify the correctness of the theoretical analysis, a steep discontinuous object of 600×600 pixels and 10.00 mm in height is simulated and a discontinuous object of ladder shape which is 32.00 mm in height is used in experiment. Both the simulation and the experiment can restore the discontinuous object height accurately by using the dual-frequency grating measurement theory.

  4. Discrete Event System Based Pyroprocessing Modeling and Simulation: Oxide Reduction

    International Nuclear Information System (INIS)

    Lee, H. J.; Ko, W. I.; Choi, S. Y.; Kim, S. K.; Hur, J. M.; Choi, E. Y.; Im, H. S.; Park, K. I.; Kim, I. T.

    2014-01-01

    Dynamic changes according to the batch operation cannot be predicted in an equilibrium material flow. This study began to build a dynamic material balance model based on the previously developed pyroprocessing flowsheet. As a mid- and long-term research, an integrated pyroprocessing simulator is being developed at the Korea Atomic Energy Research Institute (KAERI) to cope with a review on the technical feasibility, safeguards assessment, conceptual design of facility, and economic feasibility evaluation. The most fundamental thing in such a simulator development is to establish the dynamic material flow framework. This study focused on the operation modeling of pyroprocessing to implement a dynamic material flow. As a case study, oxide reduction was investigated in terms of a dynamic material flow. DES based modeling was applied to build a pyroprocessing operation model. A dynamic material flow as the basic framework for an integrated pyroprocessing was successfully implemented through ExtendSim's internal database and item blocks. Complex operation logic behavior was verified, for example, an oxide reduction process in terms of dynamic material flow. Compared to the equilibrium material flow, a model-based dynamic material flow provides such detailed information that a careful analysis of every batch is necessary to confirm the dynamic material balance results. With the default scenario of oxide reduction, the batch mass balance was verified in comparison with a one-year equilibrium mass balance. This study is still under progress with a mid-and long-term goal, the development of a multi-purpose pyroprocessing simulator that is able to cope with safeguards assessment, economic feasibility, technical evaluation, conceptual design, and support of licensing for a future pyroprocessing facility

  5. Extreme events in total ozone over the Northern mid-latitudes: an analysis based on long-term data sets from five European ground-based stations

    Energy Technology Data Exchange (ETDEWEB)

    Rieder, Harald E. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland)), e-mail: hr2302@columbia.edu; Jancso, Leonhardt M. (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Inst. for Meteorology and Geophysics, Univ. of Innsbruck, Innsbruck (Austria)); Di Rocco, Stefania (Inst. for Atmospheric and Climate Science, ETH Zurich, Zurich (Switzerland); Dept. of Geography, Univ. of Zurich, Zurich (Switzerland)) (and others)

    2011-11-15

    We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear 'fingerprints' of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector

  6. Cognitive load and task condition in event- and time-based prospective memory: an experimental investigation.

    Science.gov (United States)

    Khan, Azizuddin; Sharma, Narendra K; Dixit, Shikha

    2008-09-01

    Prospective memory is memory for the realization of delayed intention. Researchers distinguish 2 kinds of prospective memory: event- and time-based (G. O. Einstein & M. A. McDaniel, 1990). Taking that distinction into account, the present authors explored participants' comparative performance under event- and time-based tasks. In an experimental study of 80 participants, the authors investigated the roles of cognitive load and task condition in prospective memory. Cognitive load (low vs. high) and task condition (event- vs. time-based task) were the independent variables. Accuracy in prospective memory was the dependent variable. Results showed significant differential effects under event- and time-based tasks. However, the effect of cognitive load was more detrimental in time-based prospective memory. Results also revealed that time monitoring is critical in successful performance of time estimation and so in time-based prospective memory. Similarly, participants' better performance on the event-based prospective memory task showed that they acted on the basis of environment cues. Event-based prospective memory was environmentally cued; time-based prospective memory required self-initiation.

  7. Research on frequency control strategy of interconnected region based on fuzzy PID

    Science.gov (United States)

    Zhang, Yan; Li, Chunlan

    2018-05-01

    In order to improve the frequency control performance of the interconnected power grid, overcome the problems of poor robustness and slow adjustment of traditional regulation, the paper puts forward a frequency control method based on fuzzy PID. The method takes the frequency deviation and tie-line deviation of each area as the control objective, takes the regional frequency deviation and its deviation as input, and uses fuzzy mathematics theory, adjusts PID control parameters online. By establishing the regional frequency control model of water-fire complementary power generation in MATLAB, the regional frequency control strategy is given, and three control modes (TBC-FTC, FTC-FTC, FFC-FTC) are simulated and analyzed. The simulation and experimental results show that, this method has better control performance compared with the traditional regional frequency regulation.

  8. Very low frequency radio events with a reduced intensity observed by the low-altitude DEMETER spacecraft

    Czech Academy of Sciences Publication Activity Database

    Záhlava, J.; Němec, F.; Santolík, Ondřej; Kolmašová, Ivana; Parrot, M.; Rodger, C. J.

    2015-01-01

    Roč. 120, č. 11 (2015), s. 9781-9794 ISSN 2169-9380 R&D Projects: GA ČR(CZ) GA14-31899S Grant - others:Rada Programu interní podpory projektů mezinárodní spolupráce AV ČR(CZ) M100421206 Institutional support: RVO:68378289 Keywords : magnetosphere * DEMETER * VLF radio events Subject RIV: BL - Plasma and Gas Discharge Physics Impact factor: 3.318, year: 2015 http://onlinelibrary.wiley.com/doi/10.1002/2015JA021607/full

  9. Assessing uncertainty in extreme events: Applications to risk-based decision making in interdependent infrastructure sectors

    International Nuclear Information System (INIS)

    Barker, Kash; Haimes, Yacov Y.

    2009-01-01

    Risk-based decision making often relies upon expert probability assessments, particularly in the consequences of disruptive events and when such events are extreme or catastrophic in nature. Naturally, such expert-elicited probability distributions can be fraught with errors, as they describe events which occur very infrequently and for which only sparse data exist. This paper presents a quantitative framework, the extreme event uncertainty sensitivity impact method (EE-USIM), for measuring the sensitivity of extreme event consequences to uncertainties in the parameters of the underlying probability distribution. The EE-USIM is demonstrated with the Inoperability input-output model (IIM), a model with which to evaluate the propagation of inoperability throughout an interdependent set of economic and infrastructure sectors. The EE-USIM also makes use of a two-sided power distribution function generated by expert elicitation of extreme event consequences

  10. Design a Learning-Oriented Fall Event Reporting System Based on Kirkpatrick Model.

    Science.gov (United States)

    Zhou, Sicheng; Kang, Hong; Gong, Yang

    2017-01-01

    Patient fall has been a severe problem in healthcare facilities around the world due to its prevalence and cost. Routine fall prevention training programs are not as effective as expected. Using event reporting systems is the trend for reducing patient safety events such as falls, although some limitations of the systems exist at current stage. We summarized these limitations through literature review, and developed an improved web-based fall event reporting system. The Kirkpatrick model, widely used in the business area for training program evaluation, has been integrated during the design of our system. Different from traditional event reporting systems that only collect and store the reports, our system automatically annotates and analyzes the reported events, and provides users with timely knowledge support specific to the reported event. The paper illustrates the design of our system and how its features are intended to reduce patient falls by learning from previous errors.

  11. High-performance radio frequency transistors based on diameter-separated semiconducting carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Yu; Che, Yuchi; Zhou, Chongwu, E-mail: chongwuz@usc.edu [Department of Electrical Engineering, University of Southern California, Los Angeles, California 90089 (United States); Seo, Jung-Woo T.; Hersam, Mark C. [Department of Materials Science and Engineering and Department of Chemistry, Northwestern University, Evanston, Illinois 60208 (United States); Gui, Hui [Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089 (United States)

    2016-06-06

    In this paper, we report the high-performance radio-frequency transistors based on the single-walled semiconducting carbon nanotubes with a refined average diameter of ∼1.6 nm. These diameter-separated carbon nanotube transistors show excellent transconductance of 55 μS/μm and desirable drain current saturation with an output resistance of ∼100 KΩ μm. An exceptional radio-frequency performance is also achieved with current gain and power gain cut-off frequencies of 23 GHz and 20 GHz (extrinsic) and 65 GHz and 35 GHz (intrinsic), respectively. These radio-frequency metrics are among the highest reported for the carbon nanotube thin-film transistors. This study provides demonstration of radio frequency transistors based on carbon nanotubes with tailored diameter distributions, which will guide the future application of carbon nanotubes in radio-frequency electronics.

  12. Dynamic event Tress applied to sequences Full Spectrum LOCA. Calculating the frequency of excedeence of damage by integrated Safety Analysis Methodology

    International Nuclear Information System (INIS)

    Gomez-Magan, J. J.; Fernandez, I.; Gil, J.; Marrao, H.; Queral, C.; Gonzalez-Cadelo, J.; Montero-Mayorga, J.; Rivas, J.; Ibane-Llano, C.; Izquierdo, J. M.; Sanchez-Perea, M.; Melendez, E.; Hortal, J.

    2013-01-01

    The Integrated Safety Analysis (ISA) methodology, developed by the Spanish Nuclear Safety council (CSN), has been applied to obtain the dynamic Event Trees (DETs) for full spectrum Loss of Coolant Accidents (LOCAs) of a Westinghouse 3-loop PWR plant. The purpose of this ISA application is to obtain the Damage Excedence Frequency (DEF) for the LOCA Event Tree by taking into account the uncertainties in the break area and the operator actuation time needed to cool down and de pressurize reactor coolant system by means of steam generator. Simulations are performed with SCAIS, a software tool which includes a dynamic coupling with MAAP thermal hydraulic code. The results show the capability of the ISA methodology to obtain the DEF taking into account the time uncertainty in human actions. (Author)

  13. Changes in precipitation frequency and intensity in the vicinity of Taiwan: typhoon versus non-typhoon events

    International Nuclear Information System (INIS)

    Tu, Jien-Yi; Chou Chia

    2013-01-01

    The hourly rainfall at 21 ground stations in Taiwan is used to investigate changes in the frequency, intensity, and duration of rainfall, which can be divided into typhoon and non-typhoon rainfall, in the period of 1970–2010. As a whole, the frequency of rainfall shows a decreasing trend for lighter rain and an increasing trend for heavier rain. Also, the typhoon rainfall shows a significant increase for all intensities, while the non-typhoon rainfall exhibits a general trend of decreasing, particularly for lighter rain. In rainfall intensity, both typhoon and non-typhoon rainfall extremes become more intense, with an increased rate much greater than the Clausius–Clapeyron thermal scaling. Moreover, rainfall extremes associated with typhoons have tended to affect Taiwan rainfall for longer in recent decades. The more frequent, intense and long-lasting typhoon rainfall is mainly induced by the slower translation speed of the typhoons over the neighborhood of Taiwan, which could be associated with a weakening of steering flow in the western North Pacific and the northern South China Sea. (letter)

  14. The analysis of the initiating events in thorium-based molten salt reactor

    International Nuclear Information System (INIS)

    Zuo Jiaxu; Song Wei; Jing Jianping; Zhang Chunming

    2014-01-01

    The initiation events analysis and evaluation were the beginning of nuclear safety analysis and probabilistic safety analysis, and it was the key points of the nuclear safety analysis. Currently, the initiation events analysis method and experiences both focused on water reactor, but no methods and theories for thorium-based molten salt reactor (TMSR). With TMSR's research and development in China, the initiation events analysis and evaluation was increasingly important. The research could be developed from the PWR analysis theories and methods. Based on the TMSR's design, the theories and methods of its initiation events analysis could be researched and developed. The initiation events lists and analysis methods of the two or three generation PWR, high-temperature gascooled reactor and sodium-cooled fast reactor were summarized. Based on the TMSR's design, its initiation events would be discussed and developed by the logical analysis. The analysis of TMSR's initiation events was preliminary studied and described. The research was important to clarify the events analysis rules, and useful to TMSR's designs and nuclear safety analysis. (authors)

  15. A scheme for PET data normalization in event-based motion correction

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Fulton, Roger; Meikle, Steven R

    2009-01-01

    Line of response (LOR) rebinning is an event-based motion-correction technique for positron emission tomography (PET) imaging that has been shown to compensate effectively for rigid motion. It involves the spatial transformation of LORs to compensate for motion during the scan, as measured by a motion tracking system. Each motion-corrected event is then recorded in the sinogram bin corresponding to the transformed LOR. It has been shown previously that the corrected event must be normalized using a normalization factor derived from the original LOR, that is, based on the pair of detectors involved in the original coincidence event. In general, due to data compression strategies (mashing), sinogram bins record events detected on multiple LORs. The number of LORs associated with a sinogram bin determines the relative contribution of each LOR. This paper provides a thorough treatment of event-based normalization during motion correction of PET data using LOR rebinning. We demonstrate theoretically and experimentally that normalization of the corrected event during LOR rebinning should account for the number of LORs contributing to the sinogram bin into which the motion-corrected event is binned. Failure to account for this factor may cause artifactual slice-to-slice count variations in the transverse slices and visible horizontal stripe artifacts in the coronal and sagittal slices of the reconstructed images. The theory and implementation of normalization in conjunction with the LOR rebinning technique is described in detail, and experimental verification of the proposed normalization method in phantom studies is presented.

  16. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Science.gov (United States)

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  17. An expert elicitation process to project the frequency and magnitude of Florida manatee mortality events caused by red tide (Karenia brevis)

    Science.gov (United States)

    Martin, Julien; Runge, Michael C.; Flewelling, Leanne J.; Deutsch, Charles J.; Landsberg, Jan H.

    2017-11-20

    Red tides (blooms of the harmful alga Karenia brevis) are one of the major sources of mortality for the Florida manatee (Trichechus manatus latirostris), especially in southwest Florida. It has been hypothesized that the frequency and severity of red tides may increase in the future because of global climate change and other factors. To improve our ecological forecast for the effects of red tides on manatee population dynamics and long-term persistence, we conducted a formal expert judgment process to estimate probability distributions for the frequency and relative magnitude of red-tide-related manatee mortality (RTMM) events over a 100-year time horizon in three of the four regions recognized as manatee management units in Florida. This information was used to update a population viability analysis for the Florida manatee (the Core Biological Model). We convened a panel of 12 experts in manatee biology or red-tide ecology; the panel met to frame, conduct, and discuss the elicitation. Each expert provided a best estimate and plausible low and high values (bounding a confidence level of 80 percent) for each parameter in each of three regions (Northwest, Southwest, and Atlantic) of the subspecies’ range (excluding the Upper St. Johns River region) for two time periods (0−40 and 41−100 years from present). We fitted probability distributions for each parameter, time period, and expert by using these three elicited values. We aggregated the parameter estimates elicited from individual experts and fitted a parametric distribution to the aggregated results.Across regions, the experts expected the future frequency of RTMM events to be higher than historical levels, which is consistent with the hypothesis that global climate change (among other factors) may increase the frequency of red-tide blooms. The experts articulated considerable uncertainty, however, about the future frequency of RTMM events. The historical frequency of moderate and intense RTMM (combined) in

  18. Reinnervation following catheter-based radio-frequency renal denervation.

    Science.gov (United States)

    Booth, Lindsea C; Nishi, Erika E; Yao, Song T; Ramchandra, Rohit; Lambert, Gavin W; Schlaich, Markus P; May, Clive N

    2015-04-20

    What is the topic of this review? Does catheter-based renal denervation effectively denervate the afferent and efferent renal nerves and does reinnervation occur? What advances does it highlight? Following catheter-based renal denervation, the afferent and efferent responses to electrical stimulation were abolished, renal sympathetic nerve activity was absent, and levels of renal noradrenaline and immunohistochemistry for tyrosine hydroxylase and calcitonin gene-related peptide were significantly reduced. By 11 months after renal denervation, both the functional responses and anatomical markers of afferent and efferent renal nerves had returned to normal, indicating reinnervation. Renal denervation reduces blood pressure in animals with experimental hypertension and, recently, catheter-based renal denervation was shown to cause a prolonged decrease in blood pressure in patients with resistant hypertension. The randomized, sham-controlled Symplicity HTN-3 trial failed to meet its primary efficacy end-point, but there is evidence that renal denervation was incomplete in many patients. Currently, there is little information regarding the effectiveness of catheter-based renal denervation and the extent of reinnervation. We assessed the effectiveness of renal nerve denervation with the Symplicity Flex catheter and the functional and anatomical reinnervation at 5.5 and 11 months postdenervation. In anaesthetized, non-denervated sheep, there was a high level of renal sympathetic nerve activity, and electrical stimulation of the renal nerve increased blood pressure and reduced heart rate (afferent response) and caused renal vasoconstriction and reduced renal blood flow (efferent response). Immediately after renal denervation, renal sympathetic nerve activity and the responses to electrical stimulation were absent, indicating effective denervation. By 11 months after denervation, renal sympathetic nerve activity was present and the responses to electrical stimulation

  19. Study on frequency characteristics of wireless power transmission system based on magnetic coupling resonance

    Science.gov (United States)

    Liang, L. H.; Liu, Z. Z.; Hou, Y. J.; Zeng, H.; Yue, Z. K.; Cui, S.

    2017-11-01

    In order to study the frequency characteristics of the wireless energy transmission system based on the magnetic coupling resonance, a circuit model based on the magnetic coupling resonant wireless energy transmission system is established. The influence of the load on the frequency characteristics of the wireless power transmission system is analysed. The circuit coupling theory is used to derive the minimum load required to suppress frequency splitting. Simulation and experimental results verify that when the load size is lower than a certain value, the system will appear frequency splitting, increasing the load size can effectively suppress the frequency splitting phenomenon. The power regulation scheme of the wireless charging system based on magnetic coupling resonance is given. This study provides a theoretical basis for load selection and power regulation of wireless power transmission systems.

  20. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  1. Three-frequency BDS precise point positioning ambiguity resolution based on raw observables

    Science.gov (United States)

    Li, Pan; Zhang, Xiaohong; Ge, Maorong; Schuh, Harald

    2018-02-01

    All BeiDou navigation satellite system (BDS) satellites are transmitting signals on three frequencies, which brings new opportunity and challenges for high-accuracy precise point positioning (PPP) with ambiguity resolution (AR). This paper proposes an effective uncalibrated phase delay (UPD) estimation and AR strategy which is based on a raw PPP model. First, triple-frequency raw PPP models are developed. The observation model and stochastic model are designed and extended to accommodate the third frequency. Then, the UPD is parameterized in raw frequency form while estimating with the high-precision and low-noise integer linear combination of float ambiguity which are derived by ambiguity decorrelation. Third, with UPD corrected, the LAMBDA method is used for resolving full or partial ambiguities which can be fixed. This method can be easily and flexibly extended for dual-, triple- or even more frequency. To verify the effectiveness and performance of triple-frequency PPP AR, tests with real BDS data from 90 stations lasting for 21 days were performed in static mode. Data were processed with three strategies: BDS triple-frequency ambiguity-float PPP, BDS triple-frequency PPP with dual-frequency (B1/B2) and three-frequency AR, respectively. Numerous experiment results showed that compared with the ambiguity-float solution, the performance in terms of convergence time and positioning biases can be significantly improved by AR. Among three groups of solutions, the triple-frequency PPP AR achieved the best performance. Compared with dual-frequency AR, additional the third frequency could apparently improve the position estimations during the initialization phase and under constraint environments when the dual-frequency PPP AR is limited by few satellite numbers.

  2. Prediction Equations of Energy Expenditure in Chinese Youth Based on Step Frequency during Walking and Running

    Science.gov (United States)

    Sun, Bo; Liu, Yu; Li, Jing Xian; Li, Haipeng; Chen, Peijie

    2013-01-01

    Purpose: This study set out to examine the relationship between step frequency and velocity to develop a step frequency-based equation to predict Chinese youth's energy expenditure (EE) during walking and running. Method: A total of 173 boys and girls aged 11 to 18 years old participated in this study. The participants walked and ran on a…

  3. Joint Angle and Frequency Estimation Using Multiple-Delay Output Based on ESPRIT

    Science.gov (United States)

    Xudong, Wang

    2010-12-01

    This paper presents a novel ESPRIT algorithm-based joint angle and frequency estimation using multiple-delay output (MDJAFE). The algorithm can estimate the joint angles and frequencies, since the use of multiple output makes the estimation accuracy greatly improved when compared with a conventional algorithm. The useful behavior of the proposed algorithm is verified by simulations.

  4. The RFI situation for a space-based low-frequency radio astronomy instrument

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Boonstra, A.J.

    2016-01-01

    Space based ultra-long wavelength radio astronomy has recently gained a lot of interest. Techniques to open the virtually unexplored frequency band below 30 MHz are becoming within reach at this moment. Due to the ionosphere and the radio interference (RFI) on Earth exploring this frequency band

  5. Model-based Estimation of High Frequency Jump Diffusions with Microstructure Noise and Stochastic Volatility

    NARCIS (Netherlands)

    Bos, Charles S.

    2008-01-01

    When analysing the volatility related to high frequency financial data, mostly non-parametric approaches based on realised or bipower variation are applied. This article instead starts from a continuous time diffusion model and derives a parametric analog at high frequency for it, allowing

  6. Optical fiber strain sensor using fiber resonator based on frequency comb Vernier spectroscopy

    DEFF Research Database (Denmark)

    Zhang, Liang; Lu, Ping; Chen, Li

    2012-01-01

    A novel (to our best knowledge) optical fiber strain sensor using a fiber ring resonator based on frequency comb Vernier spectroscopy is proposed and demonstrated. A passively mode-locked optical fiber laser is employed to generate a phased-locked frequency comb. Strain applied to the optical fib...

  7. Flextensional fiber Bragg grating-based accelerometer for low frequency vibration measurement

    Institute of Scientific and Technical Information of China (English)

    Jinghua Zhang; Xueguang Qiao; Manli Hu; Zhongyao Feng; Hong Gao; Yang Yang; Rui Zhou

    2011-01-01

    @@ The intelligent structural health monitoring method,which uses a fiber Bragg grating(FBG)sensor,is a new approach in the field of civil engineering.However,it lacks a reliable FBG-based accelerometer for taking structural low frequency vibration measurements.In this letter,a flextensional FBG-based accelerometer is proposed and demonstrated.The experimental results indicate that the natural frequency of the developed accelerometer is 16.7 Hz,with a high sensitivity of 410.7 pm/g.In addition,it has a broad and flat response over low frequencies ranging from 1 to 10 Hz.The natural frequency and sensitivity of the accelerometer can be tuned by adding mass to tailor the sensor performance to specific applications.Experimental results are presented to demonstrate the good performance of the proposed FBG-based accelerometer.These results show that the proposed accelerometer is satisfactory for low frequency vibration measurements.%The intelligent structural health monitoring method, which uses a fiber Bragg grating {FBG} sensor, ie a new approach in the field of civil engineering. However, it lacks a reliable FBG-based accelerometer for taking structural low frequency vibration measurements. In this letter, a flextensional FBG-based accelerometer is proposed and demonstrated. The experimental results indicate that the natural frequency of the developed accelerometer is 16.7 Hz, with a high sensitivity of 410.7 pm/g. In addition, it has a broad and flat response over low frequencies ranging from 1 to 10 Hz. The natural frequency and sensitivity of the accelerometer can be tuned by adding mass to tailor the sensor performance to specific applications. Experimental results are presented to demonstrate the good performance of the proposed FBG-based accelerometer. These results show that the proposed accelerometer is satisfactory for low frequency vibration measurements.

  8. WILBER and PyWEED: Event-based Seismic Data Request Tools

    Science.gov (United States)

    Falco, N.; Clark, A.; Trabant, C. M.

    2017-12-01

    WILBER and PyWEED are two user-friendly tools for requesting event-oriented seismic data. Both tools provide interactive maps and other controls for browsing and filtering event and station catalogs, and downloading data for selected event/station combinations, where the data window for each event/station pair may be defined relative to the arrival time of seismic waves from the event to that particular station. Both tools allow data to be previewed visually, and can download data in standard miniSEED, SAC, and other formats, complete with relevant metadata for performing instrument correction. WILBER is a web application requiring only a modern web browser. Once the user has selected an event, WILBER identifies all data available for that time period, and allows the user to select stations based on criteria such as the station's distance and orientation relative to the event. When the user has finalized their request, the data is collected and packaged on the IRIS server, and when it is ready the user is sent a link to download. PyWEED is a downloadable, cross-platform (Macintosh / Windows / Linux) application written in Python. PyWEED allows a user to select multiple events and stations, and will download data for each event/station combination selected. PyWEED is built around the ObsPy seismic toolkit, and allows direct interaction and control of the application through a Python interactive console.

  9. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  10. Estimating High-Frequency Based (Co-) Variances: A Unified Approach

    DEFF Research Database (Denmark)

    Voev, Valeri; Nolte, Ingmar

    We propose a unified framework for estimating integrated variances and covariances based on simple OLS regressions, allowing for a general market microstructure noise specification. We show that our estimators can outperform, in terms of the root mean squared error criterion, the most recent...... and commonly applied estimators, such as the realized kernels of Barndorff-Nielsen, Hansen, Lunde & Shephard (2006), the two-scales realized variance of Zhang, Mykland & Aït-Sahalia (2005), the Hayashi & Yoshida (2005) covariance estimator, and the realized variance and covariance with the optimal sampling...

  11. Frequency support capability of variable speed wind turbine based on electromagnetic coupler

    DEFF Research Database (Denmark)

    You, Rui; Barahona Garzón, Braulio; Chai, Jianyun

    2015-01-01

    In the variable speed wind turbine based on electromagnetic coupler (WT-EMC), a synchronous generator is directly coupled with grid. So like conventional power plants WT-EMC is able to support grid frequency inherently. But due to the reduced inertia of synchronous generator, its frequency support...... capability has to be enhanced. In this paper, the frequency support capability of WT-EMC is studied at three typical wind conditions and with two control strategies-droop control and inertial control to enhance its frequency support capability. The synchronous generator speed, more stable than the grid...

  12. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 1: Frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    We develop a general framework for the frequency analysis of irregularly sampled time series. It is based on the Lomb-Scargle periodogram, but extended to algebraic operators accounting for the presence of a polynomial trend in the model for the data, in addition to a periodic component and a background noise. Special care is devoted to the correlation between the trend and the periodic component. This new periodogram is then cast into the Welch overlapping segment averaging (WOSA) method in order to reduce its variance. We also design a test of significance for the WOSA periodogram, against the background noise. The model for the background noise is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, more general than the classical Gaussian white or red noise processes. CARMA parameters are estimated following a Bayesian framework. We provide algorithms that compute the confidence levels for the WOSA periodogram and fully take into account the uncertainty in the CARMA noise parameters. Alternatively, a theory using point estimates of CARMA parameters provides analytical confidence levels for the WOSA periodogram, which are more accurate than Markov chain Monte Carlo (MCMC) confidence levels and, below some threshold for the number of data points, less costly in computing time. We then estimate the amplitude of the periodic component with least-squares methods, and derive an approximate proportionality between the squared amplitude and the periodogram. This proportionality leads to a new extension for the periodogram: the weighted WOSA periodogram, which we recommend for most frequency analyses with irregularly sampled data. The estimated signal amplitude also permits filtering in a frequency band. Our results generalise and unify methods developed in the fields of geosciences, engineering, astronomy and astrophysics. They also constitute the starting point for an extension to the continuous wavelet transform developed in a companion

  13. Frequency band adjustment match filtering based on variable frequency GPR antennas pairing scheme for shallow subsurface investigations

    Science.gov (United States)

    Shaikh, Shahid Ali; Tian, Gang; Shi, Zhanjie; Zhao, Wenke; Junejo, S. A.

    2018-02-01

    Ground penetrating Radar (GPR) is an efficient tool for subsurface geophysical investigations, particularly at shallow depths. The non-destructiveness, cost efficiency, and data reliability are the important factors that make it an ideal tool for the shallow subsurface investigations. Present study encompasses; variations in central frequency of transmitting and receiving GPR antennas (Tx-Rx) have been analyzed and frequency band adjustment match filters are fabricated and tested accordingly. Normally, the frequency of both the antennas remains similar to each other whereas in this study we have experimentally changed the frequencies of Tx-Rx and deduce the response. Instead of normally adopted three pairs, a total of nine Tx-Rx pairs were made from 50 MHz, 100 MHz, and 200 MHz antennas. The experimental data was acquired at the designated near surface geophysics test site of the Zhejiang University, Hangzhou, China. After the impulse response analysis of acquired data through conventional as well as varied Tx-Rx pairs, different swap effects were observed. The frequency band and exploration depth are influenced by transmitting frequencies rather than the receiving frequencies. The impact of receiving frequencies was noticed on the resolution; the more noises were observed using the combination of high frequency transmitting with respect to low frequency receiving. On the basis of above said variable results we have fabricated two frequency band adjustment match filters, the constant frequency transmitting (CFT) and the variable frequency transmitting (VFT) frequency band adjustment match filters. By the principle, the lower and higher frequency components were matched and then incorporated with intermediate one. Therefore, this study reveals that a Tx-Rx combination of low frequency transmitting with high frequency receiving is a better choice. Moreover, both the filters provide better radargram than raw one, the result of VFT frequency band adjustment filter is

  14. Microseismic Event Relocation and Focal Mechanism Estimation Based on PageRank Linkage

    Science.gov (United States)

    Aguiar, A. C.; Myers, S. C.

    2017-12-01

    Microseismicity associated with enhanced geothermal systems (EGS) is key in understanding how subsurface stimulation can modify stress, fracture rock, and increase permeability. Large numbers of microseismic events are commonly associated with hydroshearing an EGS, making data mining methods useful in their analysis. We focus on PageRank, originally developed as Google's search engine, and subsequently adapted for use in seismology to detect low-frequency earthquakes by linking events directly and indirectly through cross-correlation (Aguiar and Beroza, 2014). We expand on this application by using PageRank to define signal-correlation topology for micro-earthquakes from the Newberry Volcano EGS in Central Oregon, which has been stimulated two times using high-pressure fluid injection. We create PageRank signal families from both data sets and compare these to the spatial and temporal proximity of associated earthquakes. PageRank families are relocated using differential travel times measured by waveform cross-correlation (CC) and the Bayesloc approach (Myers et al., 2007). Prior to relocation events are loosely clustered with events at a distance from the cluster. After relocation, event families are found to be tightly clustered. Indirect linkage of signals using PageRank is a reliable way to increase the number of events confidently determined to be similar, suggesting an efficient and effective grouping of earthquakes with similar physical characteristics (ie. location, focal mechanism, stress drop). We further explore the possibility of using PageRank families to identify events with similar relative phase polarities and estimate focal mechanisms following Shelly et al. (2016) method, where CC measurements are used to determine individual polarities within event clusters. Given a positive result, PageRank might be a useful tool in adaptive approaches to enhance production at well-instrumented geothermal sites. Prepared by LLNL under Contract DE-AC52-07NA27344

  15. Structural Health Monitoring Based on Combined Structural Global and Local Frequencies

    Directory of Open Access Journals (Sweden)

    Jilin Hou

    2014-01-01

    Full Text Available This paper presents a parameter estimation method for Structural Health Monitoring based on the combined measured structural global frequencies and structural local frequencies. First, the global test is experimented to obtain the low order modes which can reflect the global information of the structure. Secondly, the mass is added on the member of structure to increase the local dynamic characteristic and to make the member have local primary frequency, which belongs to structural local frequency and is sensitive to local parameters. Then the parameters of the structure can be optimized accurately using the combined structural global frequencies and structural local frequencies. The effectiveness and accuracy of the proposed method are verified by the experiment of a space truss.

  16. A New Quantum Key Distribution Scheme Based on Frequency and Time Coding

    International Nuclear Information System (INIS)

    Chang-Hua, Zhu; Chang-Xing, Pei; Dong-Xiao, Quan; Jing-Liang, Gao; Nan, Chen; Yun-Hui, Yi

    2010-01-01

    A new scheme of quantum key distribution (QKD) using frequency and time coding is proposed, in which the security is based on the frequency-time uncertainty relation. In this scheme, the binary information sequence is encoded randomly on either the central frequency or the time delay of the optical pulse at the sender. The central frequency of the single photon pulse is set as ω 1 for bit 0 and set as ω 2 for bit 1 when frequency coding is selected. However, the single photon pulse is not delayed for bit 0 and is delayed in τ for 1 when time coding is selected. At the receiver, either the frequency or the time delay of the pulse is measured randomly, and the final key is obtained after basis comparison, data reconciliation and privacy amplification. With the proposed method, the effect of the noise in the fiber channel and environment on the QKD system can be reduced effectively

  17. Micro-Doppler Signal Time-Frequency Algorithm Based on STFRFT

    Directory of Open Access Journals (Sweden)

    Cunsuo Pang

    2016-09-01

    Full Text Available This paper proposes a time-frequency algorithm based on short-time fractional order Fourier transformation (STFRFT for identification of a complicated movement targets. This algorithm, consisting of a STFRFT order-changing and quick selection method, is effective in reducing the computation load. A multi-order STFRFT time-frequency algorithm is also developed that makes use of the time-frequency feature of each micro-Doppler component signal. This algorithm improves the estimation accuracy of time-frequency curve fitting through multi-order matching. Finally, experiment data were used to demonstrate STFRFT’s performance in micro-Doppler time-frequency analysis. The results validated the higher estimate accuracy of the proposed algorithm. It may be applied to an LFM (Linear frequency modulated pulse radar, SAR (Synthetic aperture radar, or ISAR (Inverse synthetic aperture radar, for improving the probability of target recognition.

  18. Micro-Doppler Signal Time-Frequency Algorithm Based on STFRFT.

    Science.gov (United States)

    Pang, Cunsuo; Han, Yan; Hou, Huiling; Liu, Shengheng; Zhang, Nan

    2016-09-24

    This paper proposes a time-frequency algorithm based on short-time fractional order Fourier transformation (STFRFT) for identification of a complicated movement targets. This algorithm, consisting of a STFRFT order-changing and quick selection method, is effective in reducing the computation load. A multi-order STFRFT time-frequency algorithm is also developed that makes use of the time-frequency feature of each micro-Doppler component signal. This algorithm improves the estimation accuracy of time-frequency curve fitting through multi-order matching. Finally, experiment data were used to demonstrate STFRFT's performance in micro-Doppler time-frequency analysis. The results validated the higher estimate accuracy of the proposed algorithm. It may be applied to an LFM (Linear frequency modulated) pulse radar, SAR (Synthetic aperture radar), or ISAR (Inverse synthetic aperture radar), for improving the probability of target recognition.

  19. Frequency-shaped and observer-based discrete-time sliding mode control

    CERN Document Server

    Mehta, Axaykumar

    2015-01-01

    It is well established that the sliding mode control strategy provides an effective and robust method of controlling the deterministic system due to its well-known invariance property to a class of bounded disturbance and parameter variations. Advances in microcomputer technologies have made digital control increasingly popular among the researchers worldwide. And that led to the study of discrete-time sliding mode control design and its implementation. This brief presents, a method for multi-rate frequency shaped sliding mode controller design based on switching and non-switching type of reaching law. In this approach, the frequency dependent compensator dynamics are introduced through a frequency-shaped sliding surface by assigning frequency dependent weighing matrices in a linear quadratic regulator (LQR) design procedure. In this way, the undesired high frequency dynamics or certain frequency disturbance can be eliminated. The states are implicitly obtained by measuring the output at a faster rate than th...

  20. The Tracking Resonance Frequency Method for Photoacoustic Measurements Based on the Phase Response

    Science.gov (United States)

    Suchenek, Mariusz

    2017-04-01

    One of the major issues in the use of the resonant photoacoustic cell is the resonance frequency of the cell. The frequency is not stable, and its changes depend mostly on temperature and gas mixture. This paper presents a new method for tracking resonance frequency, where both the amplitude and phase are calculated from the input samples. The stimulating frequency can be adjusted to the resonance frequency of the cell based on the phase. This method was implemented using a digital measurement system with an analog to digital converter, field programmable gate array (FPGA) and a microcontroller. The resonance frequency was changed by the injection of carbon dioxide into the cell. A theoretical description and experimental results are also presented.

  1. Passive high-frequency devices based on superlattice ferromagnetic nanowires

    International Nuclear Information System (INIS)

    Ye, B.; Li, F.; Cimpoesu, D.; Wiley, J.B.; Jung, J.-S.; Stancu, A.; Spinu, L.

    2007-01-01

    In this paper we propose to tailor the bandwidth of a microwave filter by exploitation of shape anisotropy of nanowires. In order to achieve this control of shape anisotropy, we considered superlattice wires containing varying-sized ferromagnetic regions separated by nonferromagnetic regions. Superlattice wires of Ni and Au with a nominal diameter of 200 nm were grown using standard electrodeposition techniques. The microwave properties were probed using X-band (9.8 GHz) ferromagnetic resonance (FMR) experiments performed at room temperature. In order to investigate the effectiveness of the shape anisotropy on the superlattice nanowire based filter the FMR spectrum of superlattice structure is compared to the FMR spectra of nanowires samples with constant length

  2. Metasurface-based anti-reflection coatings at optical frequencies

    Science.gov (United States)

    Monti, Alessio; Alù, Andrea; Toscano, Alessandro; Bilotti, Filiberto

    2018-05-01

    In this manuscript, we propose a metasurface approach for the reduction of electromagnetic reflection from an arbitrary air‑dielectric interface. The proposed technique exploits the exotic optical response of plasmonic nanoparticles to achieve complete cancellation of the field reflected by a dielectric substrate by means of destructive interference. Differently from other, earlier anti-reflection approaches based on nanoparticles, our design scheme is supported by a simple transmission-line formulation that allows a closed-form characterization of the anti-reflection performance of a nanoparticle array. Furthermore, since the working principle of the proposed devices relies on an average effect that does not critically depend on the array geometry, our approach enables low-cost production and easy scalability to large sizes. Our theoretical considerations are supported by full-wave simulations confirming the effectiveness of this design principle.

  3. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  4. Population based allele frequencies of disease associated polymorphisms in the Personalized Medicine Research Project.

    Science.gov (United States)

    Cross, Deanna S; Ivacic, Lynn C; Stefanski, Elisha L; McCarty, Catherine A

    2010-06-17

    There is a lack of knowledge regarding the frequency of disease associated polymorphisms in populations and population attributable risk for many populations remains unknown. Factors that could affect the association of the allele with disease, either positively or negatively, such as race, ethnicity, and gender, may not be possible to determine without population based allele frequencies.Here we used a panel of 51 polymorphisms previously associated with at least one disease and determined the allele frequencies within the entire Personalized Medicine Research Project population based cohort. We compared these allele frequencies to those in dbSNP and other data sources stratified by race. Differences in allele frequencies between self reported race, region of origin, and sex were determined. There were 19544 individuals who self reported a single racial category, 19027 or (97.4%) self reported white Caucasian, and 11205 (57.3%) individuals were female. Of the 11,208 (57%) individuals with an identifiable region of origin 8337 or (74.4%) were German.41 polymorphisms were significantly different between self reported race at the 0.05 level. Stratification of our Caucasian population by self reported region of origin revealed 19 polymorphisms that were significantly different (p = 0.05) between individuals of different origins. Further stratification of the population by gender revealed few significant differences in allele frequencies between the genders. This represents one of the largest population based allele frequency studies to date. Stratification by self reported race and region of origin revealed wide differences in allele frequencies not only by race but also by region of origin within a single racial group. We report allele frequencies for our Asian/Hmong and American Indian populations; these two minority groups are not typically selected for population allele frequency detection. Population wide allele frequencies are important for the design and

  5. Frequency Control in Autanamous Microgrid in the Presence of DFIG based Wind Turbine

    Directory of Open Access Journals (Sweden)

    Ghazanfar Shahgholian

    2015-10-01

    Full Text Available Despite their ever-increasing power injection into power grid, wind turbines play no role in frequency control. On the other hand, power network frequency is mainly adjusted by conventional power plants. DFIG-based wind turbines not only are able to produce power in various mechanical speeds, but they can also reduce speed instantaneously which, in turn, leads to mechanical energy release. Thus, they can aid conventional units in system frequency control. In this paper, the effect of wind energy conversion systems, especially variable speed DFIG-based wind turbines, in controlling and tuning of frequency is investigated when different penetration coefficients are considered in a isolated microgrid comprising of conventional thermal and non-thermal generating unit. To do this, optimal tuning of DFIG's speed controller is performed in different penetration levels using particle swarm optimization (PSO technique. In addition, optimum penetration of wind energy conversion system is studied considering frequency change parameters in a microgrid.

  6. Estimative of core damage frequency in IPEN'S IEA-R1 research reactor due to the initiating event of loss of coolant caused by large rupture in the pipe of the primary circuit

    International Nuclear Information System (INIS)

    Hirata, Daniel Massami; Sabundjian, Gaiane; Cabral, Eduardo Lobo Lustosa

    2009-01-01

    The National Commission of Nuclear Energy (CNEN), which is the Brazilian nuclear regulatory commission, imposes safety and licensing standards in order to ensure that the nuclear power plants operate in a safe way. For licensing a nuclear reactor one of the demands of CNEN is the simulation of some accidents and thermalhydraulic transients considered as design base to verify the integrity of the plant when submitted to adverse conditions. The accidents that must be simulated are those that present large probability to occur or those that can cause more serious consequences. According to the FSAR (Final Safety Analysis Report) the initiating event that can cause the largest damage in the core, of the IEA-R1 research reactor at IPEN-CNEN/SP, is the LOCA (Loss of Coolant Accident). The objective of this paper is estimate the frequency of the IEA-R1 core damage, caused by this initiating event. In this paper we analyze the accident evolution and performance of the systems which should mitigate this event: the Emergency Coolant Core System (ECCS) and the isolated pool system. They will be analyzed by means of the event tree. In this work the reliability of these systems are also quantified using the fault tree. (author)

  7. Exposure estimates based on broadband elf magnetic field measurements versus the ICNIRP multiple frequency rule

    International Nuclear Information System (INIS)

    Paniagua, Jesus M.; Rufo, Montana; Jimenez, Antonio; Pachon, Fernando T.; Carrero, Julian

    2015-01-01

    The evaluation of exposure to extremely low-frequency (ELF) magnetic fields using broadband measurement techniques gives satisfactory results when the field has essentially a single frequency. Nevertheless, magnetic fields are in most cases distorted by harmonic components. This work analyses the harmonic components of the ELF magnetic field in an outdoor urban context and compares the evaluation of the exposure based on broadband measurements with that based on spectral analysis. The multiple frequency rule of the International Commission on Non-ionizing Radiation Protection (ICNIRP) regulatory guidelines was applied. With the 1998 ICNIRP guideline, harmonics dominated the exposure with a 55 % contribution. With the 2010 ICNIRP guideline, however, the primary frequency dominated the exposure with a 78 % contribution. Values of the exposure based on spectral analysis were significantly higher than those based on broadband measurements. Hence, it is clearly necessary to determine the harmonic components of the ELF magnetic field to assess exposure in urban contexts. (authors)

  8. GaN-based High Power High Frequency Wide Range LLC Resonant Converter, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — SET Group will design, build and demonstrate a Gallium Nitride (GaN) based High Power High Frequency Wide Range LLC Resonant Converter capable of handling high power...

  9. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  10. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Science.gov (United States)

    Xu, Xianghua; Gao, Xueyong; Wan, Jian; Xiong, Naixue

    2011-01-01

    This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP) localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms. PMID:22163972

  11. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  12. Science-based risk assessments for rare events in a changing climate

    Science.gov (United States)

    Sobel, A. H.; Tippett, M. K.; Camargo, S. J.; Lee, C. Y.; Allen, J. T.

    2014-12-01

    History shows that substantial investments in protection against any specific type of natural disaster usually occur only after (usually shortly after) that specific type of disaster has happened in a given place. This is true even when it was well known before the event that there was a significant risk that it could occur. Presumably what psychologists Kahneman and Tversky have called "availability bias" is responsible, at least in part, for these failures to act on known but out-of-sample risks. While understandable, this human tendency prepares us poorly for events which are very rare (on the time scales of human lives) and even more poorly for a changing climate, as historical records become a poorer guide. A more forward-thinking and rational approach would require scientific risk assessments that can place meaningful probabilities on events that are rare enough to be absent from the historical record, and that can account for the influences of both anthropogenic climate change and low-frequency natural climate variability. The set of tools available for doing such risk assessments is still quite limited, particularly for some of the most extreme events such as tropical cyclones and tornadoes. We will briefly assess the state of the art for these events in particular, and describe some of our ongoing research to develop new tools for quantitative risk assessment using hybrids of statistical methods and physical understanding of the hazards.

  13. Types, frequencies, and burden of nonspecific adverse events of drugs: analysis of randomized placebo-controlled clinical trials.

    Science.gov (United States)

    Mahr, Alfred; Golmard, Clara; Pham, Emilie; Iordache, Laura; Deville, Laure; Faure, Pierre

    2017-07-01

    Scarce studies analyzing adverse event (AE) data from randomized placebo-controlled clinical trials (RPCCTs) of selected illnesses suggested that a substantial proportion of collected AEs are unrelated to the drug taken. This study analyzed the nonspecific AEs occurring with active-drug exposure in RPCCTs for a large range of medical conditions. Randomized placebo-controlled clinical trials published in five prominent medical journals during 2006-2012 were searched. Only trials that evaluated orally or parenterally administered active drugs versus placebo in a head-to-head setting were selected. For AEs reported from ≥10 RPCCTs, Pearson's correlation coefficients (r) were calculated to determine the relationship between AE rates in placebo and active-drug recipients. Random-effects meta-analyses were used to compute proportions of nonspecific AEs, which were truncated at a maximum of 100%, in active-drug recipients. We included 231 trials addressing various medical domains or healthy participants. For the 88 analyzed AE variables, AE rates for placebo and active-drug recipients were in general strongly correlated (r > 0.50) or very strongly correlated (r > 0.80). The pooled proportions of nonspecific AEs for the active-drug recipients were 96.8% (95%CI: 95.5-98.1) for any AEs, 100% (97.9-100) for serious AEs, and 77.7% (72.7-83.2) for drug-related AEs. Results were similar for individual medical domains and healthy participants. The pooled proportion of nonspecificity of 82 system organ class and individual AE types ranged from 38% to 100%. The large proportion of nonspecific AEs reported in active-drug recipients of RPCCTs, including serious and drug-related AEs, highlights the limitations of clinical trial data to determine the tolerability of drugs. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Noether's Theorem and its Inverse of Birkhoffian System in Event Space Based on Herglotz Variational Problem

    Science.gov (United States)

    Tian, X.; Zhang, Y.

    2018-03-01

    Herglotz variational principle, in which the functional is defined by a differential equation, generalizes the classical ones defining the functional by an integral. The principle gives a variational principle description of nonconservative systems even when the Lagrangian is independent of time. This paper focuses on studying the Noether's theorem and its inverse of a Birkhoffian system in event space based on the Herglotz variational problem. Firstly, according to the Herglotz variational principle of a Birkhoffian system, the principle of a Birkhoffian system in event space is established. Secondly, its parametric equations and two basic formulae for the variation of Pfaff-Herglotz action of a Birkhoffian system in event space are obtained. Furthermore, the definition and criteria of Noether symmetry of the Birkhoffian system in event space based on the Herglotz variational problem are given. Then, according to the relationship between the Noether symmetry and conserved quantity, the Noether's theorem is derived. Under classical conditions, Noether's theorem of a Birkhoffian system in event space based on the Herglotz variational problem reduces to the classical ones. In addition, Noether's inverse theorem of the Birkhoffian system in event space based on the Herglotz variational problem is also obtained. In the end of the paper, an example is given to illustrate the application of the results.

  15. Rainfall and runoff Intensity-Duration-Frequency Curves for Washington State considering the change and uncertainty of observed and anticipated extreme rainfall and snow events

    Science.gov (United States)

    Demissie, Y. K.; Mortuza, M. R.; Li, H. Y.

    2015-12-01

    The observed and anticipated increasing trends in extreme storm magnitude and frequency, as well as the associated flooding risk in the Pacific Northwest highlighted the need for revising and updating the local intensity-duration-frequency (IDF) curves, which are commonly used for designing critical water infrastructure. In Washington State, much of the drainage system installed in the last several decades uses IDF curves that are outdated by as much as half a century, making the system inadequate and vulnerable for flooding as seen more frequently in recent years. In this study, we have developed new and forward looking rainfall and runoff IDF curves for each county in Washington State using recently observed and projected precipitation data. Regional frequency analysis coupled with Bayesian uncertainty quantification and model averaging methods were used to developed and update the rainfall IDF curves, which were then used in watershed and snow models to develop the runoff IDF curves that explicitly account for effects of snow and drainage characteristic into the IDF curves and related designs. The resulted rainfall and runoff IDF curves provide more reliable, forward looking, and spatially resolved characteristics of storm events that can assist local decision makers and engineers to thoroughly review and/or update the current design standards for urban and rural storm water management infrastructure in order to reduce the potential ramifications of increasing severe storms and resulting floods on existing and planned storm drainage and flood management systems in the state.

  16. Lower Hybrid Frequency Range Waves Generated by Ion Polarization Drift Due to Electromagnetic Ion Cyclotron Waves: Analysis of an Event Observed by the Van Allen Probe B

    Science.gov (United States)

    Khazanov, G. V.; Boardsen, S.; Krivorutsky, E. N.; Engebretson, M. J.; Sibeck, D.; Chen, S.; Breneman, A.

    2017-01-01

    We analyze a wave event that occurred near noon between 07:03 and 07:08 UT on 23 February 2014 detected by the Van Allen Probes B spacecraft, where waves in the lower hybrid frequency range (LHFR) and electromagnetic ion cyclotron (EMIC) waves are observed to be highly correlated, with Pearson correlation coefficient of approximately 0.86. We assume that the correlation is the result of LHFR wave generation by the ions polarization drift in the electric field of the EMIC waves. To check this assumption the drift velocities of electrons and H+, He+, and O+ ions in the measured EMIC wave electric field were modeled. Then the LHFR wave linear instantaneous growth rates for plasma with these changing drift velocities and different plasma compositions were calculated. The time distribution of these growth rates, their frequency distribution, and the frequency dependence of the ratio of the LHFR wave power spectral density (PSD)parallel and perpendicular to the ambient magnetic eld to the total PSD were found. These characteristics of the growth rates were compared with the corresponding characteristics of the observed LHFR activity. Reasonable agreement between these features and the strong correlation between EMIC and LHFR energy densities support the assumption that the LHFR wave generation can be caused by the ions polarization drift in the electric field of an EMIC wave.

  17. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  18. Abnormal Event Detection in Wireless Sensor Networks Based on Multiattribute Correlation

    Directory of Open Access Journals (Sweden)

    Mengdi Wang

    2017-01-01

    Full Text Available Abnormal event detection is one of the vital tasks in wireless sensor networks. However, the faults of nodes and the poor deployment environment have brought great challenges to abnormal event detection. In a typical event detection technique, spatiotemporal correlations are collected to detect an event, which is susceptible to noises and errors. To improve the quality of detection results, we propose a novel approach for abnormal event detection in wireless sensor networks. This approach considers not only spatiotemporal correlations but also the correlations among observed attributes. A dependency model of observed attributes is constructed based on Bayesian network. In this model, the dependency structure of observed attributes is obtained by structure learning, and the conditional probability table of each node is calculated by parameter learning. We propose a new concept named attribute correlation confidence to evaluate the fitting degree between the sensor reading and the abnormal event pattern. On the basis of time correlation detection and space correlation detection, the abnormal events are identified. Experimental results show that the proposed algorithm can reduce the impact of interference factors and the rate of the false alarm effectively; it can also improve the accuracy of event detection.

  19. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference.

    Science.gov (United States)

    Kim, Jung-Jae; Rebholz-Schuhmann, Dietrich

    2011-10-06

    The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision) and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  20. Evaluation of extreme temperature events in northern Spain based on process control charts

    Science.gov (United States)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  1. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  2. Time-Frequency Distribution of Music based on Sparse Wavelet Packet Representations

    DEFF Research Database (Denmark)

    Endelt, Line Ørtoft

    We introduce a new method for generating time-frequency distributions, which is particularly useful for the analysis of music signals. The method presented here is based on $\\ell1$ sparse representations of music signals in a redundant wavelet packet dictionary. The representations are found using...... the minimization methods basis pursuit and best orthogonal basis. Visualizations of the time-frequency distribution are constructed based on a simplified energy distribution in the wavelet packet decomposition. The time-frequency distributions emphasizes structured musical content, including non-stationary content...

  3. High-resolution mid-IR spectrometer based on frequency upconversion

    DEFF Research Database (Denmark)

    Hu, Qi; Dam, Jeppe Seidelin; Pedersen, Christian

    2012-01-01

    We demonstrate a novel approach for high-resolution spectroscopy based on frequency upconversion and postfiltering by means of a scanning Fabryx2013;Perot interferometer. The system is based on sum-frequency mixing, shifting the spectral content from the mid-infrared to the near-visible region al......-frequency 1064xA0;nm laser. We investigate water vapor emission lines from a butane burner and compare the measured results to model data. The presented method we suggest to be used for real-time monitoring of specific gas lines and reference signals....

  4. Fluence-based and microdosimetric event-based methods for radiation protection in space

    International Nuclear Information System (INIS)

    Curtis, S.B.

    2002-01-01

    The National Council on Radiation Protection and Measurements (NCRP) has recently published a report (Report no.137) that discusses various aspects of the concepts used in radiation protection and the difficulties in measuring the radiation environment in spacecraft for the estimation of radiation risk to space travelers. Two novel dosimetric methodologies, fluence-based and microdosimetric event-based methods, are discussed and evaluated, along with the more conventional quality factor/linear energy transfer (LET) method. It was concluded that for the present, any reason to switch to a new methodology is not compelling. It is suggested that because of certain drawbacks in the presently-used conventional method, these alternative methodologies should be kept in mind. As new data become available and dosimetric techniques become more refined, the question should be revisited and that in the future, significant improvement might be realized. In addition, such concepts as equivalent dose and organ dose equivalent are discussed and various problems regarding the measurement/estimation of these quantities are presented. (author)

  5. Event-based motion correction for PET transmission measurements with a rotating point source

    International Nuclear Information System (INIS)

    Zhou, Victor W; Kyme, Andre Z; Meikle, Steven R; Fulton, Roger

    2011-01-01

    Accurate attenuation correction is important for quantitative positron emission tomography (PET) studies. When performing transmission measurements using an external rotating radioactive source, object motion during the transmission scan can distort the attenuation correction factors computed as the ratio of the blank to transmission counts, and cause errors and artefacts in reconstructed PET images. In this paper we report a compensation method for rigid body motion during PET transmission measurements, in which list mode transmission data are motion corrected event-by-event, based on known motion, to ensure that all events which traverse the same path through the object are recorded on a common line of response (LOR). As a result, the motion-corrected transmission LOR may record a combination of events originally detected on different LORs. To ensure that the corresponding blank LOR records events from the same combination of contributing LORs, the list mode blank data are spatially transformed event-by-event based on the same motion information. The number of counts recorded on the resulting blank LOR is then equivalent to the number of counts that would have been recorded on the corresponding motion-corrected transmission LOR in the absence of any attenuating object. The proposed method has been verified in phantom studies with both stepwise movements and continuous motion. We found that attenuation maps derived from motion-corrected transmission and blank data agree well with those of the stationary phantom and are significantly better than uncorrected attenuation data.

  6. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  7. A new frequency matching technique for FRF-based model updating

    Science.gov (United States)

    Yang, Xiuming; Guo, Xinglin; Ouyang, Huajiang; Li, Dongsheng

    2017-05-01

    Frequency Response Function (FRF) residues have been widely used to update Finite Element models. They are a kind of original measurement information and have the advantages of rich data and no extraction errors, etc. However, like other sensitivity-based methods, an FRF-based identification method also needs to face the ill-conditioning problem which is even more serious since the sensitivity of the FRF in the vicinity of a resonance is much greater than elsewhere. Furthermore, for a given frequency measurement, directly using a theoretical FRF at a frequency may lead to a huge difference between the theoretical FRF and the corresponding experimental FRF which finally results in larger effects of measurement errors and damping. Hence in the solution process, correct selection of the appropriate frequency to get the theoretical FRF in every iteration in the sensitivity-based approach is an effective way to improve the robustness of an FRF-based algorithm. A primary tool for right frequency selection based on the correlation of FRFs is the Frequency Domain Assurance Criterion. This paper presents a new frequency selection method which directly finds the frequency that minimizes the difference of the order of magnitude between the theoretical and experimental FRFs. A simulated truss structure is used to compare the performance of different frequency selection methods. For the sake of reality, it is assumed that not all the degrees of freedom (DoFs) are available for measurement. The minimum number of DoFs required in each approach to correctly update the analytical model is regarded as the right identification standard.

  8. Aerosol events in the broader Mediterranean basin based on 7-year (2000–2007 MODIS C005 data

    Directory of Open Access Journals (Sweden)

    A. Gkikas

    2009-09-01

    Full Text Available Aerosol events (their frequency and intensity in the broader Mediterranean basin were studied using 7-year (2000–2007 aerosol data of optical depth (AOD at 550 nm from the MODerate Resolution Imaging Spectroradiometer (MODIS Terra. The complete spatial coverage of data revealed a significant spatial variability of aerosol events which is also dependent on their intensity. Strong events occur more often in the western and central Mediterranean basin (up to 14 events/year whereas extreme events (AOD up to 5.0 are systematically observed in the eastern Mediterranean basin throughout the year. There is also a significant seasonal variability with strong aerosol events occurring most frequently in the western part of the basin in summer and extreme episodes in the eastern part during spring. The events were also analyzed separately over land and sea revealing differences that are due to the different natural and anthropogenic processes, like dust transport (producing maximum frequencies of extreme episodes in spring over both land and sea or forest fires (producing maximum frequencies in strong episodes in summer over land. The inter-annual variability shows a gradual decrease in the frequency of all aerosol episodes over land and sea areas of the Mediterranean during the period 2000–2007, associated with an increase in their intensity (increased AOD values. The strong spatiotemporal variability of aerosol events indicates the need for monitoring them at the highest spatial and temporal coverage and resolution.

  9. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    OpenAIRE

    Tramblay, Yves; Bouvier, Christophe; Martin, C.; Didon-Lescot, J. F.; Todorovik, D.; Domergue, J. M.

    2010-01-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture,...

  10. Safety based on organisational learning (SOL) - Conceptual approach and verification of a method for event analysis

    International Nuclear Information System (INIS)

    Miller, R.; Wilpert, B.; Fahlbruch, B.

    1999-01-01

    This paper discusses a method for analysing safety-relevant events in NPP which is known as 'SOL', safety based on organisational learning. After discussion of the specific organisational and psychological problems examined in the event analysis, the analytic process using the SOL approach is explained as well as the required general setting. The SOL approach has been tested both with scientific experiments and from the practical perspective, by operators of NPPs and experts from other branches of industry. (orig./CB) [de

  11. Regional frequency analysis of extreme rainfall in Belgium based on radar estimates

    Directory of Open Access Journals (Sweden)

    E. Goudenhoofdt

    2017-10-01

    Full Text Available In Belgium, only rain gauge time series have been used so far to study extreme rainfall at a given location. In this paper, the potential of a 12-year quantitative precipitation estimation (QPE from a single weather radar is evaluated. For the period 2005–2016, 1 and 24 h rainfall extremes from automatic rain gauges and collocated radar estimates are compared. The peak intensities are fitted to the exponential distribution using regression in Q-Q plots with a threshold rank which minimises the mean squared error. A basic radar product used as reference exhibits unrealistic high extremes and is not suitable for extreme value analysis. For 24 h rainfall extremes, which occur partly in winter, the radar-based QPE needs a bias correction. A few missing events are caused by the wind drift associated with convective cells and strong radar signal attenuation. Differences between radar and gauge rainfall values are caused by spatial and temporal sampling, gauge underestimations and radar errors. Nonetheless the fit to the QPE data is within the confidence interval of the gauge fit, which remains large due to the short study period. A regional frequency analysis for 1 h duration is performed at the locations of four gauges with 1965–2008 records using the spatially independent QPE data in a circle of 20 km. The confidence interval of the radar fit, which is small due to the sample size, contains the gauge fit for the two closest stations from the radar. In Brussels, the radar extremes are significantly higher than the gauge rainfall extremes, but similar to those observed by an automatic gauge during the same period. The extreme statistics exhibit slight variations related to topography. The radar-based extreme value analysis can be extended to other durations.

  12. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  13. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  14. A general theory on frequency and time-frequency analysis of irregularly sampled time series based on projection methods - Part 2: Extension to time-frequency analysis

    Science.gov (United States)

    Lenoir, Guillaume; Crucifix, Michel

    2018-03-01

    Geophysical time series are sometimes sampled irregularly along the time axis. The situation is particularly frequent in palaeoclimatology. Yet, there is so far no general framework for handling the continuous wavelet transform when the time sampling is irregular. Here we provide such a framework. To this end, we define the scalogram as the continuous-wavelet-transform equivalent of the extended Lomb-Scargle periodogram defined in Part 1 of this study (Lenoir and Crucifix, 2018). The signal being analysed is modelled as the sum of a locally periodic component in the time-frequency plane, a polynomial trend, and a background noise. The mother wavelet adopted here is the Morlet wavelet classically used in geophysical applications. The background noise model is a stationary Gaussian continuous autoregressive-moving-average (CARMA) process, which is more general than the traditional Gaussian white and red noise processes. The scalogram is smoothed by averaging over neighbouring times in order to reduce its variance. The Shannon-Nyquist exclusion zone is however defined as the area corrupted by local aliasing issues. The local amplitude in the time-frequency plane is then estimated with least-squares methods. We also derive an approximate formula linking the squared amplitude and the scalogram. Based on this property, we define a new analysis tool: the weighted smoothed scalogram, which we recommend for most analyses. The estimated signal amplitude also gives access to band and ridge filtering. Finally, we design a test of significance for the weighted smoothed scalogram against the stationary Gaussian CARMA background noise, and provide algorithms for computing confidence levels, either analytically or with Monte Carlo Markov chain methods. All the analysis tools presented in this article are available to the reader in the Python package WAVEPAL.

  15. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  16. Supervision in the PC based prototype for the ATLAS event filter

    CERN Document Server

    Bee, C P; Etienne, F; Fede, E; Meessen, C; Nacasch, R; Qian, Z; Touchard, F

    1999-01-01

    A prototype of the ATLAS event filter based on commodity PCs linked by a Fast Ethernet switch has been developed in Marseille. The present contribution focus on the supervision aspects of the prototype based on Java and Java mobile agents technology. (5 refs).

  17. Dynamics of large-scale cortical interactions at high gamma frequencies during word production: event related causality (ERC) analysis of human electrocorticography (ECoG).

    Science.gov (United States)

    Korzeniewska, Anna; Franaszczuk, Piotr J; Crainiceanu, Ciprian M; Kuś, Rafał; Crone, Nathan E

    2011-06-15

    Intracranial EEG studies in humans have shown that functional brain activation in a variety of functional-anatomic domains of human cortex is associated with an increase in power at a broad range of high gamma (>60Hz) frequencies. Although these electrophysiological responses are highly specific for the location and timing of cortical processing and in animal recordings are highly correlated with increased population firing rates, there has been little direct empirical evidence for causal interactions between different recording sites at high gamma frequencies. Such causal interactions are hypothesized to occur during cognitive tasks that activate multiple brain regions. To determine whether such causal interactions occur at high gamma frequencies and to investigate their functional significance, we used event-related causality (ERC) analysis to estimate the dynamics, directionality, and magnitude of event-related causal interactions using subdural electrocorticography (ECoG) recorded during two word production tasks: picture naming and auditory word repetition. A clinical subject who had normal hearing but was skilled in American Signed Language (ASL) provided a unique opportunity to test our hypothesis with reference to a predictable pattern of causal interactions, i.e. that language cortex interacts with different areas of sensorimotor cortex during spoken vs. signed responses. Our ERC analyses confirmed this prediction. During word production with spoken responses, perisylvian language sites had prominent causal interactions with mouth/tongue areas of motor cortex, and when responses were gestured in sign language, the most prominent interactions involved hand and arm areas of motor cortex. Furthermore, we found that the sites from which the most numerous and prominent causal interactions originated, i.e. sites with a pattern of ERC "divergence", were also sites where high gamma power increases were most prominent and where electrocortical stimulation mapping

  18. Ancillary Frequency Control of Direct Drive Full-Scale Converter Based Wind Power Plants

    DEFF Research Database (Denmark)

    Hu, Weihao; Su, Chi; Fang, Jiakun

    2013-01-01

    This paper presents a simulation model of a wind power plant based on a MW-level variable speed wind turbine with a full-scale back-to-back power converter developed in the simulation tool of DIgSILENT Power Factory. Three different kinds of ancillary frequency control strategies, namely inertia...... control strategies are effective means for providing ancillary frequency control of variable speed wind turbines with full-scale back-to-back power converters....... emulation, primary frequency control and secondary frequency control, are proposed in order to improve the frequency stability of power systems. The modified IEEE 39-bus test system with a large-scale wind power penetration is chosen as the studied power system. Simulation results show that the proposed...

  19. EEMD-MUSIC-Based Analysis for Natural Frequencies Identification of Structures Using Artificial and Natural Excitations

    Directory of Open Access Journals (Sweden)

    David Camarena-Martinez

    2014-01-01

    Full Text Available This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification- based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals.

  20. Frequency domain indirect identification of AMB rotor systems based on fictitious proportional feedback gain

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Hyeong Joon [Dept. of Mechanical Engineering, Soongsil University, Seoul (Korea, Republic of); Kim, Chan Jung [Dept. of Mechanical Design Engineering, Pukyong National University, Busan(Korea, Republic of)

    2016-12-15

    It is very difficult to directly identify an unstable system with uncertain dynamics from frequency domain input-output data. Hence, in these cases, closed-loop frequency responses calculated using a fictitious feedback could be more identifiable than open-loop data. This paper presents a frequency domain indirect identification of AMB rotor systems based on a Fictitious proportional feedback gain (FPFG). The closed-loop effect due to the FPFG can enhance the detectability of the system by moving the system poles, and significantly weigh the target mode in the frequency domain. The effectiveness of the proposed identification method was verified through the frequency domain identification of active magnetic bearing rotor systems.

  1. Dissemination of optical-comb-based ultra-broadband frequency reference through a fiber network.

    Science.gov (United States)

    Nagano, Shigeo; Kumagai, Motohiro; Li, Ying; Ido, Tetsuya; Ishii, Shoken; Mizutani, Kohei; Aoki, Makoto; Otsuka, Ryohei; Hanado, Yuko

    2016-08-22

    We disseminated an ultra-broadband optical frequency reference based on a femtosecond (fs)-laser optical comb through a kilometer-scale fiber link. Its spectrum ranged from 1160 nm to 2180 nm without additional fs-laser combs at the end of the link. By employing a fiber-induced phase noise cancellation technique, the linewidth and fractional frequency instability attained for all disseminated comb modes were of order 1 Hz and 10-18 in a 5000 s averaging time. The ultra-broad optical frequency reference, for which absolute frequency is traceable to Japan Standard Time, was applied in the frequency stabilization of an injection-seeded Q-switched 2051 nm pulse laser for a coherent light detection and ranging LIDAR system.

  2. An automatic method to determine cutoff frequency based on image power spectrum

    International Nuclear Information System (INIS)

    Beis, J.S.; Vancouver Hospital and Health Sciences Center, British Columbia; Celler, A.; Barney, J.S.

    1995-01-01

    The authors present an algorithm for automatically choosing filter cutoff frequency (F c ) using the power spectrum of the projections. The method is based on the assumption that the expectation of the image power spectrum is the sum of the expectation of the blurred object power spectrum (dominant at low frequencies) plus a constant value due to Poisson noise. By considering the discrete components of the noise-dominated high-frequency spectrum as a Gaussian distribution N(μ,σ), the Student t-test determines F c as the highest frequency for which the image frequency components are unlikely to be drawn from N (μ,σ). The method is general and can be applied to any filter. In this work, the authors tested the approach using the Metz restoration filter on simulated, phantom, and patient data with good results. Quantitative performance of the technique was evaluated by plotting recovery coefficient (RC) versus NMSE of reconstructed images

  3. EEMD-MUSIC-Based Analysis for Natural Frequencies Identification of Structures Using Artificial and Natural Excitations

    Science.gov (United States)

    Amezquita-Sanchez, Juan P.; Romero-Troncoso, Rene J.; Osornio-Rios, Roque A.; Garcia-Perez, Arturo

    2014-01-01

    This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification-) based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions) and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals. PMID:24683346

  4. A stabilized optical frequency comb based on an Er-doped fiber femtosecond laser

    Science.gov (United States)

    Xia, Chuanqing; Wu, Tengfei; Zhao, Chunbo; Xing, Shuai

    2018-03-01

    An optical frequency comb based on a 250 MHz home-made Er-doped fiber femtosecond laser is presented in this paper. The Er-doped fiber laser has a ring cavity and operates mode-locked in femtosecond regime with the technique of nonlinear polarization rotation. The pulse duration is 118 fs and the spectral width is 30 nm. A part of the femtosecond laser is amplified in Er-doped fiber amplifier before propagating through a piece of highly nonlinear fiber for expanding the spectrum. The carrier-envelope offset frequency of the comb which has a signal-to-noise ratio more than 35 dB is extracted by means of f-2f beating. It demonstrates that both carrier-envelope offset frequency and repetition frequency keep phase locked to a Rubidium atomic clock simultaneously for 2 hours. The frequency stabilized fiber combs will be increasingly applied in optical metrology, attosecond pulse generation, and absolute distance measurement.

  5. Dual-wavelength green laser with a 4.5 THz frequency difference based on self-frequency- doubling in Nd3+ -doped aperiodically poled lithium niobate.

    Science.gov (United States)

    Maestre, H; Torregrosa, A J; Fernández-Pousa, C R; Rico, M L; Capmany, J

    2008-05-01

    We report a dual-wavelength continuous-wave laser at 542.4 and 546.8 nm based on an Nd(3+)-doped aperiodically poled lithium niobate crystal. Two fundamental infrared (IR) wavelengths at 1084.8 and 1093.6 nm are simultaneously oscillated and self-frequency-doubled to green. The aperiodic domain distribution patterned in the crystal allows for quasi-phase matched self-frequency-doubling of both IR fundamentals while avoiding their sum-frequency mixing.

  6. Analyzing mobile WiMAX base station deployment under different frequency planning strategies

    Science.gov (United States)

    Salman, M. K.; Ahmad, R. B.; Ali, Ziad G.; Aldhaibani, Jaafar A.; Fayadh, Rashid A.

    2015-05-01

    The frequency spectrum is a precious resource and scarce in the communication markets. Therefore, different techniques are adopted to utilize the available spectrum in deploying WiMAX base stations (BS) in cellular networks. In this paper several types of frequency planning techniques are illustrated, and a comprehensive comparative study between conventional frequency reuse of 1 (FR of 1) and fractional frequency reuse (FFR) is presented. These techniques are widely used in network deployment, because they employ universal frequency (using all the available bandwidth) in their base station installation/configuration within network system. This paper presents a network model of 19 base stations in order to be employed in the comparison of the aforesaid frequency planning techniques. Users are randomly distributed within base stations, users' resource mapping and their burst profile selection are based on the measured signal to interference plus-noise ratio (SINR). Simulation results reveal that the FFR has advantages over the conventional FR of 1 in various metrics. 98 % of downlink resources (slots) are exploited when FFR is applied, whilst it is 81 % at FR of 1. Data rate of FFR has been increased to 10.6 Mbps, while it is 7.98 Mbps at FR of 1. The spectral efficiency is better enhanced (1.072 bps/Hz) at FR of 1 than FFR (0.808 bps/Hz), since FR of 1 exploits all the Bandwidth. The subcarrier efficiency shows how many data bits that can be carried by subcarriers under different frequency planning techniques, the system can carry more data bits under FFR (2.40 bit/subcarrier) than FR of 1 (1.998 bit/subcarrier). This study confirms that FFR can perform better than conventional frequency planning (FR of 1) which made it a strong candidate for WiMAX BS deployment in cellular networks.

  7. Simulation of stress-modulated magnetization precession frequency in Heusler-based spin torque oscillator

    International Nuclear Information System (INIS)

    Huang, Houbing; Zhao, Congpeng; Ma, Xingqiao

    2017-01-01

    We investigated stress-modulated magnetization precession frequency in Heusler-based spin transfer torque oscillator by combining micromagnetic simulations with phase field microelasticity theory, by encapsulating the magnetic tunnel junction into multilayers structures. We proposed a novel method of using an external stress to control the magnetization precession in spin torque oscillator instead of an external magnetic field. The stress-modulated magnetization precession frequency can be linearly modulated by externally applied uniaxial in-plane stress, with a tunable range 4.4–7.0 GHz under the stress of 10 MPa. By comparison, the out-of-plane stress imposes negligible influence on the precession frequency due to the large out-of-plane demagnetization field. The results offer new inspiration to the design of spin torque oscillator devices that simultaneously process high frequency, narrow output band, and tunable over a wide range of frequencies via external stress. - Highlights: • We proposed stress-modulated magnetization precession in spin torque oscillator. • The magnetization precession frequency can be linearly modulated by in-plane stress. • The stress also can widen the magnetization frequency range 4.4–7.0 GHz. • The stress-modulated oscillation frequency can simplify STO devices.

  8. A digital frequency stabilization system of external cavity diode laser based on LabVIEW FPGA

    Science.gov (United States)

    Liu, Zhuohuan; Hu, Zhaohui; Qi, Lu; Wang, Tao

    2015-10-01

    Frequency stabilization for external cavity diode laser has played an important role in physics research. Many laser frequency locking solutions have been proposed by researchers. Traditionally, the locking process was accomplished by analog system, which has fast feedback control response speed. However, analog system is susceptible to the effects of environment. In order to improve the automation level and reliability of the frequency stabilization system, we take a grating-feedback external cavity diode laser as the laser source and set up a digital frequency stabilization system based on National Instrument's FPGA (NI FPGA). The system consists of a saturated absorption frequency stabilization of beam path, a differential photoelectric detector, a NI FPGA board and a host computer. Many functions, such as piezoelectric transducer (PZT) sweeping, atomic saturation absorption signal acquisition, signal peak identification, error signal obtaining and laser PZT voltage feedback controlling, are totally completed by LabVIEW FPGA program. Compared with the analog system, the system built by the logic gate circuits, performs stable and reliable. User interface programmed by LabVIEW is friendly. Besides, benefited from the characteristics of reconfiguration, the LabVIEW program is good at transplanting in other NI FPGA boards. Most of all, the system periodically checks the error signal. Once the abnormal error signal is detected, FPGA will restart frequency stabilization process without manual control. Through detecting the fluctuation of error signal of the atomic saturation absorption spectrum line in the frequency locking state, we can infer that the laser frequency stability can reach 1MHz.

  9. Simulation of stress-modulated magnetization precession frequency in Heusler-based spin torque oscillator

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Houbing, E-mail: hbhuang@ustb.edu.cn; Zhao, Congpeng; Ma, Xingqiao, E-mail: xqma@sas.ustb.edu.cn

    2017-03-15

    We investigated stress-modulated magnetization precession frequency in Heusler-based spin transfer torque oscillator by combining micromagnetic simulations with phase field microelasticity theory, by encapsulating the magnetic tunnel junction into multilayers structures. We proposed a novel method of using an external stress to control the magnetization precession in spin torque oscillator instead of an external magnetic field. The stress-modulated magnetization precession frequency can be linearly modulated by externally applied uniaxial in-plane stress, with a tunable range 4.4–7.0 GHz under the stress of 10 MPa. By comparison, the out-of-plane stress imposes negligible influence on the precession frequency due to the large out-of-plane demagnetization field. The results offer new inspiration to the design of spin torque oscillator devices that simultaneously process high frequency, narrow output band, and tunable over a wide range of frequencies via external stress. - Highlights: • We proposed stress-modulated magnetization precession in spin torque oscillator. • The magnetization precession frequency can be linearly modulated by in-plane stress. • The stress also can widen the magnetization frequency range 4.4–7.0 GHz. • The stress-modulated oscillation frequency can simplify STO devices.

  10. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  11. Neural correlates of attentional and mnemonic processing in event-based prospective memory.

    Science.gov (United States)

    Knight, Justin B; Ethridge, Lauren E; Marsh, Richard L; Clementz, Brett A

    2010-01-01

    Prospective memory (PM), or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT), followed by a LDT with an embedded PM component. Event-based cues were constituted by color and lexicality (red words). Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP) revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  12. Changes in Alpha Frequency and Power of the Electroencephalogram during Volatile-Based General Anesthesia

    Directory of Open Access Journals (Sweden)

    Darren Hight

    2017-05-01

    Full Text Available Oscillations in the electroencephalogram (EEG at the alpha frequency (8–12 Hz are thought to be ubiquitous during surgical anesthesia, but the details of how this oscillation responds to ongoing changes in volatile anesthetic concentration have not been well characterized. It is not known how often alpha oscillations are absent in the clinical context, how sensitively alpha frequency and power respond to changes in anesthetic concentration, and what effect increased age has on alpha frequency. Bipolar EEG was recorded frontally from 305 patients undergoing surgery with sevoflurane or desflurane providing general anesthesia. A new method of detecting the presence of alpha oscillations based on the stability of the rate of change of the peak frequency in the alpha range was developed. Linear concentration-response curves were fitted to assess the sensitivity of alpha power and frequency measures to changing levels of anesthesia. Alpha oscillations were seen to be inexplicably absent in around 4% of patients. Maximal alpha power increased with increasing volatile anesthetic concentrations in half of the patients, and decreased in the remaining patients. Alpha frequency decreased with increasing anesthetic concentrations in near to 90% of patients. Increasing age was associated with decreased sensitivity to volatile anesthesia concentrations, and with decreased alpha frequency, which sometimes transitioned into the theta range (5–7 Hz. While peak alpha frequency shows a consistent slowing to increasing volatile concentrations, the peak power of the oscillation does not, suggesting that frequency might be more informative of depth of anesthesia than traditional power based measures during volatile-based anesthesia. The alpha oscillation becomes slower with increasing age, even when the decreased anesthetic needs of older patients were taken into account.

  13. Event-based scenario manager for multibody dynamics simulation of heavy load lifting operations in shipyards

    Directory of Open Access Journals (Sweden)

    Sol Ha

    2016-01-01

    Full Text Available This paper suggests an event-based scenario manager capable of creating and editing a scenario for shipbuilding process simulation based on multibody dynamics. To configure various situation in shipyards and easily connect with multibody dynamics, the proposed method has two main concepts: an Actor and an Action List. The Actor represents the anatomic unit of action in the multibody dynamics and can be connected to a specific component of the dynamics kernel such as the body and joint. The user can make a scenario up by combining the actors. The Action List contains information for arranging and executing the actors. Since the shipbuilding process is a kind of event-based sequence, all simulation models were configured using Discrete EVent System Specification (DEVS formalism. The proposed method was applied to simulations of various operations in shipyards such as lifting and erection of a block and heavy load lifting operation using multiple cranes.

  14. Seismology-based early identification of dam-formation landquake events.

    Science.gov (United States)

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  15. Limits on the efficiency of event-based algorithms for Monte Carlo neutron transport

    Directory of Open Access Journals (Sweden)

    Paul K. Romano

    2017-09-01

    Full Text Available The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, the vector speedup is also limited by differences in the execution time for events being carried out in a single event-iteration.

  16. Visualization of frequency-modulated electric field based on photonic frequency tracking in asynchronous electro-optic measurement system

    Science.gov (United States)

    Hisatake, Shintaro; Yamaguchi, Koki; Uchida, Hirohisa; Tojyo, Makoto; Oikawa, Yoichi; Miyaji, Kunio; Nagatsuma, Tadao

    2018-04-01

    We propose a new asynchronous measurement system to visualize the amplitude and phase distribution of a frequency-modulated electromagnetic wave. The system consists of three parts: a nonpolarimetric electro-optic frequency down-conversion part, a phase-noise-canceling part, and a frequency-tracking part. The photonic local oscillator signal generated by electro-optic phase modulation is controlled to track the frequency of the radio frequency (RF) signal to significantly enhance the measurable RF bandwidth. We demonstrate amplitude and phase measurement of a quasi-millimeter-wave frequency-modulated continuous-wave signal (24 GHz ± 80 MHz with a 2.5 ms period) as a proof-of-concept experiment.

  17. Reference Beam Pattern Design for Frequency Invariant Beamforming Based on Fast Fourier Transform

    Directory of Open Access Journals (Sweden)

    Wang Zhang

    2016-09-01

    Full Text Available In the field of fast Fourier transform (FFT-based frequency invariant beamforming (FIB, there is still an unsolved problem. That is the selection of the reference beam to make the designed wideband pattern frequency invariant (FI over a given frequency range. This problem is studied in this paper. The research shows that for a given array, the selection of the reference beam pattern is determined by the number of sensors and the ratio of the highest frequency to the lowest frequency of the signal (RHL. The length of the weight vector corresponding to a given reference beam pattern depends on the reference frequency. In addition, the upper bound of the weight length to ensure the FI property over the whole frequency band of interest is also given. When the constraints are added to the reference beam, it does not affect the FI property of the designed wideband beam as long as the symmetry of the reference beam is ensured. Based on this conclusion, a scheme for reference beam design is proposed.

  18. Risk-based ranking of dominant contributors to maritime pollution events

    International Nuclear Information System (INIS)

    Wheeler, T.A.

    1993-01-01

    This report describes a conceptual approach for identifying dominant contributors to risk from maritime shipping of hazardous materials. Maritime transportation accidents are relatively common occurrences compared to more frequently analyzed contributors to public risk. Yet research on maritime safety and pollution incidents has not been guided by a systematic, risk-based approach. Maritime shipping accidents can be analyzed using event trees to group the accidents into 'bins,' or groups, of similar characteristics such as type of cargo, location of accident (e.g., harbor, inland waterway), type of accident (e.g., fire, collision, grounding), and size of release. The importance of specific types of events to each accident bin can be quantified. Then the overall importance of accident events to risk can be estimated by weighting the events' individual bin importance measures by the risk associated with each accident bin. 4 refs., 3 figs., 6 tabs

  19. Adaptive Event-Triggered Control Based on Heuristic Dynamic Programming for Nonlinear Discrete-Time Systems.

    Science.gov (United States)

    Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo

    2017-07-01

    This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.

  20. The low frequency 2D vibration sensor based on flat coil element

    Energy Technology Data Exchange (ETDEWEB)

    Djamal, Mitra; Sanjaya, Edi; Islahudin; Ramli [Department of Physics, Institut Teknologi Bandung, Jl. Ganesa 10 Bandung 40116 (Indonesia); Department of Physics, Institut Teknologi Bandung, Jl. Ganesa 10 Bandung 40116 (Indonesia) and Department of Physics, UIN Syarif Hidayatullah, Jl. Ir.H. Djuanda 95 Ciputat 15412 (Indonesia); MTs NW Nurul Iman Kembang Kerang, Jl. Raya Mataram - Lb.Lombok, NTB (Indonesia); Department of Physics, Institut Teknologi Bandung, Jl. Ganesa 10 Bandung 40116 (Indonesia) and Department of Physics,Universitas Negeri Padang, Jl. Prof. Hamka, Padang 25132 (Indonesia)

    2012-06-20

    Vibration like an earthquake is a phenomenon of physics. The characteristics of these vibrations can be used as an early warning system so as to reduce the loss or damage caused by earthquakes. In this paper, we introduced a new type of low frequency 2D vibration sensor based on flat coil element that we have developed. Its working principle is based on position change of a seismic mass that put in front of a flat coil element. The flat coil is a part of a LC oscillator; therefore, the change of seismic mass position will change its resonance frequency. The results of measurements of low frequency vibration sensor in the direction of the x axis and y axis gives the frequency range between 0.2 to 1.0 Hz.

  1. A time and frequency synchronization method for CO-OFDM based on CMA equalizers

    Science.gov (United States)

    Ren, Kaixuan; Li, Xiang; Huang, Tianye; Cheng, Zhuo; Chen, Bingwei; Wu, Xu; Fu, Songnian; Ping, Perry Shum

    2018-06-01

    In this paper, an efficient time and frequency synchronization method based on a new training symbol structure is proposed for polarization division multiplexing (PDM) coherent optical orthogonal frequency division multiplexing (CO-OFDM) systems. The coarse timing synchronization is achieved by exploiting the correlation property of the first training symbol, and the fine timing synchronization is accomplished by using the time-domain symmetric conjugate of the second training symbol. Furthermore, based on these training symbols, a constant modulus algorithm (CMA) is proposed for carrier frequency offset (CFO) estimation. Theoretical analysis and simulation results indicate that the algorithm has the advantages of robustness to poor optical signal-to-noise ratio (OSNR) and chromatic dispersion (CD). The frequency offset estimation range can achieve [ -Nsc/2 ΔfN , + Nsc/2 ΔfN ] GHz with the mean normalized estimation error below 12 × 10-3 even under the condition of OSNR as low as 10 dB.

  2. Frequency hopping signal detection based on wavelet decomposition and Hilbert-Huang transform

    Science.gov (United States)

    Zheng, Yang; Chen, Xihao; Zhu, Rui

    2017-07-01

    Frequency hopping (FH) signal is widely adopted by military communications as a kind of low probability interception signal. Therefore, it is very important to research the FH signal detection algorithm. The existing detection algorithm of FH signals based on the time-frequency analysis cannot satisfy the time and frequency resolution requirement at the same time due to the influence of window function. In order to solve this problem, an algorithm based on wavelet decomposition and Hilbert-Huang transform (HHT) was proposed. The proposed algorithm removes the noise of the received signals by wavelet decomposition and detects the FH signals by Hilbert-Huang transform. Simulation results show the proposed algorithm takes into account both the time resolution and the frequency resolution. Correspondingly, the accuracy of FH signals detection can be improved.

  3. Application of energies of optimal frequency bands for fault diagnosis based on modified distance function

    Energy Technology Data Exchange (ETDEWEB)

    Zamanian, Amir Hosein [Southern Methodist University, Dallas (United States); Ohadi, Abdolreza [Amirkabir University of Technology (Tehran Polytechnic), Tehran (Iran, Islamic Republic of)

    2017-06-15

    Low-dimensional relevant feature sets are ideal to avoid extra data mining for classification. The current work investigates the feasibility of utilizing energies of vibration signals in optimal frequency bands as features for machine fault diagnosis application. Energies in different frequency bands were derived based on Parseval's theorem. The optimal feature sets were extracted by optimization of the related frequency bands using genetic algorithm and a Modified distance function (MDF). The frequency bands and the number of bands were optimized based on the MDF. The MDF is designed to a) maximize the distance between centers of classes, b) minimize the dispersion of features in each class separately, and c) minimize dimension of extracted feature sets. The experimental signals in two different gearboxes were used to demonstrate the efficiency of the presented technique. The results show the effectiveness of the presented technique in gear fault diagnosis application.

  4. Research of hydroelectric generating set low-frequency vibration monitoring system based on optical fiber sensing

    Science.gov (United States)

    Min, Li; Zhang, Xiaolei; Zhang, Faxiang; Sun, Zhihui; Li, ShuJuan; Wang, Meng; Wang, Chang

    2017-10-01

    In order to satisfy hydroelectric generating set low-frequency vibration monitoring, the design of Passive low-frequency vibration monitoring system based on Optical fiber sensing in this paper. The hardware of the system adopts the passive optical fiber grating sensor and unbalanced-Michelson interferometer. The software system is used to programming by Labview software and finishing the control of system. The experiment show that this system has good performance on the standard vibration testing-platform and it meets system requirements. The frequency of the monitoring system can be as low as 0.2Hz and the resolution is 0.01Hz.

  5. Econometric analysis of realized covariation: high frequency based covariance, regression, and correlation in financial economics

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2004-01-01

    This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing...... the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities....

  6. Simulation of power fluctuation of wind farms based on frequency domain

    DEFF Research Database (Denmark)

    Lin, Jin; Sun, Yuanzhang; Li, Guojie

    2011-01-01

    , however, is incapable of completely explaining the physical mechanism of randomness of power fluctuation. To remedy such a situation, fluctuation modeling based on the frequency domain is proposed. The frequency domain characteristics of stochastic fluctuation on large wind farms are studied using...... the power spectral density of wind speed, the frequency domain model of a wind power generator and the information on weather and geography of the wind farms. The correctness and effectiveness of the model are verified by comparing the measurement data with simulation results of a certain wind farm. © 2011...

  7. Clock-Frequency Switching Technique for Energy Saving of Microcontroller Unit (MCU-Based Sensor Node

    Directory of Open Access Journals (Sweden)

    Pumin Duangmanee

    2018-05-01

    Full Text Available In this paper; a technique is proposed for reducing the energy consumption of microcontroller-based sensor nodes by switching the operating clock between low and high frequencies. The proposed concept is motivated by the fact that if the application codes of the microcontroller unit (MCU consist of no-wait state instruction sets, it consumes less energy when it operates with a higher frequency. When the application code of the MCU consists of wait instruction sets; e.g., a wait acknowledge signal, it switches to low clock frequency. The experimental results confirm that the proposed technique can reduce the MCU energy consumption up to 66.9%.

  8. Influence of laser frequency noise on scanning Fabry-Perot interferometer based laser Doppler velocimetry

    DEFF Research Database (Denmark)

    Rodrigo, Peter John; Pedersen, Christian

    2014-01-01

    n this work, we study the performance of a scanning Fabry-Perot interferometer based laser Doppler velocimeter (sFPILDV) and compare two candidate 1.5 um single-frequency laser sources for the system – a fiber laser (FL) and a semiconductor laser (SL). We describe a straightforward calibration...... procedure for the sFPI-LDV and investigate the effect of different degrees of laser frequency noise between the FL and the SL on the velocimeter’s performance...

  9. Improving the Critic Learning for Event-Based Nonlinear $H_{\\infty }$ Control Design.

    Science.gov (United States)

    Wang, Ding; He, Haibo; Liu, Derong

    2017-10-01

    In this paper, we aim at improving the critic learning criterion to cope with the event-based nonlinear H ∞ state feedback control design. First of all, the H ∞ control problem is regarded as a two-player zero-sum game and the adaptive critic mechanism is used to achieve the minimax optimization under event-based environment. Then, based on an improved updating rule, the event-based optimal control law and the time-based worst-case disturbance law are obtained approximately by training a single critic neural network. The initial stabilizing control is no longer required during the implementation process of the new algorithm. Next, the closed-loop system is formulated as an impulsive model and its stability issue is handled by incorporating the improved learning criterion. The infamous Zeno behavior of the present event-based design is also avoided through theoretical analysis on the lower bound of the minimal intersample time. Finally, the applications to an aircraft dynamics and a robot arm plant are carried out to verify the efficient performance of the present novel design method.

  10. Integral-based event triggering controller design for stochastic LTI systems via convex optimisation

    Science.gov (United States)

    Mousavi, S. H.; Marquez, H. J.

    2016-07-01

    The presence of measurement noise in the event-based systems can lower system efficiency both in terms of data exchange rate and performance. In this paper, an integral-based event triggering control system is proposed for LTI systems with stochastic measurement noise. We show that the new mechanism is robust against noise and effectively reduces the flow of communication between plant and controller, and also improves output performance. Using a Lyapunov approach, stability in the mean square sense is proved. A simulated example illustrates the properties of our approach.

  11. Gear-box fault detection using time-frequency based methods

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2015-01-01

    Gear-box fault monitoring and detection is important for optimization of power generation and availability of wind turbines. The current industrial approach is to use condition monitoring systems, which runs in parallel with the wind turbine control system, using expensive additional sensors...... in the gear-box resonance frequency can be detected. Two different time–frequency based approaches are presented in this paper. One is a filter based approach and the other is based on a Karhunen–Loeve basis. Both of them detect the gear-box fault with an acceptable detection delay of maximum 100s, which...... is neglectable compared with the fault developing time....

  12. Analysis of core damage frequency from internal events: Expert judgment elicitation. Part 1: Expert panel results. Part 2: Project staff results

    Energy Technology Data Exchange (ETDEWEB)

    Wheeler, T A; Cramond, W R [Sandia National Laboratories, Albuquerque, NM (United States); Hora, S C [University of Hawii at Hilo (United States); Unwin, S D [Brookhaven National Laboratory (United States)

    1989-04-01

    Quantitative modeling techniques have limitations as to the resolution of important issues in probabilistic risk assessment (PRA). Not all issues can be resolved via the existing set of methods such as fault trees, event trees, statistical analyses, data collection, and computer simulation. Therefore, an expert judgment process was developed to address issues perceived as important to risk in the NUREG-1150 analysis but which could not be resolved with existing techniques. This process was applied to several issues that could significantly affect the internal event core damage frequencies of the PRAs performed on six light water reactors. Detailed descriptions of these issues and the results of the expert judgment elicitation are reported here, as well as an explanation of the methodology used and the procedure followed in performing the overall elicitation task. The process is time-consuming and expensive. However, the results are very useful, and represent an improvement over the draft NUREG-1150 analysis in the areas of expert selection, elicitation training, issue selection and presentation, elicitation of judgment and aggregation of results. The results are presented in two parts. Part documents the expert panel elicitations, where the most important issues were presented to a panel of experts convened from throughout the nuclear power risk assessment community. Part 2 documents the process by which the project staff performed expert judgment on other important issues, using the project staff as panel members. (author)

  13. Separate representation of stimulus frequency, intensity, and duration in auditory sensory memory: an event-related potential and dipole-model analysis.

    Science.gov (United States)

    Giard, M H; Lavikahen, J; Reinikainen, K; Perrin, F; Bertrand, O; Pernier, J; Näätänen, R

    1995-01-01

    Abstract The present study analyzed the neural correlates of acoustic stimulus representation in echoic sensory memory. The neural traces of auditory sensory memory were indirectly studied by using the mismatch negativity (MMN), an event-related potential component elicited by a change in a repetitive sound. The MMN is assumed to reflect change detection in a comparison process between the sensory input from a deviant stimulus and the neural representation of repetitive stimuli in echoic memory. The scalp topographies of the MMNs elicited by pure tones deviating from standard tones by either frequency, intensity, or duration varied according to the type of stimulus deviance, indicating that the MMNs for different attributes originate, at least in part, from distinct neural populations in the auditory cortex. This result was supported by dipole-model analysis. If the MMN generator process occurs where the stimulus information is stored, these findings strongly suggest that the frequency, intensity, and duration of acoustic stimuli have a separate neural representation in sensory memory.

  14. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  15. Pulse width modulation based pneumatic frequency tuner of the superconducting resonators at IUAC

    International Nuclear Information System (INIS)

    Pandey, A.; Suman, S.K.; Mathuria, D.S.

    2015-01-01

    The existing phase locking scheme of the quarter wave resonators (QWR) used in superconducting linear accelerator (LINAC) of IUAC consists of a fast time (electronic) and a slow time (pneumatic) control. Presently, piezo based mechanical tuners are being used to phase lock the resonators installed in the second and third accelerating modules of LINAC. However, due to space constraint, the piezo tuner can't be implemented on the resonators of the first accelerating module. Therefore, helium gas operated mechanical tuners are being used to phase lock the resonators against the master oscillator (MO) frequency. The present pneumatic frequency tuner has limitations of non-linearity, hysteresis and slow response time. To overcome these problems and to improve the dynamics of the existing tuner, a new pulse width modulation (PWM) based pneumatic frequency tuning system was adopted and successfully tested. After successful test, the PWM based pneumatic frequency tuner was installed in four QWR of the first accelerating module of LINAC. During beam run the PWM based frequency tuner performed well and the cavities could be phase locked at comparatively higher accelerating fields. A comparison of the existing tuning mechanism and the PWM based tuning system along with the test results will be presented in the paper. (author)

  16. Lessons derived from two high-frequency sea level events in the Atlantic: implications for coastal risk analysis and tsunami detection

    Directory of Open Access Journals (Sweden)

    Begoña Pérez-Gómez

    2016-11-01

    Full Text Available The upgrade and enhancement of sea level networks worldwide for integration in sea level hazard warning systems have significantly increased the possibilities for measuring and analyzing high frequency sea level oscillations, with typical periods ranging from a few minutes to a few hours. Many tide gauges now afford 1 min or more frequent sampling and have shown such events to be a common occurrence. Their origins and spatial distribution are diverse and must be well understood in order to correctly design and interpret, for example, the automatic detection algorithms used by tsunami warning centers. Two events recorded recently in European Atlantic waters are analyzed here: possible wave-induced seiches that occurred along the North coast of Spain during the storms of January and February of 2014, and oscillations detected after an earthquake in the mid-Atlantic the 13th of February of 2015. The former caused significant flooding in towns and villages and a huge increase in wave-induced coastal damage that was reported in the media for weeks. The second was a smaller signal present in several tide gauges along the Atlantic coast that, that coincided with the occurrence of this earthquake, leading to a debate on the potential detection of a very small tsunami and how it might yield significant information for tsunami wave modelers and for the development of tsunami detection software. These kinds of events inform us about the limitations of automatic algorithms for tsunami warning and help to improve the information provided to tsunami warning centers, whilst also emphasizing the importance of other forcings in generating extreme sea levels and their associated potential for causing damage to infrastructure.

  17. Focal mechanisms and inter-event times of low-frequency earthquakes reveal quasi-continuous deformation and triggered slow slip on the deep Alpine Fault

    Science.gov (United States)

    Baratin, Laura-May; Chamberlain, Calum J.; Townend, John; Savage, Martha K.

    2018-02-01

    Characterising the seismicity associated with slow deformation in the vicinity of the Alpine Fault may provide constraints on the stresses acting on a major transpressive margin prior to an anticipated great (≥M8) earthquake. Here, we use recently detected tremor and low-frequency earthquakes (LFEs) to examine how slow tectonic deformation is loading the Alpine Fault late in its typical ∼300-yr seismic cycle. We analyse a continuous seismic dataset recorded between 2009 and 2016 using a network of 10-13 short-period seismometers, the Southern Alps Microearthquake Borehole Array. Fourteen primary LFE templates are used in an iterative matched-filter and stacking routine, allowing the detection of similar signals corresponding to LFE families sharing common locations. This yields an 8-yr catalogue containing 10,000 LFEs that are combined for each of the 14 LFE families using phase-weighted stacking to produce signals with the highest possible signal-to-noise ratios. We show that LFEs occur almost continuously during the 8-yr study period and highlight two types of LFE distributions: (1) discrete behaviour with an inter-event time exceeding 2 min; (2) burst-like behaviour with an inter-event time below 2 min. We interpret the discrete events as small-scale frequent deformation on the deep extent of the Alpine Fault and LFE bursts (corresponding in most cases to known episodes of tremor or large regional earthquakes) as brief periods of increased slip activity indicative of slow slip. We compute improved non-linear earthquake locations using a 3-D velocity model. LFEs occur below the seismogenic zone at depths of 17-42 km, on or near the hypothesised deep extent of the Alpine Fault. The first estimates of LFE focal mechanisms associated with continental faulting, in conjunction with recurrence intervals, are consistent with quasi-continuous shear faulting on the deep extent of the Alpine Fault.

  18. How does higher frequency monitoring data affect the calibration of a process-based water quality model?

    Science.gov (United States)

    Jackson-Blake, Leah; Helliwell, Rachel

    2015-04-01

    Process-based catchment water quality models are increasingly used as tools to inform land management. However, for such models to be reliable they need to be well calibrated and shown to reproduce key catchment processes. Calibration can be challenging for process-based models, which tend to be complex and highly parameterised. Calibrating a large number of parameters generally requires a large amount of monitoring data, spanning all hydrochemical conditions. However, regulatory agencies and research organisations generally only sample at a fortnightly or monthly frequency, even in well-studied catchments, often missing peak flow events. The primary aim of this study was therefore to investigate how the quality and uncertainty of model simulations produced by a process-based, semi-distributed catchment model, INCA-P (the INtegrated CAtchment model of Phosphorus dynamics), were improved by calibration to higher frequency water chemistry data. Two model calibrations were carried out for a small rural Scottish catchment: one using 18 months of daily total dissolved phosphorus (TDP) concentration data, another using a fortnightly dataset derived from the daily data. To aid comparability, calibrations were carried out automatically using the Markov Chain Monte Carlo - DiffeRential Evolution Adaptive Metropolis (MCMC-DREAM) algorithm. Calibration to daily data resulted in improved simulation of peak TDP concentrations and improved model performance statistics. Parameter-related uncertainty in simulated TDP was large when fortnightly data was used for calibration, with a 95% credible interval of 26 μg/l. This uncertainty is comparable in size to the difference between Water Framework Directive (WFD) chemical status classes, and would therefore make it difficult to use this calibration to predict shifts in WFD status. The 95% credible interval reduced markedly with the higher frequency monitoring data, to 6 μg/l. The number of parameters that could be reliably auto

  19. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  20. Life review based on remembering specific positive events in active aging.

    Science.gov (United States)

    Latorre, José M; Serrano, Juan P; Ricarte, Jorge; Bonete, Beatriz; Ros, Laura; Sitges, Esther

    2015-02-01

    The aim of this study is to evaluate the effectiveness of life review (LR) based on specific positive events in non-depressed older adults taking part in an active aging program. Fifty-five older adults were randomly assigned to an experimental group or an active control (AC) group. A six-session individual training of LR based on specific positive events was carried out with the experimental group. The AC group undertook a "media workshop" of six sessions focused on learning journalistic techniques. Pre-test and post-test measures included life satisfaction, depressive symptoms, experiencing the environment as rewarding, and autobiographical memory (AM) scales. LR intervention decreased depressive symptomatology, improved life satisfaction, and increased specific memories. The findings suggest that practice in AM for specific events is an effective component of LR that could be a useful tool in enhancing emotional well-being in active aging programs, thus reducing depressive symptoms. © The Author(s) 2014.

  1. Declarative event based models of concurrency and refinement in psi-calculi

    DEFF Research Database (Denmark)

    Normann, Håkon; Johansen, Christian; Hildebrandt, Thomas

    2015-01-01

    Psi-calculi constitute a parametric framework for nominal process calculi, where constraint based process calculi and process calculi for mobility can be defined as instances. We apply here the framework of psi-calculi to provide a foundation for the exploration of declarative event-based process...... calculi with support for run-time refinement. We first provide a representation of the model of finite prime event structures as an instance of psi-calculi and prove that the representation respects the semantics up to concurrency diamonds and action refinement. We then proceed to give a psi......-calculi representation of Dynamic Condition Response Graphs, which conservatively extends prime event structures to allow finite representations of (omega) regular finite (and infinite) behaviours and have been shown to support run-time adaptation and refinement. We end by outlining the final aim of this research, which...

  2. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  3. Automated reasoning with dynamic event trees: a real-time, knowledge-based decision aide

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Subramanyan, N.

    1988-01-01

    The models and data contained in a probabilistic risk assessment (PRA) Event Sequence Analysis represent a wealth of information that can be used for dynamic calculation of event sequence likelihood. In this paper we report a new and unique computerization methodology which utilizes these data. This sub-system (referred to as PREDICTOR) has been developed and tested as part of a larger system. PREDICTOR performs a real-time (re)calculation of the estimated likelihood of core-melt as a function of plant status. This methodology uses object-oriented programming techniques from the artificial intelligence discipline that enable one to codify event tree and fault tree logic models and associated probabilities developed in a PRA study. Existence of off-normal conditions is reported to PREDICTOR, which then updates the relevant failure probabilities throughout the event tree and fault tree models by dynamically replacing the off-the-shelf (or prior) probabilities with new probabilities based on the current situation. The new event probabilities are immediately propagated through the models (using 'demons') and an updated core-melt probability is calculated. Along the way, the dominant non-success path of each event tree is determined and highlighted. (author)

  4. Studies on switch-based event building systems in RD13

    International Nuclear Information System (INIS)

    Bee, C.P.; Eshghi, S.; Jones, R.

    1996-01-01

    One of the goals of the RD13 project at CERN is to investigate the feasibility of parallel event building system for detectors at the LHC. Studies were performed by building a prototype based on the HiPPI standard and by modeling this prototype and extended architectures with MODSIM II. The prototype used commercially available VME-HiPPI interfaces and a HiPPI switch together with a modular software. The setup was tested successfully as a parallel event building system in different configurations and with different data flow control schemes. The simulation program was used with realistic parameters from the prototype measurements to simulate large-scale event building systems. This includes simulations of a realistic setup of the ATLAS event building system. The influence of different parameters and scaling behavior were investigated. The influence of realistic event size distributions was checked with data from off-line simulations. Different control schemes for destination assignment and traffic shaping were investigated as well as a two-stage event building system. (author)

  5. Gender classification in children based on speech characteristics: using fundamental and formant frequencies of Malay vowels.

    Science.gov (United States)

    Zourmand, Alireza; Ting, Hua-Nong; Mirhassani, Seyed Mostafa

    2013-03-01

    Speech is one of the prevalent communication mediums for humans. Identifying the gender of a child speaker based on his/her speech is crucial in telecommunication and speech therapy. This article investigates the use of fundamental and formant frequencies from sustained vowel phonation to distinguish the gender of Malay children aged between 7 and 12 years. The Euclidean minimum distance and multilayer perceptron were used to classify the gender of 360 Malay children based on different combinations of fundamental and formant frequencies (F0, F1, F2, and F3). The Euclidean minimum distance with normalized frequency data achieved a classification accuracy of 79.44%, which was higher than that of the nonnormalized frequency data. Age-dependent modeling was used to improve the accuracy of gender classification. The Euclidean distance method obtained 84.17% based on the optimal classification accuracy for all age groups. The accuracy was further increased to 99.81% using multilayer perceptron based on mel-frequency cepstral coefficients. Copyright © 2013 The Voice Foundation. Published by Mosby, Inc. All rights reserved.

  6. Damage detection in multi-span beams based on the analysis of frequency changes

    International Nuclear Information System (INIS)

    Gillich, G R; Ntakpe, J L; Praisach, Z I; Mimis, M C; Abdel Wahab, M

    2017-01-01

    Crack identification in multi-span beams is performed to determine whether the structure is healthy or not. Among all crack identification methods, these based on measured natural frequency changes present the advantage of simplicity and easy to use in practical engineering. To accurately identify the cracks characteristics for multi-span beam structure, a mathematical model is established, which can predict frequency changes for any boundary conditions, the intermediate supports being hinges. This relation is based on the modal strain energy concept. Since frequency changes are relative small, to obtain natural frequencies with high resolution, a signal processing algorithm based on superposing of numerous spectra is also proposed, which overcomes the disadvantage of Fast Fourier Transform in the aspect of frequency resolution. Based on above-mentioned mathematical model and signal processing algorithm, the method of identifying cracks on multi-span beams is presented. To verify the accuracy of this identification method, experimental examples are conducted on a two-span structure. The results demonstrate that the method proposed in this paper can accurately identify the crack position and depth. (paper)

  7. Frequency scanning-based stability analysis method for grid-connected inverter system

    DEFF Research Database (Denmark)

    Wang, Yanbo; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion with conside......This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion...... with consideration of the inverter nonlinearities. Small current disturbance is injected into grid-connected inverter system in a particular frequency range, and the impedance is computed according to the harmonic-frequency response using Fourier analysis, and then the stability is predicted on the basis...... of the impedance stability criterion. The stability issues of grid-connected inverters with grid-current feedback and the converter-current feedback are addressed using the proposed method. The results obtained from simulation and experiments validate the effectiveness of the method. The frequency scanning...

  8. Modeling of Temperature Effect on Modal Frequency of Concrete Beam Based on Field Monitoring Data

    Directory of Open Access Journals (Sweden)

    Wenchen Shan

    2018-01-01

    Full Text Available Temperature variation has been widely demonstrated to produce significant effect on modal frequencies that even exceed the effect of actual damage. In order to eliminate the temperature effect on modal frequency, an effective method is to construct quantitative models which accurately predict the modal frequency corresponding to temperature variation. In this paper, principal component analysis (PCA is conducted on the temperatures taken from all embedded thermocouples for extracting input parameters of regression models. Three regression-based numerical models using multiple linear regression (MLR, back-propagation neural network (BPNN, and support vector regression (SVR techniques are constructed to capture the relationships between modal frequencies and temperature distributions from measurements of a concrete beam during a period of forty days of monitoring. A comparison with respect to the performance of various optimally configured regression models has been performed on measurement data. Results indicate that the SVR exhibits a better reproduction and prediction capability than BPNN and MLR models for predicting the modal frequencies with respect to nonuniformly distributed temperatures. It is succeeded that temperature effects on modal frequencies can be effectively eliminated based on the optimally formulated SVR model.

  9. Increased frequency of single base substitutions in a population of transcripts expressed in cancer cells

    Directory of Open Access Journals (Sweden)

    Bianchetti Laurent

    2012-11-01

    Full Text Available Abstract Background Single Base Substitutions (SBS that alter transcripts expressed in cancer originate from somatic mutations. However, recent studies report SBS in transcripts that are not supported by the genomic DNA of tumor cells. Methods We used sequence based whole genome expression profiling, namely Long-SAGE (L-SAGE and Tag-seq (a combination of L-SAGE and deep sequencing, and computational methods to identify transcripts with greater SBS frequencies in cancer. Millions of tags produced by 40 healthy and 47 cancer L-SAGE experiments were compared to 1,959 Reference Tags (RT, i.e. tags matching the human genome exactly once. Similarly, tens of millions of tags produced by 7 healthy and 8 cancer Tag-seq experiments were compared to 8,572 RT. For each transcript, SBS frequencies in healthy and cancer cells were statistically tested for equality. Results In the L-SAGE and Tag-seq experiments, 372 and 4,289 transcripts respectively, showed greater SBS frequencies in cancer. Increased SBS frequencies could not be attributed to known Single Nucleotide Polymorphisms (SNP, catalogued somatic mutations or RNA-editing enzymes. Hypothesizing that Single Tags (ST, i.e. tags sequenced only once, were indicators of SBS, we observed that ST proportions were heterogeneously distributed across Embryonic Stem Cells (ESC, healthy differentiated and cancer cells. ESC had the lowest ST proportions, whereas cancer cells had the greatest. Finally, in a series of experiments carried out on a single patient at 1 healthy and 3 consecutive tumor stages, we could show that SBS frequencies increased during cancer progression. Conclusion If the mechanisms generating the base substitutions could be known, increased SBS frequency in transcripts would be a new useful biomarker of cancer. With the reduction of sequencing cost, sequence based whole genome expression profiling could be used to characterize increased SBS frequency in patient’s tumor and aid diagnostic.

  10. Increased frequency of single base substitutions in a population of transcripts expressed in cancer cells

    International Nuclear Information System (INIS)

    Bianchetti, Laurent; Kieffer, David; Féderkeil, Rémi; Poch, Olivier

    2012-01-01

    Single Base Substitutions (SBS) that alter transcripts expressed in cancer originate from somatic mutations. However, recent studies report SBS in transcripts that are not supported by the genomic DNA of tumor cells. We used sequence based whole genome expression profiling, namely Long-SAGE (L-SAGE) and Tag-seq (a combination of L-SAGE and deep sequencing), and computational methods to identify transcripts with greater SBS frequencies in cancer. Millions of tags produced by 40 healthy and 47 cancer L-SAGE experiments were compared to 1,959 Reference Tags (RT), i.e. tags matching the human genome exactly once. Similarly, tens of millions of tags produced by 7 healthy and 8 cancer Tag-seq experiments were compared to 8,572 RT. For each transcript, SBS frequencies in healthy and cancer cells were statistically tested for equality. In the L-SAGE and Tag-seq experiments, 372 and 4,289 transcripts respectively, showed greater SBS frequencies in cancer. Increased SBS frequencies could not be attributed to known Single Nucleotide Polymorphisms (SNP), catalogued somatic mutations or RNA-editing enzymes. Hypothesizing that Single Tags (ST), i.e. tags sequenced only once, were indicators of SBS, we observed that ST proportions were heterogeneously distributed across Embryonic Stem Cells (ESC), healthy differentiated and cancer cells. ESC had the lowest ST proportions, whereas cancer cells had the greatest. Finally, in a series of experiments carried out on a single patient at 1 healthy and 3 consecutive tumor stages, we could show that SBS frequencies increased during cancer progression. If the mechanisms generating the base substitutions could be known, increased SBS frequency in transcripts would be a new useful biomarker of cancer. With the reduction of sequencing cost, sequence based whole genome expression profiling could be used to characterize increased SBS frequency in patient’s tumor and aid diagnostic

  11. An assessment of envelope-based demodulation in case of proximity of carrier and modulation frequencies

    Science.gov (United States)

    Shahriar, Md Rifat; Borghesani, Pietro; Randall, R. B.; Tan, Andy C. C.

    2017-11-01

    Demodulation is a necessary step in the field of diagnostics to reveal faults whose signatures appear as an amplitude and/or frequency modulation. The Hilbert transform has conventionally been used for the calculation of the analytic signal required in the demodulation process. However, the carrier and modulation frequencies must meet the conditions set