WorldWideScience

Sample records for event frequency based

  1. Projection of changes in the frequency of heavy rain events over Hawaii based on leading Pacific climate modes

    Science.gov (United States)

    Elison Timm, O.; Diaz, H. F.; Giambelluca, T. W.; Takahashi, M.

    2011-02-01

    This study investigates the frequency of heavy rainfall events in Hawaii during the wet season (October-April) 1958-2005 and their conditional dependence on the Pacific-North American (PNA) pattern and El Niño-Southern Oscillation (ENSO). Heavy rain events are defined by the 95% quantile in the rainfall distribution of the wet seasons. Twelve stations with daily reports of rainfall amounts were used to count the number of heavy rain days during wet seasons. Multiple linear regression (MLR) indicated that the PNA index (PNAI) and the Southern Oscillation Index (SOI) can explain a significant amount of the interannual to interdecadal variability for 9 out of 12 stations. Cross validation showed that PNAI and SOI together explain about 18-44% of the variability in the number of heavy rain events. Furthermore, the MLR model reproduces the trend toward fewer heavy rain events in the years after the Pacific climate shift in the mid-1970s. The MLR model was applied to the projected PNAI and SOI indices that were obtained from six IPCC AR4 climate models. The current suite of AR4 simulations based on the A1B and A2 emissions scenarios projects small and equivocal changes in the mean state of the SOI and PNAI during the 21st century. The covariance between PNAI and SOI in these simulations appears to be stable. To the extent that variations in the frequency and magnitude of ENSO and the PNA mode are responsible for modulating extreme rainfall occurrence in Hawaii, our results indicate small changes in the projected number of heavy rainfall days with large uncertainties resulting from disparities among the climate models.

  2. Statistical Prediction of Solar Particle Event Frequency Based on the Measurements of Recent Solar Cycles for Acute Radiation Risk Analysis

    Science.gov (United States)

    Myung-Hee, Y. Kim; Shaowen, Hu; Cucinotta, Francis A.

    2009-01-01

    Large solar particle events (SPEs) present significant acute radiation risks to the crew members during extra-vehicular activities (EVAs) or in lightly shielded space vehicles for space missions beyond the protection of the Earth's magnetic field. Acute radiation sickness (ARS) can impair performance and result in failure of the mission. Improved forecasting capability and/or early-warning systems and proper shielding solutions are required to stay within NASA's short-term dose limits. Exactly how to make use of observations of SPEs for predicting occurrence and size is a great challenge, because SPE occurrences themselves are random in nature even though the expected frequency of SPEs is strongly influenced by the time position within the solar activity cycle. Therefore, we developed a probabilistic model approach, where a cumulative expected occurrence curve of SPEs for a typical solar cycle was formed from a non-homogeneous Poisson process model fitted to a database of proton fluence measurements of SPEs that occurred during the past 5 solar cycles (19 - 23) and those of large SPEs identified from impulsive nitrate enhancements in polar ice. From the fitted model, the expected frequency of SPEs was estimated at any given proton fluence threshold (Phi(sub E)) with energy (E) >30 MeV during a defined space mission period. Corresponding Phi(sub E) (E=30, 60, and 100 MeV) fluence distributions were simulated with a random draw from a gamma distribution, and applied for SPE ARS risk analysis for a specific mission period. It has been found that the accurate prediction of deep-seated organ doses was more precisely predicted at high energies, Phi(sub 100), than at lower energies such as Phi(sub 30) or Phi(sub 60), because of the high penetration depth of high energy protons. Estimates of ARS are then described for 90th and 95th percentile events for several mission lengths and for several likely organ dose-rates. The ability to accurately measure high energy protons

  3. Machine-based classification of ADHD and nonADHD participants using time/frequency features of event-related neuroelectric activity.

    Science.gov (United States)

    Öztoprak, Hüseyin; Toycan, Mehmet; Alp, Yaşar Kemal; Arıkan, Orhan; Doğutepe, Elvin; Karakaş, Sirel

    2017-12-01

    Attention-deficit/hyperactivity disorder (ADHD) is the most frequent diagnosis among children who are referred to psychiatry departments. Although ADHD was discovered at the beginning of the 20th century, its diagnosis is still confronted with many problems. A novel classification approach that discriminates ADHD and nonADHD groups over the time-frequency domain features of event-related potential (ERP) recordings that are taken during Stroop task is presented. Time-Frequency Hermite-Atomizer (TFHA) technique is used for the extraction of high resolution time-frequency domain features that are highly localized in time-frequency domain. Based on an extensive investigation, Support Vector Machine-Recursive Feature Elimination (SVM-RFE) was used to obtain the best discriminating features. When the best three features were used, the classification accuracy for the training dataset reached 98%, and the use of five features further improved the accuracy to 99.5%. The accuracy was 100% for the testing dataset. Based on extensive experiments, the delta band emerged as the most contributing frequency band and statistical parameters emerged as the most contributing feature group. The classification performance of this study suggests that TFHA can be employed as an auxiliary component of the diagnostic and prognostic procedures for ADHD. The features obtained in this study can potentially contribute to the neuroelectrical understanding and clinical diagnosis of ADHD. Copyright © 2017 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  4. The Influence of the Annual Number of Storms on the Derivation of the Flood Frequency Curve through Event-Based Simulation

    Directory of Open Access Journals (Sweden)

    Alvaro Sordo-Ward

    2016-08-01

    Full Text Available This study addresses the question of how to select the minimum set of storms that should be simulated each year in order to estimate an accurate flood frequency curve for return periods ranging between 1 and 1000 years. The Manzanares basin (Spain was used as a study case. A continuous 100,000-year hourly rainfall series was generated using the stochastic spatial–temporal model RanSimV3. Individual storms were extracted from the series by applying the exponential method. For each year, the extracted storms were transformed into hydrographs by applying an hourly time-step semi-distributed event-based rainfall–runoff model, and the maximum peak flow per year was determined to generate the reference flood frequency curve. Then, different flood frequency curves were obtained considering the N storms with maximum rainfall depth per year, with 1 ≤ N ≤ total number of storms. Main results show that: (a the degree of alignment between the calculated flood frequency curves and the reference flood frequency curve depends on the return period considered, increasing the accuracy for higher return periods; (b for the analyzed case studies, the flood frequency curve for medium and high return period (50 ≤ return period ≤ 1000 years can be estimated with a difference lower than 3% (compared to the reference flood frequency curve by considering the three storms with the maximum total rainfall depth each year; (c when considering only the greatest storm of the year, for return periods higher than 10 years, the difference for the estimation of the flood frequency curve is lower than 10%; and (d when considering the three greatest storms each year, for return periods higher than 100 years, the probability of achieving simultaneously a hydrograph with the annual maximum peak flow and the maximum volume is 94%.

  5. The Human Brain Encodes Event Frequencies While Forming Subjective Beliefs

    Science.gov (United States)

    d’Acremont, Mathieu; Schultz, Wolfram; Bossaerts, Peter

    2015-01-01

    To make adaptive choices, humans need to estimate the probability of future events. Based on a Bayesian approach, it is assumed that probabilities are inferred by combining a priori, potentially subjective, knowledge with factual observations, but the precise neurobiological mechanism remains unknown. Here, we study whether neural encoding centers on subjective posterior probabilities, and data merely lead to updates of posteriors, or whether objective data are encoded separately alongside subjective knowledge. During fMRI, young adults acquired prior knowledge regarding uncertain events, repeatedly observed evidence in the form of stimuli, and estimated event probabilities. Participants combined prior knowledge with factual evidence using Bayesian principles. Expected reward inferred from prior knowledge was encoded in striatum. BOLD response in specific nodes of the default mode network (angular gyri, posterior cingulate, and medial prefrontal cortex) encoded the actual frequency of stimuli, unaffected by prior knowledge. In this network, activity increased with frequencies and thus reflected the accumulation of evidence. In contrast, Bayesian posterior probabilities, computed from prior knowledge and stimulus frequencies, were encoded in bilateral inferior frontal gyrus. Here activity increased for improbable events and thus signaled the violation of Bayesian predictions. Thus, subjective beliefs and stimulus frequencies were encoded in separate cortical regions. The advantage of such a separation is that objective evidence can be recombined with newly acquired knowledge when a reinterpretation of the evidence is called for. Overall this study reveals the coexistence in the brain of an experience-based system of inference and a knowledge-based system of inference. PMID:23804108

  6. Overestimations in zero frequency DQE of x-ray imaging converters assessed by Monte Carlo techniques based on the study of energy impartation events

    Energy Technology Data Exchange (ETDEWEB)

    Liaparinos, P. F.; Kandarakis, I. S. [Department of Medical Instruments Technology, Technological Educational Institute, 122 10 Athens (Greece)

    2011-07-15

    Purpose: The performance of various x-ray converters, employed in medical imaging systems, has been widely examined by several methodologies (experimental, analytical, and Monte Carlo techniques). The x-ray converters most frequently employed in energy integrating digital radiology detectors are the Gd{sub 2}O{sub 2}S:Tb granular phosphor, the CsI:Tl structured phosphor, and the a-Se photoconductor. The imaging characteristics of an x-ray converter are affected by its x-ray detection properties. However, various definitions of x-ray detection have been used in the literature, leading to different results for the quantum detection efficiency (QDE) for the same type of x-ray converter. For this reason, there is a need for accurate determination of the x-ray detection and, in particular, its relation to detector response. Methods: The present article reports on the performance of the three aforementioned x-ray converters in terms of the QDE and the x-ray statistical factor I{sub x} and examines the effect of the x-ray detection, directly related to converter output signal, on the zero-frequency DQE. For the purposes of this study, Monte Carlo simulation was used to model the x-ray interactions within the x-ray converter. Simulations were carried out in the energy range from 10 keV up to 80 keV and considering two layers of different coating weights (50 and 100 mg/cm{sup 2}). The prediction and comparison of zero-frequency DQE were based on two different approaches for x-ray detection, i.e., (a) fraction of interacting photons and (b) fraction of photons leading to energy deposition. In addition, the effect of energy deposition through Compton scattering events on the DQE values was estimated. Results: Our results showed discrepancies between Monte Carlo techniques (based on energy deposition events) and analytical calculations (based on x-ray attenuation) on QDE. Discrepancies were found to range up to 10% for Gd{sub 2}O{sub 2}S:Tb (100 mg/cm{sup 2}), 7.7% for Cs

  7. Reconstructing high-magnitude/low-frequency landslide events based on soil redistribution modelling and a Late-Holocene sediment record from New Zealand

    NARCIS (Netherlands)

    Claessens, L.F.G.; Lowe, D.J.; Hayward, B.W.; Schaap, B.F.; Schoorl, J.M.; Veldkamp, A.

    2006-01-01

    A sediment record is used, in combination with shallow landslide soil redistribution and sediment-yield modelling, to reconstruct the incidence of high-magnitude/low-frequency landslide events in the upper part of a catchment and the history of a wetland in the lower part. Eleven sediment cores were

  8. Insights from in-situ, UV-based, high-frequency sensor for characterizing storm-event particulate organic carbon in stream runoff

    Science.gov (United States)

    Inamdar, S. P.; Rowland, R. D.; Del Percio, S.; Johnson, E. R.

    2016-12-01

    While dissolved forms of organic carbon (e.g., DOC) make up a large portion of the runoff load during baseflow and small storms, large storms can erode and mobilize significant amounts of particulate organic carbon (POC). Large storms yield sudden and rapid changes in POC which occur at minutes to hours and typically early in the storm event. Capturing these "hot moments" of POC is critical for understanding watershed processes, developing accurate budgets of solute flux, assessing the impacts on receiving aquatic ecosystems and developing sustainable mitigation strategies. The recent availability of in-situ, high-frequency, electronic sensors has shown considerable promise for characterizing dissolved forms of solutes (e.g., DOC, nitrate-nitrogen), but their ability to measure POC has yet to be rigorously evaluated. We evaluated the accuracy of a UV-based sensor to measure POC concentrations using a combination of field and laboratory based studies. Stream water POC concentrations were studied for multiple storms over a 2-year period (2015-2016) in a 79 ha forested watershed (second-order stream) in the Piedmont region of Maryland. Storm sampling was performed using ISCO samplers and POC (% OC content) was determined for suspended sediments (SS) retained on a 0.7 micron filter. POC values measured by the in-situ stream sensor are being evaluated against those determined for suspended sediments from stream runoff. Sensor versus lab-determined POC concentrations will be evaluated for: magnitude, intensity, and seasonal timing of the storms; values on the rising versus falling limb of the hydrograph; and potential sources of POC. Simultaneously, a laboratory experiment was performed where sensor versus lab-determined POC were examined for varying POC concentrations; variety of POC sources including stream banks, stream bed, forest floor, upland A horizon; and four particle size classes (2000-1000 µm; 1000-250 µm; 250-63 µm and climate change predictions that

  9. Event-Based Activity Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2004-01-01

    We present and discuss a modeling approach that supports event-based modeling of information and activity in information systems. Interacting human actors and IT-actors may carry out such activity. We use events to create meaningful relations between information structures and the related...... activities inside and outside an IT-system. We use event-activity diagrams to model activity. Such diagrams support the modeling of activity flow, object flow, shared events, triggering events, and interrupting events....

  10. Sexual frequency and planning among at-risk men who have sex with men in the United States: implications for event-based intermittent pre-exposure prophylaxis.

    Science.gov (United States)

    Volk, Jonathan E; Liu, Albert; Vittinghoff, Eric; Irvin, Risha; Kroboth, Elizabeth; Krakower, Douglas; Mimiaga, Matthew J; Mayer, Kenneth H; Sullivan, Patrick S; Buchbinder, Susan P

    2012-09-01

    Intermittent dosing of pre-exposure prophylaxis (iPrEP) has potential to decrease costs, improve adherence, and minimize toxicity. Practical event-based dosing of iPrEP requires men who have sex with men (MSM) to be sexually active on fewer than 3 days each week and plan for sexual activity. MSM who may be most suitable for event-based dosing were older, more educated, more frequently used sexual networking websites, and more often reported that their last sexual encounter was not with a committed partner. A substantial proportion of these MSM endorse high-risk sexual activity, and event-based iPrEP may best target this population.

  11. Frequency of adverse events after vaccination with different vaccinia strains.

    Directory of Open Access Journals (Sweden)

    Mirjam Kretzschmar

    2006-08-01

    Full Text Available BACKGROUND: Large quantities of smallpox vaccine have been stockpiled to protect entire nations against a possible reintroduction of smallpox. Planning for an appropriate use of these stockpiled vaccines in response to a smallpox outbreak requires a rational assessment of the risks of vaccination-related adverse events, compared to the risk of contracting an infection. Although considerable effort has been made to understand the dynamics of smallpox transmission in modern societies, little attention has been paid to estimating the frequency of adverse events due to smallpox vaccination. Studies exploring the consequences of smallpox vaccination strategies have commonly used a frequency of approximately one death per million vaccinations, which is based on a study of vaccination with the New York City Board of Health (NYCBH strain of vaccinia virus. However, a multitude of historical studies of smallpox vaccination with other vaccinia strains suggest that there are strain-related differences in the frequency of adverse events after vaccination. Because many countries have stockpiled vaccine based on the Lister strain of vaccinia virus, a quantitative evaluation of the adverse effects of such vaccines is essential for emergency response planning. We conducted a systematic review and statistical analysis of historical data concerning vaccination against smallpox with different strains of vaccinia virus. METHODS AND FINDINGS: We analyzed historical vaccination data extracted from the literature. We extracted data on the frequency of postvaccinal encephalitis and death with respect to vaccinia strain and age of vaccinees. Using a hierarchical Bayesian approach for meta-analysis, we estimated the expected frequencies of postvaccinal encephalitis and death with respect to age at vaccination for smallpox vaccines based on the NYCBH and Lister vaccinia strains. We found large heterogeneity between findings from different studies and a time-period effect

  12. Grid Frequency Extreme Event Analysis and Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Florita, Anthony R [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clark, Kara [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gevorgian, Vahan [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Folgueras, Maria [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wenger, Erin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-11-01

    Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distribution fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.

  13. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    2009-01-01

    The purpose of the paper is to obtain insight into and provide practical advice for event-based conceptual modeling. We analyze a set of event concepts and use the results to formulate a conceptual event model that is used to identify guidelines for creation of dynamic process models and static...... information models. We characterize events as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms of information structures. The conceptual event model is used to characterize a variety of event concepts and it is used to illustrate how events can...... be used to integrate dynamic modeling of processes and static modeling of information structures. The results are unique in the sense that no other general event concept has been used to unify a similar broad variety of seemingly incompatible event concepts. The general event concept can be used...

  14. Event-Based Conceptual Modeling

    DEFF Research Database (Denmark)

    Bækgaard, Lars

    The paper demonstrates that a wide variety of event-based modeling approaches are based on special cases of the same general event concept, and that the general event concept can be used to unify the otherwise unrelated fields of information modeling and process modeling. A set of event......-based modeling approaches are analyzed and the results are used to formulate a general event concept that can be used for unifying the seemingly unrelated event concepts. Events are characterized as short-duration processes that have participants, consequences, and properties, and that may be modeled in terms...... of information structures. The general event concept can be used to guide systems analysis and design and to improve modeling approaches....

  15. Research on the Relationship of ENSO and the Frequency of Extreme Precipitation Events in China

    OpenAIRE

    Li, Wei; Zhai, Panmao; Cai, Jinhui

    2017-01-01

    Based on a daily precipitation observation dataset of 743 stations in China from 1951–2004, the Γ distribution function is used to calculate the probability distribution of daily precipitation and to define extreme precipitation events. Based on this, the relationship of ENSO and the frequency of extreme precipitation events is studied. Results reveal that ENSO events have impact on extreme precipitation events, with different magnitudes at different regions and seasons. In general, during wi...

  16. Alternative source models of very low frequency events

    Science.gov (United States)

    Gomberg, Joan S.; Agnew, D.C.; Schwartz, S.Y.

    2016-01-01

    We present alternative source models for very low frequency (VLF) events, previously inferred to be radiation from individual slow earthquakes that partly fill the period range between slow slip events lasting thousands of seconds and low-frequency earthquakes (LFE) with durations of tenths of a second. We show that VLF events may emerge from bandpass filtering a sum of clustered, shorter duration, LFE signals, believed to be the components of tectonic tremor. Most published studies show VLF events occurring concurrently with tremor bursts and LFE signals. Our analysis of continuous data from Costa Rica detected VLF events only when tremor was also occurring, which was only 7% of the total time examined. Using analytic and synthetic models, we show that a cluster of LFE signals produces the distinguishing characteristics of VLF events, which may be determined by the cluster envelope. The envelope may be diagnostic of a single, dynamic, slowly slipping event that propagates coherently over kilometers or represents a narrowly band-passed version of nearly simultaneous arrivals of radiation from slip on multiple higher stress drop and/or faster propagating slip patches with dimensions of tens of meters (i.e., LFE sources). Temporally clustered LFE sources may be triggered by single or multiple distinct aseismic slip events or represent the nearly simultaneous chance occurrence of background LFEs. Given the nonuniqueness in possible source durations, we suggest it is premature to draw conclusions about VLF event sources or how they scale.

  17. A new approach for annual flood frequency estimation: Hybrid Causative Event Method: Conference Presentation

    OpenAIRE

    Thyer, Mark; Li, Jing; Lambert, Martin; Kuczera, George; Metcalfe, Andrew

    2016-01-01

    Conference presentation from Engineers Australia, Hydrology and Water Resources Symposium, 2012, Sydney Australia.Summarises the Hybrid Causative Event-based Method for Annual Flood Frequency Estimation, see Li et al(2014,2015) (links provided below)

  18. Derived flood frequency distributions considering individual event hydrograph shapes

    Science.gov (United States)

    Hassini, Sonia; Guo, Yiping

    2017-04-01

    Derived in this paper is the frequency distribution of the peak discharge rate of a random runoff event from a small urban catchment. The derivation follows the derived probability distribution procedure and incorporates a catchment rainfall-runoff model with approximating shapes for individual runoff event hydrographs. In the past, only simple triangular runoff event hydrograph shapes were used, in this study approximating runoff event hydrograph shapes better representing all the possibilities are considered. The resulting closed-form mathematical equations are converted to the commonly required flood frequency distributions for use in urban stormwater management studies. The analytically determined peak discharge rates of different return periods for a wide range of hypothetical catchment conditions were compared to those determined from design storm modeling. The newly derived equations generated results that are closer to those from design storm modeling and provide a better alternative for use in urban stormwater management studies.

  19. Flood Frequency Analysis for the Annual Peak Flows Simulated by an Event-Based Rainfall-Runoff Model in an Urban Drainage Basin

    Directory of Open Access Journals (Sweden)

    Jeonghwan Ahn

    2014-12-01

    Full Text Available The proper assessment of design flood is a major concern for many hydrological applications in small urban watersheds. A number of approaches can be used including statistical approach and the continuous simulation and design storm methods. However, each method has its own limitations and assumptions being applied to the real world. The design storm method has been widely used for a long time because of the simplicity of the method, but three critical assumptions are made such as the equality of the return periods between the rainfall and corresponding flood quantiles and the selections of the rainfall hyetograph and antecedent soil moisture conditions. Continuous simulation cannot be applied to small urban catchments with quick responses of runoff to rainfall. In this paper, a new flood frequency analysis for the simulated annual peak flows (FASAP is proposed. This method employs the candidate rainfall events selected by considering a time step order of five minutes and a sliding duration without any assumptions about the conventional design storm method in an urban watershed. In addition, the proposed methodology was verified by comparing the results with the conventional method in a real urban watershed.

  20. Development of transient initiating event frequencies for use in probabilistic risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.

  1. A graphene based frequency quadrupler

    Science.gov (United States)

    Cheng, Chuantong; Huang, Beiju; Mao, Xurui; Zhang, Zanyun; Zhang, Zan; Geng, Zhaoxin; Xue, Ping; Chen, Hongda

    2017-04-01

    Benefit from exceptional electrical transport properties, graphene receives worldwide attentions, especially in the domain of high frequency electronics. Due to absence of effective bandgap causing off-state the device, graphene material is extraordinarily suitable for analog circuits rather than digital applications. With this unique ambipolar behavior, graphene can be exploited and utilized to achieve high performance for frequency multipliers. Here, dual-gated graphene field-effect transistors have been firstly used to achieve frequency quadrupling. Two Dirac points in the transfer curves of the designed GFETs can be observed by tuning top-gate voltages, which is essential to generate the fourth harmonic. By applying 200 kHz sinusoid input, arround 50% of the output signal radio frequency power is concentrated at the desired frequency of 800 kHz. Additionally, in suitable operation areas, our devices can work as high performance frequency doublers and frequency triplers. Considered both simple device structure and potential superhigh carrier mobility of graphene material, graphene-based frequency quadruplers may have lots of superiorities in regards to ultrahigh frequency electronic applications in near future. Moreover, versatility of carbon material system is far-reaching for realization of complementary metal-oxide-semiconductor compatible electrically active devices.

  2. Host Event Based Network Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Jonathan Chugg

    2013-01-01

    The purpose of INL’s research on this project is to demonstrate the feasibility of a host event based network monitoring tool and the effects on host performance. Current host based network monitoring tools work on polling which can miss activity if it occurs between polls. Instead of polling, a tool could be developed that makes use of event APIs in the operating system to receive asynchronous notifications of network activity. Analysis and logging of these events will allow the tool to construct the complete real-time and historical network configuration of the host while the tool is running. This research focused on three major operating systems commonly used by SCADA systems: Linux, WindowsXP, and Windows7. Windows 7 offers two paths that have minimal impact on the system and should be seriously considered. First is the new Windows Event Logging API, and, second, Windows 7 offers the ALE API within WFP. Any future work should focus on these methods.

  3. State-based Event Detection Optimization for Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Shanglian PENG

    2014-02-01

    Full Text Available Detection of patterns in high speed, large volume of event streams has been an important paradigm in many application areas of Complex Event Processing (CEP including security monitoring, financial markets analysis and health-care monitoring. To assure real-time responsive complex pattern detection over high volume and speed event streams, efficient event detection techniques have to be designed. Unfortunately evaluation of the Nondeterministic Finite Automaton (NFA based event detection model mainly considers single event query and its optimization. In this paper, we propose multiple event queries evaluation on event streams. In particular, we consider scalable multiple event detection model that shares NFA transfer states of different event queries. For each event query, the event query is parse into NFA and states of the NFA are partitioned into different units. With this partition, the same individual state of NFA is run on different processing nodes, providing states sharing and reducing partial matches maintenance. We compare our state-based approach with Stream-based And Shared Event processing (SASE. Our experiments demonstrate that state-based approach outperforms SASE both on CPU time usage and memory consumption.

  4. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    Science.gov (United States)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical

  5. Control charts for monitoring accumulating adverse event count frequencies from single and multiple blinded trials.

    Science.gov (United States)

    Gould, A Lawrence

    2016-12-30

    Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Frequency Dependence of Single-Event Upset in Highly Advanced PowerPC Microprocessors

    Science.gov (United States)

    Irom, Farokh; Farmanesh, Farhad; White, Mark; Kouba, Coy K.

    2006-01-01

    Single-event upset effects from heavy ions were measured for Motorola silicon-on-insulator (SOI) microprocessor with 90 nm feature sizes at three frequencies of 500, 1066 and 1600 MHz. Frequency dependence of single-event upsets is discussed. The results of our studies suggest the single-event upset in registers and D-Cache tend to increase with frequency. This might have important implications for the overall single-event upset trend as technology moves toward higher frequencies.

  7. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-01-01

    Aim The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. Methods A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. Results From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for ‘symptom’ related and ‘mild’ adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Conclusions Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. PMID:26271492

  8. Systematic review on the prevalence, frequency and comparative value of adverse events data in social media.

    Science.gov (United States)

    Golder, Su; Norman, Gill; Loke, Yoon K

    2015-10-01

    The aim of this review was to summarize the prevalence, frequency and comparative value of information on the adverse events of healthcare interventions from user comments and videos in social media. A systematic review of assessments of the prevalence or type of information on adverse events in social media was undertaken. Sixteen databases and two internet search engines were searched in addition to handsearching, reference checking and contacting experts. The results were sifted independently by two researchers. Data extraction and quality assessment were carried out by one researcher and checked by a second. The quality assessment tool was devised in-house and a narrative synthesis of the results followed. From 3064 records, 51 studies met the inclusion criteria. The studies assessed over 174 social media sites with discussion forums (71%) being the most popular. The overall prevalence of adverse events reports in social media varied from 0.2% to 8% of posts. Twenty-nine studies compared the results from searching social media with using other data sources to identify adverse events. There was general agreement that a higher frequency of adverse events was found in social media and that this was particularly true for 'symptom' related and 'mild' adverse events. Those adverse events that were under-represented in social media were laboratory-based and serious adverse events. Reports of adverse events are identifiable within social media. However, there is considerable heterogeneity in the frequency and type of events reported, and the reliability or validity of the data has not been thoroughly evaluated. © 2015 The British Pharmacological Society.

  9. Planetary Hypothesis, sub-Milankovitch frequencies and Holocene cold events

    Science.gov (United States)

    Compagnucci, R. H.; Cionco, R. G.; Agosta, E.; Wanner, H.

    2013-05-01

    The Planetary Hypothesis of solar cycles proposes that the movement of the Sun around the solar system barycenter modulates the solar cycles at several times scales. Using a 3-D model of the solar system (Cionco and Compagnucci, 2012) we derived the solar barycentric motion and various dynamic parameters such as the angular momentum (L= Lx, Ly, Lz) for the Holocene. Angular momentum inversions are sporadic and important events in the dynamics of the MSB: Lz becomes negative and giant planets are nearly aligned. These episodes are related to some grand solar grand minima such as Maunder and Dalton, and also to the recent deep minimum 2007-2010 which was preceded by a Lz inversion in 1990. During the Holocene several negative Lz episodes occur that are grouped in approximately millennia to centuries long periods. Each group is separated by ~2000 years where the Lz values remain positive, both generating a cycle between 1500 and 2500 years. The spectral analysis shows significant peaks at sub-Milankovitch frequencies. Furthermore, the analysis of the spatiotemporal variability of temperature defined six specific cold events (8200, 6300, 4700, 2700, 1550 and 550 years BP) during the Holocene (Wanner et al, 2011). During, and /or before, of these major climates cooling, a group of negative Lz episodes were presented. Oppositely the resulted during the warms periods were the lack of the angular movement inversion together with the extremes of positive Lz . Therefore, the origin of Holocene cold events seems to be linked to the gravitational influence of the planets, that is to say the planetary torque that has a non-negligible effect on the causes of the solar magnetic cycle. Acknowledgements:The support of the Grants PID-UTN1351, UBACYT N_:20020100101049, CONICET PIP PIP 114-201001-00250 and MINCYT-MEYS ARC/11/09. References Cionco, R.G.; Compagnucci,R.H. (2012) Dynamical characterization of the last prolonged solar minima , Advances in Space Research 50(10), 1434

  10. Fault detection based on microseismic events

    Science.gov (United States)

    Yin, Chen

    2017-09-01

    In unconventional reservoirs, small faults allow the flow of oil and gas as well as act as obstacles to exploration; for, (1) fracturing facilitates fluid migration, (2) reservoir flooding, and (3) triggering of small earthquakes. These small faults are not generally detected because of the low seismic resolution. However, such small faults are very active and release sufficient energy to initiate a large number of microseismic events (MEs) during hydraulic fracturing. In this study, we identified microfractures (MF) from hydraulic fracturing and natural small faults based on microseismicity characteristics, such as the time-space distribution, source mechanism, magnitude, amplitude, and frequency. First, I identified the mechanism of small faults and MF by reservoir stress analysis and calibrated the ME based on the microseismic magnitude. The dynamic characteristics (frequency and amplitude) of MEs triggered by natural faults and MF were analyzed; moreover, the geometry and activity types of natural fault and MF were grouped according to the source mechanism. Finally, the differences among time-space distribution, magnitude, source mechanism, amplitude, and frequency were used to differentiate natural faults and manmade fractures.

  11. Time-Frequency Data Reduction for Event Related Potentials: Combining Principal Component Analysis and Matching Pursuit

    Directory of Open Access Journals (Sweden)

    Selin Aviyente

    2010-01-01

    Full Text Available Joint time-frequency representations offer a rich representation of event related potentials (ERPs that cannot be obtained through individual time or frequency domain analysis. This representation, however, comes at the expense of increased data volume and the difficulty of interpreting the resulting representations. Therefore, methods that can reduce the large amount of time-frequency data to experimentally relevant components are essential. In this paper, we present a method that reduces the large volume of ERP time-frequency data into a few significant time-frequency parameters. The proposed method is based on applying the widely used matching pursuit (MP approach, with a Gabor dictionary, to principal components extracted from the time-frequency domain. The proposed PCA-Gabor decomposition is compared with other time-frequency data reduction methods such as the time-frequency PCA approach alone and standard matching pursuit methods using a Gabor dictionary for both simulated and biological data. The results show that the proposed PCA-Gabor approach performs better than either the PCA alone or the standard MP data reduction methods, by using the smallest amount of ERP data variance to produce the strongest statistical separation between experimental conditions.

  12. Graphene-based frequency tripler.

    Science.gov (United States)

    Chen, Hong-Yan; Appenzeller, Joerg

    2012-04-11

    Graphene has captured the imagination of researchers worldwide as an ideal two-dimensional material with exceptional electrical transport properties. The high electron and hole mobility quickly inspired scientists to search for electronic applications that require high-performance channel materials. However, the absence of a bandgap in graphene immediately revealed itself in terms of ambipolar device characteristics and the nonexistence of a device off-state. The question is: How can the superior electronic properties of graphene be harvested while dealing appropriately with its unique characteristics rather than enforcing conventional device concepts? Here, we report a novel device idea, a graphene-based frequency tripler, an application that employs an innovative electrostatic doping approach and exploits the unique ambipolar behavior of graphene. © 2012 American Chemical Society

  13. radio frequency based radio frequency based water level monitor

    African Journals Online (AJOL)

    eobe

    Key words: radio frequency, PIC microcontroller, encoder, decoder, water pump, residential. 1. ... The sensors emit high frequency (20kHz to 200 kHz) acoustic waves that are reflected back to and detected by the emitting transducer [2-4]. In addition, optical interface ... is another method; in this method optical sensors are.

  14. Linear Frequency Estimation Technique for Reducing Frequency Based Signals.

    Science.gov (United States)

    Woodbridge, Jonathan; Bui, Alex; Sarrafzadeh, Majid

    2010-06-01

    This paper presents a linear frequency estimation (LFE) technique for data reduction of frequency-based signals. LFE converts a signal to the frequency domain by utilizing the Fourier transform and estimates both the real and imaginary parts with a series of vectors much smaller than the original signal size. The estimation is accomplished by selecting optimal points from the frequency domain and interpolating data between these points with a first order approximation. The difficulty of such a problem lies in determining which points are most significant. LFE is unique in the fact that it is generic to a wide variety of frequency-based signals such as electromyography (EMG), voice, and electrocardiography (ECG). The only requirement is that spectral coefficients are spatially correlated. This paper presents the algorithm and results from both EMG and voice data. We complete the paper with a description of how this method can be applied to pattern types of recognition, signal indexing, and compression.

  15. Increasing frequency and duration of Arctic winter warming events

    Science.gov (United States)

    Graham, Robert M.; Cohen, Lana; Petty, Alek A.; Boisvert, Linette N.; Rinke, Annette; Hudson, Stephen R.; Nicolaus, Marcel; Granskog, Mats A.

    2017-07-01

    Near-surface air temperatures close to 0°C were observed in situ over sea ice in the central Arctic during the last three winter seasons. Here we use in situ winter (December-March) temperature observations, such as those from Soviet North Pole drifting stations and ocean buoys, to determine how common Arctic winter warming events are. Observations of winter warming events exist over most of the Arctic Basin. Temperatures exceeding -5°C were observed during >30% of winters from 1954 to 2010 by North Pole drifting stations or ocean buoys. Using the ERA-Interim record (1979-2016), we show that the North Pole (NP) region typically experiences 10 warming events (T2m > -10°C) per winter, compared with only five in the Pacific Central Arctic (PCA). There is a positive trend in the overall duration of winter warming events for both the NP region (4.25 days/decade) and PCA (1.16 days/decade), due to an increased number of events of longer duration.type="synopsis">type="main">Plain Language SummaryDuring the last three winter seasons, extreme warming events were observed over sea ice in the central Arctic Ocean. Each of these warming events were associated with temperatures close to or above 0°C, which lasted for between 1 and 3 days. Typically temperatures in the Arctic at this time of year are below -30°C. Here we study past temperature observations in the Arctic to investigate how common winter warming events are. We use time temperature observations from expeditions such as Fram (1893-1896) and manned Soviet North Pole drifting ice stations from 1937 to 1991. These historic temperature records show that winter warming events have been observed over most of the Arctic Ocean. Despite a thin network of observation sites, winter time temperatures above -5°C were directly observed approximately once every 3 years in the central Arctic Ocean between 1954 and 2010. Winter warming events are associated with storm systems originating in either the Atlantic or Pacific

  16. Filtered Sampling from Populations with Heterogeneous Event Frequencies

    OpenAIRE

    Alfred Blumstein; José A. Canela-Cacho; Jacqueline Cohen

    1993-01-01

    A hierarchical model is developed to account for selection biases that result from processes in which events have a fixed probability of being sampled, but individuals in the population generate events at varying rates. It is shown that inferences about the population parameters from such unrepresentative samples are not only possible but can be statistically powerful, provided the selection biases are adequately controlled for and the specification of the model is appropriate. The model assu...

  17. HYPOCENTER DISTRIBUTION OF LOW FREQUENCY EVENT AT PAPANDAYAN VOLCANO

    Directory of Open Access Journals (Sweden)

    Muhammad Mifta Hasan

    2016-10-01

    Full Text Available Papandayan volcano is a stratovolcano with irregular cone-shaped has eight craters around the peak. The most active crater in Papandayan is a Mas crater. Distribution of relocated event calculated using Geiger Adaptive Damping Algorithm (GAD shows that the epicenter of the event centered below Mas crater with maximum rms 0.114. While depth of the hypocenter range between 0-2 km and 5-6 km due to activity of steam and gas.

  18. radio frequency based radio frequency based water level monitor

    African Journals Online (AJOL)

    eobe

    high frequency (20kHz to 200 kHz) acoustic waves that are reflected back to and detected by the emitting transducer [2-4]. In addition, optical interface method is another ..... The brain of the controlling section for this work is the. 40 pin PIC16F877A microcontroller. It processes the data received from the Receiver Section.

  19. ERPWAVELAB A toolbox for multi-channel analysis of time-frequency transformed event related potentials

    DEFF Research Database (Denmark)

    Mørup, Morten; Hansen, Lars Kai; Arnfred, Sidse M.

    2006-01-01

    The toolbox 'ERPWAVELAB' is developed for multi-channel time-frequency analysis of event related activity of EEG and MEG data. The toolbox provides tools for data analysis and visualization of the most commonly used measures of time-frequency transformed event related data as well as data...

  20. The Event-Related Low-Frequency Activity of Highly and Average Intelligent Children

    Science.gov (United States)

    Liu, Tongran; Shi, Jiannong; Zhao, Daheng; Yang, Jie

    2008-01-01

    Using time-frequency analysis techniques to investigate the event-related low-frequency (delta: 0.5-4 Hz; theta: 4-8 Hz) activity of auditory event-related potentials (ERPs) data of highly and average intelligent children, 18 intellectually gifted children, and 18 intellectually average children participated the present study. Present findings…

  1. Problems in event based engine control

    DEFF Research Database (Denmark)

    Hendricks, Elbert; Jensen, Michael; Chevalier, Alain Marie Roger

    1994-01-01

    Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample th...... problems on accurate air/fuel ratio control of a spark ignition (SI) engine....... the engine variables synchronously with these events (or submultiples of them). Such engine controllers are often called event-based systems. Unfortunately the main system noise (or disturbance) is also synchronous with the engine events: the engine pumping fluctuations. Since many electronic engine......Physically a four cycle spark ignition engine operates on the basis of four engine processes or events: intake, compression, ignition (or expansion) and exhaust. These events each occupy approximately 180° of crank angle. In conventional engine controllers, it is an accepted practice to sample...

  2. Continuous robust sound event classification using time-frequency features and deep learning.

    Science.gov (United States)

    McLoughlin, Ian; Zhang, Haomin; Xie, Zhipeng; Song, Yan; Xiao, Wei; Phan, Huy

    2017-01-01

    The automatic detection and recognition of sound events by computers is a requirement for a number of emerging sensing and human computer interaction technologies. Recent advances in this field have been achieved by machine learning classifiers working in conjunction with time-frequency feature representations. This combination has achieved excellent accuracy for classification of discrete sounds. The ability to recognise sounds under real-world noisy conditions, called robust sound event classification, is an especially challenging task that has attracted recent research attention. Another aspect of real-word conditions is the classification of continuous, occluded or overlapping sounds, rather than classification of short isolated sound recordings. This paper addresses the classification of noise-corrupted, occluded, overlapped, continuous sound recordings. It first proposes a standard evaluation task for such sounds based upon a common existing method for evaluating isolated sound classification. It then benchmarks several high performing isolated sound classifiers to operate with continuous sound data by incorporating an energy-based event detection front end. Results are reported for each tested system using the new task, to provide the first analysis of their performance for continuous sound event detection. In addition it proposes and evaluates a novel Bayesian-inspired front end for the segmentation and detection of continuous sound recordings prior to classification.

  3. Low-frequency ionospheric sounding with Narrow Bipolar Event lightning radio emissions: energy-reflectivity spectrum

    Directory of Open Access Journals (Sweden)

    A. R. Jacobson

    2008-06-01

    Full Text Available We analyze data on radio-reflection from the D-region of the lower ionosphere, retrieving the energy-reflection coefficient in the frequency range ~5–95 kHz. The data are the same as developed for a recent study of ionospheric-reflection height, and are based on recordings of powerful (multi-Gigawatt radio emissions from a type of narrow (~10 μs lightning discharge known as "Narrow Bipolar Events". The sequential appearance of first the groundwave signal, and then the ionospheric single-hop reflection signal, permits us to construct the energy-reflection ratio. We infer the energy reflection's statistical variation with solar zenith angle, angle-of-incidence, frequency, and propagation azimuth. There is also a marginally-significant response of the energy reflectivity to solar X-ray flux density. Finally, we review the relationship of our results to previous published reports.

  4. Low-frequency ionospheric sounding with Narrow Bipolar Event lightning radio emissions: energy-reflectivity spectrum

    Directory of Open Access Journals (Sweden)

    A. R. Jacobson

    2008-06-01

    Full Text Available We analyze data on radio-reflection from the D-region of the lower ionosphere, retrieving the energy-reflection coefficient in the frequency range ~5–95 kHz. The data are the same as developed for a recent study of ionospheric-reflection height, and are based on recordings of powerful (multi-Gigawatt radio emissions from a type of narrow (~10 μs lightning discharge known as "Narrow Bipolar Events". The sequential appearance of first the groundwave signal, and then the ionospheric single-hop reflection signal, permits us to construct the energy-reflection ratio. We infer the energy reflection's statistical variation with solar zenith angle, angle-of-incidence, frequency, and propagation azimuth. There is also a marginally-significant response of the energy reflectivity to solar X-ray flux density. Finally, we review the relationship of our results to previous published reports.

  5. Frequency Dependence of Single-event Upset in Advanced Commerical PowerPC Microprocessors

    Science.gov (United States)

    Irom, Frokh; Farmanesh, Farhad F.; Swift, Gary M.; Johnston, Allen H.

    2004-01-01

    This paper examines single-event upsets in advanced commercial SOI microprocessors in a dynamic mode, studying SEU sensitivity of General Purpose Registers (GPRs) with clock frequency. Results are presented for SOI processors with feature sizes of 0.18 microns and two different core voltages. Single-event upset from heavy ions is measured for advanced commercial microprocessors in a dynamic mode with clock frequency up to 1GHz. Frequency and core voltage dependence of single-event upsets in registers is discussed.

  6. Analysis of core damage frequency from internal events: Peach Bottom, Unit 2

    Energy Technology Data Exchange (ETDEWEB)

    Kolaczkowski, A.M.; Lambright, J.A.; Ferrell, W.L.; Cathey, N.G.; Najafi, B.; Harper, F.T.

    1986-10-01

    This document contains the internal event initiated accident sequence analyses for Peach Bottom, Unit 2; one of the reference plants being examined as part of the NUREG-1150 effort by the Nuclear Regulatory Commission. NUREG-1150 will document the risk of a selected group of nuclear power plants. As part of that work, this report contains the overall core damage frequency estimate for Peach Bottom, Unit 2, and the accompanying plant damage state frequencies. Sensitivity and uncertainty analyses provided additional insights regarding the dominant contributors to the Peach Bottom core damage frequency estimate. The mean core damage frequency at Peach Bottom was calculated to be 8.2E-6. Station blackout type accidents (loss of all ac power) were found to dominate the overall results. Anticipated Transient Without Scram accidents were also found to be non-negligible contributors. The numerical results are largely driven by common mode failure probability estimates and to some extent, human error. Because of significant data and analysis uncertainties in these two areas (important, for instance, to the most dominant scenario in this study), it is recommended that the results of the uncertainty and sensitivity analyses be considered before any actions are taken based on this analysis.

  7. The Irish National Adverse Events Study (INAES): the frequency and nature of adverse events in Irish hospitals-a retrospective record review study.

    Science.gov (United States)

    Rafter, Natasha; Hickey, Anne; Conroy, Ronan M; Condell, Sarah; O'Connor, Paul; Vaughan, David; Walsh, Gillian; Williams, David J

    2017-02-01

    Irish healthcare has undergone extensive change recently with spending cuts and a focus on quality initiatives; however, little is known about adverse event occurrence. To assess the frequency and nature of adverse events in Irish hospitals. 1574 (53% women, mean age 54 years) randomly selected adult inpatient admissions from a sample of eight hospitals, stratified by region and size, across the Republic of Ireland in 2009 were reviewed using two-stage (nurse review of patient charts, followed by physician review of triggered charts) retrospective chart review with electronic data capture. Results were weighted to reflect the sampling strategy. The impact on adverse event rate of differing application of international adverse event criteria was also examined. 45% of charts were triggered. The prevalence of adverse events in admissions was 12.2% (95% CI 9.5% to 15.5%), with an incidence of 10.3 events per 100 admissions (95% CI 7.5 to 13.1). Over 70% of events were considered preventable. Two-thirds were rated as having a mild-to-moderate impact on the patient, 9.9% causing permanent impairment and 6.7% contributing to death. A mean of 6.1 added bed days was attributed to events, representing an expenditure of €5550 per event. The adverse event rate varied substantially (8.6%-17.0%) when applying different published adverse event eligibility criteria. This first study of adverse events in Ireland reports similar rates to other countries. In a time of austerity, adverse events in adult inpatients were estimated to cost over €194 million. These results provide important baseline data on the adverse event burden and, alongside web-based chart review, provide an incentive and methodology to monitor future patient-safety initiatives. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  8. A regional frequency analysis of extreme rainfall events over Southern France

    Science.gov (United States)

    Najib, K.; Neppel, L.; Tramblay, Y.

    2010-09-01

    Reliable estimates of extreme rainfall events are required for several hydrological purposes. However, the reliability of statistical inference tools based on Extreme Value Theory is poor when applied to short time series. These well-established statistical procedures are relevant only when applied to relatively long data records. Therefore, regional estimation methods that "trade space for time" by including several at-site data records in the frequency analysis are efficient tools to improve the reliability of extreme quantile estimates. Regional frequency analysis methods also allow for the estimation of extreme rainfall quantiles in sites with no data. However, all regionalization procedures require an extra step: the construction of homogenous regions. For this critical step, an original neighbourhood-type approach which provides a specific statistically homogeneous region for each site of interest is proposed in the present study. Both the Hosking and Wallis heterogeneity measure, based on L-moment ratios, and the non-parametric Anderson and Darling homogeneity test, are applied there within. A pooling scheme is also proposed to avoid the effects of intersite correlation. This regionalization method based on an index-value type procedure is applied to extreme rainfall events of Southern France. This study uses 1219 daily rainfall stations belonging to the French Weather Forecast rain gauge network: 601 stations have more than 20 years of daily data and 222 stations more than 50 years (from 1950 to 2008). A calibration-validation procedure was performed to evaluate the descriptive and predictive accuracy and the robustness of this regionalization method. Finally, this study provides a comparison between local and regional estimation methods for mapping Southern France extreme rainfall events.

  9. Frequency and Variance of Communication Characteristics in Aviation Safety Events

    NARCIS (Netherlands)

    Karanikas, Nektarios; Kaspers, Steffen

    2017-01-01

    In the aviation sector, communication problems have contributed into 70% to 80% of safety occurrences. However, to date we haven’t depicted which communication aspects have affected aviation safety most frequently. Based on literature, we developed a tool which includes communication characteristics

  10. The flood event explorer - a web based framework for rapid flood event analysis

    Science.gov (United States)

    Schröter, Kai; Lüdtke, Stefan; Kreibich, Heidi; Merz, Bruno

    2015-04-01

    Flood disaster management, recovery and reconstruction planning benefit from rapid evaluations of flood events and expected impacts. The near real time in-depth analysis of flood causes and key drivers for flood impacts requires a close monitoring and documentation of hydro-meteorological and socio-economic factors. Within the CEDIM's Rapid Flood Event Analysis project a flood event analysis system is developed which enables the near real-time evaluation of large scale floods in Germany. The analysis system includes functionalities to compile event related hydro-meteorological data, to evaluate the current flood situation, to assess hazard intensity and to estimate flood damage to residential buildings. A German flood event database is under development, which contains various hydro-meteorological information - in the future also impact information -for all large-scale floods since 1950. This data base comprises data on historic flood events which allow the classification of ongoing floods in terms of triggering processes and pre-conditions, critical controls and drivers for flood losses. The flood event analysis system has been implemented in a database system which automatically retrieves and stores data from more than 100 online discharge gauges on a daily basis. The current discharge observations are evaluated in a long term context in terms of flood frequency analysis. The web-based frontend visualizes the current flood situation in comparison to any past flood from the flood catalogue. The regional flood data base for Germany contains hydro-meteorological data and aggregated severity indices for a set of 76 historic large-scale flood events in Germany. This data base has been used to evaluate the key drivers for the flood in June 2013.

  11. Human based roots of failures in nuclear events investigations

    Energy Technology Data Exchange (ETDEWEB)

    Ziedelis, Stanislovas; Noel, Marc; Strucic, Miodrag [Commission of the European Communities, Petten (Netherlands). European Clearinghouse on Operational Experience Feedback for Nuclear Power Plants

    2012-10-15

    This paper aims for improvement of quality of the event investigations in the nuclear industry through analysis of the existing practices, identifying and removing the existing Human and Organizational Factors (HOF) and management related barriers. It presents the essential results of several studies performed by the European Clearinghouse on Operational Experience. Outcomes of studies are based on survey of currently existing event investigation practices typical for nuclear industry of 12 European countries, as well as on insights from analysis of numerous event investigation reports. System of operational experience feedback from information based on event investigation results is not enough effective to prevent and even to decrease frequency of recurring events due to existing methodological, HOF-related and/or knowledge management related constraints. Besides that, several latent root causes of unsuccessful event investigation are related to weaknesses in safety culture of personnel and managers. These weaknesses include focus on costs or schedule, political manipulation, arrogance, ignorance, entitlement and/or autocracy. Upgrades in safety culture of organization's personnel and its senior management especially seem to be an effective way to improvement. Increasing of competencies, capabilities and level of independency of event investigation teams, elaboration of comprehensive software, ensuring of positive approach, adequate support and impartiality of management could also facilitate for improvement of quality of the event investigations. (orig.)

  12. Emitter frequency refinement based on maximum likelihood

    Science.gov (United States)

    Xu, Xin; Wang, Huijuan

    2015-07-01

    Frequency estimation via signal sorting is widely recognized as one of the most practical technologies in signal processing. However, the estimated frequencies via signal sorting may be inaccurate and biased due to signal fluctuation under different emitter working modes, problems of transmitter circuit, environmental noises or certain unknown interference sources. Therefore, it has become an important issue to further analyze and refine signal frequencies after signal sorting. To address the above problem, we have brought forward an iterative frequency refinement method based on maximum likelihood. Iteratively, the initial estimated signal frequency values are refined. Experimental results indicate that the refined signal frequencies are more informative than the initial ones. As another advantage of our method, noises and interference sources could be filtered out simultaneously. The efficiency and flexibility enables our method to apply in a wide application area, i.e., communication, electronic reconnaissance and radar intelligence analysis.

  13. Collision frequency locality-sensitive hashing for prediction of critical events.

    Science.gov (United States)

    Kim, Y Bryce; Hemberg, Erik; O'Reilly, Una-May

    2017-07-01

    We present a fast, efficient method to predict future critical events for a patient. The prediction method is based on retrieving and leveraging similar waveform trajectories from a large medical database. Locality-sensitive hashing (LSH), our theoretical foundation, is a model-free, sub-linear time, approximate search method enabling a fast retrieval of a nearest neighbor set for a given query. We propose a new variant of LSH, namely Collision Frequency LSH (CFLSH), to further improve the prediction accuracy without sacrificing any speed. The key idea is that the more frequently an element and a query collide across multiple LSH hash tables, the more similar they are. Unlike the standard LSH which only utilizes the linear distance calculation, in CFLSH, the short-listing step from a pool of pre-selected candidates filtered by hash functions to the final nearest neighbor set relies upon the frequency of collision along with distance information. We evaluate CFLSH versus the standard LSH using the L1 and cosine distances, for predicting acute hypotensive episodes on arterial blood pressure time series data extracted from the MIMIC II database. Our results show that CFLSH for the L1 distance has a higher prediction accuracy and further accelerates the sub-linear querying time obtained by the standard LSH.

  14. Twitter data analysis: temporal and term frequency analysis with real-time event

    Science.gov (United States)

    Yadav, Garima; Joshi, Mansi; Sasikala, R.

    2017-11-01

    From the past few years, World Wide Web (www) has become a prominent and huge source for user generated content and opinionative data. Among various social media, Twitter gained popularity as it offers a fast and effective way of sharing users’ perspective towards various critical and other issues in different domain. As the data is hugely generated on cloud, it has opened doors for the researchers in the field of data science and analysis. There are various domains such as ‘Political’ domain, ‘Entertainment’ domain and ‘Business’ domain. Also there are various APIs that Twitter provides for developers 1) Search API, focus on the old tweets 2) Rest API, focuses on user details and allow to collect the user profile, friends and followers 3) Streaming API, which collects details like tweets, hashtags, geo locations. In our work we are accessing Streaming API in order to fetch real-time tweets for the dynamic happening event. For this we are focusing on ‘Entertainment’ domain especially ‘Sports’ as IPL-T20 is currently the trending on-going event. We are collecting these numerous amounts of tweets and storing them in MongoDB database where the tweets are stored in JSON document format. On this document we are performing time-series analysis and term frequency analysis using different techniques such as filtering, information extraction for text-mining that fulfils our objective of finding interesting moments for temporal data in the event and finding the ranking among the players or the teams based on popularity which helps people in understanding key influencers on the social media platform.

  15. Low frequency events at Mt. Etna: some problems and open questions

    Directory of Open Access Journals (Sweden)

    S. Gresta

    1996-06-01

    Full Text Available A short period seismic array setting at Mt. Etna symmetrically in regard to the fracture of the 1991-1993 eruption allowed an analysis of low frequency events which occurred in the first phase of the mentioned eruption. We recorded about 50 events, 19 of them belong to a family. They show very low amplitude values and spectral peaks ranging 0.5-4.5 Hz. The evidence of this family of events shows how the process driving the dynamics of the fluid in the volcano is often the same.

  16. Heavy rainfall events in the Languedoc region (France): relationships with synoptic patterns and frequency analysis

    Science.gov (United States)

    Tramblay, Y.; Neppel, L.; Najib, K.

    2010-09-01

    High intensity rainfall events often occur in the south of the Cevennes mountainous region (France), leading to catastrophic flash floods which are the main destructive natural hazard in this region. A good knowledge of these extreme events is necessary to better predict their occurrence, in particular for forecasting and to produce future scenarios. The goal of this study is to analyze the synoptic circulation patterns associated with heavy rainfall events, in order to describe their magnitude and frequency. 45 meteorological stations with 50 years of daily records (1958-2008) constituted the database for this project. A regional sample with 24-hour rainfall events exceeding a threshold of 80 mm was built, including a total of 455 events. A regional sampling was chosen in order to avoid the spatial and temporal correlations between the records. Most of the heavy rainfall events (75%) were observed during the months of September to December, associated with the highest rainfall intensity. The relationships between synoptic weather patterns and extreme rainfall events were analyzed using a classification of daily atmospheric circulation. The Western Mediterranean Oscillation Index (WeMOi), the Mediterranean Oscillation Index (MOI) and the sea-surface temperatures (SST) were also considered. Results indicated a positive trend in the magnitude of rainfall events in fall and winter seasons during the period 1958-2008 while a similar upward trend is observed for the coastal SST in the north-west of the Mediterranean Sea. The number of threshold exceedances per year is related to the annual occurrence of a south-eastern circulation pattern. There is also a clear association of the heavy rainfall events with negative values of the WeMOI and MOI indices. Finally, a non-stationary model using a Generalized Pareto distribution with climatic co-variables is proposed to model the frequency of occurrence and the magnitude of extreme rainfall events in the fall season.

  17. Frequency of extreme weather events and increased risk of motor vehicle collision in Maryland.

    Science.gov (United States)

    Liu, Ann; Soneja, Sutyajeet I; Jiang, Chengsheng; Huang, Chanjuan; Kerns, Timothy; Beck, Kenneth; Mitchell, Clifford; Sapkota, Amir

    2017-02-15

    Previous studies have shown increased precipitation to be associated with higher frequency of traffic collisions. However, data regarding how extreme weather events, projected to grow in frequency, intensity, and duration in response to a changing climate, might affect the risk of motor vehicle collisions is particularly limited. We investigated the association between frequency of extreme heat and precipitation events and risk of motor vehicle collision in Maryland between 2000 and 2012. Motor vehicle collision data was obtained from the Maryland Automated Accident Reporting System. Each observation in the data set corresponded to a unique collision event. This data was linked to extreme heat and precipitation events that were calculated using location and calendar day specific thresholds. A time-stratified case-crossover analysis was utilized to assess the association between exposure to extreme heat and precipitation events and risk of motor vehicle collision. Additional stratified analyses examined risk by road condition, season, and collisions involving only one vehicle. Overall, there were over 1.28 million motor vehicle collisions recorded in Maryland between 2000 and 2012, of which 461,009 involved injuries or death. There was a 23% increase in risk of collision for every 1-day increase in extreme precipitation event (Odds Ratios (OR) 1.23, 95% Confidence Interval (CI): 1.22, 1.27). This risk was considerably higher for collisions on roads with a defect or obstruction (OR: 1.46, 95% CI: 1.40, 1.52) and those involving a single vehicle (OR: 1.41, 95% CI: 1.39, 1.43). Change in risk associated with extreme heat events was marginal at best. Extreme precipitation events are associated with an increased risk of motor vehicle collisions in Maryland. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Use of hazardous event frequency to evaluate safety integrity level of subsea blowout preventer

    Directory of Open Access Journals (Sweden)

    Soyeon Chung

    2016-05-01

    Full Text Available Generally, the Safety Integrity Level (SIL of a subsea Blowout Preventer (BOP is evaluated by determining the Probability of Failure on Demand (PFD, a low demand mode evaluation indicator. However, some SIL results are above the PFD's effective area despite the subsea BOP's demand rate being within the PFD's effective range. Determining a Hazardous Event Frequency (HEF that can cover all demand rates could be useful when establishing the effective BOP SIL. This study focused on subsea BOP functions that follow guideline 070 of the Norwegian Oil and Gas. Events that control subsea well kicks are defined. The HEF of each BOP function is analyzed and compared with the PFD by investigating the frequency for each event and the demand rate for the components. In addition, risk control options related to PFD and HEF improvements are compared, and the effectiveness of HEF as a SIL verification for subsea BOP is assessed.

  19. Component-based event composition modeling for CPS

    Science.gov (United States)

    Yin, Zhonghai; Chu, Yanan

    2017-06-01

    In order to combine event-drive model with component-based architecture design, this paper proposes a component-based event composition model to realize CPS’s event processing. Firstly, the formal representations of component and attribute-oriented event are defined. Every component is consisted of subcomponents and the corresponding event sets. The attribute “type” is added to attribute-oriented event definition so as to describe the responsiveness to the component. Secondly, component-based event composition model is constructed. Concept lattice-based event algebra system is built to describe the relations between events, and the rules for drawing Hasse diagram are discussed. Thirdly, as there are redundancies among composite events, two simplification methods are proposed. Finally, the communication-based train control system is simulated to verify the event composition model. Results show that the event composition model we have constructed can be applied to express composite events correctly and effectively.

  20. Financial system loss as an example of high consequence, high frequency events

    Energy Technology Data Exchange (ETDEWEB)

    McGovern, D.E.

    1996-07-01

    Much work has been devoted to high consequence events with low frequency of occurrence. Characteristic of these events are bridge failure (such as that of the Tacoma Narrows), building failure (such as the collapse of a walkway at a Kansas City hotel), or compromise of a major chemical containment system (such as at Bhopal, India). Such events, although rare, have an extreme personal, societal, and financial impact. An interesting variation is demonstrated by financial losses due to fraud and abuse in the money management system. The impact can be huge, entailing very high aggregate costs, but these are a result of the contribution of many small attacks and not the result of a single (or few) massive events. Public awareness is raised through publicized events such as the junk bond fraud perpetrated by Milikin or gross mismanagement in the failure of the Barings Bank through unsupervised trading activities by Leeson in Singapore. These event,s although seemingly large (financial losses may be on the order of several billion dollars), are but small contributors to the estimated $114 billion loss to all types of financial fraud in 1993. This paper explores the magnitude of financial system losses and identifies new areas for analysis of high consequence events including the potential effect of malevolent intent.

  1. Event Recognition Based on Deep Learning in Chinese Texts.

    Directory of Open Access Journals (Sweden)

    Yajun Zhang

    Full Text Available Event recognition is the most fundamental and critical task in event-based natural language processing systems. Existing event recognition methods based on rules and shallow neural networks have certain limitations. For example, extracting features using methods based on rules is difficult; methods based on shallow neural networks converge too quickly to a local minimum, resulting in low recognition precision. To address these problems, we propose the Chinese emergency event recognition model based on deep learning (CEERM. Firstly, we use a word segmentation system to segment sentences. According to event elements labeled in the CEC 2.0 corpus, we classify words into five categories: trigger words, participants, objects, time and location. Each word is vectorized according to the following six feature layers: part of speech, dependency grammar, length, location, distance between trigger word and core word and trigger word frequency. We obtain deep semantic features of words by training a feature vector set using a deep belief network (DBN, then analyze those features in order to identify trigger words by means of a back propagation neural network. Extensive testing shows that the CEERM achieves excellent recognition performance, with a maximum F-measure value of 85.17%. Moreover, we propose the dynamic-supervised DBN, which adds supervised fine-tuning to a restricted Boltzmann machine layer by monitoring its training performance. Test analysis reveals that the new DBN improves recognition performance and effectively controls the training time. Although the F-measure increases to 88.11%, the training time increases by only 25.35%.

  2. PTSD symptoms among police officers: associations with frequency, recency, and types of traumatic events.

    Science.gov (United States)

    Hartley, Tara A; Violanti, John M; Sarkisian, Khachatur; Andrew, Michael E; Burchfiel, Cecil M

    2013-01-01

    Policing necessitates exposure to traumatic, violent and horrific events, which can lead to an increased risk for developing post-traumatic stress disorder (PTSD). The purpose of this study was to determine whether the frequency, recency, and type of police-specific traumatic events were associated with PTSD symptoms. Participants were 359 police officers from the Buffalo Cardio-Metabolic Occupational Police Stress (BCOPS) Study (2004-2009). Traumatic police events were measured using the Police Incident Survey (PIS); PTSD was measured using the PTSD Checklist-Civilian Version (PCL-C). Associations between PIS and PTSD symptoms were evaluated using ANCOVA. Contrast statements were used to test for linear trends. Increased frequency of specific types of events were associated with an increase in the PCL-C score in women, particularly women with no history of prior trauma and those who reported having a high workload (p traumatic events was associated with higher PTSD scores in women, while the recency of seeing victims of assault was associated with higher PTSD scores in men. These results may be helpful in developing intervention strategies to reduce the psychological effects following exposure and these strategies may be different for men and women.

  3. Measurement of $B^0$ Mixing Frequency Using a New Probability Based Self-Tagging Algorithm Applied to Inclusive Lepton Events from $p\\bar{p}$ Collisions at $\\sqrt{s}$ = 1.8-TeV

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Tushar [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)

    2000-07-01

    We present a measurement of the Bd mixing frequency performed in an inclusive lepton sample, B → l+X. A secondary vertex identifies a B meson decay, and a high pt lepton determines the flavor at the time of decay.

  4. Measurement of the B0-B-0 oscillation frequency with inclusive dilepton events.

    Science.gov (United States)

    Aubert, B; Boutigny, D; Gaillard, J-M; Hicheur, A; Karyotakis, Y; Lees, J P; Robbe, P; Tisserand, V; Palano, A; Pompili, A; Chen, G P; Chen, J C; Qi, N D; Rong, G; Wang, P; Zhu, Y S; Eigen, G; Stugu, B; Abrams, G S; Borgland, A W; Breon, A B; Brown, D N; Button-Shafer, J; Cahn, R N; Clark, A R; Gill, M S; Gritsan, A V; Groysman, Y; Jacobsen, R G; Kadel, R W; Kadyk, J; Kerth, L T; Kolomensky, Yu G; Kral, J F; LeClerc, C; Levi, M E; Lynch, G; Oddone, P J; Pripstein, M; Roe, N A; Romosan, A; Ronan, M T; Shelkov, V G; Telnov, A V; Wenzel, W A; Bright-Thomas, P G; Harrison, T J; Hawkes, C M; Knowles, D J; O'Neale, S W; Penny, R C; Watson, A T; Watson, N K; Deppermann, T; Goetzen, K; Koch, H; Kunze, M; Lewandowski, B; Peters, K; Schmuecker, H; Steinke, M; Barlow, N R; Bhimji, W; Chevalier, N; Clark, P J; Cottingham, W N; Foster, B; Mackay, C; Wilson, F F; Abe, K; Hearty, C; Mattison, T S; McKenna, J A; Thiessen, D; Jolly, S; McKemey, A K; Blinov, V E; Bukin, A D; Bukin, D A; Buzykaev, A R; Golubev, V B; Ivanchenko, V N; Korol, A A; Kravchenko, E A; Onuchin, A P; Serednyakov, S I; Skovpen, Yu I; Telnov, V I; Yushkov, A N; Best, D; Chao, M; Kirkby, D; Lankford, A J; Mandelkern, M; McMahon, S; Stoker, D P; Arisaka, K; Buchanan, C; Chun, S; MacFarlane, D B; Prell, S; Rahatlou, Sh; Raven, G; Sharma, V; Campagnari, C; Dahmes, B; Hart, P A; Kuznetsova, N; Levy, S L; Long, O; Lu, A; Richman, J D; Verkerke, W; Beringer, J; Eisner, A M; Grothe, M; Heusch, C A; Lockman, W S; Pulliam, T; Schalk, T; Schmitz, R E; Schumm, B A; Seiden, A; Turri, M; Walkowiak, W; Williams, D C; Wilson, M G; Chen, E; Dubois-Felsmann, G P; Dvoretskii, A; Hitlin, D G; Metzler, S; Oyang, J; Porter, F C; Ryd, A; Samuel, A; Weaver, M; Yang, S; Zhu, R Y; Devmal, S; Geld, T L; Jayatilleke, S; Mancinelli, G; Meadows, B T; Sokoloff, M D; Barillari, T; Bloom, P; Dima, M O; Ford, W T; Nauenberg, U; Olivas, A; Rankin, P; Roy, J; Smith, J G; Van Hoek, W C; Blouw, J; Harton, J L; Krishnamurthy, M; Soffer, A; Toki, W H; Wilson, R J; Zhang, J; Brandt, T; Brose, J; Colberg, T; Dickopp, M; Dubitzky, R S; Hauke, A; Maly, E; Müller-Pfefferkorn, R; Otto, S; Schubert, K R; Schwierz, R; Spaan, B; Wilden, L; Bernard, D; Bonneaud, G R; Brochard, F; Cohen-Tanugi, J; Ferrag, S; T'Jampens, S; Thiebaux, Ch; Vasileiadis, G; Verderi, M; Anjomshoaa, A; Bernet, R; Khan, A; Lavin, D; Muheim, F; Playfer, S; Swain, J E; Tinslay, J; Falbo, M; Borean, C; Bozzi, C; Dittongo, S; Piemontese, L; Treadwell, E; Anulli, F; Baldini-Ferroli, R; Calcaterra, A; De Sangro, R; Falciai, D; Finocchiaro, G; Patteri, P; Peruzzi, I M; Piccolo, M; Xie, Y; Zallo, A; Bagnasco, S; Buzzo, A; Contri, R; Crosetti, G; Lo Vetere, M; Macri, M; Monge, M R; Passaggio, S; Pastore, F C; Patrignani, C; Pia, M G; Robutti, E; Santroni, A; Tosi, S; Morii, M; Bartoldus, R; Hamilton, R; Mallik, U; Cochran, J; Crawley, H B; Fischer, P-A; Lamsa, J; Meyer, W T; Rosenberg, E I; Grosdidier, G; Hast, C; Höcker, A; Lacker, H M; Laplace, S; Lepeltier, V; Lutz, A M; Plaszczynski, S; Schune, M H; Trincaz-Duvoid, S; Wormser, G; Bionta, R M; Brigljević, V; Lange, D J; Mugge, M; Van Bibber, K; Wright, D M; Bevan, A J; Fry, J R; Gabathuler, E; Gamet, R; George, M; Kay, M; Payne, D J; Sloane, R J; Touramanis, C; Aspinwall, M L; Bowerman, D A; Dauncey, P D; Egede, U; Eschrich, I; Gunawardane, N J W; Nash, J A; Sanders, P; Smith, D; Azzopardi, D E; Back, J J; Bellodi, G; Dixon, P; Harrison, P F; Potter, R J L; Shorthouse, H W; Strother, P; Vidal, P B; Cowan, G; George, S; Green, M G; Kurup, A; Marker, C E; McGrath, P; McMahon, T R; Ricciardi, S; Salvatore, F; Vaitsas, G; Brown, D; Davis, C L; Allison, J; Barlow, R J; Boyd, J T; Forti, A C; Fullwood, J; Jackson, F; Lafferty, G D; Savvas, N; Weatherall, J H; Williams, J C; Farbin, A; Jawahery, A; Lillard, V; Olsen, J; Roberts, D A; Schieck, J R; Blaylock, G; Dallapiccola, C; Flood, K T; Hertzbach, S S; Kofler, R; Koptchev, V B; Moore, T B; Staengle, H; Willocq, S; Brau, B; Cowan, R; Sciolla, G; Taylor, F; Yamamoto, R K; Milek, M; Patel, P M; Palombo, F; Bauer, J M; Cremaldi, L; Eschenburg, V; Kroeger, R; Reidy, J; Sanders, D A; Summers, D J; Nief, J Y; Taras, P; Nicholson, H; Cartaro, C; Cavallo, N; De Nardo, G; Fabozzi, F; Gatto, C; Lista, L; Paolucci, P; Piccolo, D; Sciacca, C; LoSecco, J M; Alsmiller, J R G; Gabriel, T A; Brau, J; Frey, R; Grauges, E; Iwasaki, M; Sinev, N B; Strom, D; Colecchia, F; Dal Corso, F; Dorigo, A; Galeazzi, F; Margoni, M; Michelon, G; Morandin, M; Posocco, M; Rotondo, M; Simonetto, F; Stroili, R; Torassa, E; Voci, C; Benayoun, M; Briand, H; Chauveau, J; David, P; De Vaissière, Ch; Del Buono, L; Hamon, O; Le Diberder, F; Leruste, Ph; Ocariz, J; Roos, L; Stark, J; Manfredi, P F; Re, V; Speziali, V; Frank, E D; Gladney, L; Guo, Q H; Panetta, J; Angelini, C; Batignani, G; Bettarini, S; Bondioli, M; Bucci, F; Campagna, E; Carpinelli, M; Forti, F; Giorgi, M A; Lusiani, A; Marchiori, G; Martinez-Vidal, F; Morganti, M; Neri, N; Paoloni, E; Rama, M; Rizzo, G; Sandrelli, F; Simi, G; Triggiani, G; Walsh, J; Haire, M; Judd, D; Paick, K; Turnbull, L; Wagoner, D E; Albert, J; Elmer, P; Lu, C; Miftakov, V; Schaffner, S F; Smith, A J S; Tumanov, A; Varnes, E W; Cavoto, G; Del Re, D; Faccini, R; Ferrarotto, F; Ferroni, F; Lamanna, E; Mazzoni, M A; Morganti, S; Piredda, G; Safai Tehrani, F; Serra, M; Voena, C; Christ, S; Waldi, R; Adye, T; De Groot, N; Franek, B; Geddes, N I; Gopal, G P; Xella, S M; Aleksan, R; Emery, S; Gaidot, A; Ganzhur, S F; Giraud, P-F; Hamel Monchenault, G; Kozanecki, W; Langer, M; London, G W; Mayer, B; Serfass, B; Vasseur, G; Yèche, Ch; Zito, M; Purohit, M V; Singh, H; Weidemann, A W; Yumiceva, F X; Adam, I; Aston, D; Berger, N; Boyarski, A M; Calderini, G; Convery, M R; Coupal, D P; Dong, D; Dorfan, J; Dunwoodie, W; Field, R C; Glanzman, T; Gowdy, S J; Haas, T; Himel, T; Hryn'ova, T; Huffer, M E; Innes, W R; Jessop, C P; Kelsey, M H; Kim, P; Kocian, M L; Langenegger, U; Leith, D W G S; Luitz, S; Luth, V; Lynch, H L; Marsiske, H; Menke, S; Messner, R; Muller, D R; O'Grady, C P; Ozcan, V E; Perazzo, A; Perl, M; Petrak, S; Quinn, H; Ratcliff, B N; Robertson, S H; Roodman, A; Salnikov, A A; Schietinger, T; Schindler, R H; Schwiening, J; Snyder, A; Soha, A; Spanier, S M; Stelzer, J; Su, D; Sullivan, M K; Tanaka, H A; Va'vra, J; Wagner, S R; Weinstein, A J R; Wisniewski, W J; Wright, D H; Young, C C; Burchat, P R; Cheng, C H; Meyer, T I; Roat, C; Henderson, R; Bugg, W; Cohn, H; Izen, J M; Kitayama, I; Lou, X C; Bianchi, F; Bona, M; Gamba, D; Bosisio, L; Della Ricca, G; Lanceri, L; Poropat, P; Vuagnin, G; Panvini, R S; Brown, C M; Jackson, P D; Kowalewski, R; Roney, J M; Band, H R; Charles, E; Dasu, S; Eichenbaum, A M; Hu, H; Johnson, J R; Liu, R; Di Lodovico, F; Pan, Y; Prepost, R; Scott, I J; Sekula, S J; Von Wimmersperg-Toeller, J H; Wu, S L; Yu, Z; Kordich, T M B; Neal, H

    2002-06-03

    The B0-B-0 oscillation frequency has been measured with a sample of 23 x 10(6) BB- pairs collected with the BABAR detector at the PEP-II asymmetric B Factory at SLAC. In this sample, we select events in which both B mesons decay semileptonically and use the charge of the leptons to identify the flavor of each B meson. A simultaneous fit to the decay time difference distributions for opposite- and same-sign dilepton events gives deltamd = 0.493+/-0.012(stat)+/-0.009(syst) ps-1.

  5. High-frequency monitoring of catchment nutrient exports reveals highly variable storm event responses and dynamic source zone activation

    Science.gov (United States)

    Blaen, Phillip J.; Khamis, Kieran; Lloyd, Charlotte; Comer-Warner, Sophie; Ciocca, Francesco; Thomas, Rick M.; MacKenzie, A. Rob; Krause, Stefan

    2017-09-01

    Storm events can drive highly variable behavior in catchment nutrient and water fluxes, yet short-term event dynamics are frequently missed by low-resolution sampling regimes. In addition, nutrient source zone contributions can vary significantly within and between storm events. Our inability to identify and characterize time-dynamic source zone contributions severely hampers the adequate design of land use management practices in order to control nutrient exports from agricultural landscapes. Here we utilize an 8 month high-frequency (hourly) time series of streamflow, nitrate (NO3-N), dissolved organic carbon (DOC), and hydroclimatic variables for a headwater agricultural catchment. We identified 29 distinct storm events across the monitoring period. These events represented 31% of the time series and contributed disproportionately to nutrient loads (42% of NO3-N and 43% of DOC) relative to their duration. Regression analysis identified a small subset of hydroclimatological variables (notably precipitation intensity and antecedent conditions) as key drivers of nutrient dynamics during storm events. Hysteresis analysis of nutrient concentration-discharge relationships highlighted the dynamic activation of discrete NO3-N and DOC source zones, which varied on an event-specific basis. Our results highlight the benefits of high-frequency in situ monitoring for characterizing short-term nutrient fluxes and unraveling connections between hydroclimatological variability and river nutrient export and source zone activation under extreme flow conditions. These new process-based insights, which we summarize in a conceptual model, are fundamental to underpinning targeted management measures to reduce nutrient loading of surface waters.

  6. High frequency image-based flow detection

    Energy Technology Data Exchange (ETDEWEB)

    Chung, R [National Heart and Lung Institute, Royal Brompton Hospital, London SW3 6NP (United Kingdom); Prager, R W [Dept. of Engineering, University of Cambridge, Cambridge CB2 1PZ (United Kingdom); Gee, A H [Dept. of Engineering, University of Cambridge, Cambridge CB2 1PZ (United Kingdom); Treece, G M [Dept. of Engineering, University of Cambridge, Cambridge CB2 1PZ (United Kingdom)

    2004-01-01

    Tumour angiogenesis refers to neovascular development on a microvascular scale and is an early indicator of cancer. Prototype high frequency pulsed Doppler systems using 50 MHz transducers have been reported to detect microvascular flow in vessels 0.02 mm to 0.5 mm in diameter at superficial depths of 0.5 mm. Detecting flow in microvasculature at deeper depths requires lower frequency transducers with a resulting tradeoff in spatial resolution. Using a 22 MHz transducer, we demonstrate a speckle decorrelation technique to detect in vitro flow in soft tubing of 0.5 mm diameter at a depth of 2 cm. This image-based decorrelation technique is capable of detecting flow in significantly narrower diameters down to 0.125 mm by decreasing the region of interest.

  7. The differential effects of increasing frequency and magnitude of extreme events on coral populations.

    Science.gov (United States)

    Fabina, Nicholas S; Baskett, Marissa L; Gross, Kevin

    2015-09-01

    Extreme events, which have profound ecological consequences, are changing in both frequency and magnitude with climate change. Because extreme temperatures induce coral bleaching, we can explore the relative impacts of changes in frequency and magnitude of high temperature events on coral reefs. Here, we combined climate projections and a dynamic population model to determine how changing bleaching regimes influence coral persistence. We additionally explored how coral traits and competition with macroalgae mediate changes in bleaching regimes. Our results predict that severe bleaching events reduce coral persistence more than frequent bleaching. Corals with low adult mortality and high growth rates are successful when bleaching is mild, but bleaching resistance is necessary to persist when bleaching is severe, regardless of frequency. The existence of macroalgae-dominated stable states reduces coral persistence and changes the relative importance of coral traits. Building on previous studies, our results predict that management efforts may need to prioritize protection of "weaker" corals with high adult mortality when bleaching is mild, and protection of "stronger" corals with high bleaching resistance when bleaching is severe. In summary, future reef projections and conservation targets depend on both local bleaching regimes and biodiversity.

  8. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm.

    Science.gov (United States)

    Zhou, Hui; Ji, Ning; Samuel, Oluwarotimi Williams; Cao, Yafei; Zhao, Zheyi; Chen, Shixiong; Li, Guanglin

    2016-10-01

    Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO) and heel strike (HS) gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  9. Towards Real-Time Detection of Gait Events on Different Terrains Using Time-Frequency Analysis and Peak Heuristics Algorithm

    Directory of Open Access Journals (Sweden)

    Hui Zhou

    2016-10-01

    Full Text Available Real-time detection of gait events can be applied as a reliable input to control drop foot correction devices and lower-limb prostheses. Among the different sensors used to acquire the signals associated with walking for gait event detection, the accelerometer is considered as a preferable sensor due to its convenience of use, small size, low cost, reliability, and low power consumption. Based on the acceleration signals, different algorithms have been proposed to detect toe off (TO and heel strike (HS gait events in previous studies. While these algorithms could achieve a relatively reasonable performance in gait event detection, they suffer from limitations such as poor real-time performance and are less reliable in the cases of up stair and down stair terrains. In this study, a new algorithm is proposed to detect the gait events on three walking terrains in real-time based on the analysis of acceleration jerk signals with a time-frequency method to obtain gait parameters, and then the determination of the peaks of jerk signals using peak heuristics. The performance of the newly proposed algorithm was evaluated with eight healthy subjects when they were walking on level ground, up stairs, and down stairs. Our experimental results showed that the mean F1 scores of the proposed algorithm were above 0.98 for HS event detection and 0.95 for TO event detection on the three terrains. This indicates that the current algorithm would be robust and accurate for gait event detection on different terrains. Findings from the current study suggest that the proposed method may be a preferable option in some applications such as drop foot correction devices and leg prostheses.

  10. Frequency and distribution of winter melt events from passive microwave satellite data in the pan-Arctic, 1988-2013

    Science.gov (United States)

    Wang, Libo; Toose, Peter; Brown, Ross; Derksen, Chris

    2016-11-01

    This study presents an algorithm for detecting winter melt events in seasonal snow cover based on temporal variations in the brightness temperature difference between 19 and 37 GHz from satellite passive microwave measurements. An advantage of the passive microwave approach is that it is based on the physical presence of liquid water in the snowpack, which may not be the case with melt events inferred from surface air temperature data. The algorithm is validated using in situ observations from weather stations, snow pit measurements, and a surface-based passive microwave radiometer. The validation results indicate the algorithm has a high success rate for melt durations lasting multiple hours/days and where the melt event is preceded by warm air temperatures. The algorithm does not reliably identify short-duration events or events that occur immediately after or before periods with extremely cold air temperatures due to the thermal inertia of the snowpack and/or overpass and resolution limitations of the satellite data. The results of running the algorithm over the pan-Arctic region (north of 50° N) for the 1988-2013 period show that winter melt events are relatively rare, totaling less than 1 week per winter over most areas, with higher numbers of melt days (around two weeks per winter) occurring in more temperate regions of the Arctic (e.g., central Québec and Labrador, southern Alaska and Scandinavia). The observed spatial pattern is similar to winter melt events inferred with surface air temperatures from the ERA-Interim (ERA-I) and Modern Era-Retrospective Analysis for Research and Applications (MERRA) reanalysis datasets. There was little evidence of trends in winter melt event frequency over 1988-2013 with the exception of negative trends over northern Europe attributed to a shortening of the duration of the winter period. The frequency of winter melt events is shown to be strongly correlated to the duration of winter period. This must be taken into

  11. DD4Hep based event reconstruction

    CERN Document Server

    AUTHOR|(SzGeCERN)683529; Frank, Markus; Gaede, Frank-Dieter; Hynds, Daniel; Lu, Shaojun; Nikiforou, Nikiforos; Petric, Marko; Simoniello, Rosa; Voutsinas, Georgios Gerasimos

    The DD4HEP detector description toolkit offers a flexible and easy-to-use solution for the consistent and complete description of particle physics detectors in a single system. The sub-component DDREC provides a dedicated interface to the detector geometry as needed for event reconstruction. With DDREC there is no need to define an additional, separate reconstruction geometry as is often done in HEP, but one can transparently extend the existing detailed simulation model to be also used for the reconstruction. Based on the extension mechanism of DD4HEP, DDREC allows one to attach user defined data structures to detector elements at all levels of the geometry hierarchy. These data structures define a high level view onto the detectors describing their physical properties, such as measurement layers, point resolutions, and cell sizes. For the purpose of charged particle track reconstruction, dedicated surface objects can be attached to every volume in the detector geometry. These surfaces provide the measuremen...

  12. Building Partnerships through Classroom-Based Events

    Science.gov (United States)

    Zacarian, Debbie; Silverstone, Michael

    2017-01-01

    Building partnerships with families can be a challenge, especially in ethnically diverse classrooms. In this article, the authors describe how to create such partnerships with three kinds of classroom events: community-building events that deepen social relationships and make families feel welcome; curriculum showcase events that give families a…

  13. A study of event related potential frequency domain coherency using multichannel electroencephalogram subspace analysis.

    Science.gov (United States)

    Razavipour, Fatemeh; Sameni, Reza

    2015-07-15

    Event related potentials (ERP) are time-locked electrical activities of the brain in direct response to a specific sensory, cognitive, or motor stimulus. ERP components, such as the P300 wave, which are involved in the process of decision-making, help scientists diagnose specific cognitive disabilities. In this study, we utilize the angles between multichannel electroencephalogram (EEG) subspaces in different frequency bands, as a similarity factor for studying the spatial coherency between ERP frequency responses. A matched filter is used to enhance the ERP from background EEG. While previous researches have focused on frequencies below 10 Hz, as the major frequency band of ERP, it is shown that by using the proposed method, significant ERP-related information can also be found in the 25-40 Hz band. These frequency bands are selected by calculating the correlation coefficient between P300 response segments and synthetic EEG, and ERP segments without P300 waves, and by rejecting the bands having the most association with background EEG and non-P300 components. The significance of the results is assessed by real EEG acquired in brain computer interface experiments versus synthetic EEG produced by existing methods in the literature, to assure that the results are not systematic side effects of the proposed framework. The overall results show that the equivalent dipoles corresponding to narrow-band events in the brain are spatially coherent within different (not necessarily adjacent) frequency bands. The results of this study can lead into novel perspectives in ERP studies. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. ATTITUDES OF SERBIAN CONSUMERS TOWARD ADVERTISING THROUGH SPORT WITH REGARD TO THE FREQUENCY OF WATCHING SPORTS EVENTS

    Directory of Open Access Journals (Sweden)

    Stevo Popović

    2015-05-01

    Full Text Available It is proposed that potential cosumers form attitudes based on advertising through sport can influence decisions to purchase a particular advertiser’s product (Pyun, 2006. From this reason, it is important to analyse their general attitudes toward advertising through sport among various questions, and this investigation was aimed at gaining relevant knowledge about the attitudes of Serbian consumers toward advertising through sport among. Methods: The sample included 127 respondents, divided into six subsample groups: cconsumers, who do not watch sports events at all, then consumers who watch sports events 1-30 minutes, next 31-60 minutes, 61-90 minutes, 91-120 minutes, as well as consumers who watch sports events more than 120 minutes during the typical day. The sample of variables contained the system of three general attitudes which were modeled by seven-point Likert scale. The results of the measuring were analyzed by multivariate analysis (MANOVA and univariate analysis (ANOVA and Post Hoc test. Results: Based on the statistical analyses it was found that significant differences didn’t occur at multivariate level, as well as between all three variables at a significance level of p=.05. Hence, it is interesting to highlight that it was found there are no significant differences showed up between the attitudes of consumers toward advertising through sport among the frequency of watching sports events. Discussion: These results are so important for the marketers, mostly due to the reason they can merge all the potential consumers regarding the frequency they watch the sports events. On the other hand, this wasn’t the case in previous investigations (Bjelica and Popović, 2011 and this observation presents relevant information.

  15. Decay ratio estimation based on time-frequency representations

    Energy Technology Data Exchange (ETDEWEB)

    Torres-Fernandez, Jose E.; Prieto-Guerrero, Alfonso [Division de Ciencias Basicas e Ingenieria, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, Mexico D.F. 09340 (Mexico); Espinosa-Paredes, Gilberto, E-mail: gepe@xanum.uam.m [Division de Ciencias Basicas e Ingenieria, Universidad Autonoma Metropolitana-Iztapalapa, Av. San Rafael Atlixco 186, Col. Vicentina, Mexico D.F. 09340 (Mexico)

    2010-02-15

    A novel method based on bilinear time-frequency representations (TFRs) is proposed to determine the time evolution of the linear stability parameters of a boiling water reactor (BWR) using neutronic noise signals. TFRs allow us to track the instantaneous frequencies contained in a signal to estimate an instantaneous decay ratio (IDR) that closely follows the signal envelope changes in time, making the IDR a measure of local stability. In order to account for long term changes in BWR stability, the ACDR measure is introduced as the accumulated product of the local IDRs. As it is shown in this paper, the ACDR measure clearly reflects major long term changes in BWR stability. Last to validate our method, synthetic and real neutronic signals were used. The methodology was tested on the Laguna Verde Unit 1, two events were reported in the Forsmark stability benchmark.

  16. Event-related potentials to conjunctions of spatial frequency and orientation as a function of stimulus parameters and response requirements.

    NARCIS (Netherlands)

    Kenemans, J.L.; Kok, A.; Smulders, F.T.

    1993-01-01

    Event-related potentials (ERPs) were recorded from 7 male graduate students who were required to push a button in response to a given conjunction of spatial frequency and orientation (target) and to ignore conjunctions sharing with the target only frequency (frequency-relevant), only orientation

  17. Considering historical flood events in flood frequency analysis: Is it worth the effort?

    Science.gov (United States)

    Schendel, Thomas; Thongwichian, Rossukon

    2017-07-01

    Information about historical floods can be useful in reducing uncertainties in flood frequency estimation. Since the start of the historical record is often defined by the first known flood, the length of the true historical period M remains unknown. We have expanded a previously published method of estimating M to the case of several known floods within the historical period. We performed a systematic evaluation of the usefulness of including historical flood events into flood frequency analysis for a wide range of return periods and studied bias as well as relative root mean square error (RRMSE). Since we used the generalized extreme value distribution (GEV) as parent distribution, we were able to investigate the impact of varying the skewness on RRMSE. We confirmed the usefulness of historical flood data regarding the reduction of RRMSE, however we found that this reduction is less pronounced the more positively skewed the parent distribution was. Including historical flood information had an ambiguous effect on bias: depending on length and number of known floods of the historical period, bias was reduced for large return periods, but increased for smaller ones. Finally, we customized the test inversion bootstrap for estimating confidence intervals to the case that historical flood events are taken into account into flood frequency analysis.

  18. Increased risk of severe hypoglycemic events with increasing frequency of non-severe hypoglycemic events in patients with Type 1 and Type 2 diabetes.

    LENUS (Irish Health Repository)

    Sreenan, Seamus

    2014-07-15

    Severe hypoglycemic events (SHEs) are associated with significant morbidity, mortality and costs. However, the more common non-severe hypoglycemic events (NSHEs) are less well explored. We investigated the association between reported frequency of NSHEs and SHEs among patients with type 1 diabetes mellitus (T1DM) and type 2 diabetes mellitus (T2DM) in the PREDICTIVE study.

  19. Network-Based and Binless Frequency Analyses.

    Directory of Open Access Journals (Sweden)

    Sybil Derrible

    Full Text Available We introduce and develop a new network-based and binless methodology to perform frequency analyses and produce histograms. In contrast with traditional frequency analysis techniques that use fixed intervals to bin values, we place a range ±ζ around each individual value in a data set and count the number of values within that range, which allows us to compare every single value of a data set with one another. In essence, the methodology is identical to the construction of a network, where two values are connected if they lie within a given a range (±ζ. The value with the highest degree (i.e., most connections is therefore assimilated to the mode of the distribution. To select an optimal range, we look at the stability of the proportion of nodes in the largest cluster. The methodology is validated by sampling 12 typical distributions, and it is applied to a number of real-world data sets with both spatial and temporal components. The methodology can be applied to any data set and provides a robust means to uncover meaningful patterns and trends. A free python script and a tutorial are also made available to facilitate the application of the method.

  20. Modelado del transformador para eventos de alta frecuencia; Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana Galván Sánchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético. Debido a su construcción física, su representación como un circuito y su modelo matemático son muy complejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de la red eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienen fenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muy detallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operación de un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea de transmisión a una distancia de 5 km de una subestación de potencia. The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physical construction, its representation as a circuit and its mathematical model are very complex. The electromagnetic behavior and all the elements in the power network depend on the involved frequency. So, for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transient behavior. This work analyzes how to pass from a simple model to a very detailed model to simulated high frequency events. The simulated events are the switch operation due to a fault in the system and the impact of an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  1. A spatial and nonstationary model for the frequency of extreme rainfall events

    DEFF Research Database (Denmark)

    Gregersen, Ida Bülow; Madsen, Henrik; Rosbjerg, Dan

    2013-01-01

    Changes in the properties of extreme rainfall events have been observed worldwide. In relation to the discussion of ongoing climatic changes, it is of high importance to attribute these changes to known sources of climate variability. Focusing on spatial and temporal changes in the frequency...... of extreme rainfall events, a statistical model is tested for this purpose. The model is built on the theory of generalized linear models and uses Poisson regression solved by generalized estimation equations. Spatial and temporal explanatory variables can be included simultaneously, and their relative......, and the average summer temperature. The two latter showed a high relative importance. The established link will be beneficial when predicting future occurrences of precipitation extremes. © 2013. American Geophysical Union. All Rights Reserved....

  2. Effects of low-frequency repetitive transcranial magnetic stimulation on event-related potential P300

    Science.gov (United States)

    Torii, Tetsuya; Sato, Aya; Iwahashi, Masakuni; Iramina, Keiji

    2012-04-01

    The present study analyzed the effects of repetitive transcranial magnetic stimulation (rTMS) on brain activity. P300 latency of event-related potential (ERP) was used to evaluate the effects of low-frequency and short-term rTMS by stimulating the supramarginal gyrus (SMG), which is considered to be the related area of P300 origin. In addition, the prolonged stimulation effects on P300 latency were analyzed after applying rTMS. A figure-eight coil was used to stimulate left-right SMG, and intensity of magnetic stimulation was 80% of motor threshold. A total of 100 magnetic pulses were applied for rTMS. The effects of stimulus frequency at 0.5 or 1 Hz were determined. Following rTMS, an odd-ball task was performed and P300 latency of ERP was measured. The odd-ball task was performed at 5, 10, and 15 min post-rTMS. ERP was measured prior to magnetic stimulation as a control. Electroencephalograph (EEG) was measured at Fz, Cz, and Pz that were indicated by the international 10-20 electrode system. Results demonstrated that different effects on P300 latency occurred between 0.5-1 Hz rTMS. With 1 Hz low-frequency magnetic stimulation to the left SMG, P300 latency decreased. Compared to the control, the latency time difference was approximately 15 ms at Cz. This decrease continued for approximately 10 min post-rTMS. In contrast, 0.5 Hz rTMS resulted in delayed P300 latency. Compared to the control, the latency time difference was approximately 20 ms at Fz, and this delayed effect continued for approximately 15 min post-rTMS. Results demonstrated that P300 latency varied according to rTMS frequency. Furthermore, the duration of the effect was not similar for stimulus frequency of low-frequency rTMS.

  3. Address-event-based platform for bioinspired spiking systems

    Science.gov (United States)

    Jiménez-Fernández, A.; Luján, C. D.; Linares-Barranco, A.; Gómez-Rodríguez, F.; Rivas, M.; Jiménez, G.; Civit, A.

    2007-05-01

    Address Event Representation (AER) is an emergent neuromorphic interchip communication protocol that allows a real-time virtual massive connectivity between huge number neurons, located on different chips. By exploiting high speed digital communication circuits (with nano-seconds timings), synaptic neural connections can be time multiplexed, while neural activity signals (with mili-seconds timings) are sampled at low frequencies. Also, neurons generate "events" according to their activity levels. More active neurons generate more events per unit time, and access the interchip communication channel more frequently, while neurons with low activity consume less communication bandwidth. When building multi-chip muti-layered AER systems, it is absolutely necessary to have a computer interface that allows (a) reading AER interchip traffic into the computer and visualizing it on the screen, and (b) converting conventional frame-based video stream in the computer into AER and injecting it at some point of the AER structure. This is necessary for test and debugging of complex AER systems. In the other hand, the use of a commercial personal computer implies to depend on software tools and operating systems that can make the system slower and un-robust. This paper addresses the problem of communicating several AER based chips to compose a powerful processing system. The problem was discussed in the Neuromorphic Engineering Workshop of 2006. The platform is based basically on an embedded computer, a powerful FPGA and serial links, to make the system faster and be stand alone (independent from a PC). A new platform is presented that allow to connect up to eight AER based chips to a Spartan 3 4000 FPGA. The FPGA is responsible of the network communication based in Address-Event and, at the same time, to map and transform the address space of the traffic to implement a pre-processing. A MMU microprocessor (Intel XScale 400MHz Gumstix Connex computer) is also connected to the FPGA

  4. High Frequency Attenuation Modeling and Event Amplitude Estimation in the Southern Nevada Region

    Science.gov (United States)

    Pyle, M. L.; Walter, W. R.; Pasyanos, M.

    2016-12-01

    Measurement of seismic amplitudes plays a critical role in underground explosion monitoring and the discrimination between earthquakes and explosions, which is crucial for global security. In order to improve amplitude estimation at small event-to-station distances, an accurate 2D model of attenuation is important. As part of the Source Physics Experiment (SPE), we develop a detailed attenuation model for the region around southern Nevada and test the model's usefulness in predicting amplitudes of local events. The SPE consists of a series of chemical explosions at the Nevada National Security Site (NNSS) designed to improve our understanding of explosion physics and enable better modeling of explosion sources. A high-resolution attenuation model will aid in the waveform modeling efforts of these experiments, and enable us to take a more detailed look at local event discrimination. To improve our understanding of the propagation of energy from sources in the area to local and regional stations in the western U.S., we invert regional phases to examine the crust and upper mantle attenuation structure of southern Nevada and the surrounding region. We consider observed amplitudes as the frequency-domain product of a source term, a site term, a geometrical spreading term, and an attenuation (Q) term (e.g. Walter and Taylor, 2001). Initially we take a staged approach to first determine the best 1D Q values; next we calculate source terms using the 1D model, and finally we solve for the best 2D Q parameters and site terms considering all frequencies simultaneously. Preliminary results show that our attenuation model correlates quite well with the regional geology, and a small number of comparisons of predicted and observed amplitudes from past SPE shots show reasonable agreement. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  5. Modelado del transformador para eventos de alta frecuencia ;Transformer model for high frequency events

    Directory of Open Access Journals (Sweden)

    Verónica Adriana – Galván Sanchez

    2012-07-01

    Full Text Available La función de un transformador es cambiar el nivel de tensión a través de un acoplamiento magnético.Debido a su construcción física, su representación como un circuito y su modelo matemático son muycomplejos. El comportamiento electromagnético del transformador, al igual que todos los elementos de lared eléctrica de potencia, depende de la frecuencia involucrada. Por esta razón cuando se tienenfenómenos de alta frecuencia su modelo debe ser muy detallado para que reproduzca el comportamientodel estado transitorio. En este trabajo se analiza cómo se pasa de un modelo muy simple, a un modelo muydetallado para hacer simulación de eventos de alta frecuencia. Los eventos que se simulan son la operaciónde un interruptor por una falla en el sistema y el impacto de una descarga atmosférica sobre la línea detransmisión a una distancia de 5 km de una subestación de potencia.The transformer’s function is to change the voltage level through a magnetic coupling. Due to its physicalconstruction, its representation as a circuit and its mathematical model are very complex. Theelectromagnetic behavior and all the elements in the power network depend on the involved frequency. So,for high frequency events, its model needs to be very detailed to reproduce the electromagnetic transientbehavior. This work analyzes how to pass from a simple model to a very detailed model to simulated highfrequency events. The simulated events are the switch operation due to a fault in the system and the impactof an atmospheric discharge (direct stroke in the transmission line, five km far away from the substation.

  6. Thermal-Diffusivity-Based Frequency References in Standard CMOS

    NARCIS (Netherlands)

    Kashmiri, S.M.

    2012-01-01

    In recent years, a lot of research has been devoted to the realization of accurate integrated frequency references. A thermal-diffusivity-based (TD) frequency reference provides an alternative method of on-chip frequency generation in standard CMOS technology. A frequency-locked loop locks the

  7. Prestress Force Identification for Externally Prestressed Concrete Beam Based on Frequency Equation and Measured Frequencies

    OpenAIRE

    Luning Shi; Haoxiang He; Weiming Yan

    2014-01-01

    A prestress force identification method for externally prestressed concrete uniform beam based on the frequency equation and the measured frequencies is developed. For the purpose of the prestress force identification accuracy, we first look for the appropriate method to solve the free vibration equation of externally prestressed concrete beam and then combine the measured frequencies with frequency equation to identify the prestress force. To obtain the exact solution of the free vibration e...

  8. Those were the days: memory bias for the frequency of positive events, depression, and self-enhancement.

    Science.gov (United States)

    Lotterman, Jenny H; Bonanno, George A

    2014-01-01

    Past research has associated depression with memory biases pertaining to the frequency, duration, and specificity of past events. Associations have been proposed between both negative and positive memory biases and depression symptoms. However, research has not examined the occurrence of actual events over time in the study of memory bias. To address these limitations and investigate whether a negative or positive memory bias is associated with symptoms of depression, we collected weekly data on specific types of life events over a 4-year period from a sample of college students, and asked students to recall event frequency at the end of that period. Exaggerated recall of frequency for positive events but not other types of events was associated with depression symptoms, using both continuous and categorical measures. Moderator analyses indicated that these effects were evidenced primarily for memories involving the self and among individuals low in trait self-enhancement. The current study indicates that positive memory-frequency bias is an important type of memory bias associated with symptoms of depression. Results support the idea that the link between memory bias for positive event frequency and depressed mood arises out of a current-self vs past-self comparison.

  9. SUBTLEX-ESP: Spanish Word Frequencies Based on Film Subtitles

    Science.gov (United States)

    Cuetos, Fernando; Glez-Nosti, Maria; Barbon, Analia; Brysbaert, Marc

    2011-01-01

    Recent studies have shown that word frequency estimates obtained from films and television subtitles are better to predict performance in word recognition experiments than the traditional word frequency estimates based on books and newspapers. In this study, we present a subtitle-based word frequency list for Spanish, one of the most widely spoken…

  10. Seismic random noise attenuation by time-frequency peak filtering based on joint time-frequency distribution

    Science.gov (United States)

    Zhang, Chao; Lin, Hong-bo; Li, Yue; Yang, Bao-jun

    2013-09-01

    Time-Frequency Peak Filtering (TFPF) is an effective method to eliminate pervasive random noise when seismic signals are analyzed. In conventional TFPF, the pseudo Wigner-Ville distribution (PWVD) is used for estimating instantaneous frequency (IF), but is sensitive to noise interferences that mask the borderline between signal and noise and detract the energy concentration on the IF curve. This leads to the deviation of the peaks of the pseudo Wigner-Ville distribution from the instantaneous frequency, which is the cause of undesirable lateral oscillations as well as of amplitude attenuation of the highly varying seismic signal, and ultimately of the biased seismic signal. With the purpose to overcome greatly these drawbacks and increase the signal-to-noise ratio, we propose in this paper a TFPF refinement that is based upon the joint time-frequency distribution (JTFD). The joint time-frequency distribution is obtained by the combination of the PWVD and smooth PWVD (SPWVD). First we use SPWVD to generate a broad time-frequency area of the signal. Then this area is filtered with a step function to remove some divergent time-frequency points. Finally, the joint time-frequency distribution JTFD is obtained from PWVD weighted by this filtered distribution. The objective pursued with all these operations is to reduce the effects of the interferences and enhance the energy concentration around the IF of the signal in the time-frequency domain. Experiments with synthetic and real seismic data demonstrate that TFPF based on the joint time-frequency distribution can effectively suppress strong random noise and preserve events of interest.

  11. [Frequency and Type of Traumatic Events in Children and Adolescents with a Posttraumatic Stress Disorder].

    Science.gov (United States)

    Loos, Sabine; Wolf, Saskia; Tutus, Dunja; Goldbeck, Lutz

    2015-01-01

    The risk for children and adolescents to be exposed to a potentially traumatic event (PTE) is high. The present study examines the frequency of PTEs in children and adolescents with Posttraumatic Stress Disorder (PTSD), the type of index trauma, and its relation to PTSD symptom severity and gender. A clinical sample of 159 children and adolescents between 7-16 years was assessed using the Clinician-Administered PTSD Scale for Children and Adolescents (CAPS-CA). All reported PTEs from the checklist were analyzed according to frequency. The index events were categorized according to the following categories: cause (random vs. intentional), relation to offender (intrafamilial vs. extrafamilial), patient's role (victim, witness or vicarious traumatization), and type of PTE (physical or sexual violence). Relation between categories and PTSD symptom severity and sex were analyzed with inferential statistics. On average participants reported five PTEs, most frequently physical violence without weapons (57.9%), loss of loved person through death (45.9%), and sexual abuse/assaults (44%). The most frequent index traumata were intentional (76.7%). Regarding trauma type, there was a significant difference concerning higher symptom severity in children and adolescents who experienced sexual abuse/assault compared to physical violence (t=-1.913(109), p=0.05). A significantly higher symptom severity was found for girls compared to boys for the trauma categories extrafamilial offender (z=-2,27, p=0.02), victim (z=-2,11, p=0,04), and sexual abuse/assault (z=-2,43, p=0,01). Clinical and diagnostic implications are discussed in relation to the amendments of PTSD diagnostic criteria in DSM-5.

  12. Framework for modeling high-impact, low-frequency power grid events to support risk-informed decisions

    Energy Technology Data Exchange (ETDEWEB)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.; Dagle, Jeffery E.; Millard, David W.; Yao, Juan; Glantz, Cliff S.; Gourisetti, Sri N. G.

    2016-06-25

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the security and resilience of the electric power grid. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be difficult, as the statistical basis needed to directly estimate the probabilities and consequences of their occurrence does not exist. Because risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact, low-frequency events (HILFs) is essential. Insights from such a model indicate where resources are most rationally and effectively expended. A risk-informed realization of designing and maintaining a grid resilient to HILFs will demand consideration of a spectrum of hazards/threats to infrastructure integrity, an understanding of their likelihoods of occurrence, treatment of the fragilities of critical assets to the stressors induced by such events, and through modeling grid network topology, the extent of damage associated with these scenarios. The model resulting from integration of these elements will allow sensitivity assessments based on optional risk management strategies, such as alternative pooling, staging and logistic strategies, and emergency contingency planning. This study is focused on the development of an end-to-end HILF risk-assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision-makers across numerous stakeholder groups in directing resources optimally towards the management of risks to operational continuity.

  13. The Influence of ENSO and Interdecadal Variability on the Frequency of Extreme Precipitation Events in South America

    Science.gov (United States)

    Grimm, A. M.; Tedeschi, R. G.; Pscheidt, I.

    2006-12-01

    Different phases of the El Nino Southern Oscillation (ENSO) produce significant impacts on monthly and seasonal precipitation over several regions of South America, as shown by previous studies. This paper examines how El Nino (EN) and La Nina (LN) episodes, as well as interdecadal oscillations, modify the frequency of extreme precipitation events in South America, and the reason for this modification. Gamma distributions are fit to precipitation series for each day of the year, in the period 1956-2002, provided by about 5000 stations all over Brazil. Daily station data are gridded to 1.0 degree to achieve more homogeneous distribution of data. Daily precipitation data are then replaced by their respective percentiles. Extreme events are those with a three-day average percentile above 90. The number of extreme events was computed for each month of each year. Years were classified as EN, LN, and normal years, and the mean frequency of extreme events for each month, within each category of year, is computed. Maps of the difference (and its statistical significance) between these mean frequencies for EN and normal years, and for LN and normal years show that EN and LN episodes influence significantly the frequency of extreme events in several regions in Brazil during certain periods. The relationships between large-scale atmospheric perturbations and variations in the frequency of extreme precipitation events are sought through composites of anomalous atmospheric fields during extreme events in EN and LN episodes, in three regions in which there is significant change in the frequency of these events. The general features of those anomalous fields are similar for extreme events during any category of year (EN, LN or normal). They show the essential ingredients for much precipitation: moisture convergence and mechanisms for lifting the air to the condensation level. In the regions where the frequency of extreme events increases (decreases) the anomaly composites during

  14. Relative frequencies of constrained events in stochastic processes: An analytical approach.

    Science.gov (United States)

    Rusconi, S; Akhmatskaya, E; Sokolovski, D; Ballard, N; de la Cal, J C

    2015-10-01

    The stochastic simulation algorithm (SSA) and the corresponding Monte Carlo (MC) method are among the most common approaches for studying stochastic processes. They relies on knowledge of interevent probability density functions (PDFs) and on information about dependencies between all possible events. Analytical representations of a PDF are difficult to specify in advance, in many real life applications. Knowing the shapes of PDFs, and using experimental data, different optimization schemes can be applied in order to evaluate probability density functions and, therefore, the properties of the studied system. Such methods, however, are computationally demanding, and often not feasible. We show that, in the case where experimentally accessed properties are directly related to the frequencies of events involved, it may be possible to replace the heavy Monte Carlo core of optimization schemes with an analytical solution. Such a replacement not only provides a more accurate estimation of the properties of the process, but also reduces the simulation time by a factor of order of the sample size (at least ≈10(4)). The proposed analytical approach is valid for any choice of PDF. The accuracy, computational efficiency, and advantages of the method over MC procedures are demonstrated in the exactly solvable case and in the evaluation of branching fractions in controlled radical polymerization (CRP) of acrylic monomers. This polymerization can be modeled by a constrained stochastic process. Constrained systems are quite common, and this makes the method useful for various applications.

  15. Power Grid Vulnerable to High-Impact, Low-Frequency Events

    Science.gov (United States)

    Tretkoff, Ernie

    2010-07-01

    The power grid is vulnerable to large solar storms and other infrequently occurring events, according to a report released in June by the U.S. Department of Energy (DOE) and the North American Electric Reliability Corporation (NERC). The report was the product of a workshop held in Washington, D. C., in November 2009 in which more than 100 attendees, including representatives of power companies and government agencies, met with experts on risks associated with several types of high-impact, low-frequency (HILF) events, which occur rarely but have the potential to cause extensive damage. One section of the workshop focused on geomagnetic disturbances caused by space weather. When a solar storm occurs, large surges of geomagnetically induced currents can flow through the power grid and damage electrical systems, especially high-voltage power transformers. For example, a March 1989 solar storm left millions of people in Quebec, Canada, without power for hours. Much larger geomagnetic storms have been observed, and very large geomagnetic storms could cause catastrophic damage to large portions of the North American power grid.

  16. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events appendices

    Energy Technology Data Exchange (ETDEWEB)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L. (Science Applications International Corp., Albuquerque, NM (USA); Sandia National Labs., Albuquerque, NM (USA))

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. The mean core damage frequency is 4.5E-6 with 5% and 95% uncertainty bounds of 3.5E-7 and 1.3E-5, respectively. Station blackout type accidents (loss of all ac power) contributed about 46% of the core damage frequency with Anticipated Transient Without Scram (ATWS) accidents contributing another 42%. The numerical results are driven by loss of offsite power, transients with the power conversion system initially available operator errors, and mechanical failure to scram. 13 refs., 345 figs., 171 tabs.

  17. The influence of non-stationarity in extreme hydrological events on flood frequency estimation

    Directory of Open Access Journals (Sweden)

    Šraj Mojca

    2016-12-01

    Full Text Available Substantial evidence shows that the frequency of hydrological extremes has been changing and is likely to continue to change in the near future. Non-stationary models for flood frequency analyses are one method of accounting for these changes in estimating design values. The objective of the present study is to compare four models in terms of goodness of fit, their uncertainties, the parameter estimation methods and the implications for estimating flood quantiles. Stationary and non-stationary models using the GEV distribution were considered, with parameters dependent on time and on annual precipitation. Furthermore, in order to study the influence of the parameter estimation approach on the results, the maximum likelihood (MLE and Bayesian Monte Carlo Markov chain (MCMC methods were compared. The methods were tested for two gauging stations in Slovenia that exhibit significantly increasing trends in annual maximum (AM discharge series. The comparison of the models suggests that the stationary model tends to underestimate flood quantiles relative to the non-stationary models in recent years. The model with annual precipitation as a covariate exhibits the best goodness-of-fit performance. For a 10% increase in annual precipitation, the 10-year flood increases by 8%. Use of the model for design purposes requires scenarios of future annual precipitation. It is argued that these may be obtained more reliably than scenarios of extreme event precipitation which makes the proposed model more practically useful than alternative models.

  18. Analysis of core damage frequency due to external events at the DOE (Department of Energy) N-Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lambright, J.A.; Bohn, M.P.; Daniel, S.L. (Sandia National Labs., Albuquerque, NM (USA)); Baxter, J.T. (Westinghouse Hanford Co., Richland, WA (USA)); Johnson, J.J.; Ravindra, M.K.; Hashimoto, P.O.; Mraz, M.J.; Tong, W.H.; Conoscente, J.P. (EQE, Inc., San Francisco, CA (USA)); Brosseau, D.A. (ERCE, Inc., Albuquerque, NM (USA))

    1990-11-01

    A complete external events probabilistic risk assessment has been performed for the N-Reactor power plant, making full use of all insights gained during the past ten years' developments in risk assessment methodologies. A detailed screening analysis was performed which showed that all external events had negligible contribution to core damage frequency except fires, seismic events, and external flooding. A limited scope analysis of the external flooding risk indicated that it is not a major risk contributor. Detailed analyses of the fire and seismic risks resulted in total (mean) core damage frequencies of 1.96E-5 and 4.60E-05 per reactor year, respectively. Detailed uncertainty analyses were performed for both fire and seismic risks. These results show that the core damage frequency profile for these events is comparable to that found for existing commercial power plants if proposed fixes are completed as part of the restart program. 108 refs., 85 figs., 80 tabs.

  19. Threshold effects in catchment storm response and the occurrence and magnitude of flood events: implications for flood frequency

    Science.gov (United States)

    Kusumastuti, D. I.; Struthers, I.; Sivapalan, M.; Reynolds, D. A.

    2007-08-01

    The aim of this paper is to illustrate the effects of selected catchment storage thresholds upon runoff behaviour, and specifically their impact upon flood frequency. The analysis is carried out with the use of a stochastic rainfall model, incorporating rainfall variability at intra-event, inter-event and seasonal timescales, as well as infrequent summer tropical cyclones, coupled with deterministic rainfall-runoff models that incorporate runoff generation by both saturation excess and subsurface stormflow mechanisms. Changing runoff generation mechanisms (i.e. from subsurface flow to surface runoff) associated with a given threshold (i.e. saturation storage capacity) is shown to be manifested in the flood frequency curve as a break in slope. It is observed that the inclusion of infrequent summer storm events increases the temporal frequency occurrence and magnitude of surface runoff events, in this way contributing to steeper flood frequency curves, and an additional break in the slope of the flood frequency curve. The results of this study highlight the importance of thresholds on flood frequency, and provide insights into the complex interactions between rainfall variability and threshold nonlinearities in the rainfall-runoff process, which are shown to have a significant impact on the resulting flood frequency curves.

  20. Role of dynamical injection locking and characteristic pulse events for low frequency fluctuations in semiconductor lasers

    Science.gov (United States)

    Hicke, K.; Brunner, D.; Soriano, M. C.; Fischer, I.

    2017-11-01

    We investigate the dynamics of semiconductor lasers subject to time-delayed optical feedback from the perspective of dynamical self-injection locking. Based on the Lang-Kobayashi model, we perform an analysis of the well-known Low Frequency Fluctuations (LFFs) in the frequency-intensity plane. Moreover, we investigate a recently found dynamical regime of fragmented LFFs by means of a locking-range analysis, spectral comparison and precursor pulse identification. We show that LFF dynamics can be explained by dynamical optical injection locking due to the delayed optical feedback. Moreover, the fragmented LFFs occur due to a re-injection locking induced by a particular optical pulse structure in the chaotic feedback dynamics. This is corroborated by experiments with a semiconductor laser experiencing delayed feedback from an optical fiber loop. The dynamical nature of the feedback injection results in an eventual loss, but also possible regaining, of the locking, explaining the recently observed phenomenon of fragmented LFFs.

  1. Optical-frequency-comb based ultrasound sensor

    Science.gov (United States)

    Minamikawa, Takeo; Ogura, Takashi; Masuoka, Takashi; Hase, Eiji; Nakajima, Yoshiaki; Yamaoka, Yoshihisa; Minoshima, Kaoru; Yasui, Takeshi

    2017-03-01

    Photo-acoustic imaging is a promising modality for deep tissue imaging with high spatial resolution in the field of biology and medicine. High penetration depth and spatial resolution of the photo-acoustic imaging is achieved by means of the advantages of optical and ultrasound imaging, i.e. tightly focused beam confines ultrasound-generated region within micrometer scale and the ultrasound can propagate through tissues without significant energy loss. To enhance the detection sensitivity and penetration depth of the photo-acoustic imaging, highly sensitive ultrasound detector is greatly desired. In this study, we proposed a novel ultrasound detector employing optical frequency comb (OFC) cavity. Ultrasound generated by the excitation of tightly focused laser beam onto a sample was sensed with a part of an OFC cavity, being encoded into OFC. The spectrally encoded OFC was converted to radio-frequency by the frequency link nature of OFC. The ultrasound-encoded radio-frequency can therefore be directly measured with a high-speed photodetector. We constructed an OFC cavity for ultrasound sensing with a ring-cavity erbium-doped fiber laser. We provided a proof-of-principle demonstration of the detection of ultrasound that was generated by a transducer operating at 10 MHz. Our proposed approach will serve as a unique and powerful tool for detecting ultrasounds for photo-acoustic imaging in the future.

  2. Towards a unified understanding of event-related changes in the EEG: the firefly model of synchronization through cross-frequency phase modulation.

    Directory of Open Access Journals (Sweden)

    Adrian P Burgess

    Full Text Available Although event-related potentials (ERPs are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD, which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i the shape and timing of ERPs at different scalp sites, ii the event-related desynchronization in alpha and synchronization in theta, and iii changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing.

  3. Towards a unified understanding of event-related changes in the EEG: the firefly model of synchronization through cross-frequency phase modulation.

    Science.gov (United States)

    Burgess, Adrian P

    2012-01-01

    Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing.

  4. DNA based Frequency Selective Electromagnetic Interference Shielding (Preprint)

    Science.gov (United States)

    2017-11-03

    AFRL-RX-WP-JA-2017-0495 DNA -BASED FREQUENCY SELECTIVE ELECTROMAGNETIC INTERFERENCE SHIELDING (PREPRINT) Fahima Ouchen, Eric Kreit...To) 31 October 2017 Interim 24 January 2014 – 30 September 2017 4. TITLE AND SUBTITLE DNA -BASED FREQUENCY SELECTIVE ELECTROMAGNETIC INTERFERENCE...92008 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39-18 DNA -based frequency selective electromagnetic interference shielding

  5. An event-based account of conformity.

    Science.gov (United States)

    Kim, Diana; Hommel, Bernhard

    2015-04-01

    People often change their behavior and beliefs when confronted with deviating behavior and beliefs of others, but the mechanisms underlying such phenomena of conformity are not well understood. Here we suggest that people cognitively represent their own actions and others' actions in comparable ways (theory of event coding), so that they may fail to distinguish these two categories of actions. If so, other people's actions that have no social meaning should induce conformity effects, especially if those actions are similar to one's own actions. We found that female participants adjusted their manual judgments of the beauty of female faces in the direction consistent with distracting information without any social meaning (numbers falling within the range of the judgment scale) and that this effect was enhanced when the distracting information was presented in movies showing the actual manual decision-making acts. These results confirm that similarity between an observed action and one's own action matters. We also found that the magnitude of the standard conformity effect was statistically equivalent to the movie-induced effect. © The Author(s) 2015.

  6. An Impact-Based Frequency Up-Converting Hybrid Vibration Energy Harvester for Low Frequency Application

    Directory of Open Access Journals (Sweden)

    Zhenlong Xu

    2017-11-01

    Full Text Available In this paper, a novel impact-based frequency up-converting hybrid energy harvester (FUCHEH was proposed. It consisted of a piezoelectric cantilever beam and a driving beam with a magnetic tip mass. A solenoid coil was attached at the end of the piezoelectric beam. This innovative configuration amplified the relative motion velocity between magnet and coil, resulting in an enhancement of the induced electromotive force in the coil. An electromechanical coupling model was developed and a numerical simulation was performed to study the principle of impact-based frequency up-converting. A prototype was fabricated and experimentally tested. The time-domain and frequency-domain analyses were performed. Fast Fourier transform (FFT analysis verified that fundamental frequencies and coupled vibration frequency contributes most of the output voltage. The measured maximum output power was 769.13 µW at a frequency of 13 Hz and an acceleration amplitude of 1 m/s2, which was 3249.4%- and 100.6%-times larger than that of the frequency up-converting piezoelectric energy harvesters (FUCPEH and frequency up-converting electromagnetic energy harvester (FUCEMEH, respectively. The root mean square (RMS voltage of the piezoelectric energy harvester subsystem (0.919 V was more than 16 times of that of the stand-alone PEH (0.055 V. This paper provided a new scheme to improve generating performance of the vibration energy harvester with high resonant frequency working in the low-frequency vibration environment.

  7. Self-reported frequency and impact of hypoglycaemic events in insulin-treated diabetic patients in Austria.

    Science.gov (United States)

    Weitgasser, Raimund; Lopes, Sandra

    2015-01-01

    Hypoglycaemia is a common side effect of insulin therapy and presents a barrier to diabetes management, however, limited data exist on the real-world frequency of events. We investigated the self-reported rates of non-severe and severe hypoglycaemic events in Austria. We also explored hypoglycaemia awareness, patient-physician communication and the health-related and economic impact of events. People with Type-1 or insulin-treated Type-2 diabetes > 15 years of age completed up to 4 questionnaires (weekly intervals). Non-severe hypoglycaemic events were defined by requiring no assistance while severe hypoglycaemic events need help from a third party. Overall, 553 respondents (40 % Type-1, 60 % Type-2) enrolled, providing a total of 1,773 patient-weeks. The mean annual non-severe event frequencies were 85 for Type-1 and 15-28 for Type-2 (depending on insulin regimen). In respondents who experienced ≥ 1 non-severe event in the study period, annual rates were 18 % higher in Type-1 and 77 % higher in Type-2. The proportion of respondents reporting 'awareness' of hypoglycaemic symptoms was 48 % for Type-1 and 43-61 % for Type-2 respondents. The proportion of respondents who rarely/never inform their physician of hypoglycaemic events was 67 % (Type-1) and 43-53 % (Type-2). The most commonly reported health-related impacts were tiredness/fatigue (58 % of events) and reduced alertness (41 % of events). Non-severe hypoglycaemic events are common in Type-1 and insulin-treated Type-2 diabetes patients in Austria. There may be subgroups of patients who are predisposed to higher rates of non-severe events. Even non-severe events have a negative impact on physical and emotional well-being.

  8. Frequency, Magnitude, and Possible Causes of Stranding and Mass-Mortality Events of the Beach Clam Tivela mactroides (Bivalvia: Veneridae.

    Directory of Open Access Journals (Sweden)

    Alexander Turra

    Full Text Available Stranding combined with mass-mortality events of sandy-beach organisms is a frequent but little-understood phenomenon, which is generally studied based on discrete episodes. The frequency, magnitude, and possible causes of stranding and mass-mortality events of the trigonal clam Tivela mactroides were assessed based on censuses of stranded individuals, every four days from September 2007 through December 2008, in Caraguatatuba Bay, southeastern Brazil. Stranded clams were classified as dying (closed valves did not open when forced or dead (closed valves were easily opened. Information on wave parameters and the living intertidal clam population was used to assess possible causes of stranding. This fine-scale monitoring showed that stranding occurred widely along the shore and year-round, with peaks interspersed with periods of low or no mortality. Dead clams showed higher mean density than dying individuals, but a lower mean shell length, attributed to a higher tolerance to desiccation of larger individuals. Wave height had a significant negative relationship to the density of dying individuals, presumed to be due to the accretive nature of low-energy waves: when digging out, clams would be more prone to be carried upward and unable to return; while larger waves, breaking farther from the beach and with a stronger backwash, would prevent stranding in the uppermost areas. This ecological finding highlights the need for refined temporal studies on mortality events, in order to understand them more clearly. Last, the similar size structure of stranded clams and the living population indicated that the stranded individuals are from the intertidal or shallow subtidal zone, and reinforces the ecological and behavioral components of this process, which have important ecological and socioeconomic implications for the management of this population.

  9. Laser frequency-offset locking based on the frequency modulation spectroscopy with higher harmonic detection

    Science.gov (United States)

    Wang, Anqi; Meng, Zhixin; Feng, Yanying

    2017-10-01

    We design a fiber electro-optic modulator (FEOM)-based laser frequency-offset locking system using frequency modulation spectroscopy (FMS) with the 3F modulation. The modulation signal and the frequency-offset control signal are simultaneously loaded on the FEOM by a mixer in order to suppress the frequency and power jitter caused by internal modulation on the current or piezoelectric ceramic transducer (PZT). It is expected to accomplish a fast locking, a widely tunable frequency-offset, a sensitive and rapid detection of narrow spectral features with the 3F modulation. The laser frequency fluctuation is limited to +/-1MHz and its overlapping Allan deviation is around 10-12 in twenty minutes, which successfully meets the requirements of the cold atom interferometer.

  10. Extreme Weather Events and Their Relationship to Low Frequency Teleconnection Patterns

    Science.gov (United States)

    Chang, Yehui; Schubert, Siegfried

    2002-01-01

    A new method for identifying the structure and other characteristics of extreme weather events is introduced and applied to both model simulations and observations. The approach is based on a linear regression model that links daily extreme precipitation amounts for a particular point on the globe to precipitation and related quantities at all other points. We present here some initial results of our analysis of extreme precipitation events over the United States, including how they are influenced by ENSO and various large-scale teleconnection patterns such as the PNA. The results are based on simulations made with the NASA/NCAR AGCM (Lin and Rood 1996). The quality of the simulated climate for the NASA/NCAR AGCM forced with observed SSTs is described in Chang et al. (2001). The runs analyzed here consist of three 20-year runs forced with idealized cold, neutral and warm ENSO SST anomalies (superimposed on the mean seasonal cycle of SST). The idealized warm or cold SST anomalies are fixed throughout each 20- year simulation and consist of the first EOF (+/- 3 standard deviations) of monthly SST data. Comparisons are made with the results obtained from a similar analysis that uses daily NOAA precipitation observations (Higgins et al. 1996) over the United States and NCEP/NCAR reanalysis data for the period 1949-1998.

  11. Framework for Modeling High-Impact, Low-Frequency Power Grid Events to Support Risk-Informed Decisions

    Energy Technology Data Exchange (ETDEWEB)

    Veeramany, Arun; Unwin, Stephen D.; Coles, Garill A.; Dagle, Jeffery E.; Millard, W. David; Yao, Juan; Glantz, Clifford S.; Gourisetti, Sri Nikhil Gup

    2015-12-03

    Natural and man-made hazardous events resulting in loss of grid infrastructure assets challenge the electric power grid’s security and resilience. However, the planning and allocation of appropriate contingency resources for such events requires an understanding of their likelihood and the extent of their potential impact. Where these events are of low likelihood, a risk-informed perspective on planning can be problematic as there exists an insufficient statistical basis to directly estimate the probabilities and consequences of their occurrence. Since risk-informed decisions rely on such knowledge, a basis for modeling the risk associated with high-impact low frequency events (HILFs) is essential. Insights from such a model can inform where resources are most rationally and effectively expended. The present effort is focused on development of a HILF risk assessment framework. Such a framework is intended to provide the conceptual and overarching technical basis for the development of HILF risk models that can inform decision makers across numerous stakeholder sectors. The North American Electric Reliability Corporation (NERC) 2014 Standard TPL-001-4 considers severe events for transmission reliability planning, but does not address events of such severity that they have the potential to fail a substantial fraction of grid assets over a region, such as geomagnetic disturbances (GMD), extreme seismic events, and coordinated cyber-physical attacks. These are beyond current planning guidelines. As noted, the risks associated with such events cannot be statistically estimated based on historic experience; however, there does exist a stable of risk modeling techniques for rare events that have proven of value across a wide range of engineering application domains. There is an active and growing interest in evaluating the value of risk management techniques in the State transmission planning and emergency response communities, some of this interest in the context of

  12. Towards the Realization of Graphene Based Flexible Radio Frequency Receiver

    Directory of Open Access Journals (Sweden)

    Maruthi N. Yogeesh

    2015-11-01

    Full Text Available We report on our progress and development of high speed flexible graphene field effect transistors (GFETs with high electron and hole mobilities (~3000 cm2/V·s, and intrinsic transit frequency in the microwave GHz regime. We also describe the design and fabrication of flexible graphene based radio frequency system. This RF communication system consists of graphite patch antenna at 2.4 GHz, graphene based frequency translation block (frequency doubler and AM demodulator and graphene speaker. The communication blocks are utilized to demonstrate graphene based amplitude modulated (AM radio receiver operating at 2.4 GHz.

  13. Event-based Simulation Model for Quantum Optics Experiments

    NARCIS (Netherlands)

    De Raedt, H.; Michielsen, K.; Jaeger, G; Khrennikov, A; Schlosshauer, M; Weihs, G

    2011-01-01

    We present a corpuscular simulation model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory by generating detection events one-by-one. The event-based corpuscular model gives a unified

  14. Time-frequency analysis of chemosensory event-related potentials to characterize the cortical representation of odors in humans.

    Directory of Open Access Journals (Sweden)

    Caroline Huart

    Full Text Available BACKGROUND: The recording of olfactory and trigeminal chemosensory event-related potentials (ERPs has been proposed as an objective and non-invasive technique to study the cortical processing of odors in humans. Until now, the responses have been characterized mainly using across-trial averaging in the time domain. Unfortunately, chemosensory ERPs, in particular, olfactory ERPs, exhibit a relatively low signal-to-noise ratio. Hence, although the technique is increasingly used in basic research as well as in clinical practice to evaluate people suffering from olfactory disorders, its current clinical relevance remains very limited. Here, we used a time-frequency analysis based on the wavelet transform to reveal EEG responses that are not strictly phase-locked to onset of the chemosensory stimulus. We hypothesized that this approach would significantly enhance the signal-to-noise ratio of the EEG responses to chemosensory stimulation because, as compared to conventional time-domain averaging, (1 it is less sensitive to temporal jitter and (2 it can reveal non phase-locked EEG responses such as event-related synchronization and desynchronization. METHODOLOGY/PRINCIPAL FINDINGS: EEG responses to selective trigeminal and olfactory stimulation were recorded in 11 normosmic subjects. A Morlet wavelet was used to characterize the elicited responses in the time-frequency domain. We found that this approach markedly improved the signal-to-noise ratio of the obtained EEG responses, in particular, following olfactory stimulation. Furthermore, the approach allowed characterizing non phase-locked components that could not be identified using conventional time-domain averaging. CONCLUSION/SIGNIFICANCE: By providing a more robust and complete view of how odors are represented in the human brain, our approach could constitute the basis for a robust tool to study olfaction, both for basic research and clinicians.

  15. Frequency of adverse events and mortality in patients with pleural empyema in a public referral hospital in Mexico City.

    Science.gov (United States)

    Herrera-Kiengelher, L; Báez-Saldaña, R; Salas-Hernández, J; Avalos-Bracho, A; Pérez-Padilla, R; Torre-Bouscoulet, L

    2010-09-01

    Adverse events (AEs) that occur during medical treatment are a public health problem. 1) To measure the prevalence of AEs, 2) to characterize those that occur in patients diagnosed with empyema and 3) to analyze the mortality rate associated with the presence of empyema. Retrospective case series based on a review of files of patient diagnosed with empyema. A total of 347 files were assessed, reporting 96.6% of the total number of patients diagnosed with empyema in that period. There were 176 AEs reported for 150 of the patients. The frequency of at least one AE was 43%, with prolonged hospitalization being the most frequent condition. In these cases, 97% of the AEs were considered preventable. Intrahospital mortality was 4.8%, with age (HR for every 5 years 1.21, 95%CI 1.08-1.35, P diabetes mellitus (HR 2.26, 95%CI 1.0-5.0, P = 0.04) being significant associated factors. There was a high frequency of AEs in patients with empyema, but most were considered preventable, especially the length of hospitalization, which could be reduced through timely surgery.

  16. Synchronization for space based ultra low frequency interferometry

    NARCIS (Netherlands)

    Rajan, Raj; Rajan, R.T.; Bentum, Marinus Jan; Boonstra, A.J.

    2013-01-01

    Recently, there is a renewed interest in space based arrays for low frequency observations. Orbiting Low Frequency Antenna Array (OLFAR) is one such project which investigates the feasibility of a space based array of ≥ 10 satellites, observing the cosmos in the 0.3 - 30 MHz spectrum. The OLFAR

  17. IBES: A Tool for Creating Instructions Based on Event Segmentation

    Directory of Open Access Journals (Sweden)

    Katharina eMura

    2013-12-01

    Full Text Available Receiving informative, well-structured, and well-designed instructions supports performance and memory in assembly tasks. We describe IBES, a tool with which users can quickly and easily create multimedia, step-by-step instructions by segmenting a video of a task into segments. In a validation study we demonstrate that the step-by-step structure of the visual instructions created by the tool corresponds to the natural event boundaries, which are assessed by event segmentation and are known to play an important role in memory processes. In one part of the study, twenty participants created instructions based on videos of two different scenarios by using the proposed tool. In the other part of the study, ten and twelve participants respectively segmented videos of the same scenarios yielding event boundaries for coarse and fine events. We found that the visual steps chosen by the participants for creating the instruction manual had corresponding events in the event segmentation. The number of instructional steps was a compromise between the number of fine and coarse events. Our interpretation of results is that the tool picks up on natural human event perception processes of segmenting an ongoing activity into events and enables the convenient transfer into meaningful multimedia instructions for assembly tasks. We discuss the practical application of IBES, for example, creating manuals for differing expertise levels, and give suggestions for research on user-oriented instructional design based on this tool.

  18. Water based fluidic radio frequency metamaterials

    Science.gov (United States)

    Cai, Xiaobing; Zhao, Shaolin; Hu, Mingjun; Xiao, Junfeng; Zhang, Naibo; Yang, Jun

    2017-11-01

    Electromagnetic metamaterials offer great flexibility for wave manipulation and enable exceptional functionality design, ranging from negative refraction, anomalous reflection, super-resolution imaging, transformation optics to cloaking, etc. However, demonstration of metamaterials with unprecedented functionalities is still challenging and costly due to the structural complexity or special material properties. Here, we demonstrate for the first time the versatile fluidic radio frequency metamaterials with negative refraction using a water-embedded and metal-coated 3D architecture. Effective medium analysis confirms that metallic frames create an evanescent environment while simultaneously water cylinders produce negative permeability under Mie resonance. The water-metal coupled 3D architectures and the accessory devices for measurement are fabricated by 3D printing with post electroless deposition. Our study also reveals the great potential of fluidic metamaterials and versatility of the 3D printing process in rapid prototyping of customized metamaterials.

  19. Very Flat Optical Frequency Comb Generation based on Polarization Modulator and Recirculation Frequency Shifter

    Science.gov (United States)

    Li, Ruxing; Wu, Shibao; Ye, Shujian; Cui, Yaling

    2017-12-01

    We propose and experimentally demonstrate a novel scheme to generate a flat and stable optical frequency comb by using recirculation frequency shifter (RFS) loop and polarization-modulator based single-side-band modulator (PSSBM). An ultra-flat 5-carriers generated by a polarization modulator is set as the seed light source of the recirculating loop, and the recirculation times is greatly reduced compared with the regular scheme based on RFS. Through theoretical analysis and experiment simulations, it is shown that the optical frequency comb of high quality with 50 spectral lines and the flatness fluctuation of less than 1 dB can be achieved.

  20. Power quality events recognition using a SVM-based method

    Energy Technology Data Exchange (ETDEWEB)

    Cerqueira, Augusto Santiago; Ferreira, Danton Diego; Ribeiro, Moises Vidal; Duque, Carlos Augusto [Department of Electrical Circuits, Federal University of Juiz de Fora, Campus Universitario, 36036 900, Juiz de Fora MG (Brazil)

    2008-09-15

    In this paper, a novel SVM-based method for power quality event classification is proposed. A simple approach for feature extraction is introduced, based on the subtraction of the fundamental component from the acquired voltage signal. The resulting signal is presented to a support vector machine for event classification. Results from simulation are presented and compared with two other methods, the OTFR and the LCEC. The proposed method shown an improved performance followed by a reasonable computational cost. (author)

  1. Do changes in the frequency, magnitude and timing of extreme climatic events threaten the population viability of coastal birds?

    NARCIS (Netherlands)

    van de Pol, Martijn; Ens, Bruno J.; Heg, Dik; Brouwer, Lyanne; Krol, Johan; Maier, Martin; Exo, Klaus-Michael; Oosterbeek, Kees; Lok, Tamar; Eising, Corine M.; Koffijberg, Kees

    P>1. Climate change encompasses changes in both the means and the extremes of climatic variables, but the population consequences of the latter are intrinsically difficult to study. 2. We investigated whether the frequency, magnitude and timing of rare but catastrophic flooding events have changed

  2. Assessing loss event frequencies of smart grid cyber threats: Encoding flexibility into FAIR using Bayesian network approach

    NARCIS (Netherlands)

    Le, Anhtuan; Chen, Yue; Chai, Kok Keong; Vasenev, Alexandr; Montoya, L.

    Assessing loss event frequencies (LEF) of smart grid cyber threats is essential for planning cost-effective countermeasures. Factor Analysis of Information Risk (FAIR) is a well-known framework that can be applied to consider threats in a structured manner by using look-up tables related to a

  3. Spatiotemporal Features for Asynchronous Event-based Data

    Directory of Open Access Journals (Sweden)

    Xavier eLagorce

    2015-02-01

    Full Text Available Bio-inspired asynchronous event-based vision sensors are currently introducing a paradigm shift in visual information processing. These new sensors rely on a stimulus-driven principle of light acquisition similar to biological retinas. They are event-driven and fully asynchronous, thereby reducing redundancy and encoding exact times of input signal changes, leading to a very precise temporal resolution. Approaches for higher-level computer vision often rely on the realiable detection of features in visual frames, but similar definitions of features for the novel dynamic and event-based visual input representation of silicon retinas have so far been lacking. This article addresses the problem of learning and recognizing features for event-based vision sensors, which capture properties of truly spatiotemporal volumes of sparse visual event information. A novel computational architecture for learning and encoding spatiotemporal features is introduced based on a set of predictive recurrent reservoir networks, competing via winner-take-all selection. Features are learned in an unsupervised manner from real-world input recorded with event-based vision sensors. It is shown that the networks in the architecture learn distinct and task-specific dynamic visual features, and can predict their trajectories over time.

  4. Prestress Force Identification for Externally Prestressed Concrete Beam Based on Frequency Equation and Measured Frequencies

    Directory of Open Access Journals (Sweden)

    Luning Shi

    2014-01-01

    Full Text Available A prestress force identification method for externally prestressed concrete uniform beam based on the frequency equation and the measured frequencies is developed. For the purpose of the prestress force identification accuracy, we first look for the appropriate method to solve the free vibration equation of externally prestressed concrete beam and then combine the measured frequencies with frequency equation to identify the prestress force. To obtain the exact solution of the free vibration equation of multispan externally prestressed concrete beam, an analytical model of externally prestressed concrete beam is set up based on the Bernoulli-Euler beam theory and the function relation between prestress variation and vibration displacement is built. The multispan externally prestressed concrete beam is taken as the multiple single-span beams which must meet the bending moment and rotation angle boundary conditions, the free vibration equation is solved using sublevel simultaneous method and the semi-analytical solution of the free vibration equation which considered the influence of prestress on section rigidity and beam length is obtained. Taking simply supported concrete beam and two-span concrete beam with external tendons as examples, frequency function curves are obtained with the measured frequencies into it and the prestress force can be identified using the abscissa of the crosspoint of frequency functions. Identification value of the prestress force is in good agreement with the test results. The method can accurately identify prestress force of externally prestressed concrete beam and trace the trend of effective prestress force.

  5. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  6. Long-range vibration sensor based on correlation analysis of optical frequency-domain reflectometry signals.

    Science.gov (United States)

    Ding, Zhenyang; Yao, X Steve; Liu, Tiegen; Du, Yang; Liu, Kun; Han, Qun; Meng, Zhuo; Chen, Hongxin

    2012-12-17

    We present a novel method to achieve a space-resolved long- range vibration detection system based on the correlation analysis of the optical frequency-domain reflectometry (OFDR) signals. By performing two separate measurements of the vibrated and non-vibrated states on a test fiber, the vibration frequency and position of a vibration event can be obtained by analyzing the cross-correlation between beat signals of the vibrated and non-vibrated states in a spatial domain, where the beat signals are generated from interferences between local Rayleigh backscattering signals of the test fiber and local light oscillator. Using the proposed technique, we constructed a standard single-mode fiber based vibration sensor that can have a dynamic range of 12 km and a measurable vibration frequency up to 2 kHz with a spatial resolution of 5 m. Moreover, preliminarily investigation results of two vibration events located at different positions along the test fiber are also reported.

  7. Frequency of Early Predialysis Nephrology Care and Postdialysis Cardiovascular Events.

    Science.gov (United States)

    Yang, Ju-Yeh; Huang, Jenq-Wen; Chen, Likwang; Chen, Yun-Yi; Pai, Mei-Fen; Tung, Kuei-Ting; Peng, Yu-Sen; Hung, Kuan-Yu

    2017-08-01

    Patients with kidney failure are at a high risk for cardiovascular events. Predialysis nephrology care has been reported to improve postdialysis survival, but its effects on postdialysis major adverse cardiovascular events (MACEs) have not been comprehensively studied. Observational cohort study. We used data from the National Health Insurance Research Database in Taiwan. Adult patients who initiated maintenance dialysis therapy in 1999 to 2010 were enrolled. We created 3 subtypes of predialysis nephrology care based on the time between the first nephrology visit and the initiation of dialysis therapy: early frequent (duration ≥ 6 months; at least 1 nephrology visit every 3 months), early infrequent (duration ≥ 6 months, nephrology visit every 3 months), and late (duration nephrology care with postdialysis 1-year MACEs. Among the 60,329 eligible patients, 24,477 (40.6%) had early frequent, 12,763 (21.2%) had early infrequent, and 23,089 (38.3%) had late nephrology care. Compared to the late-nephrology-care group, the early-frequent group was associated with an ∼10% lower risk for 1-year MACEs (HR of 0.89 [95% CI, 0.82-0.96] for first MACE and relative risk of 0.91 [95% CI, 0.84-0.98] for recurrent MACEs). However, the early-infrequent-care group had similar risks for MACEs as the late group (HR of 0.95 [95% CI, 0.86-1.05] for first MACE and relative risk of 0.94 [95% CI, 0.86-1.02] for recurrent MACEs). Lack of physical and biochemical information because of inherent limitations from administrative claims data. Early frequent nephrology care for 6 or more months before the initiation of long-term dialysis therapy may improve 1-year postdialysis major cardiovascular outcomes. Copyright © 2017 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  8. Event-based prospective memory performance in autism spectrum disorder.

    Science.gov (United States)

    Altgassen, Mareike; Schmitz-Hübsch, Maren; Kliegel, Matthias

    2010-03-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to remember to respond to a cue-event. Everyday planning performance was assessed with proxy ratings. Although parents of the autism group rated their children's everyday performance as significantly poorer than controls' parents, no group differences were found in event-based prospective memory. Nevertheless, individual differences in laboratory-based and everyday performances were related. Clinical implications of these findings are discussed.

  9. Sequencing biological and physical events affects specific frequency bands within the human premotor cortex: an intracerebral EEG study.

    Directory of Open Access Journals (Sweden)

    Fausto Caruana

    Full Text Available Evidence that the human premotor cortex (PMC is activated by cognitive functions involving the motor domain is classically explained as the reactivation of a motor program decoupled from its executive functions, and exploited for different purposes by means of a motor simulation. In contrast, the evidence that PMC contributes to the sequencing of non-biological events cannot be explained by the simulationist theory. Here we investigated how motor simulation and event sequencing coexist within the PMC and how these mechanisms interact when both functions are executed. We asked patients with depth electrodes implanted in the PMC to passively observe a randomized arrangement of images depicting biological actions and physical events and, in a second block, to sequence them in the correct order. This task allowed us to disambiguate between the simple observation of actions, their sequencing (recruiting different motor simulation processes, as well as the sequencing of non-biological events (recruiting a sequencer mechanism non dependant on motor simulation. We analysed the response of the gamma, alpha and beta frequency bands to evaluate the contribution of each brain rhythm to the observation and sequencing of both biological and non-biological stimuli. We found that motor simulation (biological>physical and event sequencing (sequencing>observation differently affect the three investigated frequency bands: motor simulation was reflected on the gamma and, partially, in the beta, but not in the alpha band. In contrast, event sequencing was also reflected on the alpha band.

  10. Radio Frequency Based Water Level Monitor and Controller for ...

    African Journals Online (AJOL)

    This paper elucidates a radio frequency (RF) based transmission and reception system used to remotely monitor and control the water Level of an overhead tank placed up to 100 meters away from the pump and controller. It uses two Radio Frequency transceivers along with a controller each installed at the overhead tank ...

  11. Do the frequencies of adverse events increase, decrease, or stay the same with long-term use of statins?

    Science.gov (United States)

    Huddy, Karlyn; Dhesi, Pavittarpaul; Thompson, Paul D

    2013-02-01

    Statins are widely used for their cholesterol-lowering properties and proven reduction of cardiovascular disease risk. Many patients take statins as long-term treatment for a variety of conditions without a clear-cut understanding of how treatment duration affects the frequency of adverse effects. We aimed to evaluate whether the frequencies of documented adverse events increase, decrease, or remain unchanged with long-term statin use. We reviewed the established literature to define the currently known adverse effects of statin therapy, including myopathy, central nervous system effects, and the appearance of diabetes, and the frequency of these events with long-term medication use. The frequency of adverse effects associated with long-term statin therapy appears to be low. Many patients who develop side effects from statin therapy do so relatively soon after initiation of therapy, so the frequency of side effects from statin therapy when expressed as a percentage of current users decreases over time. Nevertheless, patients may develop side effects such as muscle pain and weakness years after starting statin therapy; however, the absolute number of patients affected by statin myopathy increases with treatment duration. Also, clinical trials of statin therapy rarely exceed 5 years, so it is impossible to determine with certainty the frequency of long-term side effects with these drugs.

  12. Transformer-based design techniques for oscillators and frequency dividers

    CERN Document Server

    Luong, Howard Cam

    2016-01-01

    This book provides in-depth coverage of transformer-based design techniques that enable CMOS oscillators and frequency dividers to achieve state-of-the-art performance.  Design, optimization, and measured performance of oscillators and frequency dividers for different applications are discussed in detail, focusing on not only ultra-low supply voltage but also ultra-wide frequency tuning range and locking range.  This book will be an invaluable reference for anyone working or interested in CMOS radio-frequency or mm-Wave integrated circuits and systems.

  13. Frequency and Character of Extreme Aerosol Events in the Southwestern United States: A Case Study Analysis in Arizona

    Directory of Open Access Journals (Sweden)

    David H. Lopez

    2015-12-01

    Full Text Available This study uses more than a decade’s worth of data across Arizona to characterize the spatiotemporal distribution, frequency, and source of extreme aerosol events, defined as when the concentration of a species on a particular day exceeds that of the average plus two standard deviations for that given month. Depending on which of eight sites studied, between 5% and 7% of the total days exhibited an extreme aerosol event due to either extreme levels of PM10, PM2.5, and/or fine soil. Grand Canyon exhibited the most extreme event days (120, i.e., 7% of its total days. Fine soil is the pollutant type that most frequently impacted multiple sites at once at an extreme level. PM10, PM2.5, fine soil, non-Asian dust, and Elemental Carbon extreme events occurred most frequently in August. Nearly all Asian dust extreme events occurred between March and June. Extreme Elemental Carbon events have decreased as a function of time with statistical significance, while other pollutant categories did not show any significant change. Extreme events were most frequent for the various pollutant categories on either Wednesday or Thursday, but there was no statistically significant difference in the number of events on any particular day or on weekends versus weekdays.

  14. SUBTLEX- AL: Albanian word frequencies based on film subtitles

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Rrezarta Avdyli

    2013-06-01

    Full Text Available Recently several studies have shown that word frequency estimation based on subtitle files explains better the variance in word recognition performance than traditional words frequency estimates did. The present study aims to show this frequency estimate in Albanian from more than 2M words coming from film subtitles. Our results show high correlation between the RT from a LD study (120 stimuli and the SUBTLEX- AL, as well as, high correlation between this and the unique existing frequency list of a hundred more frequent Albanian words. These findings suggest that SUBTLEX-AL it is good frequency estimation, furthermore, this is the first database of frequency estimation in Albanian larger than 100 words.

  15. Effects of the major sudden stratospheric warming event of 2009 on the subionospheric very low frequency/low frequency radio signals

    Science.gov (United States)

    Pal, S.; Hobara, Y.; Chakrabarti, S. K.; Schnoor, P. W.

    2017-07-01

    This paper presents effects of the major sudden stratospheric warming (SSW) event of 2009 on the subionospheric very low frequency/low frequency (VLF/LF) radio signals propagating in the Earth-ionosphere waveguide. Signal amplitudes from four transmitters received by VLF/LF radio networks of Germany and Japan corresponding to the major SSW event are investigated for possible anomalies and atmospheric influence on the high- to middle-latitude ionosphere. Significant anomalous increase or decrease of nighttime and daytime amplitudes of VLF/LF signals by ˜3-5 dB during the SSW event have been found for all propagation paths associated with stratospheric temperature rise at 10 hPa level. Increase or decrease in VLF/LF amplitudes during daytime and nighttime is actually due to the modification of the lower ionospheric boundary conditions in terms of electron density and electron-neutral collision frequency profiles and associated modal interference effects between the different propagating waveguide modes during the SSW period. TIMED/SABER mission data are also used to investigate the upper mesospheric conditions over the VLF/LF propagation path during the same time period. We observe a decrease in neutral temperature and an increase in pressure at the height of 75-80 km around the peak time of the event. VLF/LF anomalies are correlated and in phase with the stratospheric temperature and mesospheric pressure variation, while minimum of mesospheric cooling shows a 2-3 day delay with maximum VLF/LF anomalies. Simulations of VLF/LF diurnal variation are performed using the well-known Long Wave Propagating Capability (LWPC) code within the Earth-ionosphere waveguide to explain the VLF/LF anomalies qualitatively.

  16. Energy based correlation criteria in the mid-frequency range

    Science.gov (United States)

    Biedermann, J.; Winter, R.; Wandel, M.; Böswald, M.

    2017-07-01

    Aircraft structures are characterized by their lightweight design. As such, they are prone to vibrations. Numerical models based on the Finite Element Method often show significant deviations when the mid-frequency range is considered, where strong interaction between vibrations and acoustics is present. Model validation based on experimental modal data is often not possible due to the high modal density that aircraft fuselage structures exhibit in this frequency range. Classical correlation criteria like the Modal Assurance Criterion require mode shapes and can therefore not be applied. Other correlation criteria using frequency response data, such as the Frequency Domain Assurance Criterion, are highly sensitive to even small structural modifications and fail to indicated the correlation between test and analysis data in the mid-frequency range. Nevertheless, validated numerical models for the mid- to high-frequency ranges are a prerequisite for acoustic comfort predictions of aircraft cabin. This paper presents a new method for the correlation of response data from test and analysis in the mid-frequency range to support model validation in the mid-frequency range and to enable the usage of finite element models in this frequency range. The method is validated on a stiffened cylindrical shell structure, which represents a scale-model of an aircraft fuselage. The correlation criterion presented here is inspired by Statistical Energy Analysis and is based on kinetic energies integrated over frequency bands and spatially integrated over surface areas of the structure. The objective is to indicate frequency bands where the finite element model needs to be adjusted to better match with experimental observations and to locate the areas where these adjustments should be applied.

  17. A Kalman-based Fundamental Frequency Estimation Algorithm

    DEFF Research Database (Denmark)

    Shi, Liming; Nielsen, Jesper Kjær; Jensen, Jesper Rindom

    2017-01-01

    Fundamental frequency estimation is an important task in speech and audio analysis. Harmonic model-based methods typically have superior estimation accuracy. However, such methods usually as- sume that the fundamental frequency and amplitudes are station- ary over a short time frame. In this paper......, we propose a Kalman filter-based fundamental frequency estimation algorithm using the harmonic model, where the fundamental frequency and amplitudes can be truly nonstationary by modeling their time variations as first- order Markov chains. The Kalman observation equation is derived from the harmonic...... model and formulated as a compact nonlinear matrix form, which is further used to derive an extended Kalman filter. Detailed and continuous fundamental frequency and ampli- tude estimates for speech, the sustained vowel /a/ and solo musical tones with vibrato are demonstrated....

  18. Linear encoder based low frequency inertial sensor

    Directory of Open Access Journals (Sweden)

    Collette Christophe

    2015-01-01

    Full Text Available For many applications, there is an increasing demand for low cost, high-resolution inertial sensors, which are capable of operating in harsh environments. Recently, a prototype of small optical inertial sensor has been built, using a Michelson interferometer. A resolution of 3 pm/√Hz has been obtained above 4 Hz using only low cost components. Compared to most state-of-the-art devices, this prototype did not contain any coil, which offers several important advantages, including a low thermal noise in the suspension and a full compatibility with magnetic environments (like particle collider. On the other hand, the Michelson is known to be tricky to tune, especially when one attempts to miniaturize the sensor. In this paper, we will propose a novel concept of inertial sensor, based on a linear encoder. Compared to the Michelson, the encoder is much more easy to mount, and the calibration more stable. The price to pay is a reduced resolution. In order to overcome this limitation, we amplify mechanically the relative motion between the support and the inertial mass. First results obtained with the new sensor will be discussed, and compared with the Michelson inertial sensor.

  19. Event-based prospective memory performance in autism spectrum disorder

    NARCIS (Netherlands)

    Altgassen, A.M.; Schmitz-Hübsch, M.; Kliegel, M.

    2010-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with

  20. Lung Cancer Screening CT-Based Prediction of Cardiovascular Events

    NARCIS (Netherlands)

    Mets, Onno M.; Vliegenthart, Rozemarijn; Gondrie, Martijn J.; Viergever, Max A.; Oudkerk, Matthijs; de Koning, Harry J.; Mali, Willem P. Th M.; Prokop, Mathias; van Klaveren, Rob J.; van der Graaf, Yolanda; Buckens, Constantinus F. M.; Zanen, Pieter; Lammers, Jan-Willem J.; Groen, Harry J. M.; Isgum, Ivana; de Jong, Pim A.

    OBJECTIVES The aim of this study was to derivate and validate a prediction model for cardiovascular events based on quantification of coronary and aortic calcium volume in lung cancer screening chest computed tomography (CT). BACKGROUND CT-based lung cancer screening in heavy smokers is a very

  1. An evaluation of entity and frequency based query completion methods

    NARCIS (Netherlands)

    Meij, E.; Mika, P.; Zaragoza, H.; Sanderson, M.; Zhai, C.; Zobel, J.; Allan, J.; Aslam, J.A.

    2009-01-01

    We present a semantic approach to suggesting query completions which leverages entity and type information. When compared to a frequency-based approach, we show that such information mostly helps rare queries.

  2. An Oracle-based Event Index for ATLAS

    CERN Document Server

    Gallas, Elizabeth; The ATLAS collaboration; Petrova, Petya Tsvetanova; Baranowski, Zbigniew; Canali, Luca; Formica, Andrea; Dumitru, Andrei

    2016-01-01

    The ATLAS EventIndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS, the services we have built based on this architecture, and our experience with it. We've indexed about 15 billion real data events and about 25 billion simulated events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year for real data and simulation, respectively. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data ...

  3. Event-based prospective memory performance in autism spectrum disorder

    OpenAIRE

    Altgassen, Mareike; Schmitz-H?bsch, Maren; Kliegel, Matthias

    2009-01-01

    The purpose of the present study was to investigate event-based prospective memory performance in individuals with autism spectrum disorder and to explore possible relations between laboratory-based prospective memory performance and everyday performance. Nineteen children and adolescents with autism spectrum disorder and 19 matched neurotypical controls participated. The laboratory-based prospective memory test was embedded in a visuo-spatial working memory test and required participants to ...

  4. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  5. Accelerometer-Based Event Detector for Low-Power Applications

    Directory of Open Access Journals (Sweden)

    József Smidla

    2013-10-01

    Full Text Available In this paper, an adaptive, autocovariance-based event detection algorithm is proposed, which can be used with micro-electro-mechanical systems (MEMS accelerometer sensors to build inexpensive and power efficient event detectors. The algorithm works well with low signal-to-noise ratio input signals, and its computational complexity is very low, allowing its utilization on inexpensive low-end embedded sensor devices. The proposed algorithm decreases its energy consumption by lowering its duty cycle, as much as the event to be detected allows it. The performance of the algorithm is tested and compared to the conventional filter-based approach. The comparison was performed in an application where illegal entering of vehicles into restricted areas was detected.

  6. Detection of comfortable temperature based on thermal events detection indoors

    Science.gov (United States)

    Szczurek, Andrzej; Maciejewska, Monika; Uchroński, Mariusz

    2017-11-01

    This work focussed on thermal comfort as the basis to control indoor conditions. Its objective is a method to determine thermal preferences of office occupants. The method is based on detection of thermal events. They occur when indoor conditions are under control of occupants. Thermal events are associated with the use of local heating/cooling sources which have user-adjustable settings. The detection is based on Fourier analysis of indoor temperature time series. The relevant data is collected by temperature sensor. We achieved thermal events recognition rate of 86 %. Conditions when indoor conditions were beyond control were detected with 95.6 % success rate. Using experimental data it was demonstrated that the method allows to reproduce key elements of temperature statistics associated with conditions when occupants are in control of thermal comfort.

  7. Multivariate hydrological frequency analysis for extreme events using Archimedean copula. Case study: Lower Tunjuelo River basin (Colombia)

    Science.gov (United States)

    Gómez, Wilmar

    2017-04-01

    By analyzing the spatial and temporal variability of extreme precipitation events we can prevent or reduce the threat and risk. Many water resources projects require joint probability distributions of random variables such as precipitation intensity and duration, which can not be independent with each other. The problem of defining a probability model for observations of several dependent variables is greatly simplified by the joint distribution in terms of their marginal by taking copulas. This document presents a general framework set frequency analysis bivariate and multivariate using Archimedean copulas for extreme events of hydroclimatological nature such as severe storms. This analysis was conducted in the lower Tunjuelo River basin in Colombia for precipitation events. The results obtained show that for a joint study of the intensity-duration-frequency, IDF curves can be obtained through copulas and thus establish more accurate and reliable information from design storms and associated risks. It shows how the use of copulas greatly simplifies the study of multivariate distributions that introduce the concept of joint return period used to represent the needs of hydrological designs properly in frequency analysis.

  8. A Kalman-based Fundamental Frequency Estimation Algorithm

    DEFF Research Database (Denmark)

    Shi, Liming; Nielsen, Jesper Kjær; Jensen, Jesper Rindom

    2017-01-01

    Fundamental frequency estimation is an important task in speech and audio analysis. Harmonic model-based methods typically have superior estimation accuracy. However, such methods usually as- sume that the fundamental frequency and amplitudes are station- ary over a short time frame. In this paper...... model and formulated as a compact nonlinear matrix form, which is further used to derive an extended Kalman filter. Detailed and continuous fundamental frequency and ampli- tude estimates for speech, the sustained vowel /a/ and solo musical tones with vibrato are demonstrated....

  9. Cooperative Game Study of Airlines Based on Flight Frequency Optimization

    Directory of Open Access Journals (Sweden)

    Wanming Liu

    2014-01-01

    Full Text Available By applying the game theory, the relationship between airline ticket price and optimal flight frequency is analyzed. The paper establishes the payoff matrix of the flight frequency in noncooperation scenario and flight frequency optimization model in cooperation scenario. The airline alliance profit distribution is converted into profit distribution game based on the cooperation game theory. The profit distribution game is proved to be convex, and there exists an optimal distribution strategy. The results show that joining the airline alliance can increase airline whole profit, the change of negotiated prices and cost is beneficial to profit distribution of large airlines, and the distribution result is in accordance with aviation development.

  10. An Oracle-based event index for ATLAS

    Science.gov (United States)

    Gallas, E. J.; Dimitrov, G.; Vasileva, P.; Baranowski, Z.; Canali, L.; Dumitru, A.; Formica, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in ATLAS, the system has been easily extended to perform essential assessments of data integrity and completeness and to identify event duplication, including at what step in processing the duplication occurred.

  11. Wavelet based denoising of power quality events for characterization

    African Journals Online (AJOL)

    The effectiveness of wavelet transform (WT) methods for analyzing different power quality (PQ) events with or without noise has been demonstrated in this paper. Multi-resolution signal decomposition based on discrete WT is used to localize and to classify different power quality disturbances. The energy distribution at ...

  12. Towards an event-based corpuscular model for optical phenomena

    NARCIS (Netherlands)

    De Raedt, H.; Jin, F.; Michielsen, K.; Roychoudhuri, C; Khrennikov, AY; Kracklauer, AF

    2011-01-01

    We discuss an event-based corpuscular model of optical phenomena that does not require the knowledge of the solution of a wave equation of the whole system and reproduces the results of Maxwell's theory through a series of cause-and-effect processes, starting with the emission and ending with the

  13. Training Team Problem Solving Skills: An Event-Based Approach.

    Science.gov (United States)

    Oser, R. L.; Gualtieri, J. W.; Cannon-Bowers, J. A.; Salas, E.

    1999-01-01

    Discusses how to train teams in problem-solving skills. Topics include team training, the use of technology, instructional strategies, simulations and training, theoretical framework, and an event-based approach for training teams to perform in naturalistic environments. Contains 68 references. (Author/LRW)

  14. Deterministic event-based simulation of universal quantum computation

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, H. De; Raedt, K. De; Landau, DP; Lewis, SP; Schuttler, HB

    2006-01-01

    We demonstrate that locally connected networks of classical processing units that leave primitive learning capabilities can be used to perform a deterministic; event-based simulation of universal tluanttim computation. The new simulation method is applied to implement Shor's factoring algorithm.

  15. An XML-Based Protocol for Distributed Event Services

    Science.gov (United States)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    This viewgraph presentation provides information on the application of an XML (extensible mark-up language)-based protocol to the developing field of distributed processing by way of a computational grid which resembles an electric power grid. XML tags would be used to transmit events between the participants of a transaction, namely, the consumer and the producer of the grid scheme.

  16. Simulation of quantum computation : A deterministic event-based approach

    NARCIS (Netherlands)

    Michielsen, K; De Raedt, K; De Raedt, H

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  17. Simulation of Quantum Computation : A Deterministic Event-Based Approach

    NARCIS (Netherlands)

    Michielsen, K.; Raedt, K. De; Raedt, H. De

    2005-01-01

    We demonstrate that locally connected networks of machines that have primitive learning capabilities can be used to perform a deterministic, event-based simulation of quantum computation. We present simulation results for basic quantum operations such as the Hadamard and the controlled-NOT gate, and

  18. Frequency of adverse events in plateletpheresis donors in regional transfusion centre in North India.

    Science.gov (United States)

    Patidar, Gopal Kumar; Sharma, Ratti Ram; Marwaha, Neelam

    2013-10-01

    Although automated cell separators have undergone a lot of technical refinements, attention has been focused on the quality of platelet concentrates than on donor safety. We planned this prospective study to look into donor safety aspect by studying adverse events in normal healthy plateletpheresis donors. The study included 500 healthy, first-time (n=301) and repeat (n=199) plateletpheresis donors after informed consent. The plateletpheresis procedures were performed on Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. The adverse events during procedure were recorded and classified according to their nature. The pre and post procedure hematological and biochemical profiles of these donors were also assessed with the help of automated cell counter and analyser respectively. A total of 18% (n=90) adverse events were recorded in 500 plateletpheresis donors, of which 9% of were hypocalcaemia in nature followed by hematoma (7.4%), vasovagal reaction (0.8%) and kit related adverse events in (0.8%). There was significant post procedure drop in Hb, Hct, platelet count of the donors (padverse events in Trima Accel (5.1 version, GAMBRO BCT) and Amicus (3.2 version FENWAL) cell separators. Donor reactions can adversely affect the voluntary donor recruitment strategies to increase the public awareness regarding constant need for blood and blood products. Commonly observed adverse events in plateletpheresis donors were hypocalcemia, hematoma formation and vasovagal reactions which can be prevented by pre-donation education of the donors and change of machine configuration. Nevertheless, more prospective studies on this aspect are required in order to establish guidelines for donor safety in apheresis and also to help in assessing donor suitability, especially given the present trend of double product apheresis collections. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Audio Effects Based on Biorthogonal Time-Varying Frequency Warping

    Directory of Open Access Journals (Sweden)

    Sergio Cavaliere

    2001-03-01

    Full Text Available We illustrate the mathematical background and musical use of a class of audio effects based on frequency warping. These effects alter the frequency content of a signal via spectral mapping. They can be implemented in dispersive tapped delay lines based on a chain of all-pass filters. In a homogeneous line with first-order all-pass sections, the signal formed by the output samples at a given time is related to the input via the Laguerre transform. However, most musical signals require a time-varying frequency modification in order to be properly processed. Vibrato in musical instruments or voice intonation in the case of vocal sounds may be modeled as small and slow pitch variations. Simulation of these effects requires techniques for time-varying pitch and/or brightness modification that are very useful for sound processing. The basis for time-varying frequency warping is a time-varying version of the Laguerre transformation. The corresponding implementation structure is obtained as a dispersive tapped delay line, where each of the frequency dependent delay element has its own phase response. Thus, time-varying warping results in a space-varying, inhomogeneous, propagation structure. We show that time-varying frequency warping is associated to an expansion over biorthogonal sets generalizing the discrete Laguerre basis. Slow time-varying characteristics lead to slowly varying parameter sequences. The corresponding sound transformation does not suffer from discontinuities typical of delay lines based on unit delays.

  20. Events

    Directory of Open Access Journals (Sweden)

    Igor V. Karyakin

    2016-02-01

    Full Text Available The 9th ARRCN Symposium 2015 was held during 21st–25th October 2015 at the Novotel Hotel, Chumphon, Thailand, one of the most favored travel destinations in Asia. The 10th ARRCN Symposium 2017 will be held during October 2017 in the Davao, Philippines. International Symposium on the Montagu's Harrier (Circus pygargus «The Montagu's Harrier in Europe. Status. Threats. Protection», organized by the environmental organization «Landesbund für Vogelschutz in Bayern e.V.» (LBV was held on November 20-22, 2015 in Germany. The location of this event was the city of Wurzburg in Bavaria.

  1. Topic Modeling Based Image Clustering by Events in Social Media

    Directory of Open Access Journals (Sweden)

    Bin Xu

    2016-01-01

    Full Text Available Social event detection in large photo collections is very challenging and multimodal clustering is an effective methodology to deal with the problem. Geographic information is important in event detection. This paper proposed a topic model based approach to estimate the missing geographic information for photos. The approach utilizes a supervised multimodal topic model to estimate the joint distribution of time, geographic, content, and attached textual information. Then we annotate the missing geographic photos with a predicted geographic coordinate. Experimental results indicate that the clustering performance improved by annotated geographic information.

  2. Decision Model of Flight Safety Based on Flight Event

    Science.gov (United States)

    Xiao-yu, Zhang; Jiu-sheng, Chen

    To improve the management of flight safety for airline company, the hierarchy model is established about the evaluation of flight safety by flight event. Flight safety is evaluated by improved analytical hierarchy process (AHP). The method to rectify the consistency judgment matrix is given to improve the AHP. Then the weight can be given directly without consistency judgment matrix. It ensures absolute consistent of judgment matrix. By statistic of flight event incidence history data, the flight safety analysis is processed by means of static evaluation and dynamic evaluation. The hierarchy structure model is implemented based on .NET, and the simulation result proves the validity of the method.

  3. Development of a GCR Event-based Risk Model

    Science.gov (United States)

    Cucinotta, Francis A.; Ponomarev, Artem L.; Plante, Ianik; Carra, Claudio; Kim, Myung-Hee

    2009-01-01

    A goal at NASA is to develop event-based systems biology models of space radiation risks that will replace the current dose-based empirical models. Complex and varied biochemical signaling processes transmit the initial DNA and oxidative damage from space radiation into cellular and tissue responses. Mis-repaired damage or aberrant signals can lead to genomic instability, persistent oxidative stress or inflammation, which are causative of cancer and CNS risks. Protective signaling through adaptive responses or cell repopulation is also possible. We are developing a computational simulation approach to galactic cosmic ray (GCR) effects that is based on biological events rather than average quantities such as dose, fluence, or dose equivalent. The goal of the GCR Event-based Risk Model (GERMcode) is to provide a simulation tool to describe and integrate physical and biological events into stochastic models of space radiation risks. We used the quantum multiple scattering model of heavy ion fragmentation (QMSFRG) and well known energy loss processes to develop a stochastic Monte-Carlo based model of GCR transport in spacecraft shielding and tissue. We validated the accuracy of the model by comparing to physical data from the NASA Space Radiation Laboratory (NSRL). Our simulation approach allows us to time-tag each GCR proton or heavy ion interaction in tissue including correlated secondary ions often of high multiplicity. Conventional space radiation risk assessment employs average quantities, and assumes linearity and additivity of responses over the complete range of GCR charge and energies. To investigate possible deviations from these assumptions, we studied several biological response pathway models of varying induction and relaxation times including the ATM, TGF -Smad, and WNT signaling pathways. We then considered small volumes of interacting cells and the time-dependent biophysical events that the GCR would produce within these tissue volumes to estimate how

  4. Effect of statin withdrawal on frequency of cardiac events after vascular surgery

    NARCIS (Netherlands)

    Schouten, Olaf; Hoeks, Sanne E.; Welten, Gijs M. J. M.; Davignon, Jean; Kastelein, John J. P.; Vidakovic, Radosav; Feringa, Harm H. H.; Dunkelgrun, Martin; van Domburg, Ron T.; Bax, Jeroen J.; Poldermans, Don

    2007-01-01

    The discontinuation of statin therapy in patients with acute coronary syndromes has been associated with an increase of adverse coronary events. Patients who undergo major surgery frequently are not able to take oral medication shortly after surgery. Because there is no intravenous formula for

  5. Event-Based control of depth of hypnosis in anesthesia.

    Science.gov (United States)

    Merigo, Luca; Beschi, Manuel; Padula, Fabrizio; Latronico, Nicola; Paltenghi, Massimiliano; Visioli, Antonio

    2017-08-01

    In this paper, we propose the use of an event-based control strategy for the closed-loop control of the depth of hypnosis in anesthesia by using propofol administration and the bispectral index as a controlled variable. A new event generator with high noise-filtering properties is employed in addition to a PIDPlus controller. The tuning of the parameters is performed off-line by using genetic algorithms by considering a given data set of patients. The effectiveness and robustness of the method is verified in simulation by implementing a Monte Carlo method to address the intra-patient and inter-patient variability. A comparison with a standard PID control structure shows that the event-based control system achieves a reduction of the total variation of the manipulated variable of 93% in the induction phase and of 95% in the maintenance phase. The use of event based automatic control in anesthesia yields a fast induction phase with bounded overshoot and an acceptable disturbance rejection. A comparison with a standard PID control structure shows that the technique effectively mimics the behavior of the anesthesiologist by providing a significant decrement of the total variation of the manipulated variable. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Event-based cluster synchronization of coupled genetic regulatory networks

    Science.gov (United States)

    Yue, Dandan; Guan, Zhi-Hong; Li, Tao; Liao, Rui-Quan; Liu, Feng; Lai, Qiang

    2017-09-01

    In this paper, the cluster synchronization of coupled genetic regulatory networks with a directed topology is studied by using the event-based strategy and pinning control. An event-triggered condition with a threshold consisting of the neighbors' discrete states at their own event time instants and a state-independent exponential decay function is proposed. The intra-cluster states information and extra-cluster states information are involved in the threshold in different ways. By using the Lyapunov function approach and the theories of matrices and inequalities, we establish the cluster synchronization criterion. It is shown that both the avoidance of continuous transmission of information and the exclusion of the Zeno behavior are ensured under the presented triggering condition. Explicit conditions on the parameters in the threshold are obtained for synchronization. The stability criterion of a single GRN is also given under the reduced triggering condition. Numerical examples are provided to validate the theoretical results.

  7. Event-based state estimation a stochastic perspective

    CERN Document Server

    Shi, Dawei; Chen, Tongwen

    2016-01-01

    This book explores event-based estimation problems. It shows how several stochastic approaches are developed to maintain estimation performance when sensors perform their updates at slower rates only when needed. The self-contained presentation makes this book suitable for readers with no more than a basic knowledge of probability analysis, matrix algebra and linear systems. The introduction and literature review provide information, while the main content deals with estimation problems from four distinct angles in a stochastic setting, using numerous illustrative examples and comparisons. The text elucidates both theoretical developments and their applications, and is rounded out by a review of open problems. This book is a valuable resource for researchers and students who wish to expand their knowledge and work in the area of event-triggered systems. At the same time, engineers and practitioners in industrial process control will benefit from the event-triggering technique that reduces communication costs ...

  8. Event-Based User Classification in Weibo Media

    Science.gov (United States)

    Wang, Wendong; Cheng, Shiduan; Que, Xirong

    2014-01-01

    Weibo media, known as the real-time microblogging services, has attracted massive attention and support from social network users. Weibo platform offers an opportunity for people to access information and changes the way people acquire and disseminate information significantly. Meanwhile, it enables people to respond to the social events in a more convenient way. Much of the information in Weibo media is related to some events. Users who post different contents, and exert different behavior or attitude may lead to different contribution to the specific event. Therefore, classifying the large amount of uncategorized social circles generated in Weibo media automatically from the perspective of events has been a promising task. Under this circumstance, in order to effectively organize and manage the huge amounts of users, thereby further managing their contents, we address the task of user classification in a more granular, event-based approach in this paper. By analyzing real data collected from Sina Weibo, we investigate the Weibo properties and utilize both content information and social network information to classify the numerous users into four primary groups: celebrities, organizations/media accounts, grassroots stars, and ordinary individuals. The experiments results show that our method identifies the user categories accurately. PMID:25133235

  9. Distributed Frequency Control of Prosumer-Based Electric Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Nazari, MH; Costello, Z; Feizollahi, MJ; Grijalva, S; Egerstedt, M

    2014-11-01

    In this paper, we propose a distributed frequency regulation framework for prosumer-based electric energy systems, where a prosumer (producer-consumer) is defined as an intelligent agentwhich can produce, consume, and/or store electricity. Despite the frequency regulators being distributed, stability can be ensured while avoiding inter-area oscillations using a limited control effort. To achieve this, a fully distributed one-step model-predictive control protocol is proposed and analyzed, whereby each prosumer communicates solely with its neighbors in the network. The efficacy of the proposed frequency regulation framework is shown through simulations on two real-world electric energy systems of different scale and complexity. We show that prosumers can indeed bring frequency and power deviations to their desired values after small perturbations.

  10. Acoustic frequency filter based on anisotropic topological phononic crystals

    KAUST Repository

    Chen, Zeguo

    2017-11-02

    We present a design of acoustic frequency filter based on a two-dimensional anisotropic phononic crystal. The anisotropic band structure exhibits either a directional or a combined (global + directional) bandgap at certain frequency regions, depending on the geometry. When the time-reversal symmetry is broken, it may introduce a topologically nontrivial bandgap. The induced nontrivial bandgap and the original directional bandgap result in various interesting wave propagation behaviors, such as frequency filter. We develop a tight-binding model to characterize the effective Hamiltonian of the system, from which the contribution of anisotropy is explicitly shown. Different from the isotropic cases, the Zeeman-type splitting is not linear and the anisotropic bandgap makes it possible to achieve anisotropic propagation characteristics along different directions and at different frequencies.

  11. Regadenoson versus Dipyridamole: A Comparison of the Frequency of Adverse Events in Patients Undergoing Myocardial Perfusion Imaging.

    Science.gov (United States)

    Amer, Kallie A; Hurren, Jeff R; Edwin, Stephanie B; Cohen, Gerald

    2017-06-01

    To compare the frequency of adverse events in patients undergoing myocardial perfusion imaging (MPI) with either regadenoson or dipyridamole. Single-center, retrospective cohort study. Large community teaching hospital. A total of 568 adults who underwent single-photon emission tomography MPI with either regadenoson (284 patients) or dipyridamole (284 patients) as a vasodilator agent, following an institution conversion from regadenoson to dipyridamole in the MPI protocol on July 15, 2013, for cost-saving purposes. Data were collected from the patients' electronic medical records. The primary endpoint was the composite occurrence of any documented adverse event in each group. Secondary endpoints were individual components of the primary endpoint, reason for termination of the MPI examination (protocol completion or premature end due to an adverse event), use of an interventional agent to an treat adverse event, and cost-related outcomes. A higher proportion of patients in the regadenoson group experienced an adverse event than those who received dipyridamole (84.9% vs 56.7%, pregadenoson in patients undergoing MPI. Dipyridamole offers a safe and cost-effective alternative to regadenoson for cardiac imaging studies. © 2017 Pharmacotherapy Publications, Inc.

  12. Biomedical event trigger detection by dependency-based word embedding.

    Science.gov (United States)

    Wang, Jian; Zhang, Jianhai; An, Yuan; Lin, Hongfei; Yang, Zhihao; Zhang, Yijia; Sun, Yuanyuan

    2016-08-10

    In biomedical research, events revealing complex relations between entities play an important role. Biomedical event trigger identification has become a research hotspot since its important role in biomedical event extraction. Traditional machine learning methods, such as support vector machines (SVM) and maxent classifiers, which aim to manually design powerful features fed to the classifiers, depend on the understanding of the specific task and cannot generalize to the new domain or new examples. In this paper, we propose an approach which utilizes neural network model based on dependency-based word embedding to automatically learn significant features from raw input for trigger classification. First, we employ Word2vecf, the modified version of Word2vec, to learn word embedding with rich semantic and functional information based on dependency relation tree. Then neural network architecture is used to learn more significant feature representation based on raw dependency-based word embedding. Meanwhile, we dynamically adjust the embedding while training for adapting to the trigger classification task. Finally, softmax classifier labels the examples by specific trigger class using the features learned by the model. The experimental results show that our approach achieves a micro-averaging F1 score of 78.27 and a macro-averaging F1 score of 76.94 % in significant trigger classes, and performs better than baseline methods. In addition, we can achieve the semantic distributed representation of every trigger word.

  13. PREVENTING MEDICATION ERROR BASED ON KNOWLEDGE MANAGEMENT AGAINST ADVERSE EVENT

    Directory of Open Access Journals (Sweden)

    Apriyani Puji Hastuti

    2017-06-01

    Full Text Available Introductions: Medication error is one of many types of errors that could decrease the quality and safety of healthcare. Increasing number of adverse events (AE reflects the number of medication errors. This study aimed to develop a model of medication error prevention based on knowledge management. This model is expected to improve knowledge and skill of nurses to prevent medication error which is characterized by the decrease of adverse events (AE. Methods: This study consisted of two stages. The first stage of research was an explanative survey using cross-sectional approach involving 15 respondents selected by purposive sampling. The second stage was a pre-test experiment involving 29 respondents selected with cluster sampling. Partial Leas square (PLS was used to examine the factors affecting medication error prevention model while the Wilcoxon Signed Rank Test was used to test the effect of medication error prevention model against adverse events (AE. Results: Individual factors (path coefficient 12:56, t = 4,761 play an important role in nurse behavioral changes about medication error prevention based in knowledge management, organizational factor (path coefficient = 0276, t = 2.504 play an important role in nurse behavioral changes about medication error prevention based on knowledge management. Work characteristic factor (path coefficient = 0309, t = 1.98 play an important role in nurse behavioral changes about medication error prevention based on knowledge management. The medication error prevention model based on knowledge management was also significantly decreased adverse event (p = 0.000, α <0.05. Discussion: Factors of individuals, organizations and work characteristics were important in the development of medication error prevention models based on knowledge management.

  14. Extreme rainfall analysis based on precipitation events classification in Northern Italy

    Science.gov (United States)

    Campo, Lorenzo; Fiori, Elisabetta; Molini, Luca

    2016-04-01

    Extreme rainfall statistical analysis is constituted by a consolidated family of techniques that allows to study the frequency and the statistical properties of the high-intensity meteorological events. This kind of techniques is well established and comprehends standards approaches like the GEV (Generalized Extreme Value) or TCEV (Two Components Extreme Value) probability distribution fit of the data recorded in a given raingauge on a given location. Regionalization techniques, that are aimed to spatialize the analysis on medium-large regions are also well established and operationally used. In this work a novel procedure is proposed in order to statistically characterize the rainfall extremes in a given region, basing on a "event-based" approach. Given a temporal sequence of continuous rain maps, an "event" is defined as an aggregate, continuous in time and space, of cells whose rainfall height value is above a certain threshold. Basing on this definition it is possible to classify, on a given region and for a given period, a population of events and characterize them with a number of statistics, such as their total volume, maximum spatial extension, duration, average intensity, etc. Thus, the population of events so obtained constitutes the input of a novel extreme values characteriztion technique: given a certain spatial scale, a mobile window analysis is performed and all the events that fall in the window are anlysed from an extreme value point of view. For each window, the extreme annual events are considered: maximum total volume, maximum spatial extension, maximum intensity, maximum duration are all considered for an extreme analysis and the corresponding probability distributions are fitted. The analysis allows in this way to statistically characterize the most intense events and, at the same time, to spatialize these rain characteristics exploring their variability in space. This methodology was employed on rainfall fields obtained by interpolation of

  15. An Oracle-based event index for ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00083337; The ATLAS collaboration; Dimitrov, Gancho

    2017-01-01

    The ATLAS Eventlndex System has amassed a set of key quantities for a large number of ATLAS events into a Hadoop based infrastructure for the purpose of providing the experiment with a number of event-wise services. Collecting this data in one place provides the opportunity to investigate various storage formats and technologies and assess which best serve the various use cases as well as consider what other benefits alternative storage systems provide. In this presentation we describe how the data are imported into an Oracle RDBMS (relational database management system), the services we have built based on this architecture, and our experience with it. We’ve indexed about 26 billion real data events thus far and have designed the system to accommodate future data which has expected rates of 5 and 20 billion events per year. We have found this system offers outstanding performance for some fundamental use cases. In addition, profiting from the co-location of this data with other complementary metadata in AT...

  16. Frequency of cancer events with saxagliptin in the SAVOR-TIMI 53 trial.

    Science.gov (United States)

    Leiter, L A; Teoh, H; Mosenzon, O; Cahn, A; Hirshberg, B; Stahre, C A M; Hoekstra, J B L; Alvarsson, M; Im, K; Scirica, B M; Bhatt, D L; Raz, I

    2016-02-01

    The Saxagliptin Assessment of Vascular Outcomes Recorded in Patients with Diabetes Mellitus (SAVOR)-Thrombolysis in Myocardial Infarction (TIMI) 53 trial randomized trial of 16,492 patients (placebo, n = 8212; saxagliptin, n = 8280) treated and followed for a median of 2.1 years afforded an opportunity to explore whether there was any association with cancer reported as a serious adverse event. At least one cancer event was reported by 688 patients (4.1%): 362 (4.3%) and 326 (3.8%) in the placebo and saxagliptin arms, respectively (p = 0.13). There were 59 (0.6%) deaths adjudicated as malignancy deaths with placebo and 53 (0.6%) with saxagliptin. Stratification by gender, age, race and ethnicity, diabetes duration, baseline glycated haemoglobin and pharmacotherapy did not show any clinically meaningful differences between the two study arms. The overall number of cancer events and malignancy-associated mortality rates were generally balanced between the placebo and saxagliptin groups, suggesting a null relationship with saxagliptin use over the median follow-up of 2.1 years. Multivariable modelling showed that male gender, dyslipidaemia and current smoking were independent predictors of cancer. These randomized data with adequate numbers of cancer cases are reassuring but limited, by the short follow-up in a trial not designed to test this hypothesis. © 2015 John Wiley & Sons Ltd.

  17. Carbon nanotube transistor based high-frequency electronics

    Science.gov (United States)

    Schroter, Michael

    At the nanoscale carbon nanotubes (CNTs) have higher carrier mobility and carrier velocity than most incumbent semiconductors. Thus CNT based field-effect transistors (FETs) are being considered as strong candidates for replacing existing MOSFETs in digital applications. In addition, the predicted high intrinsic transit frequency and the more recent finding of ways to achieve highly linear transfer characteristics have inspired investigations on analog high-frequency (HF) applications. High linearity is extremely valuable for an energy efficient usage of the frequency spectrum, particularly in mobile communications. Compared to digital applications, the much more relaxed constraints for CNT placement and lithography combined with already achieved operating frequencies of at least 10 GHz for fabricated devices make an early entry in the low GHz HF market more feasible than in large-scale digital circuits. Such a market entry would be extremely beneficial for funding the development of production CNTFET based process technology. This talk will provide an overview on the present status and feasibility of HF CNTFET technology will be given from an engineering point of view, including device modeling, experimental results, and existing roadblocks. Carbon nanotube transistor based high-frequency electronics.

  18. A New Instantaneous Frequency Measure Based on The Stockwell Transform

    Science.gov (United States)

    yedlin, M. J.; Ben-Horrin, Y.; Fraser, J. D.

    2011-12-01

    We propose the use of a new transform, the Stockwell transform[1], as a means of creating time-frequency maps and applying them to distinguish blasts from earthquakes. This new transform, the Stockwell transform can be considered as a variant of the continuous wavelet transform, that preserves the absolute phase.The Stockwell transform employs a complex Morlet mother wavelet. The novelty of this transform lies in its resolution properties. High frequencies in the candidate signal are well-resolved in time but poorly resolved in frequency, while the converse is true for low frequency signal components. The goal of this research is to obtain the instantaneous frequency as a function of time for both the earthquakes and the blasts. Two methods will be compared. In the first method, we will compute the analytic signal, the envelope and the instantaneous phase as a function of time[2]. The instantaneous phase derivative will yield the instantaneous angular frequency. The second method will be based on time-frequency analysis using the Stockwell transform. The Stockwell transform will be computed in non-redundant fashion using a dyadic representation[3]. For each time-point, the frequency centroid will be computed -- a representation for the most likely frequency at that time. A detailed comparison will be presented for both approaches to the computation of the instantaneous frequency. An advantage of the Stockwell approach is that no differentiation is applied. The Hilbert transform method can be less sensitive to edge effects. The goal of this research is to see if the new Stockwell-based method could be used as a discriminant between earthquakes and blasts. References [1] Stockwell, R.G., Mansinha, L. and Lowe, R.P. "Localization of the complex spectrum: the S transform", IEEE Trans. Signal Processing, vol.44, no.4, pp.998-1001, (1996). [2]Taner, M.T., Koehler, F. "Complex seismic trace analysis", Geophysics, vol. 44, Issue 6, pp. 1041-1063 (1979). [3] Brown, R

  19. Frequency of Extreme Heat Event as a Surrogate Exposure Metric for Examining the Human Health Effects of Climate Change.

    Directory of Open Access Journals (Sweden)

    Crystal Romeo Upperman

    Full Text Available Epidemiological investigation of the impact of climate change on human health, particularly chronic diseases, is hindered by the lack of exposure metrics that can be used as a marker of climate change that are compatible with health data. Here, we present a surrogate exposure metric created using a 30-year baseline (1960-1989 that allows users to quantify long-term changes in exposure to frequency of extreme heat events with near unabridged spatial coverage in a scale that is compatible with national/state health outcome data. We evaluate the exposure metric by decade, seasonality, area of the country, and its ability to capture long-term changes in weather (climate, including natural climate modes. Our findings show that this generic exposure metric is potentially useful to monitor trends in the frequency of extreme heat events across varying regions because it captures long-term changes; is sensitive to the natural climate modes (ENSO events; responds well to spatial variability, and; is amenable to spatial/temporal aggregation, making it useful for epidemiological studies.

  20. Mars Science Laboratory; A Model for Event-Based EPO

    Science.gov (United States)

    Mayo, Louis; Lewis, E.; Cline, T.; Stephenson, B.; Erickson, K.; Ng, C.

    2012-10-01

    The NASA Mars Science Laboratory (MSL) and its Curiosity Rover, a part of NASA's Mars Exploration Program, represent the most ambitious undertaking to date to explore the red planet. MSL/Curiosity was designed primarily to determine whether Mars ever had an environment capable of supporting microbial life. NASA's MSL education program was designed to take advantage of existing, highly successful event based education programs to communicate Mars science and education themes to worldwide audiences through live webcasts, video interviews with scientists, TV broadcasts, professional development for teachers, and the latest social media frameworks. We report here on the success of the MSL education program and discuss how this methodological framework can be used to enhance other event based education programs.

  1. A Physics-Based Vibrotactile Feedback Library for Collision Events.

    Science.gov (United States)

    Park, Gunhyuk; Choi, Seungmoon

    2017-01-01

    We present PhysVib: a software solution on the mobile platform extending an open-source physics engine in a multi-rate rendering architecture for automatic vibrotactile feedback upon collision events. PhysVib runs concurrently with a physics engine at a low update rate and generates vibrotactile feedback commands at a high update rate based on the simulation results of the physics engine using an exponentially-decaying sinusoidal model. We demonstrate through a user study that this vibration model is more appropriate to our purpose in terms of perceptual quality than more complex models based on sound synthesis. We also evaluated the perceptual performance of PhysVib by comparing eight vibrotactile rendering methods. Experimental results suggested that PhysVib enables more realistic vibrotactile feedback than the other methods as to perceived similarity to the visual events. PhysVib is an effective solution for providing physically plausible vibrotactile responses while reducing application development time to great extent.

  2. Time-variable frequency of events in domains of Tilia cambium

    Directory of Open Access Journals (Sweden)

    Wiesław Włoch

    2014-01-01

    Full Text Available In the cambium of linden, producing xylem with interlocked grain, domains active, as regards the occurrence of events, and inactive ones can be distinguished. The area of the cambium investigated was an assemblage of small domains among which at certain periods domains Z, and, at another period, domains S were active. The inclination of the grain was changing in the direction corresponding to the type of the active domains. Alternative occurrence of periods of activity of Z and S domains led to the formation of interlocked grain in the xylem, with a much longer wave than the height of a pair domains.

  3. Time-frequency representation based on time-varying ...

    Indian Academy of Sciences (India)

    Abstract. A parametric time-frequency representation is presented based on time- varying autoregressive model (TVAR), followed by applications to non-stationary vibration signal processing. The identification of time-varying model coefficients and the determination of model order, are addressed by means of neural ...

  4. Time-frequency representation based on time-varying ...

    Indian Academy of Sciences (India)

    A parametric time-frequency representation is presented based on timevarying autoregressive model (TVAR), followed by applications to non-stationary vibration signal processing. The identification of time-varying model coefficients and the determination of model order, are addressed by means of neural networks and ...

  5. Track-based event recognition in a realistic crowded environment

    Science.gov (United States)

    van Huis, Jasper R.; Bouma, Henri; Baan, Jan; Burghouts, Gertjan J.; Eendebak, Pieter T.; den Hollander, Richard J. M.; Dijk, Judith; van Rest, Jeroen H.

    2014-10-01

    Automatic detection of abnormal behavior in CCTV cameras is important to improve the security in crowded environments, such as shopping malls, airports and railway stations. This behavior can be characterized at different time scales, e.g., by small-scale subtle and obvious actions or by large-scale walking patterns and interactions between people. For example, pickpocketing can be recognized by the actual snatch (small scale), when he follows the victim, or when he interacts with an accomplice before and after the incident (longer time scale). This paper focusses on event recognition by detecting large-scale track-based patterns. Our event recognition method consists of several steps: pedestrian detection, object tracking, track-based feature computation and rule-based event classification. In the experiment, we focused on single track actions (walk, run, loiter, stop, turn) and track interactions (pass, meet, merge, split). The experiment includes a controlled setup, where 10 actors perform these actions. The method is also applied to all tracks that are generated in a crowded shopping mall in a selected time frame. The results show that most of the actions can be detected reliably (on average 90%) at a low false positive rate (1.1%), and that the interactions obtain lower detection rates (70% at 0.3% FP). This method may become one of the components that assists operators to find threatening behavior and enrich the selection of videos that are to be observed.

  6. Orthogonal frequency division multiplexing simulation based on MATLAB

    Science.gov (United States)

    Qiao, Yuan

    2017-09-01

    OFDM (Orthogonal Frequency Division Multiplexing) is one of the core technologies in the fourth generation mobile communication system. It is a widely-used method of the multi-carrier modulations based on IFFT and FFT transform, it can achieve the lowest complexity and effectively combat frequency selective fading. In this paper, we successfully use MATLAB to do the simulation of OFDM, and obtained good results, in which successful recovery out of the original signal under real channel condition, and error is less than 5% with the original signal.

  7. Analysis of core damage frequency: Peach Bottom, Unit 2 internal events

    Energy Technology Data Exchange (ETDEWEB)

    Kolaczkowski, A.M.; Cramond, W.R.; Sype, T.T.; Maloney, K.J.; Wheeler, T.A.; Daniel, S.L. (Science Applications International Corp., Albuquerque, NM (USA); Sandia National Labs., Albuquerque, NM (USA))

    1989-08-01

    This document contains the appendices for the accident sequence analysis of internally initiated events for the Peach Bottom, Unit 2 Nuclear Power Plant. This is one of the five plant analyses conducted as part of the NUREG-1150 effort for the Nuclear Regulatory Commission. The work performed and described here is an extensive reanalysis of that published in October 1986 as NUREG/CR-4550, Volume 4. It addresses comments from numerous reviewers and significant changes to the plant systems and procedures made since the first report. The uncertainty analysis and presentation of results are also much improved, and considerable effort was expended on an improved analysis of loss of offsite power. The content and detail of this report is directed toward PRA practitioners who need to know how the work was done and the details for use in further studies. 58 refs., 58 figs., 52 tabs.

  8. Event-related potentials reflecting the frequency of unattended spoken words

    DEFF Research Database (Denmark)

    Shtyrov, Yury; Kimppa, Lilli; Pulvermüller, Friedemann

    2011-01-01

    , in passive non-attend conditions, with acoustically matched high- and low-frequency words along with pseudo-words. Using factorial and correlation analyses, we found that already at ~120 ms after the spoken stimulus information was available, amplitude of brain responses was modulated by the words' lexical...... for the most frequent word stimuli, later-on (~270 ms), a more global lexicality effect with bilateral perisylvian sources was found for all stimuli, suggesting faster access to more frequent lexical entries. Our results support the account of word memory traces as interconnected neuronal circuits, and suggest......How are words represented in the human brain and can these representations be qualitatively assessed with respect to their structure and properties? Recent research demonstrates that neurophysiological signatures of individual words can be measured when subjects do not focus their attention...

  9. Frequency-Based Precursory Acoustic Emission Failure Sequences In Sedimentary And Igneous Rocks Under Uniaxial Compression

    Science.gov (United States)

    Colin, C.; Anderson, R. C.; Chasek, M. D.; Peters, G. H.; Carey, E. M.

    2016-12-01

    Identifiable precursors to rock failure have been a long pursued and infrequently encountered phenomena in rock mechanics and acoustic emission studies. Since acoustic emissions in compressed rocks were found to follow the Gutenberg-Richter law, failure-prediction strategies based on temporal changes in b-value have been recurrent. In this study, we extend on the results of Ohnaka and Mogi [Journal of Geophysical Research, Vol. 87, No. B5, p. 3873-3884, (1982)], where the bulk frequency characteristics of rocks under incremental uniaxial compression were observed in relation to changes in b-value before and after failure. Based on the proposition that the number of low-frequency acoustic emissions is proportional to the number of high-amplitude acoustic emissions in compressed rocks, Ohnaka and Mogi (1982) demonstrated that b-value changes in granite and andesite cores under incremental uniaxial compression could be expressed in terms of the percent abundance of low-frequency events. In this study, we attempt to demonstrate that the results of Ohnaka and Mogi (1982) hold true for different rock types (basalt, sandstone, and limestone) and different sample geometries (rectangular prisms). In order to do so, the design of the compression tests was kept similar to that of Ohnaka and Mogi (1982). Two high frequency piezoelectric transducers of 1 MHz and a 500 kHz coupled to the sides of the samples detected higher and lower frequency acoustic emission signals. However, rather than gathering parametric data from an analog signal using a counter as per Ohnaka and Mogi (1982), we used an oscilloscope as an analog to digital converter interfacing with LabVIEW 2015 to record the complete waveforms. The digitally stored waveforms were then processed, detecting acoustic emission events using a statistical method, and filtered using a 2nd order Butterworth filter. In addition to calculating the percent abundance of low-frequency events over time, the peak frequency of the

  10. Multi Agent System Based Wide Area Protection against Cascading Events

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Liu, Leo

    2012-01-01

    In this paper, a multi-agent system based wide area protection scheme is proposed in order to prevent long term voltage instability induced cascading events. The distributed relays and controllers work as a device agent which not only executes the normal function automatically but also can...... be modified to fulfill the extra function according to external requirements. The control center is designed as a highest level agent in MAS to coordinate all the lower agents to prevent the system wide voltage disturbance. A hybrid simulation platform with MATLAB and RTDS is set up to demonstrate...... the effectiveness of proposed protection strategy. The simulation results indicate that the proposed multi agent control system can effectively coordinate the distributed relays and controllers to prevent the long term voltage instability induced cascading events....

  11. Temporal and Location Based RFID Event Data Management and Processing

    Science.gov (United States)

    Wang, Fusheng; Liu, Peiya

    Advance of sensor and RFID technology provides significant new power for humans to sense, understand and manage the world. RFID provides fast data collection with precise identification of objects with unique IDs without line of sight, thus it can be used for identifying, locating, tracking and monitoring physical objects. Despite these benefits, RFID poses many challenges for data processing and management. RFID data are temporal and history oriented, multi-dimensional, and carrying implicit semantics. Moreover, RFID applications are heterogeneous. RFID data management or data warehouse systems need to support generic and expressive data modeling for tracking and monitoring physical objects, and provide automated data interpretation and processing. We develop a powerful temporal and location oriented data model for modeling and queryingRFID data, and a declarative event and rule based framework for automated complex RFID event processing. The approach is general and can be easily adapted for different RFID-enabled applications, thus significantly reduces the cost of RFID data integration.

  12. Arbitrary frequency tunable radio frequency bandpass filter based on nano-patterned Permalloy coplanar waveguide (invited)

    Science.gov (United States)

    Wang, Tengxing; Rahman, B. M. Farid; Peng, Yujia; Xia, Tian; Wang, Guoan

    2015-05-01

    A well designed coplanar waveguide (CPW) based center frequency tunable bandpass filter (BPF) at 4 GHz enabled with patterned Permalloy (Py) thin film has been implemented. The operating frequency of BPF is tunable with only DC current without the use of any external magnetic field. Electromagnetic bandgap resonators structure is adopted in the BPF and thus external DC current can be applied between the input and output of the filter for tuning of Py permeability. Special configurations of resonators with multiple narrow parallel sections have been considered for larger inductance tenability; the tunability of CPW transmission lines of different widths with patterned Py thin film on the top of the signal lines is compared and measured. Py thin film patterned as bars is deposited on the top of the multiple narrow parallel sections of the designed filter. No extra area is required for the designed filter configuration. Filter is measured and results show that its center frequency could be tuned from 4 GHz to 4.02 GHz when the DC current is applied from 0 mA to 400 mA.

  13. Frequency-agile vector signal generation based on optical frequency comb and pre-coding

    Science.gov (United States)

    Qu, Kun; Zhao, ShangHong; Tan, QingGui; Liang, DanYa

    2017-06-01

    In this paper, we experimentally demonstrate the generation of frequency-agile vector signals based on an optical frequency comb (OFC) and unbalanced pre-coding technology by employing a dual-driven Mach-Zehnder Modulator (DD-MZM) and an intensity modulator (IM). The OFC is generated by the DD-MZM and sent to the IM as a carrier. The IM is driven by a 5 GHz 2 Gbaud quadrature phase-shift keying (QPSK) vector signal with unbalanced pre-coding. The -1st order sideband of one OFC line and the +1st order sideband of another OFC line are selected by a programmable pulse shaper (PPS), after square-low photodiode detection, the frequency-agile vector signal can be obtained. The results show that the 2 Gbaud QPSK vector signals at 30 GHz, 50 GHz, 70 GHz and 90 GHz can be generated by only pre-coding once. It is possible to achieve a bit-error-rate (BER) below 1e-3 for wireless transmissions over 0.5 m using this method.

  14. Event-Based Control Strategy for Mobile Robots in Wireless Environments.

    Science.gov (United States)

    Socas, Rafael; Dormido, Sebastián; Dormido, Raquel; Fabregas, Ernesto

    2015-12-02

    In this paper, a new event-based control strategy for mobile robots is presented. It has been designed to work in wireless environments where a centralized controller has to interchange information with the robots over an RF (radio frequency) interface. The event-based architectures have been developed for differential wheeled robots, although they can be applied to other kinds of robots in a simple way. The solution has been checked over classical navigation algorithms, like wall following and obstacle avoidance, using scenarios with a unique or multiple robots. A comparison between the proposed architectures and the classical discrete-time strategy is also carried out. The experimental results shows that the proposed solution has a higher efficiency in communication resource usage than the classical discrete-time strategy with the same accuracy.

  15. Improved Goldstein Interferogram Filter Based on Local Fringe Frequency Estimation.

    Science.gov (United States)

    Feng, Qingqing; Xu, Huaping; Wu, Zhefeng; You, Yanan; Liu, Wei; Ge, Shiqi

    2016-11-23

    The quality of an interferogram, which is limited by various phase noise, will greatly affect the further processes of InSAR, such as phase unwrapping. Interferometric SAR (InSAR) geophysical measurements', such as height or displacement, phase filtering is therefore an essential step. In this work, an improved Goldstein interferogram filter is proposed to suppress the phase noise while preserving the fringe edges. First, the proposed adaptive filter step, performed before frequency estimation, is employed to improve the estimation accuracy. Subsequently, to preserve the fringe characteristics, the estimated fringe frequency in each fixed filtering patch is removed from the original noisy phase. Then, the residual phase is smoothed based on the modified Goldstein filter with its parameter alpha dependent on both the coherence map and the residual phase frequency. Finally, the filtered residual phase and the removed fringe frequency are combined to generate the filtered interferogram, with the loss of signal minimized while reducing the noise level. The effectiveness of the proposed method is verified by experimental results based on both simulated and real data.

  16. Trend of annual temperature and frequency of extreme events in the MATOPIBA region of Brazil

    Science.gov (United States)

    Salvador, Mozar de A.; de Brito, J. I. B.

    2017-06-01

    During the 1980s, a new agricultural frontier arouse in Brazil, which occupied part of the states of Maranhão, Tocantins, Piauí, and Bahia. Currently, this new frontier is known as the MATOPIBA region. The region went through intense transformations in its social and environmental characteristics, with the emergence of extensive areas of intensive agriculture and large herds. The purpose of this research was to study the climatic variabilities of temperature in the MATOPIBA region through extreme climate indexes of ClimAp tool. Data from 11 weather stations were analyzed for yearly air temperature (maximum and minimum) in the period of 1970 to 2012. To verify the trend in the series, we used methods of linear regression analysis and Kendall-tau test. The annual analysis of maximum and minimum temperatures and of the temperature extremes indexes showed a strong positive trend in practically every series (with p value less than 0.05). These results indicated that the region went through to a significant heating process in the last 3 decades. The indices of extreme also showed a significant positive trend in most of the analyzed stations, indicating a higher frequency of warm days during the year.

  17. Full-degrees-of-freedom frequency based substructuring

    Science.gov (United States)

    Drozg, Armin; Čepon, Gregor; Boltežar, Miha

    2018-01-01

    Dividing the whole system into multiple subsystems and a separate dynamic analysis is common practice in the field of structural dynamics. The substructuring process improves the computational efficiency and enables an effective realization of the local optimization, modal updating and sensitivity analyses. This paper focuses on frequency-based substructuring methods using experimentally obtained data. An efficient substructuring process has already been demonstrated using numerically obtained frequency-response functions (FRFs). However, the experimental process suffers from several difficulties, among which, many of them are related to the rotational degrees of freedom. Thus, several attempts have been made to measure, expand or combine numerical correction methods in order to obtain a complete response model. The proposed methods have numerous limitations and are not yet generally applicable. Therefore, in this paper an alternative approach based on experimentally obtained data only, is proposed. The force-excited part of the FRF matrix is measured with piezoelectric translational and rotational direct accelerometers. The incomplete moment-excited part of the FRF matrix is expanded, based on the modal model. The proposed procedure is integrated in a Lagrange Multiplier Frequency Based Substructuring method and demonstrated on a simple beam structure, where the connection coordinates are mainly associated with the rotational degrees of freedom.

  18. Event-based internet biosurveillance: relation to epidemiological observation

    Directory of Open Access Journals (Sweden)

    Nelson Noele P

    2012-06-01

    Full Text Available Abstract Background The World Health Organization (WHO collects and publishes surveillance data and statistics for select diseases, but traditional methods of gathering such data are time and labor intensive. Event-based biosurveillance, which utilizes a variety of Internet sources, complements traditional surveillance. In this study we assess the reliability of Internet biosurveillance and evaluate disease-specific alert criteria against epidemiological data. Methods We reviewed and compared WHO epidemiological data and Argus biosurveillance system data for pandemic (H1N1 2009 (April 2009 – January 2010 from 8 regions and 122 countries to: identify reliable alert criteria among 15 Argus-defined categories; determine the degree of data correlation for disease progression; and assess timeliness of Internet information. Results Argus generated a total of 1,580 unique alerts; 5 alert categories generated statistically significant (p  Conclusion Confirmed pandemic (H1N1 2009 cases collected by Argus and WHO methods returned consistent results and confirmed the reliability and timeliness of Internet information. Disease-specific alert criteria provide situational awareness and may serve as proxy indicators to event progression and escalation in lieu of traditional surveillance data; alerts may identify early-warning indicators to another pandemic, preparing the public health community for disease events.

  19. CHANGES IN FREQUENCY, PERSISTENCE AND INTENSITY OF EXTREME HIGH-TEMPERATURE EVENTS IN THE ROMANIAN PLAIN

    Directory of Open Access Journals (Sweden)

    DRAGOTĂ CARMEN-SOFIA

    2015-03-01

    Full Text Available Recent summer heat waves (2003, 2010 had a strong socio-economic impact in different parts of the continent by means of crop shortfalls and forest fires. Sustained hot days became more frequent in the recent decades in many European regions, affecting human health and leading to additional deaths. This signal has been outlined in many studies conducted in Romania, suggesting that the southern region of Romania is particularly subject to large temperature increase. This work investigates the changing annual and seasonal heat waves at regional scale of the Romanian Plain, over period 1961-2014. Daily maximum temperature recorded at six weather stations available from the ECA&D project (European Climate Assessment and Datasets were analyzed. The changes in the seasonal frequency, duration and intensity of heat waves were studied using the Mann-Kendall nonparametric trend test, as recommended by the scientific expert team on climate change detection. The likelyhood of higher maximum temperatures rise, particularly after the mid 1980s, and the changes in the upper tail of the probability density functions of these temperatures, within the extreme domain (beyond the 95% percentile level, explain the persistence and intensity of heat waves. The upward trends are dominant most of the year, and many of the calculated decadal slopes were found statistically significant (relative to the 5% level, proving an ongoing and strong warming all over the region. Our findings are in good agreement with several recent studies carried out at European and national scale and pledge for further scientific analyses i.e. heat stress impact on public health and agriculture.

  20. Time and frequency domain characteristics of detrending-operation-based scaling analysis: Exact DFA and DMA frequency responses

    Science.gov (United States)

    Kiyono, Ken; Tsujimoto, Yutaka

    2016-07-01

    We develop a general framework to study the time and frequency domain characteristics of detrending-operation-based scaling analysis methods, such as detrended fluctuation analysis (DFA) and detrending moving average (DMA) analysis. In this framework, using either the time or frequency domain approach, the frequency responses of detrending operations are calculated analytically. Although the frequency domain approach based on conventional linear analysis techniques is only applicable to linear detrending operations, the time domain approach presented here is applicable to both linear and nonlinear detrending operations. Furthermore, using the relationship between the time and frequency domain representations of the frequency responses, the frequency domain characteristics of nonlinear detrending operations can be obtained. Based on the calculated frequency responses, it is possible to establish a direct connection between the root-mean-square deviation of the detrending-operation-based scaling analysis and the power spectrum for linear stochastic processes. Here, by applying our methods to DFA and DMA, including higher-order cases, exact frequency responses are calculated. In addition, we analytically investigate the cutoff frequencies of DFA and DMA detrending operations and show that these frequencies are not optimally adjusted to coincide with the corresponding time scale.

  1. Structural eigenfrequency optimization based on local sub-domain "frequencies"

    DEFF Research Database (Denmark)

    Pedersen, Pauli; Pedersen, Niels Leergaard

    2013-01-01

    eigenfrequencies may also be controlled in this manner.The presented examples are based on 2D finite element models with the use of subspace iteration for analysis and a recursive design procedure based on the derived optimality condition. The design that maximize a frequency depend on the total amount......The engineering approach of fully stressed design is a practical tool with a theoretical foundation. The analog approach to structural eigenfrequency optimization is presented here with its theoretical foundation. A numerical redesign procedure is proposed and illustrated with examples.......For the ideal case, an optimality criterion is fulfilled if the design have the same sub-domain ”frequency” (local Rayleigh quotient). Sensitivity analysis shows an important relation between squared system eigenfrequency and squared local sub-domain frequency for a given eigenmode. Higher order...

  2. A Frequency Multiplier Based on Time Recursive Processing

    Directory of Open Access Journals (Sweden)

    D. M. Perisic

    2017-12-01

    Full Text Available This paper describes a digital frequency multiplier for a pulse rate. The multiplier is based on the recursive processing of the input and output periods and their time differences. Special emphasis is devoted to the techniques which provide the development of multipliers based on this principle. The circuit is defined by two system parameters. One is the ratio of two clock frequencies and the other is a division factor of a binary counter. The realization of the circuit is described. The region of the system parameters for the stable circuit is presented. The different aspects of applications and limitations in realization of the circuit are considered. All mathematical analyses are made using a Z transform approach. It is shown that the circuit can be also used in tracking and prediction applications. Computer simulations are performed to prove the correctness of the math and the whole approach.

  3. The InfiniBand based Event Builder implementation for the LHCb upgrade

    Science.gov (United States)

    Falabella, A.; Giacomini, F.; Manzali, M.; Marconi, U.; Neufeld, N.; Valat, S.; Voneki, B.

    2017-10-01

    The LHCb experiment will undergo a major upgrade during the second long shutdown (2019 - 2020). The upgrade will concern both the detector and the Data Acquisition system, which are to be rebuilt in order to optimally exploit the foreseen higher event rate. The Event Builder is the key component of the DAQ system, for it gathers data from the sub-detectors and builds up the whole event. The Event Builder network has to manage an incoming data rate of 32 Tb/s from a 40 MHz bunch-crossing frequency, with a cardinality of about 500 nodes. In this contribution we present an Event Builder implementation based on the InfiniBand network technology. This software relies on the InfiniBand verbs, which offers a user space interface to employ the Remote Direct Memory Access capabilities provided by the InfiniBand network devices. We will present the performance of the software on a cluster connected with 100 Gb/s InfiniBand network.

  4. Investigation of the frequency content of ground motions recorded during strong Vrancea earthquakes, based on deterministic and stochastic indices

    OpenAIRE

    Craifaleanu, Iolanda-Gabriela

    2013-01-01

    The paper presents results from a recent study in progress, involving an extensive analysis, based on several deterministic and stochastic indices, of the frequency content of ground motions recorded during strong Vrancea seismic events. The study, continuing those initiated by Lungu et al. in the early nineties, aims to better reveal the characteristics of the analyzed ground motions. Over 300 accelerograms, recorded during the strong Vrancea seismic events mentioned above and recently re-di...

  5. Event-based image recognition applied in tennis training assistance

    Science.gov (United States)

    Wawrzyniak, Zbigniew M.; Kowalski, Adam

    2016-09-01

    This paper presents a concept of a real-time system for individual tennis training assistance. The system is supposed to provide user (player) with information on his strokes accuracy as well as other training quality parameters such as velocity and rotation of the ball during its flight. The method is based on image processing methods equipped with developed explorative analysis of the events and their description by parameters of the movement. There has been presented the concept for further deployment to create a complete system that could assist tennis player during individual training.

  6. Intelligent Transportation Control based on Proactive Complex Event Processing

    Directory of Open Access Journals (Sweden)

    Wang Yongheng

    2016-01-01

    Full Text Available Complex Event Processing (CEP has become the key part of Internet of Things (IoT. Proactive CEP can predict future system states and execute some actions to avoid unwanted states which brings new hope to intelligent transportation control. In this paper, we propose a proactive CEP architecture and method for intelligent transportation control. Based on basic CEP technology and predictive analytic technology, a networked distributed Markov decision processes model with predicting states is proposed as sequential decision model. A Q-learning method is proposed for this model. The experimental evaluations show that this method works well when used to control congestion in in intelligent transportation systems.

  7. A Bayesian Model for Event-based Trust

    DEFF Research Database (Denmark)

    Nielsen, Mogens; Krukow, Karl; Sassone, Vladimiro

    2007-01-01

    The application scenarios envisioned for ‘global ubiquitous computing’ have unique requirements that are often incompatible with traditional security paradigms. One alternative currently being investigated is to support security decision-making by explicit representation of principals' trusting...... of the systems from the computational trust literature; the comparison is derived formally, rather than obtained via experimental simulation as traditionally done. With this foundation in place, we formalise a general notion of information about past behaviour, based on event structures. This yields a flexible...

  8. MAS Based Event-Triggered Hybrid Control for Smart Microgrids

    DEFF Research Database (Denmark)

    Dou, Chunxia; Liu, Bin; Guerrero, Josep M.

    2013-01-01

    This paper is focused on an advanced control for autonomous microgrids. In order to improve the performance regarding security and stability, a hierarchical decentralized coordinated control scheme is proposed based on multi-agents structure. Moreover, corresponding to the multi-mode and the hybrid...... haracteristics of microgrids, an event-triggered hybrid control, including three kinds of switching controls, is designed to intelligently reconstruct operation mode when the security stability assessment indexes or the constraint conditions are violated. The validity of proposed control scheme is demonstrated...

  9. Identification of new events in Apollo 16 lunar seismic data by Hidden Markov Model-based event detection and classification

    Science.gov (United States)

    Knapmeyer-Endrun, Brigitte; Hammer, Conny

    2015-10-01

    Detection and identification of interesting events in single-station seismic data with little prior knowledge and under tight time constraints is a typical scenario in planetary seismology. The Apollo lunar seismic data, with the only confirmed events recorded on any extraterrestrial body yet, provide a valuable test case. Here we present the application of a stochastic event detector and classifier to the data of station Apollo 16. Based on a single-waveform example for each event class and some hours of background noise, the system is trained to recognize deep moonquakes, impacts, and shallow moonquakes and performs reliably over 3 years of data. The algorithm's demonstrated ability to detect rare events and flag previously undefined signal classes as new event types is of particular interest in the analysis of the first seismic recordings from a completely new environment. We are able to classify more than 50% of previously unclassified lunar events, and additionally find over 200 new events not listed in the current lunar event catalog. These events include deep moonquakes as well as impacts and could be used to update studies on temporal variations in event rate or deep moonquakes stacks used in phase picking for localization. No unambiguous new shallow moonquake was detected, but application to data of the other Apollo stations has the potential for additional new discoveries 40 years after the data were recorded. Besides, the classification system could be useful for future seismometer missions to other planets, e.g., the InSight mission to Mars.

  10. Nano-Scale Devices for Frequency-Based Magnetic Biosensing

    Science.gov (United States)

    2017-01-31

    which itself contains a magnetic vortex). Magnetoresistive rectification (which depends on anisotropic magnetoresistance) leads to a voltage ...AFRL-AFOSR-JP-TR-2017-0010 Nano-Scale Devices for Frequency-Based Magnetic Biosensing Peter Metaxas UNIVERSITY OF WESTERN AUSTRALIA Final Report 01...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ORGANIZATION. 1. REPORT DATE (DD-MM-YYYY)   09-02-2017 2. REPORT TYPE

  11. Rain-on-snow events over North America based on two Canadian regional climate models

    Science.gov (United States)

    Il Jeong, Dae; Sushama, Laxmi

    2018-01-01

    This study evaluates projected changes to rain-on-snow (ROS) characteristics (i.e., frequency, rainfall amount, and runoff) for the future 2041-2070 period with respect to the current 1976-2005 period over North America using six simulations, based on two Canadian RCMs, driven by two driving GCMs for RCP4.5 and 8.5 emission pathways. Prior to assessing projected changes, the two RCMs are evaluated by comparing ERA-Interim driven RCM simulations with available observations, and results indicate that both models reproduce reasonably well the observed spatial patterns of ROS event frequency and other related features. Analysis of current and future simulations suggest general increases in ROS characteristics during the November-March period for most regions of Canada and for northwestern US for the future period, due to an increase in the rainfall frequency with warmer air temperatures in future. Future ROS runoff is often projected to increase more than future ROS rainfall amounts, particularly for northeastern North America, during snowmelt months, as ROS events usually accelerate snowmelt. The simulations show that ROS event is a primary flood generating mechanism over most of Canada and north-western and -central US for the January-May period for the current period and this is projected to continue in the future period. More focused analysis over selected basins shows decreases in future spring runoff due to decreases in both snow cover and ROS runoff. The above results highlight the need to take into consideration ROS events in water resources management adaptation strategies for future climate.

  12. Memory-based mismatch response to frequency changes in rats.

    Directory of Open Access Journals (Sweden)

    Piia Astikainen

    Full Text Available Any occasional changes in the acoustic environment are of potential importance for survival. In humans, the preattentive detection of such changes generates the mismatch negativity (MMN component of event-related brain potentials. MMN is elicited to rare changes ('deviants' in a series of otherwise regularly repeating stimuli ('standards'. Deviant stimuli are detected on the basis of a neural comparison process between the input from the current stimulus and the sensory memory trace of the standard stimuli. It is, however, unclear to what extent animals show a similar comparison process in response to auditory changes. To resolve this issue, epidural potentials were recorded above the primary auditory cortex of urethane-anesthetized rats. In an oddball condition, tone frequency was used to differentiate deviants interspersed randomly among a standard tone. Mismatch responses were observed at 60-100 ms after stimulus onset for frequency increases of 5% and 12.5% but not for similarly descending deviants. The response diminished when the silent inter-stimulus interval was increased from 375 ms to 600 ms for +5% deviants and from 600 ms to 1000 ms for +12.5% deviants. In comparison to the oddball condition the response also diminished in a control condition in which no repetitive standards were presented (equiprobable condition. These findings suggest that the rat mismatch response is similar to the human MMN and indicate that anesthetized rats provide a valuable model for studies of central auditory processing.

  13. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  14. Lessons Learned from Real-Time, Event-Based Internet Science Communications

    Science.gov (United States)

    Phillips, T.; Myszka, E.; Gallagher, D. L.; Adams, M. L.; Koczor, R. J.; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    For the last several years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of Internet-based science communication. The Directorate's Science Roundtable includes active researchers, NASA public relations, educators, and administrators. The Science@NASA award-winning family of Web sites features science, mathematics, and space news. The program includes extended stories about NASA science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. The focus of sharing science activities in real-time has been to involve and excite students and the public about science. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases, broadcasts accommodate active feedback and questions from Internet participants. Through these projects a pattern has emerged in the level of interest or popularity with the public. The pattern differentiates projects that include science from those that do not, All real-time, event-based Internet activities have captured public interest at a level not achieved through science stories or educator resource material exclusively. The worst event-based activity attracted more interest than the best written science story. One truly rewarding lesson learned through these projects is that the public recognizes the importance and excitement of being part of scientific discovery. Flying a camera to 100,000 feet altitude isn't as interesting to the public as searching for viable life-forms at these oxygen-poor altitudes. The details of these real-time, event-based projects and lessons learned will be discussed.

  15. Quantification of LOCA core damage frequency based on thermal-hydraulics analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jaehyun, E-mail: chojh@kaeri.re.kr; Park, Jin Hee; Kim, Dong-San; Lim, Ho-Gon

    2017-04-15

    Highlights: • We quantified the LOCA core damage frequency based on the best-estimated success criteria analysis. • The thermal-hydraulic analysis using MARS code has been applied to Korea Standard Nuclear Power Plants. • Five new event trees with new break size boundaries and new success criteria were developed. • The core damage frequency is 5.80E−07 (/y), which is 12% less than the conventional PSA event trees. - Abstract: A loss-of-coolant accident (LOCA) has always been significantly considered one of the most important initiating events. However, most probabilistic safety assessment models, up to now, have undoubtedly adopted the three groups of LOCA, and even an exact break size boundary that used in WASH-1400 reports was published in 1975. With an awareness of the importance of a realistic PSA for a risk-informed application, several studies have tried to find the realistic thermal-hydraulic behavior of a LOCA, and improve the PSA model. The purpose of this research is to obtain realistic results of the LOCA core damage frequency based on a success criteria analysis using the best-estimate thermal-hydraulics code. To do so, the Korea Standard Nuclear Power Plant (KSNP) was selected for this study. The MARS code was used for a thermal hydraulics analysis and the AIMS code was used for the core damage quantification. One of the major findings in the thermal hydraulics analysis was that the decay power is well removed by only a normal secondary cooling in LOCAs of below 1.4 in and by only a high pressure safety injection in LOCAs of 0.8–9.4 in. Based on the thermal hydraulics results regarding new break size boundaries and new success criteria, five new event trees (ETs) were developed. The core damage frequency of new LOCA ETs is 5.80E−07 (/y), which is 12% less than the conventional PSA ETs. In this research, we obtained not only thermal-hydraulics characteristics for the entire break size of a LOCA in view of the deterministic safety

  16. Time-Frequency Spectral Differences in Event-Related Potentials between Neurotic and Stable Persons in Human EEG

    Directory of Open Access Journals (Sweden)

    Christova C.

    2008-12-01

    Full Text Available The aim of this work is to show how Wavelet and S-transform Power Spectrum Analysis could be used for detection of the time-frequency spectral differences in series of Event-Related Potentials recorded from neurotic and stable persons. We compared the EEG records in simple counting task condition of 30 healthy subjects divided in stable and neurotic groups according to there scores in neuroticism scale on Eysenck's Personality Questionnaire. Significant differences were found in the theta and alpha EEG bands. The stable persons are characterized with more prominent theta and less prominent alpha spectral power compared to the neurotic group. The application of complex decomposed functions for both Wavelet and S-transform Power Spectrum Analysis showed to be more useful for the discrimination between both groups of subjects.

  17. SPREAD: a high-resolution daily gridded precipitation dataset for Spain - an extreme events frequency and intensity overview

    Science.gov (United States)

    Serrano-Notivoli, Roberto; Beguería, Santiago; Ángel Saz, Miguel; Longares, Luis Alberto; de Luis, Martín

    2017-09-01

    A high-resolution daily gridded precipitation dataset was built from raw data of 12 858 observatories covering a period from 1950 to 2012 in peninsular Spain and 1971 to 2012 in Balearic and Canary islands. The original data were quality-controlled and gaps were filled on each day and location independently. Using the serially complete dataset, a grid with a 5 × 5 km spatial resolution was constructed by estimating daily precipitation amounts and their corresponding uncertainty at each grid node. Daily precipitation estimations were compared to original observations to assess the quality of the gridded dataset. Four daily precipitation indices were computed to characterise the spatial distribution of daily precipitation and nine extreme precipitation indices were used to describe the frequency and intensity of extreme precipitation events. The Mediterranean coast and the Central Range showed the highest frequency and intensity of extreme events, while the number of wet days and dry and wet spells followed a north-west to south-east gradient in peninsular Spain, from high to low values in the number of wet days and wet spells and reverse in dry spells. The use of the total available data in Spain, the independent estimation of precipitation for each day and the high spatial resolution of the grid allowed for a precise spatial and temporal assessment of daily precipitation that is difficult to achieve when using other methods, pre-selected long-term stations or global gridded datasets. SPREAD dataset is publicly available at https://doi.org/10.20350/digitalCSIC/7393.

  18. SPREAD: a high-resolution daily gridded precipitation dataset for Spain – an extreme events frequency and intensity overview

    Directory of Open Access Journals (Sweden)

    R. Serrano-Notivoli

    2017-09-01

    Full Text Available A high-resolution daily gridded precipitation dataset was built from raw data of 12 858 observatories covering a period from 1950 to 2012 in peninsular Spain and 1971 to 2012 in Balearic and Canary islands. The original data were quality-controlled and gaps were filled on each day and location independently. Using the serially complete dataset, a grid with a 5 × 5 km spatial resolution was constructed by estimating daily precipitation amounts and their corresponding uncertainty at each grid node. Daily precipitation estimations were compared to original observations to assess the quality of the gridded dataset. Four daily precipitation indices were computed to characterise the spatial distribution of daily precipitation and nine extreme precipitation indices were used to describe the frequency and intensity of extreme precipitation events. The Mediterranean coast and the Central Range showed the highest frequency and intensity of extreme events, while the number of wet days and dry and wet spells followed a north-west to south-east gradient in peninsular Spain, from high to low values in the number of wet days and wet spells and reverse in dry spells. The use of the total available data in Spain, the independent estimation of precipitation for each day and the high spatial resolution of the grid allowed for a precise spatial and temporal assessment of daily precipitation that is difficult to achieve when using other methods, pre-selected long-term stations or global gridded datasets. SPREAD dataset is publicly available at https://doi.org/10.20350/digitalCSIC/7393.

  19. Electrophysiological correlates of strategic monitoring in event-based and time-based prospective memory.

    Directory of Open Access Journals (Sweden)

    Giorgia Cona

    Full Text Available Prospective memory (PM is the ability to remember to accomplish an action when a particular event occurs (i.e., event-based PM, or at a specific time (i.e., time-based PM while performing an ongoing activity. Strategic Monitoring is one of the basic cognitive functions supporting PM tasks, and involves two mechanisms: a retrieval mode, which consists of maintaining active the intention in memory; and target checking, engaged for verifying the presence of the PM cue in the environment. The present study is aimed at providing the first evidence of event-related potentials (ERPs associated with time-based PM, and at examining differences and commonalities in the ERPs related to Strategic Monitoring mechanisms between event- and time-based PM tasks.The addition of an event-based or a time-based PM task to an ongoing activity led to a similar sustained positive modulation of the ERPs in the ongoing trials, mainly expressed over prefrontal and frontal regions. This modulation might index the retrieval mode mechanism, similarly engaged in the two PM tasks. On the other hand, two further ERP modulations were shown specifically in an event-based PM task. An increased positivity was shown at 400-600 ms post-stimulus over occipital and parietal regions, and might be related to target checking. Moreover, an early modulation at 130-180 ms post-stimulus seems to reflect the recruitment of attentional resources for being ready to respond to the event-based PM cue. This latter modulation suggests the existence of a third mechanism specific for the event-based PM; that is, the "readiness mode".

  20. Modeling the Frequency and Costs Associated with Postsurgical Gastrointestinal Adverse Events for Tapentadol IR versus Oxycodone IR

    Science.gov (United States)

    Paris, Andrew; Kozma, Chris M.; Chow, Wing; Patel, Anisha M.; Mody, Samir H.; Kim, Myoung S.

    2013-01-01

    Background Few studies have estimated the economic effect of using an opioid that is associated with lower rates of gastrointestinal (GI) adverse events (AEs) than another opioid for postsurgical pain. Objective To estimate the number of postsurgical GI events and incremental hospital costs, including potential savings, associated with lower GI AE rates, for tapentadol immediate release (IR) versus oxycodone IR, using a literature-based calculator. Methods An electronic spreadsheet–based cost calculator was developed to estimate the total number of GI AEs (ie, nausea, vomiting, or constipation) and incremental costs to a hospital when using tapentadol IR 100 mg versus oxycodone IR 15 mg, in a hypothetical cohort of 1500 hospitalized patients requiring short-acting opioids for postsurgical pain. Data inputs were chosen from recently published, well-designed studies, including GI AE rates from a previously published phase 3 clinical trial of postsurgical patients who received these 2 opioids; GI event–related incremental length of stay from a large US hospital database; drug costs using wholesale acquisition costs in 2011 US dollars; and average hospitalization cost from the 2009 Healthcare Cost and Utilization Project database. The base case assumed that 5% (chosen as a conservative estimate) of patients admitted to the hospital would shift from oxycodone IR to tapentadol IR. Results In this hypothetical cohort of 1500 hospitalized patients, replacing 5% of oxycodone IR 15-mg use with tapentadol IR 100-mg use predicted reductions in the total number of GI events from 1095 to 1085, and in the total cost of GI AEs from $2,978,400 to $2,949,840. This cost reduction translates to a net savings of $22,922 after factoring in drug cost. For individual GI events, the net savings were $26,491 for nausea; $12,212 for vomiting; and $7187 for constipation. Conclusion Using tapentadol IR in place of a traditional μ-opioid shows the potential for reduced GI events and

  1. Event-based total suspended sediment particle size distribution model

    Science.gov (United States)

    Thompson, Jennifer; Sattar, Ahmed M. A.; Gharabaghi, Bahram; Warner, Richard C.

    2016-05-01

    One of the most challenging modelling tasks in hydrology is prediction of the total suspended sediment particle size distribution (TSS-PSD) in stormwater runoff generated from exposed soil surfaces at active construction sites and surface mining operations. The main objective of this study is to employ gene expression programming (GEP) and artificial neural networks (ANN) to develop a new model with the ability to more accurately predict the TSS-PSD by taking advantage of both event-specific and site-specific factors in the model. To compile the data for this study, laboratory scale experiments using rainfall simulators were conducted on fourteen different soils to obtain TSS-PSD. This data is supplemented with field data from three construction sites in Ontario over a period of two years to capture the effect of transport and deposition within the site. The combined data sets provide a wide range of key overlooked site-specific and storm event-specific factors. Both parent soil and TSS-PSD in runoff are quantified by fitting each to a lognormal distribution. Compared to existing regression models, the developed model more accurately predicted the TSS-PSD using a more comprehensive list of key model input parameters. Employment of the new model will increase the efficiency of deployment of required best management practices, designed based on TSS-PSD, to minimize potential adverse effects of construction site runoff on aquatic life in the receiving watercourses.

  2. Possible shallow slow slip events in Hyuga-nada, Nankai subduction zone, inferred from migration of very low frequency earthquakes

    Science.gov (United States)

    Asano, Youichi; Obara, Kazushige; Matsuzawa, Takanori; Hirose, Hitoshi; Ito, Yoshihiro

    2015-01-01

    investigated the spatiotemporal evolution of a shallow very low frequency earthquake (sVLFE) swarm linked to the 2009/2010 long-term slow slip event (SSE) in the Bungo channel, southwestern Japan. Broadband seismograms were analyzed using a cross-correlation technique to detect sVLFEs having similar waveforms to template sVLFEs, and their relative locations were estimated. The sVLFEs exhibit clear migration over a distance of 150 km along the Nankai trough, similar to nonvolcanic tremors and deep very low frequency earthquakes (dVLFEs) accompanied by short-term SSEs on the downward extension of the seismogenic zone. This similarity between sVLFEs and dVLFEs suggests that SSEs occur in both deeper and shallower extensions of the seismogenic zone. The analyzed sVLFEs were likely caused by a shallow SSE that occurred from January to March 2010 following the initiation and acceleration of the long-term SSE. This temporal evolution may be caused by stress interaction between the shallow SSE and the long-term SSE.

  3. Frequency of skeletal-related events and associated healthcare resource use and costs in US patients with multiple myeloma.

    Science.gov (United States)

    Nash Smyth, Emily; Conti, Ilaria; Wooldridge, James E; Bowman, Lee; Li, Li; Nelson, David R; Ball, Daniel E

    2016-01-01

    A potential complication for all new multiple myeloma (MM) patients is the clinical presentation of osteolytic lesions which increase the risk for skeletal-related events (SREs). However, the contribution of SREs to the overall economic impact of MM is unclear. The impact of SREs on healthcare resource utilization (HCRU) and costs for US patients with MM was analyzed in Truven Health Marketscan Commercial Claims and Medicare Supplemental Databases. Adults diagnosed with MM between January 1, 2005 and December 31, 2010 with ≥2 claims ≥30 days apart (first claim = index date) were included. SREs included: hypercalcemia, pathologic fracture, surgery for the prevention and treatment of pathologic fractures or spinal cord compression, and radiation for bone pain. Rates of HCRU (outpatient [OP], inpatient [IP], emergency room [ER], orthopedic consultation [OC], and ancillary) and healthcare costs were compared between MM patients with and without SREs. Inverse propensity weighting was applied to adjust for potential bias. Of 1028 MM patients (mean age = 67, standard deviation = 13.2), 596 patients with ≥1 SRE and 432 without SREs were assessed. HCRU rates in IP, ER, and ancillary (p < 0.01) and mean total costs of OP, IP, and ER were significantly higher (p < 0.05) for patients with vs without SREs during follow-up. HCRU rates also increased with SRE frequency (p < 0.05 in OP, IP, ER, OC, and ancillary), as did mean total healthcare costs, except for OC (p < 0.001). A broad assessment of pharmacotherapy for the treatment of MM was not an objective of the current study. Bisphosphonate use was evaluated; however, results were descriptively focused on frequency of utilization only and were not included in the broader cost and HCRU analysis. Among US patients with MM, higher SRE frequency was associated with a significant trend of higher HCRU and total healthcare costs in several settings.

  4. Neural correlates of multimodal metaphor comprehension: Evidence from event-related potentials and time-frequency decompositions.

    Science.gov (United States)

    Ma, Qingguo; Hu, Linfeng; Xiao, Can; Bian, Jun; Jin, Jia; Wang, Qiuzhen

    2016-11-01

    The present study examined the event-related potential (ERP) and time-frequency components correlates with the comprehension process of multimodal metaphors that were represented by the combination of "a vehicle picture+a written word of an animal". Electroencephalogram data were recorded when participants decided whether the metaphor using an animal word for the vehicle rendered by a picture was appropriate or not. There were two conditions: appropriateness (e.g., sport utility vehicles+tiger) vs. inappropriateness (e.g., sport utility vehicles+cat). The ERP results showed that inappropriate metaphor elicited larger N300 (280-360ms) and N400 (380-460ms) amplitude than appropriate one, which were different from previous exclusively verbal metaphor studies that rarely observed the N300 effect. A P600 (550-750ms) was also observed and larger in appropriate metaphor condition. Besides, the time-frequency principal component analysis revealed that two independent theta activities indexed the separable processes (retrieval of semantic features and semantic integration) underlying the N300 and N400. Delta band was also induced within a later time window and best characterized the integration process underlying P600. These results indicate the specific cognitive mechanism of multimodal metaphor comprehension that is different from verbal metaphor processing, mirrored by several separable processes indexed by ERP components and time-frequency components. The present study extends the metaphor research by uncovering the functional roles of delta and theta as well as their unique contributions to the ERP components during multimodal metaphor comprehension. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Music, clicks, and their imaginations favor differently the event-based timing component for rhythmic movements.

    Science.gov (United States)

    Bravi, Riccardo; Quarta, Eros; Del Tongo, Claudia; Carbonaro, Nicola; Tognetti, Alessandro; Minciacchi, Diego

    2015-06-01

    The involvement or noninvolvement of a clock-like neural process, an effector-independent representation of the time intervals to produce, is described as the essential difference between event-based and emergent timing. In a previous work (Bravi et al. in Exp Brain Res 232:1663-1675, 2014a. doi: 10.1007/s00221-014-3845-9 ), we studied repetitive isochronous wrist's flexion-extensions (IWFEs), performed while minimizing visual and tactile information, to clarify whether non-temporal and temporal characteristics of paced auditory stimuli affect the precision and accuracy of the rhythmic motor performance. Here, with the inclusion of new recordings, we expand the examination of the dataset described in our previous study to investigate whether simple and complex paced auditory stimuli (clicks and music) and their imaginations influence in a different way the timing mechanisms for repetitive IWFEs. Sets of IWFEs were analyzed by the windowed (lag one) autocorrelation-wγ(1), a statistical method recently introduced for the distinction between event-based and emergent timing. Our findings provide evidence that paced auditory information and its imagination favor the engagement of a clock-like neural process, and specifically that music, unlike clicks, lacks the power to elicit event-based timing, not counteracting the natural shift of wγ(1) toward positive values as frequency of movements increase.

  6. Robust spike classification based on frequency domain neural waveform features.

    Science.gov (United States)

    Yang, Chenhui; Yuan, Yuan; Si, Jennie

    2013-12-01

    We introduce a new spike classification algorithm based on frequency domain features of the spike snippets. The goal for the algorithm is to provide high classification accuracy, low false misclassification, ease of implementation, robustness to signal degradation, and objectivity in classification outcomes. In this paper, we propose a spike classification algorithm based on frequency domain features (CFDF). It makes use of frequency domain contents of the recorded neural waveforms for spike classification. The self-organizing map (SOM) is used as a tool to determine the cluster number intuitively and directly by viewing the SOM output map. After that, spike classification can be easily performed using clustering algorithms such as the k-Means. In conjunction with our previously developed multiscale correlation of wavelet coefficient (MCWC) spike detection algorithm, we show that the MCWC and CFDF detection and classification system is robust when tested on several sets of artificial and real neural waveforms. The CFDF is comparable to or outperforms some popular automatic spike classification algorithms with artificial and real neural data. The detection and classification of neural action potentials or neural spikes is an important step in single-unit-based neuroscientific studies and applications. After the detection of neural snippets potentially containing neural spikes, a robust classification algorithm is applied for the analysis of the snippets to (1) extract similar waveforms into one class for them to be considered coming from one unit, and to (2) remove noise snippets if they do not contain any features of an action potential. Usually, a snippet is a small 2 or 3 ms segment of the recorded waveform, and differences in neural action potentials can be subtle from one unit to another. Therefore, a robust, high performance classification system like the CFDF is necessary. In addition, the proposed algorithm does not require any assumptions on statistical

  7. Frequency selective surfaces based high performance microstrip antenna

    CERN Document Server

    Narayan, Shiv; Jha, Rakesh Mohan

    2016-01-01

    This book focuses on performance enhancement of printed antennas using frequency selective surfaces (FSS) technology. The growing demand of stealth technology in strategic areas requires high-performance low-RCS (radar cross section) antennas. Such requirements may be accomplished by incorporating FSS into the antenna structure either in its ground plane or as the superstrate, due to the filter characteristics of FSS structure. In view of this, a novel approach based on FSS technology is presented in this book to enhance the performance of printed antennas including out-of-band structural RCS reduction. In this endeavor, the EM design of microstrip patch antennas (MPA) loaded with FSS-based (i) high impedance surface (HIS) ground plane, and (ii) the superstrates are discussed in detail. The EM analysis of proposed FSS-based antenna structures have been carried out using transmission line analogy, in combination with the reciprocity theorem. Further, various types of novel FSS structures are considered in desi...

  8. Frequency based design of modal controllers for adaptive optics systems.

    Science.gov (United States)

    Agapito, Guido; Battistelli, Giorgio; Mari, Daniele; Selvi, Daniela; Tesi, Alberto; Tesi, Pietro

    2012-11-19

    This paper addresses the problem of reducing the effects of wavefront distortions in ground-based telescopes within a "Modal-Control" framework. The proposed approach allows the designer to optimize the Youla parameter of a given modal controller with respect to a relevant adaptive optics performance criterion defined on a "sampled" frequency domain. This feature makes it possible to use turbulence/vibration profiles of arbitrary complexity (even empirical power spectral densities from data), while keeping the controller order at a moderate value. Effectiveness of the proposed solution is also illustrated through an adaptive optics numerical simulator.

  9. Barometric pressure and triaxial accelerometry-based falls event detection.

    Science.gov (United States)

    Bianchi, Federico; Redmond, Stephen J; Narayanan, Michael R; Cerutti, Sergio; Lovell, Nigel H

    2010-12-01

    Falls and fall related injuries are a significant cause of morbidity, disability, and health care utilization, particularly among the age group of 65 years and over. The ability to detect falls events in an unsupervised manner would lead to improved prognoses for falls victims. Several wearable accelerometry and gyroscope-based falls detection devices have been described in the literature; however, they all suffer from unacceptable false positive rates. This paper investigates the augmentation of such systems with a barometric pressure sensor, as a surrogate measure of altitude, to assist in discriminating real fall events from normal activities of daily living. The acceleration and air pressure data are recorded using a wearable device attached to the subject's waist and analyzed offline. The study incorporates several protocols including simulated falls onto a mattress and simulated activities of daily living, in a cohort of 20 young healthy volunteers (12 male and 8 female; age: 23.7 ±3.0 years). A heuristically trained decision tree classifier is used to label suspected falls. The proposed system demonstrated considerable improvements in comparison to an existing accelerometry-based technique; showing an accuracy, sensitivity and specificity of 96.9%, 97.5%, and 96.5%, respectively, in the indoor environment, with no false positives generated during extended testing during activities of daily living. This is compared to 85.3%, 75%, and 91.5% for the same measures, respectively, when using accelerometry alone. The increased specificity of this system may enhance the usage of falls detectors among the elderly population.

  10. Adverse events in transcatheter interventions for congenital heart disease: a population-based long-term study.

    Science.gov (United States)

    Larsen, Signe Holm; Emmertsen, Kristian; Hjortdal, Vibeke Elisabeth; Nielsen-Kudsk, Jens Erik

    2015-01-01

    Few studies have reported procedure complications or adverse events for transcatheter interventions in unselected congenital heart disease cohorts. We report our 23-year experience with transcatheter interventions in congenital heart disease and examine predictors for adverse events. This study was a population-based cohort study of children and adults with congenital heart disease, covering a population of 3 million inhabitants in western Denmark. Adverse events were subdivided into 5 levels according to their severity. Procedure-type risk category, age, weight, and year of procedure were included in multivariate logistic models to identify risk factors for adverse events. Between 1990 and 2012, 1595 patients had 1878 catheter-based interventions performed. We identified 241 adverse events, corresponding to 13% of the procedures; 58 (3%) were considered to be of high severity. During the study period, there was an increase in number of procedures per year (P < .001) and a decrease in frequency of adverse events (P = .01). Procedure-type categories 3 and 4 had increased risk of adverse events when compared with category 2, with odds ratios of 1.7 (95% confidence interval [CI]: 1.2-2.3) for category 3 and 2.3 for category 4 (95% CI: 1.4-3.6). Age and weight at catheterization were not independently associated with adverse events. We found an increase in number of procedures over time and a decrease in frequency of adverse events. Higher procedure-type risk categories were associated with increased risk of adverse events. © 2014 Wiley Periodicals, Inc.

  11. Investigation of the frequency content of ground motions recorded during strong Vrancea earthquakes, based on deterministic and stochastic indices

    CERN Document Server

    Craifaleanu, Iolanda-Gabriela

    2013-01-01

    The paper presents results from a recent study in progress, involving an extensive analysis, based on several deterministic and stochastic indices, of the frequency content of ground motions recorded during strong Vrancea seismic events. The study, continuing those initiated by Lungu et al. in the early nineties, aims to better reveal the characteristics of the analyzed ground motions. Over 300 accelerograms, recorded during the strong Vrancea seismic events mentioned above and recently re-digitized, are used in the study. Various analytical estimators of the frequency content, such as those based on Fourier spectra, power spectral density, response spectra and peak ground motion values are evaluated and compared. The results are correlated and validated by using the information provided by various spectral bandwidth measures, as the Vanmarcke and the Cartwright and Longuet-Higgins indices. The capacity of the analyzed estimators to describe the frequency content of the analyzed ground motions is assessed com...

  12. Identification of trends in intensity and frequency of extreme rainfall events in part of the Indian Himalaya

    Science.gov (United States)

    Bhardwaj, Alok; Ziegler, Alan D.; Wasson, Robert J.; Chow, Winston; Sharma, Mukat L.

    2017-04-01

    -whitening and variance correction methods was obtained. Further, increasing trend in frequency with a slope magnitude of 0.01 was identified by three methods except block bootstrap in the south of the catchment. With the APHRODITE data set, we obtained significant increasing trend in intensity with a slope magnitude of 1.27 at the middle of the catchment as identified by all four methods. Collectively, both the datasets show signals of increasing intensity, and IMD shows results for increasing frequency in the Mandakini Catchment. The increasing occurrence of extreme events, as identified here, is becoming more disastrous because of rising human population and infrastructure in the Mandakini Catchment. For example, the 2013 flood due to extreme rainfall was catastrophic in terms of loss of human and animal lives and destruction of the local economy. We believe our results will help understand more about extreme rainfall events in the Mandakini Catchment and in the Indian Himalaya.

  13. Comparison of metatranscriptomic samples based on k-tuple frequencies.

    Directory of Open Access Journals (Sweden)

    Ying Wang

    Full Text Available The comparison of samples, or beta diversity, is one of the essential problems in ecological studies. Next generation sequencing (NGS technologies make it possible to obtain large amounts of metagenomic and metatranscriptomic short read sequences across many microbial communities. De novo assembly of the short reads can be especially challenging because the number of genomes and their sequences are generally unknown and the coverage of each genome can be very low, where the traditional alignment-based sequence comparison methods cannot be used. Alignment-free approaches based on k-tuple frequencies, on the other hand, have yielded promising results for the comparison of metagenomic samples. However, it is not known if these approaches can be used for the comparison of metatranscriptome datasets and which dissimilarity measures perform the best.We applied several beta diversity measures based on k-tuple frequencies to real metatranscriptomic datasets from pyrosequencing 454 and Illumina sequencing platforms to evaluate their effectiveness for the clustering of metatranscriptomic samples, including three d2-type dissimilarity measures, one dissimilarity measure in CVTree, one relative entropy based measure S2 and three classical 1p-norm distances. Results showed that the measure d2(S can achieve superior performance on clustering metatranscriptomic samples into different groups under different sequencing depths for both 454 and Illumina datasets, recovering environmental gradients affecting microbial samples, classifying coexisting metagenomic and metatranscriptomic datasets, and being robust to sequencing errors. We also investigated the effects of tuple size and order of the background Markov model. A software pipeline to implement all the steps of analysis is built and is available at http://code.google.com/p/d2-tools/.The k-tuple based sequence signature measures can effectively reveal major groups and gradient variation among

  14. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf

  15. High Frequency Vibration Based Fatigue Testing of Developmental Alloys

    Science.gov (United States)

    Holycross, Casey M.; Srinivasan, Raghavan; George, Tommy J.; Tamirisakandala, Seshacharyulu; Russ, Stephan M.

    Many fatigue test methods have been previously developed to rapidly evaluate fatigue behavior. This increased test speed can come at some expense, since these methods may require non-standard specimen geometry or increased facility and equipment capability. One such method, developed by George et al, involves a base-excited plate specimen driven into a high frequency bending resonant mode. This resonant mode is of sufficient frequency (typically 1200 to 1700 Hertz) to accumulate 107 cycles in a few hours. One of the main limitations of this test method is that fatigue cracking is almost certainly guaranteed to be surface initiated at regions of high stress. This brings into question the validity of the fatigue test results, as compared to more traditional uniaxial, smooth-bar testing, since high stresses are subjecting only a small volume to fatigue damage. This limitation also brings into question the suitability of this method to screen developmental alloys, should their initiation life be governed by subsurface flaws. However, if applicable, the rapid generation of fatigue data using this method would facilitate faster design iterations, identifying more quickly, material and manufacturing process deficiencies. The developmental alloy used in this study was a powder metallurgy boron-modified Ti-6Al-4V, a new alloy currently being considered for gas turbine engine fan blades. Plate specimens were subjected to fully reversed bending fatigue. Results are compared with existing data from commercially available Ti-6Al-4V using both vibration based and more traditional fatigue test methods.

  16. A Frequency-Based Approach to Intrusion Detection

    Directory of Open Access Journals (Sweden)

    Mian Zhou

    2004-06-01

    Full Text Available Research on network security and intrusion detection strategies presents many challenging issues to both theoreticians and practitioners. Hackers apply an array of intrusion and exploit techniques to cause disruption of normal system operations, but on the defense, firewalls and intrusion detection systems (IDS are typically only effective in defending known intrusion types using their signatures, and are far less than mature when faced with novel attacks. In this paper, we adapt the frequency analysis techniques such as the Discrete Fourier Transform (DFT used in signal processing to the design of intrusion detection algorithms. We demonstrate the effectiveness of the frequency-based detection strategy by running synthetic network intrusion data in simulated networks using the OPNET software. The simulation results indicate that the proposed intrusion detection strategy is effective in detecting anomalous traffic data that exhibit patterns over time, which include several types of DOS and probe attacks. The significance of this new strategy is that it does not depend on the prior knowledge of attack signatures, thus it has the potential to be a useful supplement to existing signature-based IDS and firewalls.

  17. LPI Radar Waveform Recognition Based on Time-Frequency Distribution

    Directory of Open Access Journals (Sweden)

    Ming Zhang

    2016-10-01

    Full Text Available In this paper, an automatic radar waveform recognition system in a high noise environment is proposed. Signal waveform recognition techniques are widely applied in the field of cognitive radio, spectrum management and radar applications, etc. We devise a system to classify the modulating signals widely used in low probability of intercept (LPI radar detection systems. The radar signals are divided into eight types of classifications, including linear frequency modulation (LFM, BPSK (Barker code modulation, Costas codes and polyphase codes (comprising Frank, P1, P2, P3 and P4. The classifier is Elman neural network (ENN, and it is a supervised classification based on features extracted from the system. Through the techniques of image filtering, image opening operation, skeleton extraction, principal component analysis (PCA, image binarization algorithm and Pseudo–Zernike moments, etc., the features are extracted from the Choi–Williams time-frequency distribution (CWD image of the received data. In order to reduce the redundant features and simplify calculation, the features selection algorithm based on mutual information between classes and features vectors are applied. The superiority of the proposed classification system is demonstrated by the simulations and analysis. Simulation results show that the overall ratio of successful recognition (RSR is 94.7% at signal-to-noise ratio (SNR of −2 dB.

  18. Tunable antenna radome based on graphene frequency selective surface

    Directory of Open Access Journals (Sweden)

    Meijun Qu

    2017-09-01

    Full Text Available In this paper, a graphene-based frequency selective surface (FSS is proposed. The proposed FSS exhibits a tunable bandpass filtering characteristic due to the alterable conductivity of the graphene strips which is controlled by chemical potential. Based on the reconfigurable bandpass property of the proposed FSS, a cylindrical antenna radome is designed using the FSS unit cells. A conventional omnidirectional dipole can realize a two-beam directional pattern when it is placed into the proposed antenna radome. Forward and backward endfire radiations of the dipole loaded with the radome is realized by properly adjusting the chemical potential. The proposed antenna radome is extremely promising for beam-scanning in terahertz and mid-infrared plasmonic devices and systems when the gain of a conventional antenna needs to be enhanced.

  19. Tunable antenna radome based on graphene frequency selective surface

    Science.gov (United States)

    Qu, Meijun; Rao, Menglou; Li, Shufang; Deng, Li

    2017-09-01

    In this paper, a graphene-based frequency selective surface (FSS) is proposed. The proposed FSS exhibits a tunable bandpass filtering characteristic due to the alterable conductivity of the graphene strips which is controlled by chemical potential. Based on the reconfigurable bandpass property of the proposed FSS, a cylindrical antenna radome is designed using the FSS unit cells. A conventional omnidirectional dipole can realize a two-beam directional pattern when it is placed into the proposed antenna radome. Forward and backward endfire radiations of the dipole loaded with the radome is realized by properly adjusting the chemical potential. The proposed antenna radome is extremely promising for beam-scanning in terahertz and mid-infrared plasmonic devices and systems when the gain of a conventional antenna needs to be enhanced.

  20. Analysis of low-frequency seismic signals generated during a multiple-iceberg calving event at Jakobshavn Isbræ, Greenland

    Science.gov (United States)

    Walter, Fabian; Amundson, Jason M.; O'Neel, Shad; Truffer, Martin; Fahnestock, Mark; Fricker, Helen A.

    2012-01-01

    We investigated seismic signals generated during a large-scale, multiple iceberg calving event that occurred at Jakobshavn Isbræ, Greenland, on 21 August 2009. The event was recorded by a high-rate time-lapse camera and five broadband seismic stations located within a few hundred kilometers of the terminus. During the event two full-glacier-thickness icebergs calved from the grounded (or nearly grounded) terminus and immediately capsized; the second iceberg to calve was two to three times smaller than the first. The individual calving and capsize events were well-correlated with the radiation of low-frequency seismic signals (icebergs generated a larger seismic signal.

  1. Short term administration of glucocorticoids in patients with protracted and chronic gout arthritis. Part III – frequency of adverse events

    Directory of Open Access Journals (Sweden)

    A A Fedorova

    2009-01-01

    Full Text Available Objective. To assess frequency of adverse events during short term administration of gluco- corticoid (GC in protracted and chronic gout arthritis. Material and methods. 59 pts with tophaceous gout (crystal-verified diagnosis and arthritis of three and more joints lasting more than a months in spite of treatment with sufficient doses of nonsteroidal anti-inflammatory drugs were included. Median age of pts was 56 [48;63], median disease duration – 15,2 years [7,4;20], median swollen joint count at the examination – 8 [5;11]. The patients were randomized into 2 groups. Methylprednisolone (MP 500 mg/day iv during 2 days and placebo im once was administered in one of them, betamethasone (BM 7 mg im once and placebo iv twice – in the other. Clinical evaluation of inflamed joints was performed every day. Standard laboratory examination and ECG were done before drug administration, at 3rd, 7th, and 14th day of follow up. Immunoreactive insulin level was evaluated before drug administration and at day 14. Blood pressure (BP was measured every day. Results. After first GC administration BP elevated in 28 (47% pts. In pts not having appropriate BP values BP elevated in 73% of cases. Pts with appropriate BP values showed less frequent BP elevation – 38% (p=0,02. In 8 (13% pts at day 3 after GC administration ECG signs of myocardial blood supply deterioration were revealed. Glucose level elevated in 10 (17% pts and after the second BM administration – in 5 (8% pts. Cholesterol level did not significantly change after 14 days of follow up but in 28 (47% pts it increased in comparison with baseline. Trigliceride level significantly decreased at day 14 from 149 [106; 187] to 108 [66,5; 172] mg/dl (p=0,02. 26 (44% pts had face hyperemia, 4 (7% –42 palpitation and 2 (3,4% – bitter taste. Conclusion. Administration of short course of GC in pts with gout requires monitoring of possible adverse events. Antihypertensive therapy providing appropriate BP

  2. What a Shame: Increased Rates of OMS Resident Burnout May Be Related to the Frequency of Shamed Events During Training.

    Science.gov (United States)

    Shapiro, Michael C; Rao, Sowmya R; Dean, Jason; Salama, Andrew R

    2017-03-01

    Shame is an ineffective tool in residency education that often results in depression, isolation, and worse patient care. This study aimed to assess burnout, depersonalization, and personal achievement levels in current oral and maxillofacial surgery (OMS) residents, to assess the prevalence of the use of shame in OMS residency training, and to determine whether there is a relation between shame exposure and resident burnout, depersonalization, and personal achievement levels. An anonymous 20-question cross-sectional survey was developed incorporating the Maslach Burnout Index and a previously validated shame questionnaire and sent to all OMS program directors affiliated with the American Association of Oral and Maxillofacial Surgeons for distribution among their respective residents in 2016. Univariate analyses were used to determine the distribution of the predictor (shame) and outcome (burnout) by gender and by frequency of shaming events. Multivariable logistic regression analysis was used to assess the relation of shame to burnout. A 2-sided P value less than .05 was considered statistically significant. Two hundred seventeen responses were received; 82% of respondents were men (n = 178), 95% were 25 to 34 years old (n = 206), and 58% (n = 126) were enrolled in a 4-year program. Frequently shamed residents were more likely to have depression (58 vs 22%; P shamed were more likely to experience moderate to severe burnout (odds ratio = 4.6; 95% confidence interval, 2.1-10.0; P shamed. There is a clear relation between the number of shame events and burnout and depersonalization levels. It is important to understand the negative impact that the experience of shame has on residents, including its unintended consequences. Published by Elsevier Inc.

  3. Method for Assessing Grid Frequency Deviation Due to Wind Power Fluctuation Based on “Time-Frequency Transformation”

    DEFF Research Database (Denmark)

    Jin, Lin; Yuan-zhang, Sun; Sørensen, Poul Ejnar

    2012-01-01

    alternative of the existing dynamic frequency deviation simulation model. In this way, the method takes the stochastic wind power fluctuation into full account so as to give a quantitative risk assessment of grid frequency deviation to grid operators, even without using any dynamic simulation tool. The case......Grid frequency deviation caused by wind power fluctuation has been a major concern for secure operation of a power system with integrated large-scale wind power. Many approaches have been proposed to assess this negative effect on grid frequency due to wind power fluctuation. Unfortunately, most...... published studies are based entirely on deterministic methodology. This paper presents a novel assessment method based on Time-Frequency Transformation to overcome shortcomings of existing methods. The main contribution of the paper is to propose a stochastic process simulation model which is a better...

  4. Rydberg-atom based radio-frequency electrometry using frequency modulation spectroscopy in room temperature vapor cells.

    Science.gov (United States)

    Kumar, Santosh; Fan, Haoquan; Kübler, Harald; Jahangiri, Akbar J; Shaffer, James P

    2017-04-17

    Rydberg atom-based electrometry enables traceable electric field measurements with high sensitivity over a large frequency range, from gigahertz to terahertz. Such measurements are particularly useful for the calibration of radio frequency and terahertz devices, as well as other applications like near field imaging of electric fields. We utilize frequency modulated spectroscopy with active control of residual amplitude modulation to improve the signal to noise ratio of the optical readout of Rydberg atom-based radio frequency electrometry. Matched filtering of the signal is also implemented. Although we have reached similarly, high sensitivity with other read-out methods, frequency modulated spectroscopy is advantageous because it is well-suited for building a compact, portable sensor. In the current experiment, ∼3 µV cm-1 Hz-1/2 sensitivity is achieved and is found to be photon shot noise limited.

  5. Disentangling the effect of event-based cues on children's time-based prospective memory performance.

    Science.gov (United States)

    Redshaw, Jonathan; Henry, Julie D; Suddendorf, Thomas

    2016-10-01

    Previous time-based prospective memory research, both with children and with other groups, has measured the ability to perform an action with the arrival of a time-dependent yet still event-based cue (e.g., the occurrence of a specific clock pattern) while also engaged in an ongoing activity. Here we introduce a novel means of operationalizing time-based prospective memory and assess children's growing capacities when the availability of an event-based cue is varied. Preschoolers aged 3, 4, and 5years (N=72) were required to ring a bell when a familiar 1-min sand timer had completed a cycle under four conditions. In a 2×2 within-participants design, the timer was either visible or hidden and was either presented in the context of a single task or embedded within a dual picture-naming task. Children were more likely to ring the bell before 2min had elapsed in the visible-timer and single-task conditions, with performance improving with age across all conditions. These results suggest a divergence in the development of time-based prospective memory in the presence versus absence of event-based cues, and they also suggest that performance on typical time-based tasks may be partly driven by event-based prospective memory. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. A process-oriented event-based programming language

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Zanitti, Francesco

    2012-01-01

    Vi præsenterer den første version af PEPL, et deklarativt Proces-orienteret, Event-baseret Programmeringssprog baseret på den fornyligt introducerede Dynamic Condition Response (DCR) Graphs model. DCR Graphs tillader specifikation, distribuerede udførsel og verifikation af pervasive event-basered...... defineret og udført i en almindelig web-browser....

  7. Deterministic event-based simulation of quantum phenomena

    NARCIS (Netherlands)

    De Raedt, K; De Raedt, H; Michielsen, K

    2005-01-01

    We propose and analyse simple deterministic algorithms that can be used to construct machines that have primitive learning capabilities. We demonstrate that locally connected networks of these machines can be used to perform blind classification on an event-by-event basis, without storing the

  8. Nanoionics-Based Switches for Radio-Frequency Applications

    Science.gov (United States)

    Nessel, James; Lee, Richard

    2010-01-01

    Nanoionics-based devices have shown promise as alternatives to microelectromechanical systems (MEMS) and semiconductor diode devices for switching radio-frequency (RF) signals in diverse systems. Examples of systems that utilize RF switches include phase shifters for electronically steerable phased-array antennas, multiplexers, cellular telephones and other radio transceivers, and other portable electronic devices. Semiconductor diode switches can operate at low potentials (about 1 to 3 V) and high speeds (switching times of the order of nanoseconds) but are characterized by significant insertion loss, high DC power consumption, low isolation, and generation of third-order harmonics and intermodulation distortion (IMD). MEMS-based switches feature low insertion loss (of the order of 0.2 dB), low DC power consumption (picowatts), high isolation (>30 dB), and low IMD, but contain moving parts, are not highly reliable, and must be operated at high actuation potentials (20 to 60 V) generated and applied by use of complex circuitry. In addition, fabrication of MEMS is complex, involving many processing steps. Nanoionics-based switches offer the superior RF performance and low power consumption of MEMS switches, without need for the high potentials and complex circuitry necessary for operation of MEMS switches. At the same time, nanoionics-based switches offer the high switching speed of semiconductor devices. Also, like semiconductor devices, nanoionics-based switches can be fabricated relatively inexpensively by use of conventional integrated-circuit fabrication techniques. More over, nanoionics-based switches have simple planar structures that can easily be integrated into RF power-distribution circuits.

  9. AN EMPIRICAL ANALYSIS OF THE INFLUENCE OF RISK FACTORS ON THE FREQUENCY AND IMPACT OF SEVERE EVENTS ON THE SUPPLY CHAIN IN THE CZECH REPUBLIC

    Directory of Open Access Journals (Sweden)

    José María Caridad

    2014-12-01

    Full Text Available Purpose: This paper is focused on an analysis and evaluation of severe events according to their frequency of occurrence and their impact on the company's manufacturing and distribution supply chains performance in the Czech Republic. Risk factors are introduced for critical events.Design/methodology: An identification and classification of severe events are realized on the basis of median mapping and mapping of ordinal variability acquired through the questionnaire survey of 82 companies. Analysis of 46 risk factors was sorted into 5 groups. We used asymmetric Somers's d statistics for testing the dependence of frequency and impact of a severe event on selected risk sources. The hierarchical cluster analysis is performed to identify relatively homogeneous groups of critical severe events according to their dependency on risk factors and its strength.Findings: Results showed that ‘a lack of contracts’ is considered to be the most critical severe event. Groups of demand and supply side and an external risk factor group were identified to be the most significant sources of risk factors. The worst cluster encompasses 11% of examined risk factors which should be prevented. We concluded that organizations need to adopt appropriate precautions and risk management methods in logistics.Originality: In this paper, the methodology for severe events evaluation in supply chain is designed. This methodology involves assessing the critical factors which influence the critical events and which should be prevented.

  10. Adaptive Window Zero-Crossing-Based Instantaneous Frequency Estimation

    Directory of Open Access Journals (Sweden)

    S. Chandra Sekhar

    2004-09-01

    Full Text Available We address the problem of estimating instantaneous frequency (IF of a real-valued constant amplitude time-varying sinusoid. Estimation of polynomial IF is formulated using the zero-crossings of the signal. We propose an algorithm to estimate nonpolynomial IF by local approximation using a low-order polynomial, over a short segment of the signal. This involves the choice of window length to minimize the mean square error (MSE. The optimal window length found by directly minimizing the MSE is a function of the higher-order derivatives of the IF which are not available a priori. However, an optimum solution is formulated using an adaptive window technique based on the concept of intersection of confidence intervals. The adaptive algorithm enables minimum MSE-IF (MMSE-IF estimation without requiring a priori information about the IF. Simulation results show that the adaptive window zero-crossing-based IF estimation method is superior to fixed window methods and is also better than adaptive spectrogram and adaptive Wigner-Ville distribution (WVD-based IF estimators for different signal-to-noise ratio (SNR.

  11. Dust events in Beijing, China (2004–2006: comparison of ground-based measurements with columnar integrated observations

    Directory of Open Access Journals (Sweden)

    Z. J. Wu

    2009-09-01

    Full Text Available Ambient particle number size distributions spanning three years were used to characterize the frequency and intensity of atmospheric dust events in the urban areas of Beijing, China in combination with AERONET sun/sky radiometer data. Dust events were classified into two types based on the differences in particle number and volume size distributions and local weather conditions. This categorization was confirmed by aerosol index images, columnar aerosol optical properties, and vertical potential temperature profiles. During the type-1 events, dust particles dominated the total particle volume concentration (<10 μm, with a relative share over 70%. Anthropogenic particles in the Aitken and accumulation mode played a subordinate role here because of high wind speeds (>4 m s−1. The type-2 events occurred in rather stagnant air masses and were characterized by a lower volume fraction of coarse mode particles (on average, 55%. Columnar optical properties showed that the superposition of dust and anthropogenic aerosols in type-2 events resulted in a much higher AOD (average: 1.51 than for the rather pure dust aerosols in type-1 events (average AOD: 0.36. A discrepancy was found between the ground-based and column integrated particle volume size distributions, especially for the coarse mode particles. This discrepancy likely originates from both the limited comparability of particle volume size distributions derived from Sun photometer and in situ number size distributions, and the inhomogeneous vertical distribution of particles during dust events.

  12. Event-based home safety problem detection under the CPS home safety architecture

    OpenAIRE

    Yang, Zhengguo; Lim, Azman Osman; Tan, Yasuo

    2013-01-01

    This paper presents a CPS(Cyber-physical System) home safety architecture for home safety problem detection and reaction and shows some example cases. In order for home safety problem detection, there are three levels of events defined: elementary event, semantic event and entire event, which representing the meaning from parameter to single safety problem, and then the whole safety status of a house. For the relationship between these events and raw data, a Finite State Machine (FSM) based m...

  13. A joint renewal process used to model event based data

    National Research Council Canada - National Science Library

    Mergenthaler, Wolfgang; Jaroszewski, Daniel; Feller, Sebastian; Laumann, Larissa

    2016-01-01

    .... Event data, herein defined as a collection of triples containing a time stamp, a failure code and eventually a descriptive text, can best be evaluated by using the paradigm of joint renewal processes...

  14. Synthetic drought event sets: thousands of meteorological drought events for risk-based management under present and future conditions

    Science.gov (United States)

    Guillod, Benoit P.; Massey, Neil; Otto, Friederike E. L.; Allen, Myles R.; Jones, Richard; Hall, Jim W.

    2016-04-01

    Droughts and related water scarcity can have large impacts on societies and consist of interactions between a number of natural and human factors. Meteorological conditions are usually the first natural trigger of droughts, and climate change is expected to impact these and thereby the frequency and intensity of the events. However, extreme events such as droughts are, by definition, rare, and accurately quantifying the risk related to such events is therefore difficult. The MaRIUS project (Managing the Risks, Impacts and Uncertainties of drought and water Scarcity) aims at quantifying the risks associated with droughts in the UK under present and future conditions. To do so, a large number of drought events, from climate model simulations downscaled at 25km over Europe, are being fed into hydrological models of various complexity and used for the estimation of drought risk associated with human and natural systems, including impacts on the economy, industry, agriculture, terrestrial and aquatic ecosystems, and socio-cultural aspects. Here, we present the hydro-meteorological drought event set that has been produced by weather@home [1] for MaRIUS. Using idle processor time on volunteers' computers around the world, we have run a very large number (10'000s) of Global Climate Model (GCM) simulations, downscaled at 25km over Europe by a nested Regional Climate Model (RCM). Simulations include the past 100 years as well as two future horizons (2030s and 2080s), and provide a large number of sequences of spatio-temporally consistent weather, which are consistent with the boundary forcing such as the ocean, greenhouse gases and solar forcing. The drought event set for use in impact studies is constructed by extracting sequences of dry conditions from these model runs, leading to several thousand drought events. In addition to describing methodological and validation aspects of the synthetic drought event sets, we provide insights into drought risk in the UK, its

  15. Artificial intelligence based event detection in wireless sensor networks

    OpenAIRE

    Bahrepour, M.

    2013-01-01

    Wireless sensor networks (WSNs) are composed of large number of small, inexpensive devices, called sensor nodes, which are equipped with sensing, processing, and communication capabilities. While traditional applications of wireless sensor networks focused on periodic monitoring, the focus of more recent applications is on fast and reliable identification of out-of-ordinary situations and events. This new functionality of wireless sensor networks is known as event detection. Due to the fact t...

  16. RELATIONSHIP BETWEEN CULTURAL/ARTISTIC EVENTS VISITATION AND OTHER ACTIVITY-BASED TOURISM SEGMENTS

    National Research Council Canada - National Science Library

    Ana Tezak; Darko Saftic; Zdravko Sergo

    2011-01-01

    .... One of these specific forms of tourism is event tourism. The aim of this research is to determine the relationship between cultural/artistic events visitation and other activity-based tourism segments...

  17. Agreement between event-based and trend-based glaucoma progression analyses.

    Science.gov (United States)

    Rao, H L; Kumbar, T; Kumar, A U; Babu, J G; Senthil, S; Garudadri, C S

    2013-07-01

    To evaluate the agreement between event- and trend-based analyses to determine visual field (VF) progression in glaucoma. VFs of 175 glaucoma eyes with ≥5 VFs were analyzed by proprietary software of VF analyzer to determine progression. Agreement (κ) between trend-based analysis of VF index (VFI) and event-based analysis (glaucoma progression analysis, GPA) was evaluated. For eyes progressing by event- and trend-based methods, time to progression by two methods was calculated. Median number of VFs per eye was 7 and follow-up 7.5 years. GPA classified 101 eyes (57.7%) as stable, 30 eyes (17.1%) as possible and 44 eyes (25.2%) as likely progression. Trend-based analysis classified 122 eyes (69.7%) as stable (slope >-1% per year or any slope magnitude with P>0.05), 53 eyes (30.3%) as progressing with slope trend-based analysis was 0.48, and between specific criteria of GPA (possible clubbed with no progression) and trend-based analysis was 0.50. In eyes progressing by sensitive criteria of both methods (42 eyes), median time to progression by GPA (4.9 years) was similar (P=0.30) to trend-based method (5.0 years). This was also similar in eyes progressing by specific criteria of both methods (25 eyes; 5.6 years versus 5.9 years, P=0.23). Agreement between event- and trend-based progression analysis was moderate. GPA seemed to detect progression earlier than trend-based analysis, but this wasn't statistically significant.

  18. SParSE++: improved event-based stochastic parameter search.

    Science.gov (United States)

    Roh, Min K; Daigle, Bernie J

    2016-11-25

    Despite the increasing availability of high performance computing capabilities, analysis and characterization of stochastic biochemical systems remain a computational challenge. To address this challenge, the Stochastic Parameter Search for Events (SParSE) was developed to automatically identify reaction rates that yield a probabilistic user-specified event. SParSE consists of three main components: the multi-level cross-entropy method, which identifies biasing parameters to push the system toward the event of interest, the related inverse biasing method, and an optional interpolation of identified parameters. While effective for many examples, SParSE depends on the existence of a sufficient amount of intrinsic stochasticity in the system of interest. In the absence of this stochasticity, SParSE can either converge slowly or not at all. We have developed SParSE++, a substantially improved algorithm for characterizing target events in terms of system parameters. SParSE++ makes use of a series of novel parameter leaping methods that accelerate the convergence rate to the target event, particularly in low stochasticity cases. In addition, the interpolation stage is modified to compute multiple interpolants and to choose the optimal one in a statistically rigorous manner. We demonstrate the performance of SParSE++ on four example systems: a birth-death process, a reversible isomerization model, SIRS disease dynamics, and a yeast polarization model. In all four cases, SParSE++ shows significantly improved computational efficiency over SParSE, with the largest improvements resulting from analyses with the strictest error tolerances. As researchers continue to model realistic biochemical systems, the need for efficient methods to characterize target events will grow. The algorithmic advancements provided by SParSE++ fulfill this need, enabling characterization of computationally intensive biochemical events that are currently resistant to analysis.

  19. A MEMS-based high frequency x-ray chopper

    Energy Technology Data Exchange (ETDEWEB)

    Siria, A; Schwartz, W; Chevrier, J [Institut Neel, CNRS-Universite Joseph Fourier Grenoble, BP 166, F-38042 Grenoble Cedex 9 (France); Dhez, O; Comin, F [ESRF, 6 rue Jules Horowitz, F-38043 Grenoble Cedex 9 (France); Torricelli, G [Department of Physics and Astronomy, University of Leicester, University Road, Leicester LE1 7RH (United Kingdom)

    2009-04-29

    Time-resolved x-ray experiments require intensity modulation at high frequencies (advanced rotating choppers have nowadays reached the kHz range). We here demonstrate that a silicon microlever oscillating at 13 kHz with nanometric amplitude can be used as a high frequency x-ray chopper. We claim that using micro-and nanoelectromechanical systems (MEMS and NEMS), it will be possible to achieve higher frequencies in excess of hundreds of megahertz. Working at such a frequency can open a wealth of possibilities in chemistry, biology and physics time-resolved experiments.

  20. A MEMS-based high frequency x-ray chopper.

    Science.gov (United States)

    Siria, A; Dhez, O; Schwartz, W; Torricelli, G; Comin, F; Chevrier, J

    2009-04-29

    Time-resolved x-ray experiments require intensity modulation at high frequencies (advanced rotating choppers have nowadays reached the kHz range). We here demonstrate that a silicon microlever oscillating at 13 kHz with nanometric amplitude can be used as a high frequency x-ray chopper. We claim that using micro-and nanoelectromechanical systems (MEMS and NEMS), it will be possible to achieve higher frequencies in excess of hundreds of megahertz. Working at such a frequency can open a wealth of possibilities in chemistry, biology and physics time-resolved experiments.

  1. Contributions of cerebellar event-based temporal processing and preparatory function to speech perception.

    Science.gov (United States)

    Schwartze, Michael; Kotz, Sonja A

    2016-10-01

    The role of the cerebellum in the anatomical and functional architecture of the brain is a matter of ongoing debate. We propose that cerebellar temporal processing contributes to speech perception on a number of accounts: temporally precise cerebellar encoding and rapid transmission of an event-based representation of the temporal structure of the speech signal serves to prepare areas in the cerebral cortex for the subsequent perceptual integration of sensory information. As speech dynamically evolves in time this fundamental preparatory function may extend its scope to the predictive allocation of attention in time and supports the fine-tuning of temporally specific models of the environment. In this framework, an oscillatory account considering a range of frequencies may best serve the linking of the temporal and speech processing systems. Lastly, the concerted action of these processes may not only advance predictive adaptation to basic auditory dynamics but optimize the perceptual integration of speech. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Event-based text mining for biology and functional genomics

    Science.gov (United States)

    Thompson, Paul; Nawaz, Raheel; McNaught, John; Kell, Douglas B.

    2015-01-01

    The assessment of genome function requires a mapping between genome-derived entities and biochemical reactions, and the biomedical literature represents a rich source of information about reactions between biological components. However, the increasingly rapid growth in the volume of literature provides both a challenge and an opportunity for researchers to isolate information about reactions of interest in a timely and efficient manner. In response, recent text mining research in the biology domain has been largely focused on the identification and extraction of ‘events’, i.e. categorised, structured representations of relationships between biochemical entities, from the literature. Functional genomics analyses necessarily encompass events as so defined. Automatic event extraction systems facilitate the development of sophisticated semantic search applications, allowing researchers to formulate structured queries over extracted events, so as to specify the exact types of reactions to be retrieved. This article provides an overview of recent research into event extraction. We cover annotated corpora on which systems are trained, systems that achieve state-of-the-art performance and details of the community shared tasks that have been instrumental in increasing the quality, coverage and scalability of recent systems. Finally, several concrete applications of event extraction are covered, together with emerging directions of research. PMID:24907365

  3. Decentralized Event-Based Communication Strategy on Leader-Follower Consensus Control

    Directory of Open Access Journals (Sweden)

    Duosi Xie

    2016-01-01

    Full Text Available This paper addresses the leader-follower consensus problem of networked systems by using a decentralized event-based control strategy. The event-based control strategy makes the controllers of agents update at aperiodic event instants. Two decentralized event functions are designed to generate these event instants. In particular, the second event function only uses its own information and the neighbors’ states at their latest event instants. By using this event function, no continuous communication among followers is required. As the followers only communicate at these discrete event instants, this strategy is able to save communication and to reduce channel occupation. It is analytically shown that the leader-follower networked system is able to reach consensus by utilizing the proposed control strategy. Simulation examples are shown to illustrate effectiveness of the proposed control strategy.

  4. Physiologically-based toxicokinetic models help identifying the key factors affecting contaminant uptake during flood events

    Energy Technology Data Exchange (ETDEWEB)

    Brinkmann, Markus; Eichbaum, Kathrin [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Kammann, Ulrike [Thünen-Institute of Fisheries Ecology, Palmaille 9, 22767 Hamburg (Germany); Hudjetz, Sebastian [Department of Ecosystem Analysis, Institute for Environmental Research,ABBt – Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Cofalla, Catrina [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Buchinger, Sebastian; Reifferscheid, Georg [Federal Institute of Hydrology (BFG), Department G3: Biochemistry, Ecotoxicology, Am Mainzer Tor 1, 56068 Koblenz (Germany); Schüttrumpf, Holger [Institute of Hydraulic Engineering and Water Resources Management, RWTH Aachen University, Mies-van-der-Rohe-Straße 1, 52056 Aachen (Germany); Preuss, Thomas [Department of Environmental Biology and Chemodynamics, Institute for Environmental Research,ABBt- Aachen Biology and Biotechnology, RWTH Aachen University, Worringerweg 1, 52074 Aachen (Germany); and others

    2014-07-01

    Highlights: • A PBTK model for trout was coupled with a sediment equilibrium partitioning model. • The influence of physical exercise on pollutant uptake was studies using the model. • Physical exercise during flood events can increase the level of biliary metabolites. • Cardiac output and effective respiratory volume were identified as relevant factors. • These confounding factors need to be considered also for bioconcentration studies. - Abstract: As a consequence of global climate change, we will be likely facing an increasing frequency and intensity of flood events. Thus, the ecotoxicological relevance of sediment re-suspension is of growing concern. It is vital to understand contaminant uptake from suspended sediments and relate it to effects in aquatic biota. Here we report on a computational study that utilizes a physiologically based toxicokinetic model to predict uptake, metabolism and excretion of sediment-borne pyrene in rainbow trout (Oncorhynchus mykiss). To this end, data from two experimental studies were compared with the model predictions: (a) batch re-suspension experiments with constant concentration of suspended particulate matter at two different temperatures (12 and 24 °C), and (b) simulated flood events in an annular flume. The model predicted both the final concentrations and the kinetics of 1-hydroxypyrene secretion into the gall bladder of exposed rainbow trout well. We were able to show that exhaustive exercise during exposure in simulated flood events can lead to increased levels of biliary metabolites and identified cardiac output and effective respiratory volume as the two most important factors for contaminant uptake. The results of our study clearly demonstrate the relevance and the necessity to investigate uptake of contaminants from suspended sediments under realistic exposure scenarios.

  5. Naive Probability: Model-Based Estimates of Unique Events.

    Science.gov (United States)

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  6. Predicting Future Random Events Based on Past Performance

    OpenAIRE

    Donald G. Morrison; David C. Schmittlein

    1981-01-01

    There are many situations where one is interested in predicting the expected number of events in period 2 given that x events occurred in period 1. For example, insurance companies must decide whether or not to cancel the insurance of drivers who had 3 or more accidents during the previous year. In analyzing marketing research data an analyst may wish to predict the number of future purchases to be made by those customers who made x purchases in the previous 3 months. The owner of a baseball ...

  7. Demosaicking Based on Optimization and Projection in Different Frequency Bands

    Directory of Open Access Journals (Sweden)

    Omer OsamaA

    2008-01-01

    Full Text Available Abstract A fast and effective iterative demosaicking algorithm is described for reconstructing a full-color image from single-color filter array data. The missing color values are interpolated on the basis of optimization and projection in different frequency bands. A filter bank is used to decompose an initially interpolated image into low-frequency and high-frequency bands. In the low-frequency band, a quadratic cost function is minimized in accordance with the observation that the low-frequency components of chrominance slowly vary within an object region. In the high-frequency bands, the high-frequency components of the unknown values are projected onto the high-frequency components of the known values. Comparison of the proposed algorithm with seven state-of-the-art demosaicking algorithms showed that it outperforms all of them for 20 images on average in terms of objective quality and that it is competitive with them from the subjective quality and complexity points of view.

  8. Base-Level Guide for Electromagnetic Frequency Radiation

    Science.gov (United States)

    2012-12-01

    situations per AFOSH 48-9, Section 2.6. (10) Inductive Capacitive Coupling: Capacitive (electric) fields are voltage fields. The effects depend upon the... inductive coupling, capacitive coupling and by radiation. Some frequencies are transmitted predominately by one form of coupling and some frequencies by...81 H-2. AN/TPS-75 Search Radar at Keesler AFB

  9. A Studio Project Based on the Events of September 11

    Science.gov (United States)

    Ruby, Nell

    2004-01-01

    A week after the 9/11 WTC event, the collage project that Nell Ruby and her class had been working on in a basic design classroom lacked relevance. They had been working from master works, analyzing hue and value relationships, color schemes, shape, and composition. The master works seemed unimportant because of the immense emotional impact of the…

  10. Classification and Prediction of Event-based Suspended Sediment Dynamics using Artificial Neural Networks

    Science.gov (United States)

    Hamshaw, S. D.; Underwood, K.; Wemple, B. C.; Rizzo, D.

    2016-12-01

    Sediment transport can be an immensely complex process, yet plays a vital role in the transport of substances and nutrients that can impact receiving waters. Advancements in the use of sensors for indirect measurement of suspended sediments have allowed access to high frequency sediment data. This has promoted the use of more advanced computational tools to identify patterns in sediment data to improve our understanding of physical processes occurring in the watershed. In this study, a network of weather stations and in-stream turbidity sensors were deployed to capture more than three years of sediment dynamics and meteorological data in the Mad River watershed in central Vermont. Monitoring sites were located along the main stem of the the Mad River and on five tributaries. Separate storm events were identified from the data at each site to study event sediment dynamics associated with erosion and deposition over space and time. Two types of artificial neural networks (ANNs), a self-organizing map (SOM) and a radial basis function (RBF), were used to cluster the storm event data based on hydrometeorological metrics and were subsequently compared to traditional classes of hysteresis patterns in suspended sediment concentration - discharge (SSC-Q) relationships. Hysteresis patterns were also directly used as inputs to both ANNs to identify distinct patterns and test the applicability of performing pattern recognition on hysteresis patterns. The results of this study will be used to gain insight into the dynamic physical processes (both spatial and temporal) occurring in the watershed based on patterns observed in SSQ-Q data.

  11. Capacitance-Based Frequency Adjustment of Micro Piezoelectric Vibration Generator

    Directory of Open Access Journals (Sweden)

    Xinhua Mao

    2014-01-01

    Full Text Available Micro piezoelectric vibration generator has a wide application in the field of microelectronics. Its natural frequency is unchanged after being manufactured. However, resonance cannot occur when the natural frequencies of a piezoelectric generator and the source of vibration frequency are not consistent. Output voltage of the piezoelectric generator will sharply decline. It cannot normally supply power for electronic devices. In order to make the natural frequency of the generator approach the frequency of vibration source, the capacitance FM technology is adopted in this paper. Different capacitance FM schemes are designed by different locations of the adjustment layer. The corresponding capacitance FM models have been established. Characteristic and effect of the capacitance FM have been simulated by the FM model. Experimental results show that the natural frequency of the generator could vary from 46.5 Hz to 42.4 Hz when the bypass capacitance value increases from 0 nF to 30 nF. The natural frequency of a piezoelectric vibration generator could be continuously adjusted by this method.

  12. Optical Tunable-Based Transmitter for Multiple Radio Frequency Bands

    Science.gov (United States)

    Nguyen, Hung (Inventor); Simons, Rainee N. (Inventor); Wintucky, Edwin G. (Inventor); Freeman, Jon C. (Inventor)

    2016-01-01

    An optical tunable transmitter is used to transmit multiple radio frequency bands on a single beam. More specifically, a tunable laser is configured to generate a plurality of optical wavelengths, and an optical tunable transmitter is configured to modulate each of the plurality of optical wavelengths with a corresponding radio frequency band. The optical tunable transmitter is also configured to encode each of the plurality of modulated optical wavelengths onto a single laser beam for transmission of a plurality of radio frequency bands using the single laser beam.

  13. The Effects of Semantic Transparency and Base Frequency on the Recognition of English Complex Words

    Science.gov (United States)

    Xu, Joe; Taft, Marcus

    2015-01-01

    A visual lexical decision task was used to examine the interaction between base frequency (i.e., the cumulative frequencies of morphologically related forms) and semantic transparency for a list of derived words. Linear mixed effects models revealed that high base frequency facilitates the recognition of the complex word (i.e., a "base…

  14. A Dynamically Configurable Log-based Distributed Security Event Detection Methodology using Simple Event Correlator

    Science.gov (United States)

    2010-06-01

    gathered from Wikipedia, the free online encyclopedia ; and the content from the Marketing server was gathered from the GNU Operating System’s homepage...Francoise and Julien Bourgeois. “Log-based Distributed Intrusion De- tection for Hybrid Networks”. CSIIRW, 2008. 197 30. Salem , Malek Ben, Shlomo Hershkop

  15. Event Management for Teacher-Coaches: Risk and Supervision Considerations for School-Based Sports

    Science.gov (United States)

    Paiement, Craig A.; Payment, Matthew P.

    2011-01-01

    A professional sports event requires considerable planning in which years are devoted to the success of that single activity. School-based sports events do not have that luxury, because high schools across the country host athletic events nearly every day. It is not uncommon during the fall sports season for a combination of boys' and girls'…

  16. From low-level events to activities - A pattern-based approach

    NARCIS (Netherlands)

    Mannhardt, Felix; De Leoni, Massimiliano; Reijers, Hajo A.; Van Der Aalst, Wil M P; Toussaint, Pieter J.

    2016-01-01

    Process mining techniques analyze processes based on event data. A crucial assumption for process analysis is that events correspond to occurrences of meaningful activities. Often, low-level events recorded by information systems do not directly correspond to these. Abstraction methods, which

  17. DNA-based frequency selective electromagnetic interference shielding

    Science.gov (United States)

    Grote, James; Ouchen, Fahima; Kreit, Eric; Buskohl, Phillip; Steffan, Thomas; Rogers, Charles; Salour, Michael

    2017-10-01

    A method of modeling RF properties of multilayered polymer host - metal nanoparticle guest composite films, using the transmission matrix method (TMM) model is presented. This is an alternate, pattern-less, dielectric approach to frequency selective surface electromagnetic interference shielding.

  18. Quantum frequency doubling based on tripartite entanglement with cavities

    Science.gov (United States)

    Juan, Guo; Zhi-Feng, Wei; Su-Ying, Zhang

    2016-02-01

    We analyze the entanglement characteristics of three harmonic modes, which are the output fields from three cavities with an input tripartite entangled state at fundamental frequency. The entanglement properties of the input beams can be maintained after their frequencies have been up-converted by the process of second harmonic generation. We have calculated the parametric dependences of the correlation spectrum on the initial squeezing factor, the pump power, the transmission coefficient, and the normalized analysis frequency of cavity. The numerical results provide references to choose proper experimental parameters for designing the experiment. The frequency conversion of the multipartite entangled state can also be applied to a quantum communication network. Project supported by the National Natural Science Foundation of China (Grant No. 91430109), the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20111401110004), and the Natural Science Foundation of Shanxi Province, China (Grant No. 2014011005-3).

  19. Simulation of Greenhouse Climate Monitoring and Control with Wireless Sensor Network and Event-Based Control

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2009-01-01

    Full Text Available Monitoring and control of the greenhouse environment play a decisive role in greenhouse production processes. Assurance of optimal climate conditions has a direct influence on crop growth performance, but it usually increases the required equipment cost. Traditionally, greenhouse installations have required a great effort to connect and distribute all the sensors and data acquisition systems. These installations need many data and power wires to be distributed along the greenhouses, making the system complex and expensive. For this reason, and others such as unavailability of distributed actuators, only individual sensors are usually located in a fixed point that is selected as representative of the overall greenhouse dynamics. On the other hand, the actuation system in greenhouses is usually composed by mechanical devices controlled by relays, being desirable to reduce the number of commutations of the control signals from security and economical point of views. Therefore, and in order to face these drawbacks, this paper describes how the greenhouse climate control can be represented as an event-based system in combination with wireless sensor networks, where low-frequency dynamics variables have to be controlled and control actions are mainly calculated against events produced by external disturbances. The proposed control system allows saving costs related with wear minimization and prolonging the actuator life, but keeping promising performance results. Analysis and conclusions are given by means of simulation results.

  20. Qualitative Event-based Diagnosis with Possible Conflicts Applied to Spacecraft Power Distribution Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Model-based diagnosis enables efficient and safe operation of engineered systems. In this paper, we describe two algorithms based on a qualitative event-based fault...

  1. "Cognitive" visual acuity estimation based on the event-related potential P300 component.

    Science.gov (United States)

    Heinrich, Sven P; Marhöfer, David; Bach, Michael

    2010-09-01

    An objective assessment of sensory and sensuo-cognitive function, based on physiological signals rather than on the behavioral response of a patient, is often advisable, albeit challenging. Evoked potentials are frequently used as an objective measure, but usually fail to detect defects beyond primary sensory areas, including those of psychogenic origin. Here we assess whether the event-related P300 component may be used to measure "cognitive" visual acuity. A visual oddball paradigm was used to elicit P300 responses in subjects with normal or artificially blurred vision. Probe stimuli consisted of infrequently presented gratings with spatial frequencies in the range of 2.2-16.2 cycles per degree, which could be either target or non-target stimuli depending on their orientation. Frequent stimuli were homogeneously grey fields. Without blur, rare stimuli of all spatial frequencies produced reliable P300 responses. Blur abolished the P300 to fine gratings consistently in 10 of 11 subjects. The drop in P300 amplitude was steep, rather than gradual, between visible and invisible gratings. The P300 is sensitive to identify the resolution threshold and thus may serve as a tool for estimating acuity in cases of visual impairments. The study presents a tool for the objective assessment of acuity in cases of higher-level visual impairments. The concept can most likely be extended to other sensory modalities. 2010 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  2. DNA Hybridization Detection Based on Resonance Frequency Readout in Graphene on Au SPR Biosensor

    Directory of Open Access Journals (Sweden)

    Md. Biplob Hossain

    2016-01-01

    Full Text Available This paper demonstrates a numerical modeling of surface plasmon resonance (SPR biosensor for detecting DNA hybridization by recording the resonance frequency characteristics (RFC. The proposed sensor is designed based on graphene material as biomolecular recognition elements (BRE and the sharp SPR curve of gold (Au. Numerical analysis shows that the variation of RFC for mismatched DNA strands is quiet negligible whereas that for complementary DNA strands is considerably countable. Here, graphene is used to perform faster immobilization between target DNA and probe DNA. The usage of graphene also changes the RFC that ensure hybridization of DNA event by utilizing its optochemical property. In addition, proposed sensor successfully distinguishes between hybridization and single-nucleotide polymorphisms (SNP by observing the variation level of RFC and maximum transmittance. Therefore, the proposed frequency readout based SPR sensor could potentially open a new window of detection for biomolecular interactions. We also highlight the advantage of using graphene sublayer by performing the sensitivity analysis. Sandwiching of each graphene sublayer enhances 95% sensitivity comparing with conventional SPR sensor.

  3. Noise pollution filters bird communities based on vocal frequency.

    Directory of Open Access Journals (Sweden)

    Clinton D Francis

    Full Text Available BACKGROUND: Human-generated noise pollution now permeates natural habitats worldwide, presenting evolutionarily novel acoustic conditions unprecedented to most landscapes. These acoustics not only harm humans, but threaten wildlife, and especially birds, via changes to species densities, foraging behavior, reproductive success, and predator-prey interactions. Explanations for negative effects of noise on birds include disruption of acoustic communication through energetic masking, potentially forcing species that rely upon acoustic communication to abandon otherwise suitable areas. However, this hypothesis has not been adequately tested because confounding stimuli often co-vary with noise and are difficult to separate from noise exposure. METHODOLOGY/PRINCIPAL FINDINGS: Using a natural experiment that controls for confounding stimuli, we evaluate whether species vocal features or urban-tolerance classifications explain their responses to noise measured through habitat use. Two data sets representing nesting and abundance responses reveal that noise filters bird communities nonrandomly. Signal duration and urban tolerance failed to explain species-specific responses, but birds with low-frequency signals that are more susceptible to masking from noise avoided noisy areas and birds with higher frequency vocalizations remained. Signal frequency was also negatively correlated with body mass, suggesting that larger birds may be more sensitive to noise due to the link between body size and vocal frequency. CONCLUSIONS/SIGNIFICANCE: Our findings suggest that acoustic masking by noise may be a strong selective force shaping the ecology of birds worldwide. Larger birds with lower frequency signals may be excluded from noisy areas, whereas smaller species persist via transmission of higher frequency signals. We discuss our findings as they relate to interspecific relationships among body size, vocal amplitude and frequency and suggest that they are

  4. Assessment of System Frequency Support Effect of PMSG-WTG Using Torque-Limit-Based Inertial Control: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiao; Gao, Wenzhong; Wang, Jianhui; Wu, Ziping; Yan, Weihang; Gevorgian, Vahan; Zhang, Yingchen; Muljadi, Eduard; Kang, Moses; Hwang, Min; Kang, Yong Cheol

    2017-05-12

    To release the 'hidden inertia' of variable-speed wind turbines for temporary frequency support, a method of torque-limit-based inertial control is proposed in this paper. This method aims to improve the frequency support capability considering the maximum torque restriction of a permanent magnet synchronous generator. The advantages of the proposed method are improved frequency nadir (FN) in the event of an under-frequency disturbance; and avoidance of over-deceleration and a second frequency dip during the inertial response. The system frequency response is different, with different slope values in the power-speed plane when the inertial response is performed. The proposed method is evaluated in a modified three-machine, nine-bus system. The simulation results show that there is a trade-off between the recovery time and FN, such that a gradual slope tends to improve the FN and restrict the rate of change of frequency aggressively while causing an extension of the recovery time. These results provide insight into how to properly design such kinds of inertial control strategies for practical applications.

  5. Assessment of System Frequency Support Effect of PMSG-WTG Using Torque-Limit-Based Inertial Control

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xiao; Gao, Wenzhong; Wang, Jianhui; Wu, Ziping; Yan, Weihang; Gevorgian, Vahan; Zhang, Yingchen; Muljadi, Eduard; Kang, Moses; Hwang, Min; Kang, Yong Cheol

    2017-02-16

    To release the 'hidden inertia' of variable-speed wind turbines for temporary frequency support, a method of torque-limit based inertial control is proposed in this paper. This method aims to improve the frequency support capability considering the maximum torque restriction of a permanent magnet synchronous generator. The advantages of the proposed method are improved frequency nadir (FN) in the event of an under-frequency disturbance; and avoidance of over-deceleration and a second frequency dip during the inertial response. The system frequency response is different, with different slope values in the power-speed plane when the inertial response is performed. The proposed method is evaluated in a modified three-machine, nine-bus system. The simulation results show that there is a trade-off between the recovery time and FN, such that a gradual slope tends to improve the FN and restrict the rate of change of frequency aggressively while causing an extension of the recovery time. These results provide insight into how to properly design such kinds of inertial control strategies for practical applications.

  6. Merging plot and Landsata data to estimate the frequency distribution of Central Amazon mortality event size for landscape-scale ecosystem simulations

    Science.gov (United States)

    Di Vittorio, A. V.; Chambers, J. Q.

    2012-12-01

    Mitigation strategies and estimates of land use change emissions assume initial states of landscapes that respond to prescribed scenarios. The Amazon basin is a target for both mitigation (e.g. maintenance of old-growth forest) and land use change (e.g. agriculture), but the current states of its old-growth and secondary forest landscapes are uncertain with respect to carbon cycling. Contributing to this uncertainty in old-growth forest ecosystems is a mosaic of patches in different successional stages, with the areal fraction of any particular stage relatively constant over large temporal and spatial scales. Old-growth mosaics are generally created through ongoing effects of tree mortality, with the Central Amazon mosaic generated primarily by wind mortality. Unfortunately, estimation of generalizable frequency distributions of mortality event size has been hindered by limited spatial and temporal scales of observations. To overcome these limitations we merge field and remotely sensed tree mortality data and fit the top two candidate distributions (power law and exponential) to these data to determine the most appropriate statistical mortality model for use in landscape-scale ecosystem simulations. Our results show that the power law model better represents the distribution of mortality event size than the exponential model. We also use an individual-tree-based forest stand model to simulate a 100 ha landscape using the best fit of each candidate distribution to demonstrate the effects of different mortality regimes on above ground biomass in the Central Amazon forest mosaic. We conclude that the correct mortality distribution model is critical for robust simulation of patch succession dynamics and above ground biomass.

  7. Snow anomaly events from historical documents in eastern China during the past two millennia and implication for low-frequency variability of AO/NAO and PDO

    Science.gov (United States)

    Chu, Guoqiang; Sun, Qing; Wang, Xiaohua; Sun, Junying

    2008-07-01

    Models and instrumental data indicate that the spatial and temporal variations of snow cover are significantly related to atmospheric circulation (e.g. the AO/NAO). Here, we present historical snow anomaly events during the past two millennia that provide a unique temporal window to studying long-term AO/NAO, a prominent phenomenon in wintertime. Direct descriptions such as ``no snow during the winter'' and ``pray God for snow'' are interpreted as convincing evidence for snow anomalies. The variations of positive/negative snow abnormal events show clear decadal to century variations during the past two millennia. Based on the previous instrumental research and comparison with other reconstruction data, we suggest the Index of Abnormal Snow (IAS) may be an AO-like atmospheric variability. The winter during the Medieval Warm Period (MWP) (AD 900-1300) might be strongly influenced by a predominantly positive AO with less snow condition, whereas the Little Ice Age (LIA) (AD 1300-1900) by negative AO concomitant with heavier snowfalls in East Asia. Our data show that a warm climate period (the MWP)/a cold period (the LIA) can be perturbed by a cold spell/a warm spell which are linked with a change in atmospheric circulation. Low-frequency variability of snow records may be intrinsic to the natural climate system. Although the dynamic mechanisms linking snow anomalies with atmospheric circulation (the AO/NAO, the PDO) is unclear on the decadal to century time scales, Pacific Ocean may play an important role in regulating atmospheric circulations since the IAS is highly correlated with the reconstruction of PDO.

  8. Long-term high frequency measurements of ethane, benzene and methyl chloride at Ragged Point, Barbados: Identification of long-range transport events

    Directory of Open Access Journals (Sweden)

    A.T. Archibald

    2015-09-01

    Full Text Available AbstractHere we present high frequency long-term observations of ethane, benzene and methyl chloride from the AGAGE Ragged Point, Barbados, monitoring station made using a custom built GC-MS system. Our analysis focuses on the first three years of data (2005–2007 and on the interpretation of periodic episodes of high concentrations of these compounds. We focus specifically on an exemplar episode during September 2007 to assess if these measurements are impacted by long-range transport of biomass burning and biogenic emissions. We use the Lagrangian Particle Dispersion model, NAME, run forwards and backwards in time to identify transport of air masses from the North East of Brazil during these events. To assess whether biomass burning was the cause we used hot spots detected using the MODIS instrument to act as point sources for simulating the release of biomass burning plumes. Excellent agreement for the arrival time of the simulated biomass burning plumes and the observations of enhancements in the trace gases indicates that biomass burning strongly influenced these measurements. These modelling data were then used to determine the emissions required to match the observations and compared with bottom up estimates based on burnt area and literature emission factors. Good agreement was found between the two techniques highlight the important role of biomass burning. The modelling constrained by in situ observations suggests that the emission factors were representative of their known upper limits, with the in situ data suggesting slightly greater emissions of ethane than the literature emission factors account for. Further analysis was performed concluding only a small role for biogenic emissions of methyl chloride from South America impacting measurements at Ragged Point. These results highlight the importance of long-term high frequency measurements of NMHC and ODS and highlight how these data can be used to determine sources of emissions

  9. Issues in Informal Education: Event-Based Science Communication Involving Planetaria and the Internet

    Science.gov (United States)

    Adams, M.; Gallagher, D. L.; Whitt, A.; Six, N. Frank (Technical Monitor)

    2002-01-01

    For the past four years the Science Directorate at Marshall Space Flight Center has carried out a diverse program of science communication through the web resources on the Internet. The program includes extended stories about NAS.4 science, a curriculum resource for teachers tied to national education standards, on-line activities for students, and webcasts of real-time events. Events have involved meteor showers, solar eclipses, natural very low frequency radio emissions, and amateur balloon flights. In some cases broadcasts accommodate active feedback and questions from Internet participants. We give here, examples of events, problems, and lessons learned from these activities.

  10. Towards an Event Based Indicator for Monitoring Change in Extreme Precipitation in Support of the US National Climate Assessment

    Science.gov (United States)

    Slinskey, E. A.; Loikith, P. C.; Waliser, D. E.

    2016-12-01

    Extreme precipitation events are associated with numerous societal and environmental impacts. Recent observational analysis suggests increasing trends in precipitation intensity across portions of the Continental United States (CONUS) consistent with expectations associated with anthropogenic climate change. Therefore, a spatial understanding and intuitive means of monitoring extreme precipitation over time is critical. In support of the ongoing efforts of the US National Climate Assessment (NCA), we present a gridded climate indicator, based on high-resolution gridded NASA satellite-based precipitation data from NASA's Tropical Rainfall Measuring Mission (TRMM) and the Global Precipitation Measurement (GPM) product, to monitor and track extreme precipitation events over the CONUS. The indicator is based on categorized storm totals over the CONUS defined as 3-day total accumulated precipitation events, ensuring a spatially and temporally balanced regional representation of synoptic-scale and short-duration storm events alike. A precipitation categorization scheme mirroring that of the widely understood Saffir-Simpson hurricane intensity index is assigned to each 3-day precipitation event with each precipitation category referred to as a P-Cat. The magnitude of each event lies between P-Cat 1, the lightest category of storm totals, and P-Cat 5, the heaviest, allowing for easy interpretation and visualization. With all precipitation events assigned to a P-Cat, point-wise statistics are computed across the CONUS including the maximum P-Cat, the mean P-Cat, and the frequency of each P-Cat at each grid point providing a comprehensive climatology of precipitation event intensity and a baseline for monitoring change. A novel aspect of this indicator is that it will accurately display discernable spatial variations with regional specificity in extreme precipitation event frequency and intensity over relevant temporal scales. Changes in variability will be observable at

  11. Broadband metamaterial absorber based on coupling resistive frequency selective surface.

    Science.gov (United States)

    Sun, LiangKui; Cheng, HaiFeng; Zhou, YongJiang; Wang, Jun

    2012-02-13

    We report the design, fabrication, and measurement of a broadband metamaterial absorber, which consists of lossy frequency selective surface (FSS) and a metallic ground plane separated by a dielectric layer. The compact single unit cell of the FSS contains crisscross and fractal square patch which couple with each other. Both qualitative analysis by equivalent circuit and accurate numeric calculation show that the coupling between the crisscross and the fractal square patch can enhance the bandwidth with the reflectivity below -10dB in the frequency range of 2-18GHz by producing a third absorption null. In the end, the designed absorber was realized by experiment.

  12. Simulation Analysis of SPWM Variable Frequency Speed Based on Simulink

    Directory of Open Access Journals (Sweden)

    Min-Yan DI

    2014-01-01

    Full Text Available This article is studied on currently a very active field of researching sinusoidal pulse width modulation (SPWM frequency speed control system, and strengthen researched on the simulation model of speed control system with MATLAB / Simulink / Power System simulation tools, thus we can find the best way to simulation. We apply it to the actual conveyor belt, frequency conversion motor, when the obtained simulation results are compared with the measured data, we prove that the method is practical and effective. The results of our research have a guiding role for the future engineering and technical personnel in asynchronous motor SPWM VVVF CAD design.

  13. Measurement frequency and sampling spatial domains required to characterize turbidity and salinity events in the Guadalquivir estuary (Spain

    Directory of Open Access Journals (Sweden)

    E. Contreras

    2012-08-01

    Full Text Available Estuaries are complex systems in which long water quality data series are not always available at the proper scale. Data proceeding from several water quality networks, with different measuring frequencies (monthly, weekly and 15 min and different numbers of sampling points, were compared throughout the main channel of the Guadalquivir estuary. Higher frequency of turbidity sampling in the upper estuary is required. In the lower estuary, sampling points help to find out the ETM, and higher frequency sampling of EC is required because of the effect of the tidal and river components. This could be a feedback for the implementation of monitoring networks in estuaries.

  14. Measurement frequency and sampling spatial domains required to characterize turbidity and salinity events in the Guadalquivir estuary (Spain)

    Science.gov (United States)

    Contreras, E.; Polo, M. J.

    2012-08-01

    Estuaries are complex systems in which long water quality data series are not always available at the proper scale. Data proceeding from several water quality networks, with different measuring frequencies (monthly, weekly and 15 min) and different numbers of sampling points, were compared throughout the main channel of the Guadalquivir estuary. Higher frequency of turbidity sampling in the upper estuary is required. In the lower estuary, sampling points help to find out the ETM, and higher frequency sampling of EC is required because of the effect of the tidal and river components. This could be a feedback for the implementation of monitoring networks in estuaries.

  15. neural network based load frequency control for restructuring power

    African Journals Online (AJOL)

    2012-03-01

    Mar 1, 2012 ... Abstract. In this study, an artificial neural network (ANN) application of load frequency control. (LFC) of a Multi-Area power system by using a neural network controller is presented. The comparison between a conventional Proportional Integral (PI) controller and the proposed artificial neural networks ...

  16. Remote sensing-based fire frequency mapping in a savannah ...

    African Journals Online (AJOL)

    Linear pixel unmixing was used for image classification and subsequent mapping of burnt areas. The results showed that it was feasible to have discrimination of burnt areas and 'un-burnt' areas as well as generating a six year fire frequency map of the study area. Accuracy assessment of the classified images was carried ...

  17. High resolution mid-infrared spectroscopy based on frequency upconversion

    DEFF Research Database (Denmark)

    Dam, Jeppe Seidelin; Hu, Qi; Tidemand-Lichtenberg, Peter

    2013-01-01

    We present high resolution upconversion of incoherent infrared radiation by means of sum-frequency mixing with a laser followed by simple CCD Si-camera detection. Noise associated with upconversion is, in strong contrast to room temperature direct mid-IR detection, extremely small, thus very faint...

  18. Ionospheric correction for spaceborne single-frequency GPS based ...

    Indian Academy of Sciences (India)

    A modified ionospheric correction method and the corresponding approximate algorithm for spaceborne single-frequency Global Positioning System (GPS) users are proposed in this study. Single Layer Model (SLM) mapping function for spaceborne GPS was analyzed. SLM mapping functions at different altitudes were ...

  19. neural network based load frequency control for restructuring power

    African Journals Online (AJOL)

    2012-03-01

    Mar 1, 2012 ... power system is chosen and load frequency con- trol of this system is made by a ANN controller and a conventional PI controller. Basically, power system consists of a governor, a turbine, and a generator with feedback of reg- ulation constant. System also includes step load change input to the generator.

  20. Possibility of reinforcement learning based on event-related potential.

    Science.gov (United States)

    Yamagishi, Yuya; Tsubone, Tadashi; Wada, Yasuhiro

    2008-01-01

    We applied event-related potential (ERP) to reinforcement signals that are equivalent to reward and punishment signals.We conducted an electroencephalogram (EEG) in which volunteers identified the success or failure of a task. We confirmed that there were differences in the EEG depending on whether the task was successful or not and suggested that ERP might be used as a reward of reinforcement leaning. We used a support vector machine (SVM) for recognizing the P300. We selected the feature vector in SVM that was composed of averages of each 50 ms for each of the six channels (C3,Cz,C4,P3,Pz,P4) for a total of 700 ms. We can suggest that reinforcement learning using P300 can be performed accurately.

  1. Long-term changes in regular and low-frequency earthquake inter-event times near Parkfield, CA

    Science.gov (United States)

    Wu, C.; Shelly, D. R.; Johnson, P. A.; Gomberg, J. S.; Peng, Z.

    2012-12-01

    The temporal evolution of earthquake inter-event time may provide important clues for the timing of future events and underlying physical mechanisms of earthquake nucleation. In this study, we examine inter-event times from 12-yr catalogs of ~50,000 earthquakes and ~730,000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault. We focus on the long-term evolution of inter-event times after the 2003 Mw6.5 San Simeon and 2004 Mw6.0 Parkfield earthquakes. We find that inter-event times decrease by ~4 orders of magnitudes after the Parkfield and San Simeon earthquakes and are followed by a long-term recovery with time scales of ~3 years and more than 8 years for earthquakes along and to the southwest of the San Andreas fault, respectively. The differing long-term recovery of the earthquake inter-event times is likely a manifestation of different aftershock recovery time scales that reflect the different tectonic loading rates in the two regions. We also observe a possible decrease of LFE inter-event times in some LFE families, followed by a recovery with time scales of ~4 months to several years. The drop in the recurrence time of LFE after the Parkfield earthquake is likely caused by a combination of the dynamic and positive static stress induced by the Parkfield earthquake, and the long-term recovery in LFE recurrence time could be due to post-seismic relaxation or gradual recovery of the fault zone material properties. Our on-going work includes better constraining and understanding the physical mechanisms responsible for the observed long-term recovery in earthquake and LFE inter-event times.

  2. Measurement frequency and sampling spatial domains required to characterize turbidity and salinity events in the Guadalquivir estuary (Spain)

    OpenAIRE

    Contreras, E.; M. J. Polo

    2012-01-01

    Estuaries are complex systems in which long water quality data series are not always available at the proper scale. Data proceeding from several water quality networks, with different measuring frequencies (monthly, weekly and 15 min) and different numbers of sampling points, were compared throughout the main channel of the Guadalquivir estuary. Higher frequency of turbidity sampling in the upper estuary is required. In the lower estuary, sampling points help to find out the ETM, and higher f...

  3. Asteroid! An Event-Based Science Module. Teacher's Guide. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  4. Asteroid! An Event-Based Science Module. Student Edition. Astronomy Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  5. Oil Spill! An Event-Based Science Module. Student Edition. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  6. Oil Spill!: An Event-Based Science Module. Teacher's Guide. Oceanography Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science or general science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event- based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  7. Volcano!: An Event-Based Science Module. Teacher's Guide. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research,…

  8. Volcano!: An Event-Based Science Module. Student Edition. Geology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  9. Hurricane! An Event-Based Science Module. Student Edition. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  10. Hurricane!: An Event-Based Science Module. Teacher's Guide. Meteorology Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school earth science teachers to help their students learn about problems with hurricanes and scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning,…

  11. Fraud! An Event-Based Science Module. Student Edition. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school students to learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork, independent research, hands-on investigations, and…

  12. Fraud! An Event-Based Science Module. Teacher's Guide. Chemistry Module.

    Science.gov (United States)

    Wright, Russell G.

    This book is designed for middle school life science or physical science teachers to help their students learn scientific literacy through event-based science. Unlike traditional curricula, the event-based earth science module is a student-centered, interdisciplinary, inquiry-oriented program that emphasizes cooperative learning, teamwork,…

  13. Automatic detection of esophageal pressure events. Is there an alternative to rule-based criteria?

    DEFF Research Database (Denmark)

    Kruse-Andersen, S; Rütz, K; Kolberg, Jens Godsk

    1995-01-01

    curves generated by muscular contractions, rule-based criteria do not always select the pressure events most relevant for further analysis. We have therefore been searching for a new concept for automatic event recognition. The present study describes a new system, based on the method of neurocomputing...

  14. Event detection using population-based health care databases in randomized clinical trials

    DEFF Research Database (Denmark)

    Thuesen, Leif; Jensen, Lisette Okkels; Tilsted, Hans Henrik

    2013-01-01

    To describe a new research tool, designed to reflect routine clinical practice and relying on population-based health care databases to detect clinical events in randomized clinical trials.......To describe a new research tool, designed to reflect routine clinical practice and relying on population-based health care databases to detect clinical events in randomized clinical trials....

  15. Transformation of frequency-magnitude relation prior to large events in the model of block structure dynamics

    Directory of Open Access Journals (Sweden)

    A. Soloviev

    2008-02-01

    Full Text Available The b-value change in the frequency-magnitude (FM distribution for a synthetic earthquake catalogue obtained by means of the model of block structure dynamics has been studied. The catalogue is divided into time periods preceding strong earthquakes and time periods that do not precede strong earthquakes. The separate analysis of these periods shows that the b-value is smaller before strong earthquakes. The similar phenomenon has been found also for the observed seismicity of the Southern California. The model of block structure dynamics represents a seismic region as a system of perfectly rigid blocks divided by infinitely thin plane faults. The blocks interact between themselves and with the underlying medium. The system of blocks moves as a consequence of prescribed motion of the boundary blocks and of the underlying medium. As the blocks are perfectly rigid, all deformation takes place in the fault zones and at the block base in contact with the underlying medium. Relative block displacements take place along the fault zones. Block motion is defined so that the system is in a quasistatic equilibrium state. The interaction of blocks along the fault zones is viscous-elastic ("normal state" while the ratio of the stress to the pressure remains below a certain strength level. When the critical level is exceeded in some part of a fault zone, a stress-drop ("failure" occurs (in accordance with the dry friction model, possibly causing failure in other parts of the fault zones. These failures produce earthquakes. Immediately after the earthquake and for some time after, the affected parts of the fault zones are in a state of creep. This state differs from the normal state because of a faster growth of inelastic displacements, lasting until the stress falls below some other level. This numerical simulation gives rise a synthetic earthquake catalogue.

  16. Adapted wavelet transform improves time-frequency representations: a study of auditory elicited P300-like event-related potentials in rats

    Science.gov (United States)

    Richard, Nelly; Laursen, Bettina; Grupe, Morten; Drewes, Asbjørn M.; Graversen, Carina; Sørensen, Helge B. D.; Bastlund, Jesper F.

    2017-04-01

    Objective. Active auditory oddball paradigms are simple tone discrimination tasks used to study the P300 deflection of event-related potentials (ERPs). These ERPs may be quantified by time-frequency analysis. As auditory stimuli cause early high frequency and late low frequency ERP oscillations, the continuous wavelet transform (CWT) is often chosen for decomposition due to its multi-resolution properties. However, as the conventional CWT traditionally applies only one mother wavelet to represent the entire spectrum, the time-frequency resolution is not optimal across all scales. To account for this, we developed and validated a novel method specifically refined to analyse P300-like ERPs in rats. Approach. An adapted CWT (aCWT) was implemented to preserve high time-frequency resolution across all scales by commissioning of multiple wavelets operating at different scales. First, decomposition of simulated ERPs was illustrated using the classical CWT and the aCWT. Next, the two methods were applied to EEG recordings obtained from prefrontal cortex in rats performing a two-tone auditory discrimination task. Main results. While only early ERP frequency changes between responses to target and non-target tones were detected by the CWT, both early and late changes were successfully described with strong accuracy by the aCWT in rat ERPs. Increased frontal gamma power and phase synchrony was observed particularly within theta and gamma frequency bands during deviant tones. Significance. The study suggests superior performance of the aCWT over the CWT in terms of detailed quantification of time-frequency properties of ERPs. Our methodological investigation indicates that accurate and complete assessment of time-frequency components of short-time neural signals is feasible with the novel analysis approach which may be advantageous for characterisation of several types of evoked potentials in particularly rodents.

  17. Model-Checking of Component-Based Event-Driven Real-Time Embedded Software

    National Research Council Canada - National Science Library

    Gu, Zonghua; Shin, Kang G

    2005-01-01

    .... We discuss application of model-checking to verify system-level concurrency properties of component-based real-time embedded software based on CORBA Event Service, using Avionics Mission Computing...

  18. Speaker Normalization Based on Time-Frequency Warp with Inter-Frame Consistency

    OpenAIRE

    山田, 圭; 内田, 誠一; 迫江, 博昭

    1998-01-01

    A new algorithm for speaker-independent spoken word recognition is presented. The algorithm is based on the time-frequency warping technique where frequency axis warping is performed in order to adjust individual spectral difference, in addition to time axis warping. In the conventional algorithm, frequency axis warping is independently determined at each frame (i.e., time). In this case, such warp have a tendency to yield excessive deformations of time-frequency plane, it is feared. In order...

  19. Very-Narrow-Line Semiconductor Laser and Optical Clocks Based on Spectral Hole Burning Frequency Standards

    National Research Council Canada - National Science Library

    Cone, Rufus

    2000-01-01

    .... The achieved frequency stabilization provides ideal lasers for high-resolution spectroscopy, real time optical signal processing based on spectral holography, and other applications requiring ultra...

  20. Common time-frequency analysis of local field potential and pyramidal cell activity in seizure-like events of the rat hippocampus

    Science.gov (United States)

    Cotic, M.; Chiu, A. W. L.; Jahromi, S. S.; Carlen, P. L.; Bardakjian, B. L.

    2011-08-01

    To study cell-field dynamics, physiologists simultaneously record local field potentials and the activity of individual cells from animals performing cognitive tasks, during various brain states or under pathological conditions. However, apart from spike shape and spike timing analyses, few studies have focused on elucidating the common time-frequency structure of local field activity relative to surrounding cells across different periods of phenomena. We have used two algorithms, multi-window time frequency analysis and wavelet phase coherence (WPC), to study common intracellular-extracellular (I-E) spectral features in spontaneous seizure-like events (SLEs) from rat hippocampal slices in a low magnesium epilepsy model. Both algorithms were applied to 'pairs' of simultaneously observed I-E signals from slices in the CA1 hippocampal region. Analyses were performed over a frequency range of 1-100 Hz. I-E spectral commonality varied in frequency and time. Higher commonality was observed from 1 to 15 Hz, and lower commonality was observed in the 15-100 Hz frequency range. WPC was lower in the non-SLE region compared to SLE activity; however, there was no statistical difference in the 30-45 Hz band between SLE and non-SLE modes. This work provides evidence of strong commonality in various frequency bands of I-E SLEs in the rat hippocampus, not only during SLEs but also immediately before and after.

  1. Simulation of Frequency Hopping Communication System Based On MATLAB

    OpenAIRE

    Xu Xiaoping; Wang Weiqi; Yi Lan; Rong Jie; Wang Anqi

    2016-01-01

    In today’s information age, how to carry out the accurate information communication is a crucial issue in the field of communication, which is widely used in civil and military fields. Frequency hopping communication system is a typical spectrum communication system, it is in military communication, mobile communication, computer wireless data transmission and wireless LAN and other fields have a very wide range of applications, has become the current short wave communication is one of the im...

  2. Maximal potential patent foramen diameter does not correlate with the type or frequency of the neurologic event prior to closure.

    Science.gov (United States)

    Kutty, Shelby; Brown, Kimberly; Qureshi, Athar M; Latson, Larry A

    2009-01-01

    We analyzed our data on patients undergoing transcatheter patent foramen ovale (PFO) closure to determine if the maximal potential PFO diameter (MPPD) by balloon sizing correlates with important clinical characteristics in this population. We defined stroke as a focal neurologic deficit lasting >24 h, or focal deficit of shorter duration associated with permanent MRI/CT changes consistent with a focal infarction. Parameters analyzed included age, gender, anticoagulation, hypertension, smoking, MRI/CT findings and MPPD at catheterization. We specifically analyzed the type of neurologic event (stroke/transient ischemic attack, TIA), and number of recorded preceding clinical neurologic events. In 216 consecutive patients, 167 suffered a stroke. MRI/CT changes consistent with one or more embolic events were seen in 156 patients; 49 had a clinical TIA. There was no significant difference in MPPD between stroke (11.0 +/- 3.6 mm) and TIA groups (10.9 +/- 3.9 mm; 95% confidence interval for difference: -1.33 to 1.00). MPPD did not differ between MRI/CT-positive vs. -negative strokes, and had no correlation with the number of identified pre-closure clinical neurologic events. Continued investigation is needed to determine whether other PFO characteristics, or other anatomic/physiologic parameters, may be useful to identify patients at high risk for cryptogenic stroke/TIA, even before they have their first neurologic event. Copyright 2008 S. Karger AG, Basel.

  3. Iterative interferometry-based method for picking microseismic events

    Science.gov (United States)

    Iqbal, Naveed; Al-Shuhail, Abdullatif A.; Kaka, SanLinn I.; Liu, Entao; Raj, Anupama Govinda; McClellan, James H.

    2017-05-01

    Continuous microseismic monitoring of hydraulic fracturing is commonly used in many engineering, environmental, mining, and petroleum applications. Microseismic signals recorded at the surface, suffer from excessive noise that complicates first-break picking and subsequent data processing and analysis. This study presents a new first-break picking algorithm that employs concepts from seismic interferometry and time-frequency (TF) analysis. The algorithm first uses a TF plot to manually pick a reference first-break and then iterates the steps of cross-correlation, alignment, and stacking to enhance the signal-to-noise ratio of the relative first breaks. The reference first-break is subsequently used to calculate final first breaks from the relative ones. Testing on synthetic and real data sets at high levels of additive noise shows that the algorithm enhances the first-break picking considerably. Furthermore, results show that only two iterations are needed to converge to the true first breaks. Indeed, iterating more can have detrimental effects on the algorithm due to increasing correlation of random noise.

  4. You Never Walk Alone: Recommending Academic Events Based on Social Network Analysis

    Science.gov (United States)

    Klamma, Ralf; Cuong, Pham Manh; Cao, Yiwei

    Combining Social Network Analysis and recommender systems is a challenging research field. In scientific communities, recommender systems have been applied to provide useful tools for papers, books as well as expert finding. However, academic events (conferences, workshops, international symposiums etc.) are an important driven forces to move forwards cooperation among research communities. We realize a SNA based approach for academic events recommendation problem. Scientific communities analysis and visualization are performed to provide an insight into the communities of event series. A prototype is implemented based on the data from DBLP and EventSeer.net, and the result is observed in order to prove the approach.

  5. Event-Based Proof of the Mutual Exclusion Property of Peterson’s Algorithm

    Directory of Open Access Journals (Sweden)

    Ivanov Ievgen

    2015-12-01

    Full Text Available Proving properties of distributed algorithms is still a highly challenging problem and various approaches that have been proposed to tackle it [1] can be roughly divided into state-based and event-based proofs. Informally speaking, state-based approaches define the behavior of a distributed algorithm as a set of sequences of memory states during its executions, while event-based approaches treat the behaviors by means of events which are produced by the executions of an algorithm. Of course, combined approaches are also possible.

  6. Liraglutide Treatment Is Associated with a Low Frequency and Magnitude of Antibody Formation with No Apparent Impact on Glycemic Response or Increased Frequency of Adverse Events

    DEFF Research Database (Denmark)

    Buse, John B; Garber, Alan; Rosenstock, Julio

    2011-01-01

    the impact on glycemic control and safety, and to compare it with exenatide, an agent in the same class. Design: Antibody data were collected during six Liraglutide Effect and Action in Diabetes (LEAD) trials (26–104 wk duration). Setting: Samples for determination of antibody formation were collected...... (high or low level)]. Results: After 26 wk, 32 of 369 (8.7%) and 49 of 587 (8.3%) patients had low-level antibodies to liraglutide 1.2 and 1.8 mg, respectively [mean 3.3% antibody-bound radioactivity out of total radioactivity (%B/T), range 1.6–10.7%B/T], which did not attenuate glycemic efficacy (HbA1c.......0022). After switching from exenatide to liraglutide, anti-exenatide antibodies did not compromise a further glycemic response to liraglutide (additional 0.4% HbA1c reduction). Conclusions: Liraglutide was less immunogenic than exenatide; the frequency and levels of anti-liraglutide antibodies were low and did...

  7. A novel probabilistic framework for event-based speech recognition

    Science.gov (United States)

    Juneja, Amit; Espy-Wilson, Carol

    2003-10-01

    One of the reasons for unsatisfactory performance of the state-of-the-art automatic speech recognition (ASR) systems is the inferior acoustic modeling of low-level acoustic-phonetic information in the speech signal. An acoustic-phonetic approach to ASR, on the other hand, explicitly targets linguistic information in the speech signal, but such a system for continuous speech recognition (CSR) is not known to exist. A probabilistic and statistical framework for CSR based on the idea of the representation of speech sounds by bundles of binary valued articulatory phonetic features is proposed. Multiple probabilistic sequences of linguistically motivated landmarks are obtained using binary classifiers of manner phonetic features-syllabic, sonorant and continuant-and the knowledge-based acoustic parameters (APs) that are acoustic correlates of those features. The landmarks are then used for the extraction of knowledge-based APs for source and place phonetic features and their binary classification. Probabilistic landmark sequences are constrained using manner class language models for isolated or connected word recognition. The proposed method could overcome the disadvantages encountered by the early acoustic-phonetic knowledge-based systems that led the ASR community to switch to systems highly dependent on statistical pattern analysis methods and probabilistic language or grammar models.

  8. Wavelet based denoising of power quality events for characterization

    African Journals Online (AJOL)

    user

    Angrisani L., Daponte P., D'Apuuo M. and Testa A., 1996, A new wavelet transform based procedure for electrical power quality analysis, Proceedings of the International Conference on Harmonics and Quality of Power (ICHQP), Las Vegas, Nevada,. USA, pp. 608-614. Bollen Math H.J., 2000, Understanding power quality ...

  9. Event Highlight: Nigeria Evidence-based Health System Initiative

    International Development Research Centre (IDRC) Digital Library (Canada)

    2012-06-01

    Jun 1, 2012 ... Since limited resources are available for life-saving health services in Nigeria, those who plan health programs need to know which interventions are most effective and how to prioritise them. An important objective of the Nigeria Evidence-based Health. System Initiative (NEHSI) is to build the capacity of.

  10. Wireless Chalcogenide Nanoionic-Based Radio-Frequency Switch

    Science.gov (United States)

    Nessel, James; Miranda, Felix

    2013-01-01

    A new nonvolatile nanoionic switch is powered and controlled through wireless radio-frequency (RF) transmission. A thin layer of chalcogenide glass doped with a metal ion, such as silver, comprises the operational portion of the switch. For the switch to function, an oxidizable electrode is made positive (anode) with respect to an opposing electrode (cathode) when sufficient bias, typically on the order of a few tenths of a volt or more, is applied. This action causes the metal ions to flow toward the cathode through a coordinated hopping mechanism. At the cathode, a reduction reaction occurs to form a metal deposit. This metal deposit creates a conductive path that bridges the gap between electrodes to turn the switch on. Once this conductive path is formed, no further power is required to maintain it. To reverse this process, the metal deposit is made positive with respect to the original oxidizable electrode, causing the dissolution of the metal bridge thereby turning the switch off. Once the metal deposit has been completely dissolved, the process self-terminates. This switching process features the following attributes. It requires very little to change states (i.e., on and off). Furthermore, no power is required to maintain the states; hence, the state of the switch is nonvolatile. Because of these attributes the integration of a rectenna to provide the necessary power and control is unique to this embodiment. A rectenna, or rectifying antenna, generates DC power from an incident RF signal. The low voltages and power required for the nanoionic switch control are easily generated from this system and provide the switch with a novel capability to be operated and powered from an external wireless device. In one realization, an RF signal of a specific frequency can be used to set the switch into an off state, while another frequency can be used to set the switch to an on state. The wireless, miniaturized, and nomoving- part features of this switch make it

  11. The effect of high-frequency conditioning stimulation of human skin on reported pain intensity and event-related potentials

    NARCIS (Netherlands)

    Broeke, E.N. van den; Heck, C.H. van; Ceelen, L.A.J.M.; Rijn, C.M. van; Goor, H. van; Wilder-Smith, O.H.G.

    2012-01-01

    High-frequency conditioning electrical stimulation (HFS) of human skin induces an increased pain sensitivity to mechanical stimuli in the surrounding nonconditioned skin. The aim of this study was to investigate the effect of HFS on reported pain sensitivity to single electrical stimuli applied

  12. Event-based media processing and analysis: A survey of the literature

    OpenAIRE

    Tzelepis, Christos; Ma, Zhigang; MEZARIS, Vasileios; Ionescu, Bogdan; Kompatsiaris, Ioannis; Boato, Giulia; Sebe, Nicu; Yan, Shuicheng

    2016-01-01

    Research on event-based processing and analysis of media is receiving an increasing attention from the scientific community due to its relevance for an abundance of applications, from consumer video management and video surveillance to lifelogging and social media. Events have the ability to semantically encode relationships of different informational modalities, such as visual-audio-text, time, involved agents and objects, with the spatio-temporal component of events being a key feature for ...

  13. Personalized Event-Based Surveillance and Alerting Support for the Assessment of Risk

    OpenAIRE

    Stewar, Avaré; Lage, Ricardo; Diaz-Aviles, Ernesto; Dolog, Peter

    2011-01-01

    In a typical Event-Based Surveillance setting, a stream of web documents is continuously monitored for disease reporting. A structured representation of the disease reporting events is extracted from the raw text, and the events are then aggregated to produce signals, which are intended to represent early warnings against potential public health threats. To public health officials, these warnings represent an overwhelming list of "one-size-fits-all" information for risk assessment. To reduce ...

  14. Galactic Cosmic Ray Event-Based Risk Model (GERM) Code

    Science.gov (United States)

    Cucinotta, Francis A.; Plante, Ianik; Ponomarev, Artem L.; Kim, Myung-Hee Y.

    2013-01-01

    This software describes the transport and energy deposition of the passage of galactic cosmic rays in astronaut tissues during space travel, or heavy ion beams in patients in cancer therapy. Space radiation risk is a probability distribution, and time-dependent biological events must be accounted for physical description of space radiation transport in tissues and cells. A stochastic model can calculate the probability density directly without unverified assumptions about shape of probability density function. The prior art of transport codes calculates the average flux and dose of particles behind spacecraft and tissue shielding. Because of the signaling times for activation and relaxation in the cell and tissue, transport code must describe temporal and microspatial density of functions to correlate DNA and oxidative damage with non-targeted effects of signals, bystander, etc. These are absolutely ignored or impossible in the prior art. The GERM code provides scientists data interpretation of experiments; modeling of beam line, shielding of target samples, and sample holders; and estimation of basic physical and biological outputs of their experiments. For mono-energetic ion beams, basic physical and biological properties are calculated for a selected ion type, such as kinetic energy, mass, charge number, absorbed dose, or fluence. Evaluated quantities are linear energy transfer (LET), range (R), absorption and fragmentation cross-sections, and the probability of nuclear interactions after 1 or 5 cm of water equivalent material. In addition, a set of biophysical properties is evaluated, such as the Poisson distribution for a specified cellular area, cell survival curves, and DNA damage yields per cell. Also, the GERM code calculates the radiation transport of the beam line for either a fixed number of user-specified depths or at multiple positions along the Bragg curve of the particle in a selected material. The GERM code makes the numerical estimates of basic

  15. Tag and Neighbor based Recommender systems for Medical events

    DEFF Research Database (Denmark)

    Bayyapu, Karunakar Reddy; Dolog, Peter

    2010-01-01

    This paper presents an extension of a multifactor recommendation approach based on user tagging with term neighbours. Neighbours of words in tag vectors and documents provide for hitting larger set of documents and not only those matching with direct tag vectors or content of the documents. Tag...... in the situations where the quality of tags is lower. We discuss the approach on the examples from the existing Medworm system to indicate the usefulness of the approach....

  16. Location-Based Events Detection on Micro-Blogs

    OpenAIRE

    Santos, Augusto Dias Pereira dos; Wives, Leandro Krug; Alvares, Luis Otavio

    2012-01-01

    The increasing use of social networks generates enormous amounts of data that can be used for many types of analysis. Some of these data have temporal and geographical information, which can be used for comprehensive examination. In this paper, we propose a new method to analyze the massive volume of messages available in Twitter to identify places in the world where topics such as TV shows, climate change, disasters, and sports are emerging. The proposed method is based on a neural network t...

  17. GPS-based PWV for precipitation forecasting and its application to a typhoon event

    Science.gov (United States)

    Zhao, Qingzhi; Yao, Yibin; Yao, Wanqiang

    2018-01-01

    The temporal variability of precipitable water vapour (PWV) derived from Global Navigation Satellite System (GNSS) observations can be used to forecast precipitation events. A number of case studies of precipitation events have been analysed in Zhejiang Province, and a forecasting method for precipitation events was proposed. The PWV time series retrieved from the Global Positioning System (GPS) observations was processed by using a least-squares fitting method, so as to obtain the line tendency of ascents and descents over PWV. The increment of PWV for a short time (two to six hours) and PWV slope for a longer time (a few hours to more than ten hours) during the PWV ascending period are considered as predictive factors with which to forecast the precipitation event. The numerical results show that about 80%-90% of precipitation events and more than 90% of heavy rain events can be forecasted two to six hours in advance of the precipitation event based on the proposed method. 5-minute PWV data derived from GPS observations based on real-time precise point positioning (RT-PPP) were used for the typhoon event that passed over Zhejiang Province between 10 and 12 July, 2015. A good result was acquired using the proposed method and about 74% of precipitation events were predicted at some ten to thirty minutes earlier than their onset with a false alarm rate of 18%. This study shows that the GPS-based PWV was promising for short-term and now-casting precipitation forecasting.

  18. Coupling urban event-based and catchment continuous modelling for combined sewer overflow river impact assessment

    Directory of Open Access Journals (Sweden)

    I. Andrés-Doménech

    2010-10-01

    Full Text Available Since Water Framework Directive (WFD was passed in year 2000, the conservation of water bodies in the EU must be understood in a completely different way. Regarding to combined sewer overflows (CSOs from urban drainage networks, the WFD implies that we cannot accept CSOs because of their intrinsic features, but they must be assessed for their impact on the receiving water bodies in agreement with specific environmental aims. Consequently, both, urban system and the receiving water body must be jointly analysed to evaluate the environmental impact generated on the latter. In this context, a coupled scheme is presented in this paper to assess the CSOs impact on a river system in Torrelavega (Spain. First, a urban model is developed to statistically characterise the CSOs frequency, volume and duration. The main feature of this first model is the fact of being event-based: the system is modelled with some built synthetic storms which cover adequately the probability range of the main rainfall descriptors, i.e., rainfall event volume and peak intensity. Thus, CSOs are characterised in terms of their occurrence probability. Secondly, a continuous and distributed basin model is built to assess river response at different points in the river network. This model was calibrated initially on a daily scale and downscaled later to hourly scale. The main objective of this second element of the scheme is to provide the most likely state of the receiving river when a CSO occurs. By combining results of both models, CSO and river flows are homogeneously characterised from a statistical point of view. Finally, results from both models were coupled to estimate the final concentration of some analysed pollutants (biochemical oxygen demand, BOD, and total ammonium, NH4+, within the river just after the spills.

  19. Predictive Event Triggered Control based on Heuristic Dynamic Programming for Nonlinear Continuous Time Systems

    Science.gov (United States)

    2015-08-17

    Control based on Heuristic Dynamic Programming for Nonlinear Continuous-Time Systems In this paper, a novel predictive event-triggered control...method based on heuristic dynamic programming (HDP) algorithm is developed for nonlinear continuous-time systems. A model network is used to estimate...College Road, Suite II Kingston, RI 02881 -1967 ABSTRACT Predictive Event-Triggered Control based on Heuristic Dynamic Programming for Nonlinear

  20. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources.

    Science.gov (United States)

    Solovyev, Valery; Ivanov, Vladimir

    2016-01-01

    Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts) to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessary for natural language processing. In this paper we define a set of linguistic resources that are necessary in development of a knowledge-based event extraction system in Russian: a vocabulary of subordination models, a vocabulary of event triggers, and a vocabulary of Frame Elements that are basic building blocks for semantic patterns. We propose a set of methods for creation of such vocabularies in Russian and other languages using Google Books NGram Corpus. The methods are evaluated in development of event extraction system for Russian.

  1. Adverse Event Profile of Pyrimethamine-Based Therapy in Toxoplasmosis: A Systematic Review.

    Science.gov (United States)

    Ben-Harari, Ruben R; Goodwin, Elizabeth; Casoy, Julio

    2017-12-01

    Approximately a third of the population worldwide is chronically infected with Toxoplasma gondii. Pyrimethamine-based regimens are recommended for the treatment of toxoplasmosis. The aim was to evaluate the safety profile of pyrimethamine-based treatment for the three main Toxoplasma manifestations: toxoplasmic encephalitis (TE), ocular toxoplasmosis, and congenital toxoplasmosis. PubMed, Cochrane Library, and Google Scholar databases were searched through August 1, 2016. Randomized, observational, prospective/retrospective, and cohort studies were eligible. Thirty-one studies were included with a total of 2975 patients. Of these, 13 were in congenital toxoplasmosis (n = 929), 11 in ocular toxoplasmosis (n = 1284), and seven in TE (n = 687). Across manifestations, adverse event (AE)-related treatment discontinuation and/or change in therapy involved ≤37% of patients and occurred in >55% of studies: 100% for ocular toxoplasmosis, 57.1% for TE, and 61.5% for congenital toxoplasmosis. The most commonly observed AEs were bone marrow suppression, dermatologic, and gastrointestinal (GI). The prevalence of bone marrow suppression-related AEs was ≤50% in congenital toxoplasmosis, ≤42.7% in TE, and ≤9.0% in ocular toxoplasmosis. The frequency of GI and dermatologic AEs were ≤100 and ≤11.1%, respectively, for ocular toxoplasmosis, ≤10.7 and ≤17.9% for TE, and ≤10.8 and ≤2.1% for congenital toxoplasmosis. Steven-Johnson syndrome was reported in two patients with ocular toxoplasmosis and one with TE. The AE profile associated with pyrimethamine-based treatments differed by each manifestation of toxoplasmosis and within a given manifestation. Hematologic AEs occurred across all manifestations indicating the importance of monitoring the blood of patients administered pyrimethamine-based regimens.

  2. Structural and Thermal Controls on the Frequency and Magnitude of Small-size Rockfall Events (European Swiss Alps)

    Science.gov (United States)

    Messenzehl, K.; Blöthe, J. H.; Dikau, R.

    2016-12-01

    In steep mountain terrain rockwall erosion and sediment deposition commonly occur by rockfalls of different magnitudes and frequencies. To assess the natural hazard potential it is essential to understand and predict the causes and frequencies of rockwall failure and characterise which block sizes are deposited on a specific location. Contrary to large catastrophic instabilities, small-size rockfalls (systems in the Swiss Alps and aim to estimate their rockfall frequency-magnitude spectrum with respect to their controls. We present an integrated approach combining: (i) geotechnical scanline surveys and thermal studies of the source rockwall with (ii) engineering and geomorphic analyses of the talus slopes. (i) Each rockwall is linked to specific rockfall-prone block volumes (0.1-100 m3), in which wedge sliding or topples dominate depending on the specific joint sets. The 2-year rock temperature data reveal further contrasting thermal regimes. Using the model by Hales and Roering (2007), we calculated cracking intensities of few cm up to 200 cm bedrock depth. (ii) Along the talus slopes, we identified typical downslope gravity sorting with mean block diameters ranging from 10 cm (apex) to 1.2 m (foot). Applying the approach by Evans and Hungr (1993), we found significant differences in the annual rockfall frequency with respect to the block magnitude and the landing position. Surprisingly - despite their similar topo-climatic, paraglacial and geological setting - each landform-complex is characterised by different frequency-magnitude relationships. We lead this back to the mechanical and thermal variability of the source rock mass. While large-size wedge sliding due to seasonal frost cracking might result in low-frequency but high-magnitude talus deposition, a more homogeneous depositional signature is linked to small-size topples released by near-surface frost action. Therefore, to improve the natural hazard assessment in steep terrain, integrative studies on

  3. A New Quantum Key Distribution Scheme based on Frequency and Time Coding

    OpenAIRE

    Zhu, Chang-hua; Pei, Chang-xing; Quan, Dong-xiao; Chen, Nan; Yi, Yun-hui

    2010-01-01

    A new scheme of quantum key distribution (QKD) using frequency and time coding is proposed, in which the security is based on the frequency-time uncertainty relation. In this scheme, the binary information sequence is encoded randomly on either the central frequency or the time delay at the sender. The central frequency of the single photon pulse is set as omega1 for bit "0" and set as omega2 for bit "1" when frequency coding is selected. While, the single photon pulse is not delayed for bit ...

  4. Hail frequency estimation across Europe based on a combination of overshooting top detections and the ERA-INTERIM reanalysis

    Science.gov (United States)

    Punge, H. J.; Bedka, K. M.; Kunz, M.; Reinbold, A.

    2017-12-01

    This article presents a hail frequency estimation based on the detection of cold overshooting cloud tops (OTs) from the Meteosat Second Generation (MSG) operational weather satellites, in combination with a hail-specific filter derived from the ERA-INTERIM reanalysis. This filter has been designed based on the atmospheric properties in the vicinity of hail reports registered in the European Severe Weather Database (ESWD). These include Convective Available Potential Energy (CAPE), 0-6-km bulk wind shear and freezing level height, evaluated at the nearest time step and interpolated from the reanalysis grid to the location of the hail report. Regions highly exposed to hail events include Northern Italy, followed by South-Eastern Austria and Eastern Spain. Pronounced hail frequency is also found in large parts of Eastern Europe, around the Alps, the Czech Republic, Southern Germany, Southern and Eastern France, and in the Iberic and Apennine mountain ranges.

  5. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...

  6. Wildfire and debris-flows in South East Australian catchments: threshold conditions and magnitude-frequency of post-fire erosion events.

    Science.gov (United States)

    Sheridan, G. J.; Nyman, P.; Lane, P. J.

    2009-04-01

    The effect of wildfire on catchment processes is highly variable depending on environmental and physical factors. Data from a range of conditions show elevated post fire erosion rates on hillslopes and increased sediment export from burnt catchments. However, most controlled experiments and catchment monitoring studies are unlikely and in most cases unable to capture rare events such as debris flows. In south east Australia, extreme erosion events have been reported in a range of catchments following major wildfires over the last 50 years. The circumstances under which these events occurred, the processes involved, the volumes of material exported and the impacts on water quality have remained largely unexplored. In 2007, severely burnt catchments in north east Victoria produced a large number of mass erosion events following high intensity convective storms. Data on rainfall duration and intensity was available from nearby locations. More than 30 catchments were studied and mapped in detail to provide quantitative data on the conditions required for initiation of major erosion events after fire, and to estimate loads of sediment and other constituents delivered from the hillslopes and the channels of the study areas. The study established that the erosion events were triggered by runoff processes and sediment entrainment rather than mass failure, typical of fire-related debris flows reported from research in South Western USA. These loads are compared and contrasted with sediment load magnitude-frequency data collected from 5 fully instrumented burnt catchments in SE Australia with contrasting soil properties. This analysis has allowed an initial estimation of i) threshold conditions for debris flow initiation, and ii) recurrence intervals for fire related debris flow events.

  7. Interaction between Gender and Skill on Competitive State Anxiety Using the Time-to-Event Paradigm: What Roles Do Intensity, Direction, and Frequency Dimensions Play?

    Directory of Open Access Journals (Sweden)

    John E. Hagan

    2017-05-01

    Full Text Available Background and purpose: The functional understanding and examination of competitive anxiety responses as temporal events that unfold as time-to-competition moves closer has emerged as a topical research area within the domains of sport psychology. However, little is known from an inclusive and interaction oriented perspective. Using the multidimensional anxiety theory as a framework, the present study examined the temporal patterning of competitive anxiety, focusing on the dimensions of intensity, direction, and frequency of intrusions in athletes across gender and skill level.Methods: Elite and semi-elite table tennis athletes from the Ghanaian league (N = 90 completed a modified version of Competitive State Anxiety Inventory-2 (CSAI-2 with the inclusion of the directional and frequency of intrusion scales at three temporal phases (7 days, 2 days, and 1 h prior to a competitive fixture.Results: Multivariate Analyses of Variance repeated measures with follow-up analyses revealed significant interactions for between-subjects factors on all anxiety dimensions (intensity, direction, and frequency. Notably, elite (international female athletes were less cognitively anxious, showed more facilitative interpretation toward somatic anxiety symptoms and experienced less frequency of somatic anxiety symptoms than their male counterparts. However, both elite groups displayed appreciable level of self-confidence. For time-to-event effects, both cognitive and somatic anxiety intensity fluctuated whereas self-confidence showed a steady rise as competition neared. Somatic anxiety debilitative interpretation slightly improved 1 h before competition whereas cognitive anxiety frequencies also increased progressively during the entire preparatory phase.Conclusion: Findings suggest a more dynamic image of elite athletes’ pre-competitive anxiety responses than suggested by former studies, potentially influenced by cultural differences. The use of psychological

  8. Mercury Atomic Frequency Standards for Space Based Navigation and Timekeeping

    Science.gov (United States)

    Tjoelker, R. L.; Burt, E. A.; Chung, S.; Hamell, R. L.; Prestage, J. D.; Tucker, B.; Cash, P.; Lutwak, R.

    2012-01-01

    A low power Mercury Atomic Frequency Standard (MAFS) has been developed and demonstrated on the path towards future space clock applications. A self contained mercury ion breadboard clock: emulating flight clock interfaces, steering a USO local oscillator, and consuming approx 40 Watts has been operating at JPL for more than a year. This complete, modular ion clock instrument demonstrates that key GNSS size, weight, and power (SWaP) requirements can be achieved while still maintaining short and long term performance demonstrated in previous ground ion clocks. The MAFS breadboard serves as a flexible platform for optimizing further space clock development and guides engineering model design trades towards fabrication of an ion clock for space flight.

  9. High frequency modulation circuits based on photoconductive wide bandgap switches

    Energy Technology Data Exchange (ETDEWEB)

    Sampayan, Stephen

    2018-02-13

    Methods, systems, and devices for high voltage and/or high frequency modulation. In one aspect, an optoelectronic modulation system includes an array of two or more photoconductive switch units each including a wide bandgap photoconductive material coupled between a first electrode and a second electrode, a light source optically coupled to the WBGP material of each photoconductive switch unit via a light path, in which the light path splits into multiple light paths to optically interface with each WBGP material, such that a time delay of emitted light exists along each subsequent split light path, and in which the WBGP material conducts an electrical signal when a light signal is transmitted to the WBGP material, and an output to transmit the electrical signal conducted by each photoconductive switch unit. The time delay of the photons emitted through the light path is substantially equivalent to the time delay of the electrical signal.

  10. The Hassles Assessment Scale for Students in College: measuring the frequency and unpleasantness of and dwelling on stressful events.

    Science.gov (United States)

    Sarafino, E P; Ewing, M

    1999-09-01

    Development of the Hassles Assessment Scale for Students in College, a new scale to measure students' stress, is described. In the scale, students rate each of 54 hassles for its frequency and unpleasantness in the past month and indicate the degree to which they dwelt or ruminated on it. Very high levels of internal consistency for the frequency, unpleasantness, and dwelling measures were found. Correlational analyses demonstrated the scale's criterion validity (scores were negatively correlated with the number of hours respondents reported engaging in physical exercise) and congruent validity (scores were positively correlated with scores on the Inventory of College Students' Recent Life Experience, an established scale for assessing student hassles). Exploratory factor analyses suggested the possibility that many items on the scale are independent, with each contributing some specific variance to the total variance of the item pool that is not shared with other items.

  11. A new EMG frequency-based fatigue threshold test.

    Science.gov (United States)

    Hendrix, C Russell; Housh, Terry J; Johnson, Glen O; Mielke, Michelle; Camic, Clayton L; Zuniga, Jorge M; Schmidt, Richard J

    2009-06-30

    Theoretically, the critical torque (CT) and electromyographic mean power frequency fatigue threshold (EMG MPF(FT)) describe the maximal non-fatiguing isometric torque level. The purposes of this study were two-fold: (1) to determine if the mathematical model for estimating the EMG fatigue threshold (EMG(FT)) from the amplitude of the EMG signal was applicable to the frequency domain of the EMG signal to estimate a new fatigue threshold called the EMG MPF(FT); and (2) to compare the torque level derived from the CT test to that of the EMG MPF(FT) test for the vastus lateralis (VL) muscle during isometric muscle actions of the leg extensors. Nine adults (4 men and 5 women; mean+/-SD age=21.6+/-1.2 yr) performed three or four continuous, fatiguing, isometric muscle actions of the leg extensors at 30, 45, 60, and 75% of maximum voluntary isometric contraction (MVIC) to determine the time to exhaustion (T(lim)) values. The slope coefficient of the linear relationship between total isometric "work" (W(lim) in Nms=TorquexT(lim)) and T(lim) was defined as the CT. Surface EMG signals were recorded from the vastus lateralis (VL) muscle during each fatiguing isometric muscle action. The EMG MPF(FT) was defined as the y-intercept of the isometric torque versus slope coefficient (EMG MPF versus time) plot. There were no significant differences between CT (19.7+/-5.8%MVIC) and EMG MPF(FT) (21.4+/-8.7%MVIC). These findings provided indirect validation of the EMG MPF(FT) test.

  12. Future frequencies of extreme weather events in the National Wildlife Refuges of the conterminous U.S.

    Science.gov (United States)

    Martinuzzi, Sebastian; Allstadt, Andrew J.; Bateman, Brooke L.; Heglund, Patricia J.; Pidgeon, Anna M.; Thogmartin, Wayne E.; Vavrus, Stephen J.; Radeloff, Volker C.

    2016-01-01

    Climate change is a major challenge for managers of protected areas world-wide, and managers need information about future climate conditions within protected areas. Prior studies of climate change effects in protected areas have largely focused on average climatic conditions. However, extreme weather may have stronger effects on wildlife populations and habitats than changes in averages. Our goal was to quantify future changes in the frequency of extreme heat, drought, and false springs, during the avian breeding season, in 415 National Wildlife Refuges in the conterminous United States. We analyzed spatially detailed data on extreme weather frequencies during the historical period (1950–2005) and under different scenarios of future climate change by mid- and late-21st century. We found that all wildlife refuges will likely experience substantial changes in the frequencies of extreme weather, but the types of projected changes differed among refuges. Extreme heat is projected to increase dramatically in all wildlife refuges, whereas changes in droughts and false springs are projected to increase or decrease on a regional basis. Half of all wildlife refuges are projected to see increases in frequency (> 20% higher than the current rate) in at least two types of weather extremes by mid-century. Wildlife refuges in the Southwest and Pacific Southwest are projected to exhibit the fastest rates of change, and may deserve extra attention. Climate change adaptation strategies in protected areas, such as the U.S. wildlife refuges, may need to seriously consider future changes in extreme weather, including the considerable spatial variation of these changes.

  13. Optimizing graph-based patterns to extract biomedical events from the literature

    OpenAIRE

    Liu, Haibin; Verspoor, Karin; Comeau, Donald C; MacKinlay, Andrew D; Wilbur, W John

    2015-01-01

    In BioNLP-ST 2013 We participated in the BioNLP 2013 shared tasks on event extraction. Our extraction method is based on the search for an approximate subgraph isomorphism between key context dependencies of events and graphs of input sentences. Our system was able to address both the GENIA (GE) task focusing on 13 molecular biology related event types and the Cancer Genetics (CG) task targeting a challenging group of 40 cancer biology related event types with varying arguments concerning 18 ...

  14. Repeating LP events and increases in high-frequency seismic energy preceding the December 1999 eruption of the quiescently active Telica Volcano, Nicaragua

    Science.gov (United States)

    Rodgers, M.; Roman, D. C.; Geirsson, H.; Lafemina, P.; Muñoz, A.; Guzman, C.; Tenorio, V.

    2010-12-01

    Telica volcano, Nicaragua, is a ‘quiescently active’ basaltic andesite stratovolcano located in the Central American volcanic front. A high rate of long-period (LP) seismicity has been recorded at Telica since the installation of a single vertical-component 1 Hz seismic sensor (TELN) near its summit in 1993 by the Instituto Nicaragüense de Estudios Territoriales (INETER). Due to the continuously high rate of LPs at Telica, traditional methods of forecasting volcanic activity may not be applicable; therefore an understanding of the nature of precursory changes in Telica’s seismicity is necessary to accurately forecast future volcanic activity. A VEI 2 eruption of Telica occurred on the 29th December 1999, preceded by a series of small explosions between the 3rd-15th October 1999. Here we analyse an eight-month period of seismicity bracketing this activity, in an attempt to identify precursory changes with respect to background seismicity. Between August 1999 and March 2000 over 18,000 seismic events were recorded on TELN. We first calculated the dominant frequencies (i.e. frequency with dominant spectral energy) for all events recorded during this period. A time series of the dominant event frequencies between August 1999 and March 2000 shows a significant increase in the number of high frequency (> 5 Hz) events and, in LP events, a shift in the two dominant spectral energy peaks from 2 Hz and 4 Hz to 2 Hz and 3 Hz in the month before the October 1999 explosions. Next, we selected six representative eight-day periods, three from before the explosions and three from after, for multiplet analysis using waveform cross-correlation. Multiplet analysis of the six selected time periods reveal significant changes in behaviour. In period 1 (more than one month before the explosions) events are poorly correlated. In periods 2 and 3 (less than one month before the explosions) we identified several unique families of LP events, each having high cross-correlation values

  15. The typology, frequency and magnitude of some behaviour events in case of torrential hydrographical management works in the upper Tarlung watershed

    Directory of Open Access Journals (Sweden)

    Ioan Clinciu

    2010-09-01

    Full Text Available During the 20-25 years from their startup, the torrential hydrographical management works carried out in the upper Tărlung Watershed (55 dams, 22 sills, 25 traverses and 4 outlet canals have exposed a number of 24 behaviour event types: 13 out of them reduce the safety of exploitation and the sustainability of the works (hereinafter called damages, while the other 11 reduce the functionality of the works (hereinafter called disfunctionalities. The following behaviour events have the highest frequency:(i damages caused by water and alluvia erosion (erosive damages, followed by breakages, in the category of damages, and (ii unsupervised installation of forest vegetation on the managed torrential hydrographical network and apron siltation, in the category of disfunctionalities. For methodological reasons, only the erosive damage of works was successively analysed, according to two criteria: the average depth (cm in the eroded area and the percentage of the erosive area out of the total surface. Further on, by combining the two criteria for analysis, five representation areas with the same damage intensity were defined (very low, low, medium, high and very high intensity. With the aid of the event frequency values recorded in these areas and of the coefficients attributed to each intensity class (from 1 for very low intensity to 5 for very high intensity, the author reached the conclusion that the level of the recorded intensity of the damage caused by water and alluvia erosion ranged from very low to low.

  16. Multi-Crop Specific Area Frame Stratification Based on Geospatial Crop Planting Frequency Data Layers

    Science.gov (United States)

    Boryan, C. G.; Yang, Z.; Willis, P.; Di, L.

    2016-12-01

    Area sampling frames (ASFs) are the basis of many statistical programs around the world. When an ASF's stratification is based on generalized percent cultivation, the ASF usually cannot identify the planting location of specific crops targeted for agricultural surveys. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geospatial crop planting frequency data is proposed. The Crop Planting Frequency Data Layers are crop specific geospatial data sets derived from multi-year Cropland Data Layers. Therefore, the ASF stratification based on the crop planting frequency data is crop specific. This paper investigates using 2008-2013 geospatial Crop Frequency Data Layers to create a novel multi-crop specific stratification for South Dakota, U.S. The crop specific ASF stratification is developed based on crop frequency statistics calculated at the primary sampling unit (PSU) level based on the corn, soybean and wheat planting frequency data layers, three major crops in South Dakota. Strata are formed using a k means clustering algorithm. It is observed that the crop frequency based ASF stratification predicts corn, soybean and wheat planting patterns well as verified by 2014 Farm Service Agency (FSA) Common Land Unit (CLU) and 578 administrative data. This finding demonstrates that the novel multi-crop specific stratification based on crop planting frequency data is crop type independent and applicable to all major crops. Further, these results indicate that the new multi-crop specific ASF stratification has great potential to improve ASF accuracy, efficiency and crop estimates.

  17. Joint Estimation of Time-Frequency Signature and DOA Based on STFD for Multicomponent Chirp Signals.

    Science.gov (United States)

    Zhao, Ziyue; Liu, Congfeng

    2014-01-01

    In the study of the joint estimation of time-frequency signature and direction of arrival (DOA) for multicomponent chirp signals, an estimation method based on spatial time-frequency distributions (STFDs) is proposed in this paper. Firstly, array signal model for multicomponent chirp signals is presented and then array processing is applied in time-frequency analysis to mitigate cross-terms. According to the results of the array processing, Hough transform is performed and the estimation of time-frequency signature is obtained. Subsequently, subspace method for DOA estimation based on STFD matrix is achieved. Simulation results demonstrate the validity of the proposed method.

  18. Soil Moisture Sensing via Swept Frequency Based Microwave Sensors

    Directory of Open Access Journals (Sweden)

    Greg A. Holt

    2012-01-01

    Full Text Available There is a need for low-cost, high-accuracy measurement of water content in various materials. This study assesses the performance of a new microwave swept frequency domain instrument (SFI that has promise to provide a low-cost, high-accuracy alternative to the traditional and more expensive time domain reflectometry (TDR. The technique obtains permittivity measurements of soils in the frequency domain utilizing a through transmission configuration, transmissometry, which provides a frequency domain transmissometry measurement (FDT. The measurement is comparable to time domain transmissometry (TDT with the added advantage of also being able to separately quantify the real and imaginary portions of the complex permittivity so that the measured bulk permittivity is more accurate that the measurement TDR provides where the apparent permittivity is impacted by the signal loss, which can be significant in heavier soils. The experimental SFI was compared with a high-end 12 GHz TDR/TDT system across a range of soils at varying soil water contents and densities. As propagation delay is the fundamental measurement of interest to the well-established TDR or TDT technique; the first set of tests utilized precision propagation delay lines to test the accuracy of the SFI instrument’s ability to resolve propagation delays across the expected range of delays that a soil probe would present when subjected to the expected range of soil types and soil moisture typical to an agronomic cropping system. The results of the precision-delay line testing suggests the instrument is capable of predicting propagation delays with a RMSE of +/−105 ps across the range of delays ranging from 0 to 12,000 ps with a coefficient of determination of r2 = 0.998. The second phase of tests noted the rich history of TDR for prediction of soil moisture and leveraged this history by utilizing TDT measured with a high-end Hewlett Packard TDR/TDT instrument to directly benchmark the

  19. Pymote: High Level Python Library for Event-Based Simulation and Evaluation of Distributed Algorithms

    National Research Council Canada - National Science Library

    Arbula, Damir; Lenac, Kristijan

    2013-01-01

    .... Simulation is a fundamental part of distributed algorithm design and evaluation process. In this paper, we present a library for event-based simulation and evaluation of distributed algorithms...

  20. Tunable antenna radome based on graphene frequency selective surface

    National Research Council Canada - National Science Library

    Qu, Meijun; Rao, Menglou; Li, Shufang; Deng, Li

    2017-01-01

    ... to the alterable conductivity of the graphene strips which is controlled by chemical potential. Based on the reconfigurable bandpass property of the proposed FSS, a cylindrical antenna radome is designed using the FSS unit cells...

  1. Dynamic phasor based frequency scanning for grid-connected ...

    Indian Academy of Sciences (India)

    M K Das

    2017-10-11

    connected power electronic systems. (PESs). These models are required ... electronics-based power system. 1. Introduction. Power electronic ...... financial support to carry out this work, under the project. 'Simulation Centre for Power ...

  2. High-performance radio frequency transistors based on diameter-separated semiconducting carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Yu; Che, Yuchi; Zhou, Chongwu, E-mail: chongwuz@usc.edu [Department of Electrical Engineering, University of Southern California, Los Angeles, California 90089 (United States); Seo, Jung-Woo T.; Hersam, Mark C. [Department of Materials Science and Engineering and Department of Chemistry, Northwestern University, Evanston, Illinois 60208 (United States); Gui, Hui [Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, California 90089 (United States)

    2016-06-06

    In this paper, we report the high-performance radio-frequency transistors based on the single-walled semiconducting carbon nanotubes with a refined average diameter of ∼1.6 nm. These diameter-separated carbon nanotube transistors show excellent transconductance of 55 μS/μm and desirable drain current saturation with an output resistance of ∼100 KΩ μm. An exceptional radio-frequency performance is also achieved with current gain and power gain cut-off frequencies of 23 GHz and 20 GHz (extrinsic) and 65 GHz and 35 GHz (intrinsic), respectively. These radio-frequency metrics are among the highest reported for the carbon nanotube thin-film transistors. This study provides demonstration of radio frequency transistors based on carbon nanotubes with tailored diameter distributions, which will guide the future application of carbon nanotubes in radio-frequency electronics.

  3. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources

    OpenAIRE

    Solovyev, Valery; Ivanov, Vladimir

    2016-01-01

    Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts) to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessar...

  4. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model ?

    OpenAIRE

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-01-01

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algori...

  5. Multiple frequencies sequential coding for SSVEP-based brain-computer interface.

    Directory of Open Access Journals (Sweden)

    Yangsong Zhang

    Full Text Available BACKGROUND: Steady-state visual evoked potential (SSVEP-based brain-computer interface (BCI has become one of the most promising modalities for a practical noninvasive BCI system. Owing to both the limitation of refresh rate of liquid crystal display (LCD or cathode ray tube (CRT monitor, and the specific physiological response property that only a very small number of stimuli at certain frequencies could evoke strong SSVEPs, the available frequencies for SSVEP stimuli are limited. Therefore, it may not be enough to code multiple targets with the traditional frequencies coding protocols, which poses a big challenge for the design of a practical SSVEP-based BCI. This study aimed to provide an innovative coding method to tackle this problem. METHODOLOGY/PRINCIPAL FINDINGS: In this study, we present a novel protocol termed multiple frequencies sequential coding (MFSC for SSVEP-based BCI. In MFSC, multiple frequencies are sequentially used in each cycle to code the targets. To fulfill the sequential coding, each cycle is divided into several coding epochs, and during each epoch, certain frequency is used. Obviously, different frequencies or the same frequency can be presented in the coding epochs, and the different epoch sequence corresponds to the different targets. To show the feasibility of MFSC, we used two frequencies to realize four targets and carried on an offline experiment. The current study shows that: 1 MFSC is feasible and efficient; 2 the performance of SSVEP-based BCI based on MFSC can be comparable to some existed systems. CONCLUSIONS/SIGNIFICANCE: The proposed protocol could potentially implement much more targets with the limited available frequencies compared with the traditional frequencies coding protocol. The efficiency of the new protocol was confirmed by real data experiment. We propose that the SSVEP-based BCI under MFSC might be a promising choice in the future.

  6. THE EFFECT OF DEVOTEE-BASED BRAND EQUITY ON RELIGIOUS EVENTS

    Directory of Open Access Journals (Sweden)

    MUHAMMAD JAWAD IQBAL

    2016-04-01

    Full Text Available The objective of this research is to apply DBBE model to discover the constructs to measure the religious event as a business brand on the bases of devotees’ perception. SEM technique was applied to measure the hypothesized model of which CFA put to analyze the model and a theoretical model was made to measure the model fit. Sample size was of 500. The base of brand loyalty was affected directly by image and quality. This information might be beneficial to event management and sponsors in making brand and operating visitors’ destinations. More importantly, the brand of these religious events in Pakistan can be built as a strong tourism product.

  7. Event-Based Control for Average Consensus of Wireless Sensor Networks with Stochastic Communication Noises

    Directory of Open Access Journals (Sweden)

    Chuan Ji

    2013-01-01

    Full Text Available This paper focuses on the average consensus problem for the wireless sensor networks (WSNs with fixed and Markovian switching, undirected and connected network topologies in the noise environment. Event-based protocol is applied to each sensor node to reach the consensus. An event triggering strategy is designed based on a Lyapunov function. Under the event trigger condition, some sufficient conditions for average consensus in mean square are obtained. Finally, some numerical simulations are given to illustrate the effectiveness of the results derived in this paper.

  8. Duration and frequency of migraines affect cognitive function: evidence from neuropsychological tests and event-related potentials.

    Science.gov (United States)

    Huang, Lifang; Juan Dong, Hong; Wang, Xi; Wang, Yan; Xiao, Zheman

    2017-12-01

    The aim of this study was to evaluate the changes in the cognitive performance of migraine patients using a comprehensive series of cognitive/behavioral and electrophysiological tests. A randomized, cross-sectional, within subject approach was used to compare neuropsychological and electrophysiological evaluations from migrane-affected and healthy subjects. Thirty-four patients with migraine (6 males, 28 females, average 36 years old) were included. Migraineurs performed worse in the majority of the Montreal Cognitive Assessment (MoCA) (p = 0.007) compared to the healthy subjects, significantly in language (p = 0.005), memory (p = 0.006), executive functions (p = 0.042), calculation (p = 0.018) and orientation (p = 0.012). Migraineurs had a lower score on the memory trial of the Rey-Osterrieth complex figure test (ROCF) (p = 0.012). The P3 latency in Fz, Cz, Pz was prolonged in migraineurs compared with the normal control group (P migraine. We also observed that a decrease in the MoCA-executive functions and calculation score and in the ROCF-recall score were both correlated to the frequency of migraine. Migraineurs were more anxious than healthy subjects (p = 0.001), which is independent of cognitive testing. Differences were unrelated to age, gender and literacy. Cognitive performance decreases during migraine, and cognitive dysfunction can be related to the duration and frequency of a migraine attack.

  9. Gait Event Detection in Real-World Environment for Long-Term Applications: Incorporating Domain Knowledge Into Time-Frequency Analysis.

    Science.gov (United States)

    Khandelwal, Siddhartha; Wickstrom, Nicholas

    2016-12-01

    Detecting gait events is the key to many gait analysis applications that would benefit from continuous monitoring or long-term analysis. Most gait event detection algorithms using wearable sensors that offer a potential for use in daily living have been developed from data collected in controlled indoor experiments. However, for real-word applications, it is essential that the analysis is carried out in humans' natural environment; that involves different gait speeds, changing walking terrains, varying surface inclinations and regular turns among other factors. Existing domain knowledge in the form of principles or underlying fundamental gait relationships can be utilized to drive and support the data analysis in order to develop robust algorithms that can tackle real-world challenges in gait analysis. This paper presents a novel approach that exhibits how domain knowledge about human gait can be incorporated into time-frequency analysis to detect gait events from long-term accelerometer signals. The accuracy and robustness of the proposed algorithm are validated by experiments done in indoor and outdoor environments with approximately 93 600 gait events in total. The proposed algorithm exhibits consistently high performance scores across all datasets in both, indoor and outdoor environments.

  10. An expert elicitation process to project the frequency and magnitude of Florida manatee mortality events caused by red tide (Karenia brevis)

    Science.gov (United States)

    Martin, Julien; Runge, Michael C.; Flewelling, Leanne J.; Deutsch, Charles J.; Landsberg, Jan H.

    2017-11-20

    Red tides (blooms of the harmful alga Karenia brevis) are one of the major sources of mortality for the Florida manatee (Trichechus manatus latirostris), especially in southwest Florida. It has been hypothesized that the frequency and severity of red tides may increase in the future because of global climate change and other factors. To improve our ecological forecast for the effects of red tides on manatee population dynamics and long-term persistence, we conducted a formal expert judgment process to estimate probability distributions for the frequency and relative magnitude of red-tide-related manatee mortality (RTMM) events over a 100-year time horizon in three of the four regions recognized as manatee management units in Florida. This information was used to update a population viability analysis for the Florida manatee (the Core Biological Model). We convened a panel of 12 experts in manatee biology or red-tide ecology; the panel met to frame, conduct, and discuss the elicitation. Each expert provided a best estimate and plausible low and high values (bounding a confidence level of 80 percent) for each parameter in each of three regions (Northwest, Southwest, and Atlantic) of the subspecies’ range (excluding the Upper St. Johns River region) for two time periods (0−40 and 41−100 years from present). We fitted probability distributions for each parameter, time period, and expert by using these three elicited values. We aggregated the parameter estimates elicited from individual experts and fitted a parametric distribution to the aggregated results.Across regions, the experts expected the future frequency of RTMM events to be higher than historical levels, which is consistent with the hypothesis that global climate change (among other factors) may increase the frequency of red-tide blooms. The experts articulated considerable uncertainty, however, about the future frequency of RTMM events. The historical frequency of moderate and intense RTMM (combined) in

  11. A semi-supervised learning framework for biomedical event extraction based on hidden topics.

    Science.gov (United States)

    Zhou, Deyu; Zhong, Dayou

    2015-05-01

    Scientists have devoted decades of efforts to understanding the interaction between proteins or RNA production. The information might empower the current knowledge on drug reactions or the development of certain diseases. Nevertheless, due to the lack of explicit structure, literature in life science, one of the most important sources of this information, prevents computer-based systems from accessing. Therefore, biomedical event extraction, automatically acquiring knowledge of molecular events in research articles, has attracted community-wide efforts recently. Most approaches are based on statistical models, requiring large-scale annotated corpora to precisely estimate models' parameters. However, it is usually difficult to obtain in practice. Therefore, employing un-annotated data based on semi-supervised learning for biomedical event extraction is a feasible solution and attracts more interests. In this paper, a semi-supervised learning framework based on hidden topics for biomedical event extraction is presented. In this framework, sentences in the un-annotated corpus are elaborately and automatically assigned with event annotations based on their distances to these sentences in the annotated corpus. More specifically, not only the structures of the sentences, but also the hidden topics embedded in the sentences are used for describing the distance. The sentences and newly assigned event annotations, together with the annotated corpus, are employed for training. Experiments were conducted on the multi-level event extraction corpus, a golden standard corpus. Experimental results show that more than 2.2% improvement on F-score on biomedical event extraction is achieved by the proposed framework when compared to the state-of-the-art approach. The results suggest that by incorporating un-annotated data, the proposed framework indeed improves the performance of the state-of-the-art event extraction system and the similarity between sentences might be precisely

  12. Spatial distribution and frequency of precipitation during an extreme event: July 2006 mesoscale convective complexes and floods in southeastern Arizona

    Science.gov (United States)

    Griffiths, P.G.; Magirl, C.S.; Webb, R.H.; Pytlak, E.; Troch, Peter A.; Lyon, S.W.

    2009-01-01

    An extreme, multiday rainfall event over southeastern Arizona during 27-31 July 2006 caused record flooding and a historically unprecedented number of slope failures and debris flows in the Santa Catalina Mountains north of Tucson. An unusual synoptic weather pattern induced repeated nocturnal mesoscale convective systems over southeastern Arizona for five continuous days, generating multiday rainfall totals up to 360 mm. Analysis of point rainfall and weather radar data yielded storm totals for the southern Santa Catalina Mountains at 754 grid cells approximately 1 km ?? 1 km in size. Precipitation intensity for the 31 July storms was not unusual for typical monsoonal precipitation in this region (recurrence interval (RI) 50 years and individual grid cells had RI exceeding 1000 years. The 31 July storms caused the watersheds to be essentially saturated following 4 days of rainfall. Copyright 2009 by the American Geophysical Union.

  13. Temporal changes in allele frequencies in a small marble trout Salmo marmoratus population threatened by extreme flood events.

    Science.gov (United States)

    Pujolar, J M; Vincenzi, S; Zane, L; Crivelli, A J

    2016-03-01

    The effect of extreme floods on the genetic composition of marble trout Salmo marmoratus living in Lipovscek, a tributary of the Soca River in Slovenia, which has been affected by multiple destructive flood events for centuries was investigated. By monitoring genetic variability during the period 2004-2011, apparent signatures of genetic erosion including a decline in observed and expected heterozygosities and allelic richness were observed. Contemporary effective population size was estimated between 11 and 55 individuals, which is congruent with census data. The data suggest asymmetric gene flow between the two sections of the river. The existence of substantial downstream migration (15-19%) was confirmed by paternity analysis. A small (1-3%) upstream migration was also suggested, which was confirmed by tagging data. Overall, low genetic diversity has not prevented the survival of the Lipovscek population, which might be a common feature of salmonid freshwater populations. © 2016 The Fisheries Society of the British Isles.

  14. Comparison of Frequency of Cardiovascular Events and Mortality in Patients With Heart Failure Using Versus Not Using Cocaine.

    Science.gov (United States)

    Nguyen, Peter; Kamran, Hassan; Nasir, Saifullah; Chan, Wenyaw; Shah, Tina; Deswal, Anita; Bozkurt, Biykem

    2017-06-15

    Beta-blocker treatment improves left ventricular function, morbidity, and survival in patients with systolic heart failure (HF). However, there are limited data addressing the safety and efficacy of β blockers in the setting of cocaine use as there is a perceived risk of adverse outcomes. Our aim was to determine if beta-blocker treatment was safe in HF patients with a history of cocaine use compared with HF patients without history of cocaine use. We also examined whether effects differed between cardioselective versus noncardioselective β blockers. Ninety systolic HF patients with cocaine use were compared with 177 patients with nonischemic, systolic HF, and no cocaine use. Outcomes were HF readmissions, major adverse cardiovascular events, and death using multivariable Cox proportional hazard models adjusted for age, black race, hypertension, diabetes, coronary artery disease, renal insufficiency, and angiotensin-converting enzyme inhibitors. Beta-blocker treatment in systolic HF patients with cocaine use did not have significant differences in HF readmissions (hazard ratio [HR] 0.66, 95% CI 0.31 to 0.1.38), major adverse cardiovascular events (HR 0.58, 95% CI 0.27 to 1.09), death (HR 0.96, 95% CI 0.39 to 2.34), or all combined outcomes (HR 0.76, 95% CI 0.39 to 1.47) compared with beta-blocker treatment in HF patients without cocaine use. Within HF patients with cocaine use, mortality rates (HR 1.50, 95% CI 0.28 to 8.23) were not significantly different between patients treated with noncardioselective versus cardioselective β blockers. In conclusion, beta-blocker treatment in systolic HF patients with cocaine use was not associated with adverse outcomes. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Prospective memory while driving: comparison of time- and event-based intentions.

    Science.gov (United States)

    Trawley, Steven L; Stephens, Amanda N; Rendell, Peter G; Groeger, John A

    2017-06-01

    Prospective memories can divert attentional resources from ongoing activities. However, it is unclear whether these effects and the theoretical accounts that seek to explain them will generalise to a complex real-world task such as driving. Twenty-four participants drove two simulated routes while maintaining a fixed headway with a lead vehicle. Drivers were given either event-based (e.g. arriving at a filling station) or time-based errands (e.g. on-board clock shows 3:30). In contrast to the predominant view in the literature which suggests time-based tasks are more demanding, drivers given event-based errands showed greater difficulty in mirroring lead vehicle speed changes compared to the time-based group. Results suggest that common everyday secondary tasks, such as scouting the roadside for a bank, may have a detrimental impact on driving performance. The additional finding that this cost was only evident with the event-based task highlights a potential area of both theoretical and practical interest. Practitioner Summary: Drivers were given either time- or event-based errands whilst engaged in a simulated drive. We examined the effect of errands on an ongoing vehicle follow task. In contrast to previous non-driving studies, event-based errands are more disruptive. Common everyday errands may have a detrimental impact on driving performance.

  16. Study on frequency characteristics of wireless power transmission system based on magnetic coupling resonance

    Science.gov (United States)

    Liang, L. H.; Liu, Z. Z.; Hou, Y. J.; Zeng, H.; Yue, Z. K.; Cui, S.

    2017-11-01

    In order to study the frequency characteristics of the wireless energy transmission system based on the magnetic coupling resonance, a circuit model based on the magnetic coupling resonant wireless energy transmission system is established. The influence of the load on the frequency characteristics of the wireless power transmission system is analysed. The circuit coupling theory is used to derive the minimum load required to suppress frequency splitting. Simulation and experimental results verify that when the load size is lower than a certain value, the system will appear frequency splitting, increasing the load size can effectively suppress the frequency splitting phenomenon. The power regulation scheme of the wireless charging system based on magnetic coupling resonance is given. This study provides a theoretical basis for load selection and power regulation of wireless power transmission systems.

  17. Central FPGA-based Destination and Load Control in the LHCb MHz Event Readout

    CERN Document Server

    Jacobsson, Richard

    2012-01-01

    The readout strategy of the LHCb experiment [1] is based on complete event readout at 1 MHz [2]. Over 300 sub-detector readout boards transmit event fragments at 1 MHz over a commercial 70 Gigabyte/s switching network to a distributed event building and trigger processing farm with 1470 individual multi-core computer nodes [3]. In the original specifications, the readout was based on a pure push protocol. This paper describes the proposal, implementation, and experience of a powerful non-conventional mixture of a push and a pull protocol, akin to credit-based flow control. A high-speed FPGA-based central master module controls the event fragment packing in the readout boards, the assignment of the farm node destination for each event, and controls the farm load based on an asynchronous pull mechanism from each farm node. This dynamic readout scheme relies on generic event requests and the concept of node credit allowing load balancing and trigger rate regulation as a function of the global farm load. It also ...

  18. Model-based Estimation of High Frequency Jump Diffusions with Microstructure Noise and Stochastic Volatility

    NARCIS (Netherlands)

    Bos, Charles S.

    2008-01-01

    When analysing the volatility related to high frequency financial data, mostly non-parametric approaches based on realised or bipower variation are applied. This article instead starts from a continuous time diffusion model and derives a parametric analog at high frequency for it, allowing

  19. Streambed gravel sampling and frequency base conversion: A solution to data set sharing

    Science.gov (United States)

    Shirazi, Mostafa A.; Faustini, John M.; Kaufmann, Philip R.

    2009-01-01

    The analysis of streambed particle size distribution is fundamental to geology, geomorphology, engineering, ecology, and hydrology. There is a continued need for standard analytical methods to accommodate different practices in sample collection, particle size characterization, frequency analysis, and frequency base conversion. We focus upon two related topics: (1) quantitative description of size of irregular particles and (2) frequency base conversion procedures. The first is needed to accurately determine physical particle properties (diameter, surface area, volume, and weight), and the second to determine the statistical influence on one or more of these properties of each particle in a mixture. We collected natural streambed particles, measured various calipered diameters including a nominal diameter using each particle volume, and calculated a shape factor for each diameter that converts it to an equivalent sieved diameter. Next, we extended a model originally derived in 1929 for a lognormal distribution to a streambed particle size distribution that severely deviated from a lognormal distribution. After successfully converting from number to weight frequency (and the reverse) of samples collected by grid, area, and volume, we extended the conversion to apply across these collection methods. These results make streambed frequency analysis independent of the particular diameter used, the observed frequency base, and the sample collection procedure. The immediate utility of our analysis is to facilitate data sharing among disciplines. The ultimate benefit is to free researchers to select the most convenient diameter measurement, size frequency classification, frequency base, and sample collection procedure from the many alternative strategies available.

  20. Spatial Frequency Scheduling for Uplink SC-FDMA based Linearly Precoded LTE Multiuser MIMO Systems

    DEFF Research Database (Denmark)

    Lin, Zihuai; Xiao, Pei; Sørensen, Troels Bundgaard

    2010-01-01

    This paper investigates the performance of the 3GPP Long Term Evolution (LTE) uplink Single Carrier (SC) Frequency Division Multiple Access (FDMA) based linearly precoded multiuserMultiple InputMultiple Output (MIMO) systems with frequency domain packet scheduling. A mathematical expression...

  1. The RFI situation for a space-based low-frequency radio astronomy instrument

    NARCIS (Netherlands)

    Bentum, Marinus Jan; Boonstra, A.J.

    2016-01-01

    Space based ultra-long wavelength radio astronomy has recently gained a lot of interest. Techniques to open the virtually unexplored frequency band below 30 MHz are becoming within reach at this moment. Due to the ionosphere and the radio interference (RFI) on Earth exploring this frequency band

  2. Joint Angle and Frequency Estimation Using Multiple-Delay Output Based on ESPRIT

    Science.gov (United States)

    Xudong, Wang

    2010-12-01

    This paper presents a novel ESPRIT algorithm-based joint angle and frequency estimation using multiple-delay output (MDJAFE). The algorithm can estimate the joint angles and frequencies, since the use of multiple output makes the estimation accuracy greatly improved when compared with a conventional algorithm. The useful behavior of the proposed algorithm is verified by simulations.

  3. Blood velocity estimation using spatio-temporal encoding based on frequency division approach

    DEFF Research Database (Denmark)

    Gran, Fredrik; Nikolov, Svetoslav; Jensen, Jørgen Arendt

    2005-01-01

    In this paper a feasibility study of using a spatial encoding technique based on frequency division for blood flow estimation is presented. The spatial encoding is carried out by dividing the available bandwidth of the transducer into a number of narrow frequency bands with approximately disjoint...

  4. Prediction Equations of Energy Expenditure in Chinese Youth Based on Step Frequency during Walking and Running

    Science.gov (United States)

    Sun, Bo; Liu, Yu; Li, Jing Xian; Li, Haipeng; Chen, Peijie

    2013-01-01

    Purpose: This study set out to examine the relationship between step frequency and velocity to develop a step frequency-based equation to predict Chinese youth's energy expenditure (EE) during walking and running. Method: A total of 173 boys and girls aged 11 to 18 years old participated in this study. The participants walked and ran on a…

  5. Optical fiber strain sensor using fiber resonator based on frequency comb Vernier spectroscopy

    DEFF Research Database (Denmark)

    Zhang, Liang; Lu, Ping; Chen, Li

    2012-01-01

    A novel (to our best knowledge) optical fiber strain sensor using a fiber ring resonator based on frequency comb Vernier spectroscopy is proposed and demonstrated. A passively mode-locked optical fiber laser is employed to generate a phased-locked frequency comb. Strain applied to the optical fiber...

  6. Site correction of a high-frequency strong-ground-motion simulation based on an empirical transfer function

    Science.gov (United States)

    Huang, Jyun-Yan; Wen, Kuo-Liang; Lin, Che-Min; Kuo, Chun-Hsiang; Chen, Chun-Te; Chang, Shuen-Chiang

    2017-05-01

    In this study, an empirical transfer function (ETF), which is the spectrum difference in Fourier amplitude spectra between observed strong ground motion and synthetic motion obtained by a stochastic point-source simulation technique, is constructed for the Taipei Basin, Taiwan. The basis stochastic point-source simulations can be treated as reference rock site conditions in order to consider site effects. The parameters of the stochastic point-source approach related to source and path effects are collected from previous well-verified studies. A database of shallow, small-magnitude earthquakes is selected to construct the ETFs so that the point-source approach for synthetic motions might be more widely applicable. The high-frequency synthetic motion obtained from the ETF procedure is site-corrected in the strong site-response area of the Taipei Basin. The site-response characteristics of the ETF show similar responses as in previous studies, which indicates that the base synthetic model is suitable for the reference rock conditions in the Taipei Basin. The dominant frequency contour corresponds to the shape of the bottom of the geological basement (the top of the Tertiary period), which is the Sungshan formation. Two clear high-amplification areas are identified in the deepest region of the Sungshan formation, as shown by an amplification contour of 0.5 Hz. Meanwhile, a high-amplification area was shifted to the basin's edge, as shown by an amplification contour of 2.0 Hz. Three target earthquakes with different kinds of source conditions, including shallow small-magnitude events, shallow and relatively large-magnitude events, and deep small-magnitude events relative to the ETF database, are tested to verify site correction. The results indicate that ETF-based site correction is effective for shallow earthquakes, even those with higher magnitudes, but is not suitable for deep earthquakes. Finally, one of the most significant shallow large-magnitude earthquakes (the

  7. Digital disease detection: A systematic review of event-based internet biosurveillance systems.

    Science.gov (United States)

    O'Shea, Jesse

    2017-05-01

    Internet access and usage has changed how people seek and report health information. Meanwhile,infectious diseases continue to threaten humanity. The analysis of Big Data, or vast digital data, presents an opportunity to improve disease surveillance and epidemic intelligence. Epidemic intelligence contains two components: indicator based and event-based. A relatively new surveillance type has emerged called event-based Internet biosurveillance systems. These systems use information on events impacting health from Internet sources, such as social media or news aggregates. These systems circumvent the limitations of traditional reporting systems by being inexpensive, transparent, and flexible. Yet, innovations and the functionality of these systems can change rapidly. To update the current state of knowledge on event-based Internet biosurveillance systems by identifying all systems, including current functionality, with hopes to aid decision makers with whether to incorporate new methods into comprehensive programmes of surveillance. A systematic review was performed through PubMed, Scopus, and Google Scholar databases, while also including grey literature and other publication types. 50 event-based Internet systems were identified, including an extraction of 15 attributes for each system, described in 99 articles. Each system uses different innovative technology and data sources to gather data, process, and disseminate data to detect infectious disease outbreaks. The review emphasises the importance of using both formal and informal sources for timely and accurate infectious disease outbreak surveillance, cataloguing all event-based Internet biosurveillance systems. By doing so, future researchers will be able to use this review as a library for referencing systems, with hopes of learning, building, and expanding Internet-based surveillance systems. Event-based Internet biosurveillance should act as an extension of traditional systems, to be utilised as an

  8. Frequency support capability of variable speed wind turbine based on electromagnetic coupler

    DEFF Research Database (Denmark)

    You, Rui; Barahona Garzón, Braulio; Chai, Jianyun

    2015-01-01

    frequency which is the input signal for Type 3 and Type 4 wind turbine frequency support controller, is used for the calculation of WT-EMC supplementary torque command. The integrated simulation environment based on the aeroelastic code HAWC2 and software Matlab/Simulink is used to build a 2 MW WT-EMC model...... capability has to be enhanced. In this paper, the frequency support capability of WT-EMC is studied at three typical wind conditions and with two control strategies-droop control and inertial control to enhance its frequency support capability. The synchronous generator speed, more stable than the grid...

  9. Model-based compressed sensing of fiber Bragg grating arrays in the frequency domain

    Science.gov (United States)

    Werzinger, Stefan; Gottinger, Michael; Gussner, Sandra; Bergdolt, Sven; Engelbrecht, Rainer; Schmauss, Bernhard

    2017-04-01

    We propose a model-based compressed sensing (MBCS) of FBG arrays (FBGA), interrogated with wavelength scanning incoherent optical frequency domain reflectometry. This method measures the frequency response of a FBGA with an electrical vector network analyzer combined with a tunable laser. Instead of the usual inverse discrete Fourier transform (IDFT), we apply a direct estimation of the grating reflectivities with a simple frequency domain model. A reconstruction of 10 gratings spaced by 20 cm is demonstrated. MBCS allows to reduce the number of measurement frequencies from 120 to 8, compared to an IDFT, while using a bandwidth of just 500 MHz.

  10. Ancillary Frequency Control of Direct Drive Full-Scale Converter Based Wind Power Plants

    DEFF Research Database (Denmark)

    Hu, Weihao; Su, Chi; Fang, Jiakun

    2013-01-01

    This paper presents a simulation model of a wind power plant based on a MW-level variable speed wind turbine with a full-scale back-to-back power converter developed in the simulation tool of DIgSILENT Power Factory. Three different kinds of ancillary frequency control strategies, namely inertia...... emulation, primary frequency control and secondary frequency control, are proposed in order to improve the frequency stability of power systems. The modified IEEE 39-bus test system with a large-scale wind power penetration is chosen as the studied power system. Simulation results show that the proposed...

  11. Statistical analysis of nature frequencies of hemispherical resonator gyroscope based on probability theory

    Science.gov (United States)

    Yu, Xudong; Long, Xingwu; Wei, Guo; Li, Geng; Qu, Tianliang

    2015-04-01

    A finite element model of the hemispherical resonator gyro (HRG) is established and the natural frequencies and vibration modes are investigated. The matrix perturbation technology in the random finite element method is first introduced to analyze the statistical characteristics of the natural frequencies of HRG. The influences of random material parameters and dimensional parameters on the natural frequencies are quantitatively described based on the probability theory. The statistics expressions of the random parameters are given and the influences of three key parameters on natural frequency are pointed out. These results are important for design and improvement of high accuracy HRG.

  12. Fast LCMV-based Methods for Fundamental Frequency Estimation

    DEFF Research Database (Denmark)

    Jensen, Jesper Rindom; Glentis, George-Othon; Christensen, Mads Græsbøll

    2013-01-01

    as such either the classic time domain averaging covariance matrix estimator, or, if aiming for an increased spectral resolution, the covariance matrix resulting from the application of the recent iterative adaptive approach (IAA). The proposed exact implementations reduce the required computational complexity...... with several orders of magnitude, but, as we show, further computational savings can be obtained by the adoption of an approximative IAA-based data covariance matrix estimator, reminiscent of the recently proposed Quasi-Newton IAA technique. Furthermore, it is shown how the considered pitch estimators can...

  13. submitter Linear encoder based low frequency inertial sensor

    CERN Document Server

    Hellegouarch, Sylvain; Artoos, Kurt; Lambert, Pierre; Collette, Christophe

    2016-01-01

    In this article, we present a novel concept of inertial sensor, based on a linear encoder. Compared to other interferometric sensors, the encoder is much more easy to mount, and the calibration more stable. A prototype has been built and validated experimentally by comparison with a commercial seismometer. It has a resolution of about 10 pm/√Hz. In order to further improve the resolution, two concepts of mechanical amplifiers have been studied and compared. One of them is shown to be extremely promising, provided that the amplifier does not stiffen the sensor.

  14. An Event-Based Approach to Distributed Diagnosis of Continuous Systems

    Science.gov (United States)

    Daigle, Matthew; Roychoudhurry, Indranil; Biswas, Gautam; Koutsoukos, Xenofon

    2010-01-01

    Distributed fault diagnosis solutions are becoming necessary due to the complexity of modern engineering systems, and the advent of smart sensors and computing elements. This paper presents a novel event-based approach for distributed diagnosis of abrupt parametric faults in continuous systems, based on a qualitative abstraction of measurement deviations from the nominal behavior. We systematically derive dynamic fault signatures expressed as event-based fault models. We develop a distributed diagnoser design algorithm that uses these models for designing local event-based diagnosers based on global diagnosability analysis. The local diagnosers each generate globally correct diagnosis results locally, without a centralized coordinator, and by communicating a minimal number of measurements between themselves. The proposed approach is applied to a multi-tank system, and results demonstrate a marked improvement in scalability compared to a centralized approach.

  15. Trust Index Based Fault Tolerant Multiple Event Localization Algorithm for WSNs

    Directory of Open Access Journals (Sweden)

    Jian Wan

    2011-06-01

    Full Text Available This paper investigates the use of wireless sensor networks for multiple event source localization using binary information from the sensor nodes. The events could continually emit signals whose strength is attenuated inversely proportional to the distance from the source. In this context, faults occur due to various reasons and are manifested when a node reports a wrong decision. In order to reduce the impact of node faults on the accuracy of multiple event localization, we introduce a trust index model to evaluate the fidelity of information which the nodes report and use in the event detection process, and propose the Trust Index based Subtract on Negative Add on Positive (TISNAP localization algorithm, which reduces the impact of faulty nodes on the event localization by decreasing their trust index, to improve the accuracy of event localization and performance of fault tolerance for multiple event source localization. The algorithm includes three phases: first, the sink identifies the cluster nodes to determine the number of events occurred in the entire region by analyzing the binary data reported by all nodes; then, it constructs the likelihood matrix related to the cluster nodes and estimates the location of all events according to the alarmed status and trust index of the nodes around the cluster nodes. Finally, the sink updates the trust index of all nodes according to the fidelity of their information in the previous reporting cycle. The algorithm improves the accuracy of localization and performance of fault tolerance in multiple event source localization. The experiment results show that when the probability of node fault is close to 50%, the algorithm can still accurately determine the number of the events and have better accuracy of localization compared with other algorithms.

  16. Structural Health Monitoring Based on Combined Structural Global and Local Frequencies

    Directory of Open Access Journals (Sweden)

    Jilin Hou

    2014-01-01

    Full Text Available This paper presents a parameter estimation method for Structural Health Monitoring based on the combined measured structural global frequencies and structural local frequencies. First, the global test is experimented to obtain the low order modes which can reflect the global information of the structure. Secondly, the mass is added on the member of structure to increase the local dynamic characteristic and to make the member have local primary frequency, which belongs to structural local frequency and is sensitive to local parameters. Then the parameters of the structure can be optimized accurately using the combined structural global frequencies and structural local frequencies. The effectiveness and accuracy of the proposed method are verified by the experiment of a space truss.

  17. Micro-Doppler Signal Time-Frequency Algorithm Based on STFRFT

    Directory of Open Access Journals (Sweden)

    Cunsuo Pang

    2016-09-01

    Full Text Available This paper proposes a time-frequency algorithm based on short-time fractional order Fourier transformation (STFRFT for identification of a complicated movement targets. This algorithm, consisting of a STFRFT order-changing and quick selection method, is effective in reducing the computation load. A multi-order STFRFT time-frequency algorithm is also developed that makes use of the time-frequency feature of each micro-Doppler component signal. This algorithm improves the estimation accuracy of time-frequency curve fitting through multi-order matching. Finally, experiment data were used to demonstrate STFRFT’s performance in micro-Doppler time-frequency analysis. The results validated the higher estimate accuracy of the proposed algorithm. It may be applied to an LFM (Linear frequency modulated pulse radar, SAR (Synthetic aperture radar, or ISAR (Inverse synthetic aperture radar, for improving the probability of target recognition.

  18. The Tracking Resonance Frequency Method for Photoacoustic Measurements Based on the Phase Response

    Science.gov (United States)

    Suchenek, Mariusz

    2017-04-01

    One of the major issues in the use of the resonant photoacoustic cell is the resonance frequency of the cell. The frequency is not stable, and its changes depend mostly on temperature and gas mixture. This paper presents a new method for tracking resonance frequency, where both the amplitude and phase are calculated from the input samples. The stimulating frequency can be adjusted to the resonance frequency of the cell based on the phase. This method was implemented using a digital measurement system with an analog to digital converter, field programmable gate array (FPGA) and a microcontroller. The resonance frequency was changed by the injection of carbon dioxide into the cell. A theoretical description and experimental results are also presented.

  19. Frequency-shaped and observer-based discrete-time sliding mode control

    CERN Document Server

    Mehta, Axaykumar

    2015-01-01

    It is well established that the sliding mode control strategy provides an effective and robust method of controlling the deterministic system due to its well-known invariance property to a class of bounded disturbance and parameter variations. Advances in microcomputer technologies have made digital control increasingly popular among the researchers worldwide. And that led to the study of discrete-time sliding mode control design and its implementation. This brief presents, a method for multi-rate frequency shaped sliding mode controller design based on switching and non-switching type of reaching law. In this approach, the frequency dependent compensator dynamics are introduced through a frequency-shaped sliding surface by assigning frequency dependent weighing matrices in a linear quadratic regulator (LQR) design procedure. In this way, the undesired high frequency dynamics or certain frequency disturbance can be eliminated. The states are implicitly obtained by measuring the output at a faster rate than th...

  20. Tracing the Spatial-Temporal Evolution of Events Based on Social Media Data

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhou

    2017-03-01

    Full Text Available Social media data provide a great opportunity to investigate event flow in cities. Despite the advantages of social media data in these investigations, the data heterogeneity and big data size pose challenges to researchers seeking to identify useful information about events from the raw data. In addition, few studies have used social media posts to capture how events develop in space and time. This paper demonstrates an efficient approach based on machine learning and geovisualization to identify events and trace the development of these events in real-time. We conducted an empirical study to delineate the temporal and spatial evolution of a natural event (heavy precipitation and a social event (Pope Francis’ visit to the US in the New York City—Washington, DC regions. By investigating multiple features of Twitter data (message, author, time, and geographic location information, this paper demonstrates how voluntary local knowledge from tweets can be used to depict city dynamics, discover spatiotemporal characteristics of events, and convey real-time information.

  1. Improving the extraction of complex regulatory events from scientific text by using ontology-based inference

    Directory of Open Access Journals (Sweden)

    Kim Jung-jae

    2011-10-01

    Full Text Available Abstract Background The extraction of complex events from biomedical text is a challenging task and requires in-depth semantic analysis. Previous approaches associate lexical and syntactic resources with ontologies for the semantic analysis, but fall short in testing the benefits from the use of domain knowledge. Results We developed a system that deduces implicit events from explicitly expressed events by using inference rules that encode domain knowledge. We evaluated the system with the inference module on three tasks: First, when tested against a corpus with manually annotated events, the inference module of our system contributes 53.2% of correct extractions, but does not cause any incorrect results. Second, the system overall reproduces 33.1% of the transcription regulatory events contained in RegulonDB (up to 85.0% precision and the inference module is required for 93.8% of the reproduced events. Third, we applied the system with minimum adaptations to the identification of cell activity regulation events, confirming that the inference improves the performance of the system also on this task. Conclusions Our research shows that the inference based on domain knowledge plays a significant role in extracting complex events from text. This approach has great potential in recognizing the complex concepts of such biomedical ontologies as Gene Ontology in the literature.

  2. A robust neural network-based approach for microseismic event detection

    KAUST Repository

    Akram, Jubran

    2017-08-17

    We present an artificial neural network based approach for robust event detection from low S/N waveforms. We use a feed-forward network with a single hidden layer that is tuned on a training dataset and later applied on the entire example dataset for event detection. The input features used include the average of absolute amplitudes, variance, energy-ratio and polarization rectilinearity. These features are calculated in a moving-window of same length for the entire waveform. The output is set as a user-specified relative probability curve, which provides a robust way of distinguishing between weak and strong events. An optimal network is selected by studying the weight-based saliency and effect of number of neurons on the predicted results. Using synthetic data examples, we demonstrate that this approach is effective in detecting weaker events and reduces the number of false positives.

  3. Population based allele frequencies of disease associated polymorphisms in the Personalized Medicine Research Project

    Directory of Open Access Journals (Sweden)

    Stefanski Elisha L

    2010-06-01

    Full Text Available Abstract Background There is a lack of knowledge regarding the frequency of disease associated polymorphisms in populations and population attributable risk for many populations remains unknown. Factors that could affect the association of the allele with disease, either positively or negatively, such as race, ethnicity, and gender, may not be possible to determine without population based allele frequencies. Here we used a panel of 51 polymorphisms previously associated with at least one disease and determined the allele frequencies within the entire Personalized Medicine Research Project population based cohort. We compared these allele frequencies to those in dbSNP and other data sources stratified by race. Differences in allele frequencies between self reported race, region of origin, and sex were determined. Results There were 19544 individuals who self reported a single racial category, 19027 or (97.4% self reported white Caucasian, and 11205 (57.3% individuals were female. Of the 11,208 (57% individuals with an identifiable region of origin 8337 or (74.4% were German. 41 polymorphisms were significantly different between self reported race at the 0.05 level. Stratification of our Caucasian population by self reported region of origin revealed 19 polymorphisms that were significantly different (p = 0.05 between individuals of different origins. Further stratification of the population by gender revealed few significant differences in allele frequencies between the genders. Conclusions This represents one of the largest population based allele frequency studies to date. Stratification by self reported race and region of origin revealed wide differences in allele frequencies not only by race but also by region of origin within a single racial group. We report allele frequencies for our Asian/Hmong and American Indian populations; these two minority groups are not typically selected for population allele frequency detection. Population wide

  4. Non-Cooperative Regulation Coordination Based on Game Theory for Wind Farm Clusters during Ramping Events

    DEFF Research Database (Denmark)

    Qi, Yongzhi; Liu, Yutian; Wu, Qiuwei

    2017-01-01

    With increasing penetration of wind power in power systems, it is important to track scheduled wind power output as much as possible during ramping events to ensure security of the system. In this paper, a non‐cooperative coordination strategy based on the game theory is proposed for the regulation...... of wind farm clusters (WFCs) in order to track scheduled wind power of the WFC during ramping events. In the proposed strategy, a non‐cooperative game is formulated and wind farms compete to provide regulation to the WFC during ramping events. A regulation revenue function is proposed to evaluate...

  5. Frequency shift keying by current modulation in a MTJ-based STNO with high data rate

    Science.gov (United States)

    Ruiz-Calaforra, A.; Purbawati, A.; Brächer, T.; Hem, J.; Murapaka, C.; Jiménez, E.; Mauri, D.; Zeltser, A.; Katine, J. A.; Cyrille, M.-C.; Buda-Prejbeanu, L. D.; Ebels, U.

    2017-08-01

    Spin torque nano-oscillators are nanoscopic microwave frequency generators which excel due to their large frequency tuning range and agility for amplitude and frequency modulation. Due to their compactness, they are regarded as suitable candidates for applications in wireless communications, where cost-effective and complementary metal-oxide semiconductor-compatible standalone devices are required. In this work, we study the ability of a magnetic-tunnel-junction based spin torque nano-oscillator to respond to a binary input sequence encoded in a square-shaped current pulse for its application as a frequency-shift-keying (FSK) based emitter. We demonstrate that below the limit imposed by the spin torque nano-oscillator intrinsic relaxation frequency, an agile variation between discrete oscillator states is possible. For this kind of devices, we demonstrate FSK up to data rates of 400 Mbps, which is well suited for the application of such oscillators in wireless networks.

  6. Frequency Control in Autanamous Microgrid in the Presence of DFIG based Wind Turbine

    Directory of Open Access Journals (Sweden)

    Ghazanfar Shahgholian

    2015-10-01

    Full Text Available Despite their ever-increasing power injection into power grid, wind turbines play no role in frequency control. On the other hand, power network frequency is mainly adjusted by conventional power plants. DFIG-based wind turbines not only are able to produce power in various mechanical speeds, but they can also reduce speed instantaneously which, in turn, leads to mechanical energy release. Thus, they can aid conventional units in system frequency control. In this paper, the effect of wind energy conversion systems, especially variable speed DFIG-based wind turbines, in controlling and tuning of frequency is investigated when different penetration coefficients are considered in a isolated microgrid comprising of conventional thermal and non-thermal generating unit. To do this, optimal tuning of DFIG's speed controller is performed in different penetration levels using particle swarm optimization (PSO technique. In addition, optimum penetration of wind energy conversion system is studied considering frequency change parameters in a microgrid.

  7. Frequency-based similarity detection of structures in human brain

    Science.gov (United States)

    Sims, Dave I.; Siadat, Mohammad-Reza

    2017-03-01

    Advancements in 3D scanning and volumetric imaging methods have motivated researchers to tackle new challenges related to storing, retrieving and comparing 3D models, especially in medical domain. Comparing natural rigid shapes and detecting subtle changes in 3D models of brain structures is of great importance. Precision in capturing surface details and insensitivity to shape orientation are highly desirable properties of good shape descriptors. In this paper, we propose a new method, Spherical Harmonics Distance (SHD), which leverages the power of spherical harmonics to provide more accurate representation of surface details. At the same time, the proposed method incorporates the features of a shape distribution method (D2) and inherits its insensitivity to shape orientation. Comparing SHD to a spherical harmonics based method (SPHARM) shows that the performance of the proposed method is less sensitive to rotation. Also, comparing SHD to D2 shows that the proposed method is more accurate in detecting subtle changes. The performance of the proposed method is verified by calculating the Fisher measure (FM) of extracted feature vectors. The FM of the vectors generated by SHD on average shows 27 times higher values than that of D2. Our preliminary results show that SHD successfully combines desired features from two different methods and paves the way towards better detection of subtle dissimilarities among natural rigid shapes (e.g. structures of interest in human brain). Detecting these subtle changes can be instrumental in more accurate diagnosis, prognosis and treatment planning.

  8. The typology, frequency and magnitude of some behaviour events in case of torrential hydrographical management works in the upper Tarlung watershed

    Directory of Open Access Journals (Sweden)

    Ioan Clinciu

    2013-12-01

    Full Text Available During the 20-25 years from their startup, the torrential hydrographicalmanagement works carried out in the upper Tărlung Watershed(55 dams, 22 sills, 25 traverses and 4 outlet canals have exposed a number of 24 behaviour event types: 13 out of them reduce the safety of exploitation and the sustainability of the works (hereinafter called damages, while the other 11 reduce the functionality of the works (hereinafter called disfunctionalities. The following behaviour events have the highest frequency:(i damages caused by water and alluvia erosion (erosive damages,followed by breakages, in the category of damages, and (ii unsupervised installation of forest vegetation on the managed torrential hydrographical network and apron siltation, in the category of disfunctionalities. For methodological reasons, only the erosive damage of works was successively analysed, according to two criteria: the average depth (cm in the eroded area and the percentage of the erosive area out of the total surface. Further on, by combining the two criteria for analysis, five representation areas with the same damage intensity were defined (very low, low, medium, high and very high intensity. With the aid of the event frequency values recorded in these areas and of the coefficients attributed to each intensity class (from 1 for very low intensity to 5 for very high intensity, the author reached the conclusion that the level of the recorded intensity of the damage caused by water and alluvia erosion ranged from very low to low.

  9. Alcohol sponsorship of a summer of sport: a frequency analysis of alcohol marketing during major sports events on New Zealand television.

    Science.gov (United States)

    Chambers, Tim; Signal, Louise; Carter, Mary-Ann; McConville, Samuel; Wong, Rebecca; Zhu, Wendy

    2017-01-13

    This research aims to assess the nature and extent of alcohol marketing through sport sponsorship over a summer of televised sport in New Zealand. Frequency analysis of New Zealand television broadcasts of five international sporting events during the summer of 2014-2015. Broadcasts were analysed to identify the percentage of time when alcohol brands were visible during game-play. The number of independent alcohol brand exposures was recorded. Alcohol brands were observed during every televised event. Audiences were exposed to between 1.6 and 3.8 alcohol brand exposures per minute. Alcohol brands were visible between 42 and 777 times across the games examined. For three out of the five events alcohol brands were visible for almost half of the game. Alcohol sponsorship was prevalent in international sport on New Zealand television. Given the popularity of broadcast sport, especially with children, there is an urgent need for regulation of alcohol sponsorship of sport. There are viable models of alcohol sponsorship replacement but their implementation requires the will of both sporting organisations and politicians. This research adds weight to arguments to implement recommendations to remove all alcohol sponsorship of sport.

  10. A multiple distributed representation method based on neural network for biomedical event extraction.

    Science.gov (United States)

    Wang, Anran; Wang, Jian; Lin, Hongfei; Zhang, Jianhai; Yang, Zhihao; Xu, Kan

    2017-12-20

    Biomedical event extraction is one of the most frontier domains in biomedical research. The two main subtasks of biomedical event extraction are trigger identification and arguments detection which can both be considered as classification problems. However, traditional state-of-the-art methods are based on support vector machine (SVM) with massive manually designed one-hot represented features, which require enormous work but lack semantic relation among words. In this paper, we propose a multiple distributed representation method for biomedical event extraction. The method combines context consisting of dependency-based word embedding, and task-based features represented in a distributed way as the input of deep learning models to train deep learning models. Finally, we used softmax classifier to label the example candidates. The experimental results on Multi-Level Event Extraction (MLEE) corpus show higher F-scores of 77.97% in trigger identification and 58.31% in overall compared to the state-of-the-art SVM method. Our distributed representation method for biomedical event extraction avoids the problems of semantic gap and dimension disaster from traditional one-hot representation methods. The promising results demonstrate that our proposed method is effective for biomedical event extraction.

  11. High-resolution mid-IR spectrometer based on frequency upconversion

    DEFF Research Database (Denmark)

    Hu, Qi; Dam, Jeppe Seidelin; Pedersen, Christian

    2012-01-01

    We demonstrate a novel approach for high-resolution spectroscopy based on frequency upconversion and postfiltering by means of a scanning Fabryx2013;Perot interferometer. The system is based on sum-frequency mixing, shifting the spectral content from the mid-infrared to the near-visible region al......-frequency 1064xA0;nm laser. We investigate water vapor emission lines from a butane burner and compare the measured results to model data. The presented method we suggest to be used for real-time monitoring of specific gas lines and reference signals....

  12. Time-Frequency Distribution of Music based on Sparse Wavelet Packet Representations

    DEFF Research Database (Denmark)

    Endelt, Line Ørtoft

    the minimization methods basis pursuit and best orthogonal basis. Visualizations of the time-frequency distribution are constructed based on a simplified energy distribution in the wavelet packet decomposition. The time-frequency distributions emphasizes structured musical content, including non-stationary content......, by masking the energy from less structured music instruments. We present four examples for visualizing structured content, including vocal and single instrument.......We introduce a new method for generating time-frequency distributions, which is particularly useful for the analysis of music signals. The method presented here is based on $\\ell1$ sparse representations of music signals in a redundant wavelet packet dictionary. The representations are found using...

  13. Assessing distractors and teamwork during surgery: developing an event-based method for direct observation.

    Science.gov (United States)

    Seelandt, Julia C; Tschan, Franziska; Keller, Sandra; Beldi, Guido; Jenni, Nadja; Kurmann, Anita; Candinas, Daniel; Semmer, Norbert K

    2014-11-01

    To develop a behavioural observation method to simultaneously assess distractors and communication/teamwork during surgical procedures through direct, on-site observations; to establish the reliability of the method for long (>3 h) procedures. Observational categories for an event-based coding system were developed based on expert interviews, observations and a literature review. Using Cohen's κ and the intraclass correlation coefficient, interobserver agreement was assessed for 29 procedures. Agreement was calculated for the entire surgery, and for the 1st hour. In addition, interobserver agreement was assessed between two tired observers and between a tired and a non-tired observer after 3 h of surgery. The observational system has five codes for distractors (door openings, noise distractors, technical distractors, side conversations and interruptions), eight codes for communication/teamwork (case-relevant communication, teaching, leadership, problem solving, case-irrelevant communication, laughter, tension and communication with external visitors) and five contextual codes (incision, last stitch, personnel changes in the sterile team, location changes around the table and incidents). Based on 5-min intervals, Cohen's κ was good to excellent for distractors (0.74-0.98) and for communication/teamwork (0.70-1). Based on frequency counts, intraclass correlation coefficient was excellent for distractors (0.86-0.99) and good to excellent for communication/teamwork (0.45-0.99). After 3 h of surgery, Cohen's κ was 0.78-0.93 for distractors, and 0.79-1 for communication/teamwork. The observational method developed allows a single observer to simultaneously assess distractors and communication/teamwork. Even for long procedures, high interobserver agreement can be achieved. Data collected with this method allow for investigating separate or combined effects of distractions and communication/teamwork on surgical performance and patient outcomes. Published by the

  14. Efficiency of Event-Based Sampling According to Error Energy Criterion

    Directory of Open Access Journals (Sweden)

    Marek Miskowicz

    2010-03-01

    Full Text Available The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control. Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  15. Efficiency of event-based sampling according to error energy criterion.

    Science.gov (United States)

    Miskowicz, Marek

    2010-01-01

    The paper belongs to the studies that deal with the effectiveness of the particular event-based sampling scheme compared to the conventional periodic sampling as a reference. In the present study, the event-based sampling according to a constant energy of sampling error is analyzed. This criterion is suitable for applications where the energy of sampling error should be bounded (i.e., in building automation, or in greenhouse climate monitoring and control). Compared to the integral sampling criteria, the error energy criterion gives more weight to extreme sampling error values. The proposed sampling principle extends a range of event-based sampling schemes and makes the choice of particular sampling criterion more flexible to application requirements. In the paper, it is proved analytically that the proposed event-based sampling criterion is more effective than the periodic sampling by a factor defined by the ratio of the maximum to the mean of the cubic root of the signal time-derivative square in the analyzed time interval. Furthermore, it is shown that the sampling according to energy criterion is less effective than the send-on-delta scheme but more effective than the sampling according to integral criterion. On the other hand, it is indicated that higher effectiveness in sampling according to the selected event-based criterion is obtained at the cost of increasing the total sampling error defined as the sum of errors for all the samples taken.

  16. Effects of GPR antenna configuration on subpavement drain detection based on the frequency-shift phenomenon

    Science.gov (United States)

    Bai, Hao; Sinfield, Joseph V.

    2017-11-01

    The water and clay content of subsurface soil can significantly influence the detection results obtained from ground penetrating radar (GPR). Due to the variation of the material properties underground, the center frequency of transmitted GPR signals shifts to a lower range as wave attenuation increases. Examination of wave propagation in the subsurface employing an attenuation filter based on a linear system model shows that received GPR signals will be shifted to lower frequencies than those originally transmitted. The amount of the shift is controlled by a wave attenuation factor, which is determined by the dielectric constant, electric conductivity, and magnetic susceptibility of the transmitted medium. This paper introduces a receiver-transmitter-receiver dual-frequency configuration for GPR that employs two operational frequencies for a given test - one higher and one slightly lower - to take advantage of this phenomenon to improve subpavement drain detection results. In this configuration, the original signal is transmitted from the higher frequency transmitter. After traveling through underground materials, the signal is received by two receivers with different frequencies. One of the receivers has the same higher center frequency as the transmitter, and the other receiver has a lower center frequency. This configuration can be expressed as Rx(low-frequency)-Tx(high-frequency)-Rx(high-frequency) and was applied in both laboratory experiments and field tests. Results are analyzed in the frequency domain to evaluate and compare the properties of the signal obtained by both receivers. The laboratory experiment used the configuration of Rx(400MHz)-Tx(900MHz)-Rx(900MHz). The field tests, in addition to the configuration used in the lab tests, employed another configuration of Rx(270MHz)-Tx(400MHz)-Rx(400MHz) to obtain more information about this phenomenon. Both lab and field test results illustrate the frequency-shift phenomenon described by theoretical

  17. Managing wildfire events: risk-based decision making among a group of federal fire managers

    Science.gov (United States)

    Robyn S. Wilson; Patricia L. Winter; Lynn A. Maguire; Timothy. Ascher

    2011-01-01

    Managing wildfire events to achieve multiple management objectives involves a high degree of decision complexity and uncertainty, increasing the likelihood that decisions will be informed by experience-based heuristics triggered by available cues at the time of the decision. The research reported here tests the prevalence of three risk-based biases among 206...

  18. Full-waveform detection of non-impulsive seismic events based on time-reversal methods

    Science.gov (United States)

    Solano, Ericka Alinne; Hjörleifsdóttir, Vala; Liu, Qinya

    2017-12-01

    We present a full-waveform detection method for non-impulsive seismic events, based on time-reversal principles. We use the strain Green's tensor as a matched filter, correlating it with continuous observed seismograms, to detect non-impulsive seismic events. We show that this is mathematically equivalent to an adjoint method for detecting earthquakes. We define the detection function, a scalar valued function, which depends on the stacked correlations for a group of stations. Event detections are given by the times at which the amplitude of the detection function exceeds a given value relative to the noise level. The method can make use of the whole seismic waveform or any combination of time-windows with different filters. It is expected to have an advantage compared to traditional detection methods for events that do not produce energetic and impulsive P waves, for example glacial events, landslides, volcanic events and transform-fault earthquakes for events which velocity structure along the path is relatively well known. Furthermore, the method has advantages over empirical Greens functions template matching methods, as it does not depend on records from previously detected events, and therefore is not limited to events occurring in similar regions and with similar focal mechanisms as these events. The method is not specific to any particular way of calculating the synthetic seismograms, and therefore complicated structural models can be used. This is particularly beneficial for intermediate size events that are registered on regional networks, for which the effect of lateral structure on the waveforms can be significant. To demonstrate the feasibility of the method, we apply it to two different areas located along the mid-oceanic ridge system west of Mexico where non-impulsive events have been reported. The first study area is between Clipperton and Siqueiros transform faults (9°N), during the time of two earthquake swarms, occurring in March 2012 and May

  19. Ground roll wave suppression based on wavelet frequency division and radial trace transform

    Science.gov (United States)

    Wang, Wan-Li; Yang, Wu-Yang; Wei, Xin-Jian; He, Xin

    2017-03-01

    Ground roll waves interfere with seismic data. The suppression of ground roll waves based on the division of wavelet frequencies considers the low-frequency characteristics of ground roll waves. However, this method will not be effective when the ground roll wave and the effective signal have the same frequency bands because of overlapping. The radial trace transform (RTT) considers the apparent velocity difference between the effective signal and the ground roll wave to suppress the latter, but affects the low-frequency components of the former. This study proposes a ground roll wave suppression method by combining the wavelet frequency division and the RTT based on the difference between the ground roll wave velocity and the effective signal and their energy difference in the wavelet domain, thus making full use of the advantages of both methods. First, we decompose the seismic data into different frequency bands through wavelet transform. Second, the RTT and low-cut filtering are applied to the low-frequency band, where the ground roll waves are appearing. Third, we reconstruct the seismic record without ground roll waves by using the inverse RTT and the remaining frequency bands. The proposed method not only improves the ground roll wave suppression, but also protects the signal integrity. The numerical simulation and real seismic data processing results suggest that the proposed method has a strong ability to denoise while preserving the amplitude.

  20. Regional frequency analysis of extreme rainfall in Belgium based on radar estimates

    Directory of Open Access Journals (Sweden)

    E. Goudenhoofdt

    2017-10-01

    Full Text Available In Belgium, only rain gauge time series have been used so far to study extreme rainfall at a given location. In this paper, the potential of a 12-year quantitative precipitation estimation (QPE from a single weather radar is evaluated. For the period 2005–2016, 1 and 24 h rainfall extremes from automatic rain gauges and collocated radar estimates are compared. The peak intensities are fitted to the exponential distribution using regression in Q-Q plots with a threshold rank which minimises the mean squared error. A basic radar product used as reference exhibits unrealistic high extremes and is not suitable for extreme value analysis. For 24 h rainfall extremes, which occur partly in winter, the radar-based QPE needs a bias correction. A few missing events are caused by the wind drift associated with convective cells and strong radar signal attenuation. Differences between radar and gauge rainfall values are caused by spatial and temporal sampling, gauge underestimations and radar errors. Nonetheless the fit to the QPE data is within the confidence interval of the gauge fit, which remains large due to the short study period. A regional frequency analysis for 1 h duration is performed at the locations of four gauges with 1965–2008 records using the spatially independent QPE data in a circle of 20 km. The confidence interval of the radar fit, which is small due to the sample size, contains the gauge fit for the two closest stations from the radar. In Brussels, the radar extremes are significantly higher than the gauge rainfall extremes, but similar to those observed by an automatic gauge during the same period. The extreme statistics exhibit slight variations related to topography. The radar-based extreme value analysis can be extended to other durations.

  1. Regional frequency analysis of extreme rainfall in Belgium based on radar estimates

    Science.gov (United States)

    Goudenhoofdt, Edouard; Delobbe, Laurent; Willems, Patrick

    2017-10-01

    In Belgium, only rain gauge time series have been used so far to study extreme rainfall at a given location. In this paper, the potential of a 12-year quantitative precipitation estimation (QPE) from a single weather radar is evaluated. For the period 2005-2016, 1 and 24 h rainfall extremes from automatic rain gauges and collocated radar estimates are compared. The peak intensities are fitted to the exponential distribution using regression in Q-Q plots with a threshold rank which minimises the mean squared error. A basic radar product used as reference exhibits unrealistic high extremes and is not suitable for extreme value analysis. For 24 h rainfall extremes, which occur partly in winter, the radar-based QPE needs a bias correction. A few missing events are caused by the wind drift associated with convective cells and strong radar signal attenuation. Differences between radar and gauge rainfall values are caused by spatial and temporal sampling, gauge underestimations and radar errors. Nonetheless the fit to the QPE data is within the confidence interval of the gauge fit, which remains large due to the short study period. A regional frequency analysis for 1 h duration is performed at the locations of four gauges with 1965-2008 records using the spatially independent QPE data in a circle of 20 km. The confidence interval of the radar fit, which is small due to the sample size, contains the gauge fit for the two closest stations from the radar. In Brussels, the radar extremes are significantly higher than the gauge rainfall extremes, but similar to those observed by an automatic gauge during the same period. The extreme statistics exhibit slight variations related to topography. The radar-based extreme value analysis can be extended to other durations.

  2. On the predictability of volcano-tectonic events by low frequency seismic noise analysis at Teide-Pico Viejo volcanic complex, Canary Islands

    Directory of Open Access Journals (Sweden)

    M. Tárraga

    2006-01-01

    Full Text Available The island of Tenerife (Canary Islands, Spain, is showing possible signs of reawakening after its last basaltic strombolian eruption, dated 1909 at Chinyero. The main concern relates to the central active volcanic complex Teide - Pico Viejo, which poses serious hazards to the properties and population of the island of Tenerife (Canary Islands, Spain, and which has erupted several times during the last 5000 years, including a subplinian phonolitic eruption (Montaña Blanca about 2000 years ago. In this paper we show the presence of low frequency seismic noise which possibly includes tremor of volcanic origin and we investigate the feasibility of using it to forecast, via the material failure forecast method, the time of occurrence of discrete events that could be called Volcano-Tectonic or simply Tectonic (i.e. non volcanic on the basis of their relationship to volcanic activity. In order to avoid subjectivity in the forecast procedure, an automatic program has been developed to generate forecasts, validated by Bayes theorem. A parameter called 'forecast gain' measures (and for the first time quantitatively what is gained in probabilistic terms by applying the (automatic failure forecast method. The clear correlation between the obtained forecasts and the occurrence of (Volcano-Tectonic seismic events - a clear indication of a relationship between the continuous seismic noise and the discrete seismic events - is the explanation for the high value of this 'forecast gain' in both 2004 and 2005 and an indication that the events are Volcano-Tectonic rather than purely Tectonic.

  3. Evaluation of Distributive Frequency of Oral Contraceptive Pills Consumption in Women with Cerebrovascular Events Admitted in Farshchian Hospital of Hamadan between 1997-2007

    Directory of Open Access Journals (Sweden)

    Mehrdokht Mazdeh

    2011-06-01

    Full Text Available Background & Objectives: Although there is no prolonged time elapsed from propagation of oral contraceptive pills (OCP, case reports demonstrated occurrence of pulmonary embolism and cerebral infarction in women using these pills. Present study was done to specify distributive frequency of oral contraceptive pills consumption in women with cerebrovascular events admitted in Farshchian hospital of Hamadan between 1997 to 2007. Materials & Methods: Every woman with cerebrovascular events during years 1997-2007 who was admitted in Farshchian hospital of Hamadan and her dossier was present in archive of hospital, were carefully checked and those who hadn’t exclusion criteria, were include in this study, a total of 1587 of them with respect to their Characteristics such as type of cerebrovascular event, age, type of oral contraceptive pill and duration of pill use were extracted from patient dossier and registered in respective checklist. Results: 24.1% of patient used oral contraceptive pill and 76.9% of patients were non users. Mean age of OCP users and non users were 45 years. Mean duration of pill use among these patients was 33 months. In assessing type of vascular events, in the group OCP users 73.1% and non users 66.4% had ischemic stroke.Which was statistically significant. In the group OCP users 24.6% and non users 29.1% were hemorrhagic stroke.. Also in the group OCP users 2.3% and non users 4.5% were affected sagital sinuses thrombosis that showed no significant difference. Among OCP users 85% of the patients used OCP, LD and 15% of the patients OCP, HD. Conclusion: The present study showed, the ischemic stroke rate of the patients with OCP consumption were significantly more than those of non users.

  4. Neural correlates of attentional and mnemonic processing in event-based prospective memory

    Directory of Open Access Journals (Sweden)

    Justin B Knight

    2010-02-01

    Full Text Available Prospective memory, or memory for realizing delayed intentions, was examined with an event-based paradigm while simultaneously measuring neural activity with high-density EEG recordings. Specifically, the neural substrates of monitoring for an event-based cue were examined, as well as those perhaps associated with the cognitive processes supporting detection of cues and fulfillment of intentions. Participants engaged in a baseline lexical decision task (LDT, followed by a LDT with an embedded prospective memory (PM component. Event-based cues were constituted by color and lexicality (red words. Behavioral data provided evidence that monitoring, or preparatory attentional processes, were used to detect cues. Analysis of the event-related potentials (ERP revealed visual attentional modulations at 140 and 220 ms post-stimulus associated with preparatory attentional processes. In addition, ERP components at 220, 350, and 400 ms post-stimulus were enhanced for intention-related items. Our results suggest preparatory attention may operate by selectively modulating processing of features related to a previously formed event-based intention, as well as provide further evidence for the proposal that dissociable component processes support the fulfillment of delayed intentions.

  5. A process mining-based investigation of adverse events in care processes.

    Science.gov (United States)

    Caron, Filip; Vanthienen, Jan; Vanhaecht, Kris; Van Limbergen, Erik; Deweerdt, Jochen; Baesens, Bart

    2014-01-01

    This paper proposes the Clinical Pathway Analysis Method (CPAM) approach that enables the extraction of valuable organisational and medical information on past clinical pathway executions from the event logs of healthcare information systems. The method deals with the complexity of real-world clinical pathways by introducing a perspective-based segmentation of the date-stamped event log. CPAM enables the clinical pathway analyst to effectively and efficiently acquire a profound insight into the clinical pathways. By comparing the specific medical conditions of patients with the factors used for characterising the different clinical pathway variants, the medical expert can identify the best therapeutic option. Process mining-based analytics enables the acquisition of valuable insights into clinical pathways, based on the complete audit traces of previous clinical pathway instances. Additionally, the methodology is suited to assess guideline compliance and analyse adverse events. Finally, the methodology provides support for eliciting tacit knowledge and providing treatment selection assistance.

  6. Seismology-based early identification of dam-formation landquake events.

    Science.gov (United States)

    Chao, Wei-An; Zhao, Li; Chen, Su-Chin; Wu, Yih-Min; Chen, Chi-Hsuan; Huang, Hsin-Hua

    2016-01-12

    Flooding resulting from the bursting of dams formed by landquake events such as rock avalanches, landslides and debris flows can lead to serious bank erosion and inundation of populated areas near rivers. Seismic waves can be generated by landquake events which can be described as time-dependent forces (unloading/reloading cycles) acting on the Earth. In this study, we conduct inversions of long-period (LP, period ≥20 s) waveforms for the landquake force histories (LFHs) of ten events, which provide quantitative characterization of the initiation, propagation and termination stages of the slope failures. When the results obtained from LP waveforms are analyzed together with high-frequency (HF, 1-3 Hz) seismic signals, we find a relatively strong late-arriving seismic phase (dubbed Dam-forming phase or D-phase) recorded clearly in the HF waveforms at the closest stations, which potentially marks the time when the collapsed masses sliding into river and perhaps even impacting the topographic barrier on the opposite bank. Consequently, our approach to analyzing the LP and HF waveforms developed in this study has a high potential for identifying five dam-forming landquake events (DFLEs) in near real-time using broadband seismic records, which can provide timely warnings of the impending floods to downstream residents.

  7. Pinning cluster synchronization in an array of coupled neural networks under event-based mechanism.

    Science.gov (United States)

    Li, Lulu; Ho, Daniel W C; Cao, Jinde; Lu, Jianquan

    2016-04-01

    Cluster synchronization is a typical collective behavior in coupled dynamical systems, where the synchronization occurs within one group, while there is no synchronization among different groups. In this paper, under event-based mechanism, pinning cluster synchronization in an array of coupled neural networks is studied. A new event-triggered sampled-data transmission strategy, where only local and event-triggering states are utilized to update the broadcasting state of each agent, is proposed to realize cluster synchronization of the coupled neural networks. Furthermore, a self-triggered pinning cluster synchronization algorithm is proposed, and a set of iterative procedures is given to compute the event-triggered time instants. Hence, this will reduce the computational load significantly. Finally, an example is given to demonstrate the effectiveness of the theoretical results. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  8. Frequency domain indirect identification of AMB rotor systems based on fictitious proportional feedback gain

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Hyeong Joon [Dept. of Mechanical Engineering, Soongsil University, Seoul (Korea, Republic of); Kim, Chan Jung [Dept. of Mechanical Design Engineering, Pukyong National University, Busan(Korea, Republic of)

    2016-12-15

    It is very difficult to directly identify an unstable system with uncertain dynamics from frequency domain input-output data. Hence, in these cases, closed-loop frequency responses calculated using a fictitious feedback could be more identifiable than open-loop data. This paper presents a frequency domain indirect identification of AMB rotor systems based on a Fictitious proportional feedback gain (FPFG). The closed-loop effect due to the FPFG can enhance the detectability of the system by moving the system poles, and significantly weigh the target mode in the frequency domain. The effectiveness of the proposed identification method was verified through the frequency domain identification of active magnetic bearing rotor systems.

  9. EEMD-MUSIC-Based Analysis for Natural Frequencies Identification of Structures Using Artificial and Natural Excitations

    Directory of Open Access Journals (Sweden)

    David Camarena-Martinez

    2014-01-01

    Full Text Available This paper presents a new EEMD-MUSIC- (ensemble empirical mode decomposition-multiple signal classification- based methodology to identify modal frequencies in structures ranging from free and ambient vibration signals produced by artificial and natural excitations and also considering several factors as nonstationary effects, close modal frequencies, and noisy environments, which are common situations where several techniques reported in literature fail. The EEMD and MUSIC methods are used to decompose the vibration signal into a set of IMFs (intrinsic mode functions and to identify the natural frequencies of a structure, respectively. The effectiveness of the proposed methodology has been validated and tested with synthetic signals and under real operating conditions. The experiments are focused on extracting the natural frequencies of a truss-type scaled structure and of a bridge used for both highway traffic and pedestrians. Results show the proposed methodology as a suitable solution for natural frequencies identification of structures from free and ambient vibration signals.

  10. Application of Model Based Parameter Estimation for RCS Frequency Response Calculations Using Method of Moments

    Science.gov (United States)

    Reddy, C. J.

    1998-01-01

    An implementation of the Model Based Parameter Estimation (MBPE) technique is presented for obtaining the frequency response of the Radar Cross Section (RCS) of arbitrarily shaped, three-dimensional perfect electric conductor (PEC) bodies. An Electric Field Integral Equation (EFTE) is solved using the Method of Moments (MoM) to compute the RCS. The electric current is expanded in a rational function and the coefficients of the rational function are obtained using the frequency derivatives of the EFIE. Using the rational function, the electric current on the PEC body is obtained over a frequency band. Using the electric current at different frequencies, RCS of the PEC body is obtained over a wide frequency band. Numerical results for a square plate, a cube, and a sphere are presented over a bandwidth. Good agreement between MBPE and the exact solution over the bandwidth is observed.

  11. Analyzing mobile WiMAX base station deployment under different frequency planning strategies

    Science.gov (United States)

    Salman, M. K.; Ahmad, R. B.; Ali, Ziad G.; Aldhaibani, Jaafar A.; Fayadh, Rashid A.

    2015-05-01

    The frequency spectrum is a precious resource and scarce in the communication markets. Therefore, different techniques are adopted to utilize the available spectrum in deploying WiMAX base stations (BS) in cellular networks. In this paper several types of frequency planning techniques are illustrated, and a comprehensive comparative study between conventional frequency reuse of 1 (FR of 1) and fractional frequency reuse (FFR) is presented. These techniques are widely used in network deployment, because they employ universal frequency (using all the available bandwidth) in their base station installation/configuration within network system. This paper presents a network model of 19 base stations in order to be employed in the comparison of the aforesaid frequency planning techniques. Users are randomly distributed within base stations, users' resource mapping and their burst profile selection are based on the measured signal to interference plus-noise ratio (SINR). Simulation results reveal that the FFR has advantages over the conventional FR of 1 in various metrics. 98 % of downlink resources (slots) are exploited when FFR is applied, whilst it is 81 % at FR of 1. Data rate of FFR has been increased to 10.6 Mbps, while it is 7.98 Mbps at FR of 1. The spectral efficiency is better enhanced (1.072 bps/Hz) at FR of 1 than FFR (0.808 bps/Hz), since FR of 1 exploits all the Bandwidth. The subcarrier efficiency shows how many data bits that can be carried by subcarriers under different frequency planning techniques, the system can carry more data bits under FFR (2.40 bit/subcarrier) than FR of 1 (1.998 bit/subcarrier). This study confirms that FFR can perform better than conventional frequency planning (FR of 1) which made it a strong candidate for WiMAX BS deployment in cellular networks.

  12. Simulation of stress-modulated magnetization precession frequency in Heusler-based spin torque oscillator

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Houbing, E-mail: hbhuang@ustb.edu.cn; Zhao, Congpeng; Ma, Xingqiao, E-mail: xqma@sas.ustb.edu.cn

    2017-03-15

    We investigated stress-modulated magnetization precession frequency in Heusler-based spin transfer torque oscillator by combining micromagnetic simulations with phase field microelasticity theory, by encapsulating the magnetic tunnel junction into multilayers structures. We proposed a novel method of using an external stress to control the magnetization precession in spin torque oscillator instead of an external magnetic field. The stress-modulated magnetization precession frequency can be linearly modulated by externally applied uniaxial in-plane stress, with a tunable range 4.4–7.0 GHz under the stress of 10 MPa. By comparison, the out-of-plane stress imposes negligible influence on the precession frequency due to the large out-of-plane demagnetization field. The results offer new inspiration to the design of spin torque oscillator devices that simultaneously process high frequency, narrow output band, and tunable over a wide range of frequencies via external stress. - Highlights: • We proposed stress-modulated magnetization precession in spin torque oscillator. • The magnetization precession frequency can be linearly modulated by in-plane stress. • The stress also can widen the magnetization frequency range 4.4–7.0 GHz. • The stress-modulated oscillation frequency can simplify STO devices.

  13. Study on frequency control method for DBR laser diode based on FPGA

    Science.gov (United States)

    Hao, Jiepeng; Hong, Jintao; Chen, Linlin; Lei, Guanqun; Zhou, Binquan

    2015-10-01

    In recent years, the distributed Bragg reflector (DBR) laser diode (LD) has advantages of its small size, high efficiency, low power consumption and so on, so it has been widely used in precision measurement, optical information processing, quantum research and other fields. There is a strict requirement for the output frequency of the DBR laser diode in precision measurement technology. Therefore, controlling the frequency of the laser accurately is of great significance for precision measurement. Currently, there are a lot of frequency control scheme for laser diode, mainly through the external system to stabilize the frequency of laser diode, the drawback of which is that it is not conducive to system integration. Therefore, this paper proposes a method based on FPGA for controlling the output frequency of the laser diode. The main purpose of the control is to study the frequency characteristics of the laser diode. In this paper, the FPGA chip is used as a micro controller, and combined with PID control algorithm constitute a closed loop control circuit. At the same time, the control algorithm is programmed into the FPGA device, which can maximize the operating speed of control system. For different frequency of the laser, it is only required to modify the control parameters simply, which can be realized the steady control of the light source. Through the test, near the operating temperature of the laser diode, temperature stability is better than +/-0.01°C. As a result, the laser frequency stability can be controlled to 0.1%.

  14. Knowledge-Driven Event Extraction in Russian: Corpus-Based Linguistic Resources

    Directory of Open Access Journals (Sweden)

    Valery Solovyev

    2016-01-01

    Full Text Available Automatic event extraction form text is an important step in knowledge acquisition and knowledge base population. Manual work in development of extraction system is indispensable either in corpus annotation or in vocabularies and pattern creation for a knowledge-based system. Recent works have been focused on adaptation of existing system (for extraction from English texts to new domains. Event extraction in other languages was not studied due to the lack of resources and algorithms necessary for natural language processing. In this paper we define a set of linguistic resources that are necessary in development of a knowledge-based event extraction system in Russian: a vocabulary of subordination models, a vocabulary of event triggers, and a vocabulary of Frame Elements that are basic building blocks for semantic patterns. We propose a set of methods for creation of such vocabularies in Russian and other languages using Google Books NGram Corpus. The methods are evaluated in development of event extraction system for Russian.

  15. An event-based neurobiological recognition system with orientation detector for objects in multiple orientations

    Directory of Open Access Journals (Sweden)

    Hanyu Wang

    2016-11-01

    Full Text Available A new multiple orientation event-based neurobiological recognition system is proposed by integrating recognition and tracking function in this paper, which is used for asynchronous address-event representation (AER image sensors. The characteristic of this system has been enriched to recognize the objects in multiple orientations with only training samples moving in a single orientation. The system extracts multi-scale and multi-orientation line features inspired by models of the primate visual cortex. An orientation detector based on modified Gaussian blob tracking algorithm is introduced for object tracking and orientation detection. The orientation detector and feature extraction block work in simultaneous mode, without any increase in categorization time. An addresses lookup table (addresses LUT is also presented to adjust the feature maps by addresses mapping and reordering, and they are categorized in the trained spiking neural network. This recognition system is evaluated with the MNIST dataset which have played important roles in the development of computer vision, and the accuracy is increase owing to the use of both ON and OFF events. AER data acquired by a DVS are also tested on the system, such as moving digits, pokers, and vehicles. The experimental results show that the proposed system can realize event-based multi-orientation recognition.The work presented in this paper makes a number of contributions to the event-based vision processing system for multi-orientation object recognition. It develops a new tracking-recognition architecture to feedforward categorization system and an address reorder approach to classify multi-orientation objects using event-based data. It provides a new way to recognize multiple orientation objects with only samples in single orientation.

  16. Limits on the Efficiency of Event-Based Algorithms for Monte Carlo Neutron Transport

    Energy Technology Data Exchange (ETDEWEB)

    Romano, Paul K.; Siegel, Andrew R.

    2017-04-16

    The traditional form of parallelism in Monte Carlo particle transport simulations, wherein each individual particle history is considered a unit of work, does not lend itself well to data-level parallelism. Event-based algorithms, which were originally used for simulations on vector processors, may offer a path toward better utilizing data-level parallelism in modern computer architectures. In this study, a simple model is developed for estimating the efficiency of the event-based particle transport algorithm under two sets of assumptions. Data collected from simulations of four reactor problems using OpenMC was then used in conjunction with the models to calculate the speedup due to vectorization as a function of two parameters: the size of the particle bank and the vector width. When each event type is assumed to have constant execution time, the achievable speedup is directly related to the particle bank size. We observed that the bank size generally needs to be at least 20 times greater than vector size in order to achieve vector efficiency greater than 90%. When the execution times for events are allowed to vary, however, the vector speedup is also limited by differences in execution time for events being carried out in a single event-iteration. For some problems, this implies that vector effciencies over 50% may not be attainable. While there are many factors impacting performance of an event-based algorithm that are not captured by our model, it nevertheless provides insights into factors that may be limiting in a real implementation.

  17. Frequency weightings based on biodynamics of fingers-hand-arm system.

    Science.gov (United States)

    Dong, Ren G; Welcome, Daniel E; Wu, John Z

    2005-07-01

    The frequency weighting for assessing hand-transmitted vibration exposure is critical to obtaining a true dose-response relationship. Any valid weighting must have a solid theoretical foundation. The objectives of this study are to examine the biodynamic foundation for assessing the vibration exposure and to develop a set of biodynamic methods to formulate the frequency weightings for different anatomical locations of the fingers-hand-arm system. The vibration transmissibility measured on the fingers, hand, wrist, elbow, shoulder, and head was used to define the transmitted acceleration-based (TAB) frequency weighting. The apparent masses measured at the fingers and the palm of the hand were used to construct the biodynamic force-based (BFB) weightings. These weightings were compared with the ISO weighting specified in ISO 5349-1 (2001). The results of this study suggest that the frequency weightings for the vibration-induced problems at different anatomical locations of the hand-arm system can be basically divided into three groups: (a) the weighting for the fingers and hand, (b) the weighting for the wrist, elbow, and shoulder, and (c) the weighting for the head. The ISO weighting is highly correlated with the weighting for the second group but not with the first and third groups. The TAB and BFB finger weightings are quite different at frequencies lower than 100 Hz, but they show similar trends at higher frequencies. Both TAB and BFB finger weightings at frequencies higher than 20 Hz are greater than the ISO weighting.

  18. Forecasting dose-time profiles of solar particle events using a dosimetry-based forecasting methodology

    Science.gov (United States)

    Neal, John Stuart

    2001-10-01

    A dosimetery-based Bayesian methodology for forecasting astronaut radiation doses in deep space due to radiologically significant solar particle event proton fluences is developed. Three non-linear sigmoidal growth curves (Gompertz, Weibull, logistic) are used with hierarchical, non-linear, regression models to forecast solar particle event dose-time profiles from doses obtained early in the development of the event. Since there are no detailed measurements of dose versus time for actual events, surrogate dose data are provided by calculational methods. Proton fluence data are used as input to the deterministic, coupled neutron-proton space radiation computer code, BRYNTRN, for transporting protons and their reaction products (protons, neutrons, 2H, 3H, 3He, and 4He) through aluminum shielding material and water. Calculated doses and dose rates for ten historical solar particle events are used as the input data by grouping similar historical solar particle events, using asymptotic dose and maximum dose rate as the grouping criteria. These historical data are then used to lend strength to predictions of dose and dose rate-time profiles for new solar particle events. Bayesian inference techniques are used to make parameter estimates and predictive forecasts. Markov Chain Monte Carlo (MCMC) methods are used to sample from the posterior distributions. Hierarchical, non-linear regression models provide useful predictions of asymptotic dose and dose-time profiles for the November 8, 2000 and August 12, 1989 solar particle events. Predicted dose rate-time profiles are adequate for the November 8, 2000 solar particle event. Predictions of dose rate-time profiles for the August 12, 1989 solar particle event suffer due to a more complex dose rate-time profile. Forecasts provide a valuable tool to space operations planners when making recommendations concerning operations in which radiological exposure might jeopardize personal safety or mission completion. This work

  19. Application of energies of optimal frequency bands for fault diagnosis based on modified distance function

    Energy Technology Data Exchange (ETDEWEB)

    Zamanian, Amir Hosein [Southern Methodist University, Dallas (United States); Ohadi, Abdolreza [Amirkabir University of Technology (Tehran Polytechnic), Tehran (Iran, Islamic Republic of)

    2017-06-15

    Low-dimensional relevant feature sets are ideal to avoid extra data mining for classification. The current work investigates the feasibility of utilizing energies of vibration signals in optimal frequency bands as features for machine fault diagnosis application. Energies in different frequency bands were derived based on Parseval's theorem. The optimal feature sets were extracted by optimization of the related frequency bands using genetic algorithm and a Modified distance function (MDF). The frequency bands and the number of bands were optimized based on the MDF. The MDF is designed to a) maximize the distance between centers of classes, b) minimize the dispersion of features in each class separately, and c) minimize dimension of extracted feature sets. The experimental signals in two different gearboxes were used to demonstrate the efficiency of the presented technique. The results show the effectiveness of the presented technique in gear fault diagnosis application.

  20. Frequency hopping signal detection based on wavelet decomposition and Hilbert-Huang transform

    Science.gov (United States)

    Zheng, Yang; Chen, Xihao; Zhu, Rui

    2017-07-01

    Frequency hopping (FH) signal is widely adopted by military communications as a kind of low probability interception signal. Therefore, it is very important to research the FH signal detection algorithm. The existing detection algorithm of FH signals based on the time-frequency analysis cannot satisfy the time and frequency resolution requirement at the same time due to the influence of window function. In order to solve this problem, an algorithm based on wavelet decomposition and Hilbert-Huang transform (HHT) was proposed. The proposed algorithm removes the noise of the received signals by wavelet decomposition and detects the FH signals by Hilbert-Huang transform. Simulation results show the proposed algorithm takes into account both the time resolution and the frequency resolution. Correspondingly, the accuracy of FH signals detection can be improved.

  1. Tunable microwave-photonic filter using frequency-to-time mapping-based delay lines.

    Science.gov (United States)

    Mokhtari, Arash; Jamshidi, Kambiz; Preußler, Stefan; Zadok, Avi; Schneider, Thomas

    2013-09-09

    A new implementation of microwave-photonic filters (MPFs) based on tunable optical delay lines is proposed and demonstrated. The variable delay is based on mapping of the spectral components of an incoming waveform onto the time domain, the application of linearly-varying temporal phase offsets, and an inverse mapping back to the frequency domain. The linear phase correction is equivalent to a frequency offset, and realized though suppressed-carrier single-sideband modulation by a radio-frequency sine wave. The variable delay element, controlled by the selected frequency, is used in one arm of a two-tap MPF. In a proof-of-concept experiment, the free spectral range (FSR) of the MPF was varied by over a factor of four: between 1.2 GHz and 5.3 GHz.

  2. A Method for Abstraction and Reproduction of Environment Based on Frequency Characteristic

    Science.gov (United States)

    Hyodo, Shoyo; Ohnishi, Kouhei

    With an advancement of information technology, record and reproduction technologies of sound and image have been improved. On the other hand, record and reproduction technologies of haptic display are developing. This paper proposes a method for abstraction and reproduction of environmental stiffness based on frequency characteristic. In the proposed method, environmental position response from force input is measured. Then, Fourier transformed position response divided by Fourier transformed force input is calculated for abstracting environmental frequency characteristic. Control system for the reproduction of environment is designed based on the abstracted environmental frequency characteristic. Finite Impulse Response (FIR) filter is utilized for the reproduction of environmental frequency characteristic. Finally, the validity of the proposed method is shown by experimental results.

  3. Effect of write voltage and frequency on the reliability aspects of memristor-based RRAM

    Science.gov (United States)

    Dongale, T. D.; Khot, K. V.; Mohite, S. V.; Desai, N. D.; Shinde, S. S.; Patil, V. L.; Vanalkar, S. A.; Moholkar, A. V.; Rajpure, K. Y.; Bhosale, P. N.; Patil, P. S.; Gaikwad, P. K.; Kamat, R. K.

    2017-08-01

    In this paper, we report the effect of the write voltage and frequency on memristor-based resistive random access memory (RRAM). The above-said parameters have been investigated on the linear drift model of the memristor. With a variation of write voltage from 0.2 to 1.2 V and a subsequent frequency modulation from 1, 2, 4, 10, 100 and 200 Hz, the corresponding effects on memory window, low resistance state (LRS) and high resistance state (HRS) have been reported. Thus, the lifetime ( τ) reliability analysis of memristor-based RRAM is carried out using the above results. It is found that the HRS is independent of the write voltage, whereas LRS shows dependency on write voltage and frequency. The simulation results showcase the fact that the memristor possesses higher memory window and lifetime ( τ) in the higher voltage with lower frequency region, which has been attributed to less data losses in the memory architecture.

  4. Two-frequency picosecond laser based on composite vanadate crystals with {sigma}-polarised radiation

    Energy Technology Data Exchange (ETDEWEB)

    Sirotkin, A A; Sadovskiy, S P; Garnov, Sergei V [A M Prokhorov General Physics Institute, Russian Academy of Sciences, Moscow (Russian Federation)

    2013-07-31

    A two-frequency picosecond laser based on {alpha}-cut Nd:YVO{sub 4}-YVO{sub 4} composite vanadate crystals is experimentally studied for the s-polarised radiation at the {sup 4}F{sub 3/2} - {sup 4}I{sub 11/2} transition with frequency tuning using Fabry-Perot etalons of different thickness. The difference between the radiation wavelengths was tuned within the range of 1.2-4.4 nm. In the mode-locking regime, the two-frequency radiation power was 280 mW at an absorbed pump power of 12 W. (lasers)

  5. An adaptive pneumatic suspension based on the estimation of the excitation frequency

    Science.gov (United States)

    Nieto, A. J.; Morales, A. L.; Trapero, J. R.; Chicharro, J. M.; Pintado, P.

    2011-04-01

    A pneumatic suspension that can adapt itself to the incoming vibration is presented in this paper. A switching control strategy between two different configurations is proposed and studied. The objective is to avoid undesirable resonant frequencies. The control procedure is based on the pre-knowledge of the incoming vibration frequency, and when this frequency is unknown, a very efficient prediction technique is used. The results show that the adaptable suspension has improved performance as compared to any of its passive counterparts. The transient response when switching typically takes less than three cycles and does not hinder the suspension performance.

  6. A Verilog-A Based Fractional Frequency Synthesizer Model for Fast and Accurate Noise Assessment

    Directory of Open Access Journals (Sweden)

    V. R. Gonzalez-Diaz

    2016-04-01

    Full Text Available This paper presents a new strategy to simulate fractional frequency synthesizer behavioral models with better performance and reduced simulation time. The models are described in Verilog-A with accurate phase noise predictions and they are based on a time jitter to power spectral density transformation of the principal noise sources in a synthesizer. The results of a fractional frequency synthesizer simulation is compared with state of the art Verilog-A descriptions showing a reduction of nearly 20 times. In addition, experimental results of a fractional frequency synthesizer are compared to the simulation results to validate the proposed model.

  7. Fundamental Frequency Estimation using Polynomial Rooting of a Subspace-Based Method

    DEFF Research Database (Denmark)

    Jensen, Jesper Rindom; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2010-01-01

    We consider the problem of estimating the fundamental frequency of periodic signals such as audio and speech. A novel estimation method based on polynomial rooting of the harmonic MUltiple SIgnal Classification (HMUSIC) is presented. By applying polynomial rooting, we obtain two significant...... improvements compared to HMUSIC. First, by using the proposed method we can obtain an estimate of the fundamental frequency without doing a grid search like in HMUSIC. This is due to that the fundamental frequency is estimated as the argument of the root lying closest to the unit circle. Second, we obtain...

  8. Econometric analysis of realized covariation: high frequency based covariance, regression, and correlation in financial economics

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2004-01-01

    This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing...... the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities....

  9. Influence of laser frequency noise on scanning Fabry-Perot interferometer based laser Doppler velocimetry

    DEFF Research Database (Denmark)

    Rodrigo, Peter John; Pedersen, Christian

    2014-01-01

    n this work, we study the performance of a scanning Fabry-Perot interferometer based laser Doppler velocimeter (sFPILDV) and compare two candidate 1.5 um single-frequency laser sources for the system – a fiber laser (FL) and a semiconductor laser (SL). We describe a straightforward calibration...... procedure for the sFPI-LDV and investigate the effect of different degrees of laser frequency noise between the FL and the SL on the velocimeter’s performance...

  10. Realization of fiber-based laser Doppler vibrometer with serrodyne frequency shifting.

    Science.gov (United States)

    Li, Yanlu; Meersman, Stijn; Baets, Roel

    2011-06-10

    We demonstrate a laser Doppler vibrometer (LDV) based on the serrodyne frequency shifting technique. A proof-of-principle system is implemented on the basis of fiber-optic components but opens the way toward an ultracompact integrated LDV system on a silicon chip. With a low laser power of 50  μW, the serrodyne LDV was able to measure submicrometer vibrations with frequencies in the audio range.

  11. Integrated Instantaneous Frequency Measurement Subsystem Based on Multi-Band-Stop Filters

    OpenAIRE

    Oliveira, B. G. M. de; Melo, M. T. de; Llamas-Garro, Ignacio; Espinosa-Espinosa, Moisés; de Oliveira, M.R.T.; de Oliveira, E.M.F.

    2014-01-01

    A 4-bit instantaneous frequency measurement (IFM) subsystem operating in the 1.5–4.66 GHz frequency band has been developed based on multi-band-stop filters. The design instantaneously detects the incoming signal which is then associated to one of the 16 sub-bands. Design, simulation and measurement of the device are presented in this paper, where a good agreement between simulations and measurements was obtained.

  12. A Novel Microstrip Frequency Discriminator for IFM Based on Balanced Gray-code

    OpenAIRE

    de Oliveira, Elias M.F.; Pedrosa, Túlio L.; de Souza, S.R.O.; Melo, M. T. de; Oliveira, B. G. M. de; Llamas-Garro, Ignacio

    2017-01-01

    This work presents the design, simulation, fabrication and measurement of a novel set of microstrip filters to perform the task of frequency discriminators. These filters’ frequency responses are based on the balanced Gray-code. Results show that the use of the balanced Gray-code, as opposed to the traditional Gray-code, allowed 20% circuit size reduction by using 60% less resonators due to a change in the resonators’ orientation.

  13. Measurement of the underlying event using track-based event shapes in Z→l{sup +}l{sup -} events with ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, Holger

    2014-09-11

    This thesis describes a measurement of hadron-collider event shapes in proton-proton collisions at a centre of momentum energy of 7 TeV at the Large Hadron Collider (LHC) at CERN (Conseil Europeenne pour la Recherche Nucleaire) located near Geneva (Switzerland). The analysed data (integrated luminosity: 1.1 fb{sup -1}) was recorded in 2011 with the ATLAS-experiment. Events where a Z-boson was produced in the hard sub-process which subsequently decays into an electron-positron or muon-antimuon pair were selected for this analysis. The observables are calculated using all reconstructed tracks of charged particles within the acceptance of the inner detector of ATLAS except those of the leptons of the Z-decay. Thus, this is the first measurement of its kind. The observables were corrected for background processes using data-driven methods. For the correction of so-called ''pile-up'' (multiple overlapping proton-proton collisions) a novel technique was developed and successfully applied. The data was further unfolded to correct for remaining detector effects. The obtained distributions are especially sensitive to the so-called ''Underlying Event'' and can be compared with predictions of Monte-Carlo event-generators directly, i.e. without the necessity of running time-consuming simulations of the ATLAS-detector. Finally, it was tried to improve the predictions of the event generators Pythia8 and Sherpa by finding an optimised setting of relevant model parameters in a technique called ''Tuning''. It became apparent, however, that the underlying Sjoestrand-Zijl model is unable to give a good description of the measured event-shape distributions.

  14. The effect of a low-frequency noise signal on a single-frequency millimeter-band oscillator based on an avalanche-transit diode

    Science.gov (United States)

    Kotov, V. D.; Myasin, E. A.

    2017-11-01

    Noise-wave generation in a single-frequency oscillator based on a 7-mm-band avalanche-transit diode has been implemented for the first time under the action of a low-frequency narrow-band ( 3 MHz) noise signal on an avalanche-transit-diode feed circuit.

  15. Quantile-based bias correction and uncertainty quantification of extreme event attribution statements

    Directory of Open Access Journals (Sweden)

    Soyoung Jeon

    2016-06-01

    Full Text Available Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR or the risk ratio (RR and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output based on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.

  16. Multi-scale event synchronization analysis for unravelling climate processes: a wavelet-based approach

    Science.gov (United States)

    Agarwal, Ankit; Marwan, Norbert; Rathinasamy, Maheswaran; Merz, Bruno; Kurths, Jürgen

    2017-10-01

    The temporal dynamics of climate processes are spread across different timescales and, as such, the study of these processes at only one selected timescale might not reveal the complete mechanisms and interactions within and between the (sub-)processes. To capture the non-linear interactions between climatic events, the method of event synchronization has found increasing attention recently. The main drawback with the present estimation of event synchronization is its restriction to analysing the time series at one reference timescale only. The study of event synchronization at multiple scales would be of great interest to comprehend the dynamics of the investigated climate processes. In this paper, the wavelet-based multi-scale event synchronization (MSES) method is proposed by combining the wavelet transform and event synchronization. Wavelets are used extensively to comprehend multi-scale processes and the dynamics of processes across various timescales. The proposed method allows the study of spatio-temporal patterns across different timescales. The method is tested on synthetic and real-world time series in order to check its replicability and applicability. The results indicate that MSES is able to capture relationships that exist between processes at different timescales.

  17. Multitask Learning-Based Security Event Forecast Methods for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Hui He

    2016-01-01

    Full Text Available Wireless sensor networks have strong dynamics and uncertainty, including network topological changes, node disappearance or addition, and facing various threats. First, to strengthen the detection adaptability of wireless sensor networks to various security attacks, a region similarity multitask-based security event forecast method for wireless sensor networks is proposed. This method performs topology partitioning on a large-scale sensor network and calculates the similarity degree among regional subnetworks. The trend of unknown network security events can be predicted through multitask learning of the occurrence and transmission characteristics of known network security events. Second, in case of lacking regional data, the quantitative trend of unknown regional network security events can be calculated. This study introduces a sensor network security event forecast method named Prediction Network Security Incomplete Unmarked Data (PNSIUD method to forecast missing attack data in the target region according to the known partial data in similar regions. Experimental results indicate that for an unknown security event forecast the forecast accuracy and effects of the similarity forecast algorithm are better than those of single-task learning method. At the same time, the forecast accuracy of the PNSIUD method is better than that of the traditional support vector machine method.

  18. Optimization of base-to-emitter spacer thickness to maximize the frequency response of bipolar transistors

    Science.gov (United States)

    Lee, Wai-Kit; Chan, Alain C. K.; Chan, Mansun

    2005-04-01

    The impacts of base-to-emitter spacer thickness on the unity gain frequency ( fT), base resistance ( rB), base collector capacitance ( CBC) and maximum oscillation frequency ( fmax) of a bipolar junction transistor (BJT) are studied. Using the extracted Y-parameters from a simulated device with structural parameters calibrated to an actual process, the resulting fT and fmax with different spacer thickness is reported. A tradeoff between peak fT and fmax is observed and the process window to obtain high fT and fmax is proposed.

  19. Improved eigensubspace-based approach for radio frequency interference filtering of synthetic aperture radar images

    Science.gov (United States)

    Zhou, Chunhui; Li, Fei; Li, Ning; Zheng, Huifang; Wang, Robert; Wang, Xiangyu

    2017-04-01

    The radio frequency interference (RFI) has an adverse effect on the useful signals, which can degrade the image quality seriously. An improved eigensubspace-based approach for RFI filtering of synthetic aperture radar images is developed. In the preprocessing stage of the proposed approach, the data sets that need subsequent processing can be selected in both frequency and time domain. Then, the data can be processed by the traditional eigensubspace-based approach. Compared with the traditional eigensubspace-based approach, our approach can work more efficiently and effectively.

  20. Gear-box fault detection using time-frequency based methods

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob

    2015-01-01

    Gear-box fault monitoring and detection is important for optimization of power generation and availability of wind turbines. The current industrial approach is to use condition monitoring systems, which runs in parallel with the wind turbine control system, using expensive additional sensors...... in the gear-box resonance frequency can be detected. Two different time–frequency based approaches are presented in this paper. One is a filter based approach and the other is based on a Karhunen–Loeve basis. Both of them detect the gear-box fault with an acceptable detection delay of maximum 100s, which...

  1. Design and Development of Thermistor based Power Meter at 140 GHz Frequency Band

    Science.gov (United States)

    Roy, Rajesh; Kush, Abhimanyue Kumar; Dixit, Rajendra Prasad

    2011-12-01

    Design and development of thermistor based power meter at 140 gigahertz (GHz) frequency band have been presented. Power meter comprises power sensor, amplifier circuit and dialog based graphical user interface in visual C++ for the average power measurement. The output power level of a component or system is very critical design factor. Thus there was a need of a power meter for the development of millimeter wave components at 140 GHz frequency band. Power sensor has been designed and developed using NTC (Negative Temperature Coefficient) thermistors. The design aims at developing a direct, simple and inexpensive power meter that can be used to measure absolute power at 140 GHz frequency band. Due to absorption of 140 GHz frequencies, resistance of thermistor changes to a new value. This change in resistance of thermistor can be converted to a dc voltage change and amplified voltage change can be fed to computer through data acquisition card. Dialog based graphical user interface (GUI) has been developed in visual C++ language for average power measurement in dBm. WR6 standard rectangular waveguide is the input port for the sensor of power meter. Temperature compensation has been achieved. Moderate sensor return loss greater than 20 dB has been found over the frequency range 110 to 170 GHz. The response time of the power sensor is 10 second. Average power accuracy is better than ±0.25 dB within the power range from -10 to 10 dBm at 140 GHz frequency band.

  2. Lessons derived from two high-frequency sea level events in the Atlantic: implications for coastal risk analysis and tsunami detection

    Directory of Open Access Journals (Sweden)

    Begoña Pérez-Gómez

    2016-11-01

    Full Text Available The upgrade and enhancement of sea level networks worldwide for integration in sea level hazard warning systems have significantly increased the possibilities for measuring and analyzing high frequency sea level oscillations, with typical periods ranging from a few minutes to a few hours. Many tide gauges now afford 1 min or more frequent sampling and have shown such events to be a common occurrence. Their origins and spatial distribution are diverse and must be well understood in order to correctly design and interpret, for example, the automatic detection algorithms used by tsunami warning centers. Two events recorded recently in European Atlantic waters are analyzed here: possible wave-induced seiches that occurred along the North coast of Spain during the storms of January and February of 2014, and oscillations detected after an earthquake in the mid-Atlantic the 13th of February of 2015. The former caused significant flooding in towns and villages and a huge increase in wave-induced coastal damage that was reported in the media for weeks. The second was a smaller signal present in several tide gauges along the Atlantic coast that, that coincided with the occurrence of this earthquake, leading to a debate on the potential detection of a very small tsunami and how it might yield significant information for tsunami wave modelers and for the development of tsunami detection software. These kinds of events inform us about the limitations of automatic algorithms for tsunami warning and help to improve the information provided to tsunami warning centers, whilst also emphasizing the importance of other forcings in generating extreme sea levels and their associated potential for causing damage to infrastructure.

  3. Focal mechanisms and inter-event times of low-frequency earthquakes reveal quasi-continuous deformation and triggered slow slip on the deep Alpine Fault

    Science.gov (United States)

    Baratin, Laura-May; Chamberlain, Calum J.; Townend, John; Savage, Martha K.

    2018-02-01

    Characterising the seismicity associated with slow deformation in the vicinity of the Alpine Fault may provide constraints on the stresses acting on a major transpressive margin prior to an anticipated great (≥M8) earthquake. Here, we use recently detected tremor and low-frequency earthquakes (LFEs) to examine how slow tectonic deformation is loading the Alpine Fault late in its typical ∼300-yr seismic cycle. We analyse a continuous seismic dataset recorded between 2009 and 2016 using a network of 10-13 short-period seismometers, the Southern Alps Microearthquake Borehole Array. Fourteen primary LFE templates are used in an iterative matched-filter and stacking routine, allowing the detection of similar signals corresponding to LFE families sharing common locations. This yields an 8-yr catalogue containing 10,000 LFEs that are combined for each of the 14 LFE families using phase-weighted stacking to produce signals with the highest possible signal-to-noise ratios. We show that LFEs occur almost continuously during the 8-yr study period and highlight two types of LFE distributions: (1) discrete behaviour with an inter-event time exceeding 2 min; (2) burst-like behaviour with an inter-event time below 2 min. We interpret the discrete events as small-scale frequent deformation on the deep extent of the Alpine Fault and LFE bursts (corresponding in most cases to known episodes of tremor or large regional earthquakes) as brief periods of increased slip activity indicative of slow slip. We compute improved non-linear earthquake locations using a 3-D velocity model. LFEs occur below the seismogenic zone at depths of 17-42 km, on or near the hypothesised deep extent of the Alpine Fault. The first estimates of LFE focal mechanisms associated with continental faulting, in conjunction with recurrence intervals, are consistent with quasi-continuous shear faulting on the deep extent of the Alpine Fault.

  4. Frequency scanning-based stability analysis method for grid-connected inverter system

    DEFF Research Database (Denmark)

    Wang, Yanbo; Wang, Xiongfei; Blaabjerg, Frede

    2017-01-01

    This paper proposes a frequency scanning-based impedance analysis for stability assessment of grid-connected inverter system, which is able to perform stability assessment without using system mathematical models and inherit the superior feature of impedance-based stability criterion with conside...

  5. Lexical Frequency and Exemplar-Based Learning Effects in Language Acquisition: Evidence from Sentential Complements

    Science.gov (United States)

    Kidd, Evan; Lieven, Elena V. M.; Tomasello, Michael

    2010-01-01

    Usage-based approaches to language acquisition argue that children acquire the grammar of their target language using general-cognitive learning principles. The current paper reports on an experiment that tested a central assumption of the usage-based approach: argument structure patterns are connected to high frequency verbs that facilitate…

  6. A browser-based event display for the CMS experiment at the LHC

    Science.gov (United States)

    Hategan, M.; McCauley, T.; Nguyen, P.

    2012-12-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  7. Event-based Plausibility Immediately Influences On-line Language Comprehension

    Science.gov (United States)

    Matsuki, Kazunaga; Chow, Tracy; Hare, Mary; Elman, Jeffrey L.; Scheepers, Christoph; McRae, Ken

    2011-01-01

    In some theories of sentence comprehension, linguistically-relevant lexical knowledge such as selectional restrictions is privileged in terms of the time-course of its access and influence. We examined whether event knowledge computed by combining multiple concepts can rapidly influence language understanding even in the absence of selectional restriction violations. Specifically, we investigated whether instruments can combine with actions to influence comprehension of ensuing patients. Instrument-verb-patient triplets were created in a norming study designed to tap directly into event knowledge. In self-paced reading (Experiment 1), participants were faster to read patient nouns such as hair when they were typical of the instrument-action pair (Donna used the shampoo to wash vs. the hose to wash). Experiment 2 showed that these results were not due to direct instrument-patient relations. Experiment 3 replicated Experiment 1 using eyetracking, with effects of event typicality observed in first fixation and gaze durations on the patient noun. This research demonstrates that conceptual event-based expectations are computed and used rapidly and dynamically during on-line language comprehension. We discuss relationships among plausibility and predictability, as well as their implications. We conclude that selectional restrictions may be best considered as event-based conceptual knowledge, rather than lexical-grammatical knowledge. PMID:21517222

  8. A browser-based event display for the CMS experiment at the LHC

    Energy Technology Data Exchange (ETDEWEB)

    Hategan, M. [Chicago U.; McCauley, T. [Fermilab; Nguyen, P. [Fermilab

    2012-01-01

    The line between native and web applications is becoming increasingly blurred as modern web browsers are becoming powerful platforms on which applications can be run. Such applications are trivial to install and are readily extensible and easy to use. In an educational setting, web applications permit a way to deploy deploy tools in a highly-restrictive computing environment. The I2U2 collaboration has developed a browser-based event display for viewing events in data collected and released to the public by the CMS experiment at the LHC. The application itself reads a JSON event format and uses the JavaScript 3D rendering engine pre3d. The only requirement is a modern browser using HTML5 canvas. The event display has been used by thousands of high school students in the context of programs organized by I2U2, QuarkNet, and IPPOG. This browser-based approach to display of events can have broader usage and impact for experts and public alike.

  9. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...

  10. Component-Based Data-Driven Predictive Maintenance to Reduce Unscheduled Maintenance Events

    NARCIS (Netherlands)

    Verhagen, W.J.C.; Curran, R.; de Boer, L.W.M.; Chen, C.H.; Trappey, A.C.; Peruzzini, M.; Stjepandić, J.; Wognum, N.

    2017-01-01

    Costs associated with unscheduled and preventive maintenance can contribute significantly to an airline's expenditure. Reliability analysis can help to identify and plan for maintenance events. Reliability analysis in industry is often limited to statistically based

  11. Crisis response simulation combining discrete-event and agent-based modeling

    NARCIS (Netherlands)

    Gonzalez, R.A.

    2009-01-01

    This paper presents a crisis response simulation model architecture combining a discrete-event simulation (DES) environment for a crisis scenario with an agent-based model of the response organization. In multi-agent systems (MAS) as a computational organization, agents are modeled and implemented

  12. Event-based prospective memory in depression: The impact of cue focality

    NARCIS (Netherlands)

    Altgassen, A.M.; Kliegel, M.; Martin, M.

    2009-01-01

    This study is the first to compare event-based prospective memory performance in individuals with depression and healthy controls. The degree to which self-initiated processing is required to perform the prospective memory task was varied. Twenty-eight individuals with depression and 32 healthy

  13. Using Story-Based Causal Diagrams to Analyze Disagreements about Complex Events.

    Science.gov (United States)

    Shapiro, Brian P.; And Others

    1995-01-01

    Describes procedures for constructing story-based causal diagrams. Discusses the cognitive and pragmatic constraints that govern the tendency to attribute events to incomplete causes. Uses causal diagrams to analyze major disagreements about the 1987 stock market crash. Explores how causal diagrams may mitigate the constraints on causal…

  14. Ontology-based combinatorial comparative analysis of adverse events associated with killed and live influenza vaccines.

    Directory of Open Access Journals (Sweden)

    Sirarat Sarntivijai

    Full Text Available Vaccine adverse events (VAEs are adverse bodily changes occurring after vaccination. Understanding the adverse event (AE profiles is a crucial step to identify serious AEs. Two different types of seasonal influenza vaccines have been used on the market: trivalent (killed inactivated influenza vaccine (TIV and trivalent live attenuated influenza vaccine (LAIV. Different adverse event profiles induced by these two groups of seasonal influenza vaccines were studied based on the data drawn from the CDC Vaccine Adverse Event Report System (VAERS. Extracted from VAERS were 37,621 AE reports for four TIVs (Afluria, Fluarix, Fluvirin, and Fluzone and 3,707 AE reports for the only LAIV (FluMist. The AE report data were analyzed by a novel combinatorial, ontology-based detection of AE method (CODAE. CODAE detects AEs using Proportional Reporting Ratio (PRR, Chi-square significance test, and base level filtration, and groups identified AEs by ontology-based hierarchical classification. In total, 48 TIV-enriched and 68 LAIV-enriched AEs were identified (PRR>2, Chi-square score >4, and the number of cases >0.2% of total reports. These AE terms were classified using the Ontology of Adverse Events (OAE, MedDRA, and SNOMED-CT. The OAE method provided better classification results than the two other methods. Thirteen out of 48 TIV-enriched AEs were related to neurological and muscular processing such as paralysis, movement disorders, and muscular weakness. In contrast, 15 out of 68 LAIV-enriched AEs were associated with inflammatory response and respiratory system disorders. There were evidences of two severe adverse events (Guillain-Barre Syndrome and paralysis present in TIV. Although these severe adverse events were at low incidence rate, they were found to be more significantly enriched in TIV-vaccinated patients than LAIV-vaccinated patients. Therefore, our novel combinatorial bioinformatics analysis discovered that LAIV had lower chance of inducing these

  15. The nature frequency identification of tunnel lining based on the microtremor method

    Directory of Open Access Journals (Sweden)

    Yujing Jiang

    2016-12-01

    Full Text Available Many tunnels all over the world have been in service for several decades, which require effective inspection methods to assess their health conditions. Microtremor, as a type of ambient vibration originating from natural or artificial oscillations without specific sources, has attracted more and more attentions in the recent study of the microtremor dynamic properties of concrete structures. In this study, the microtremors of the tunnel lining were simulated numerically based on the Distinct Element Method (DEM. The Power Spectra Density (PSD of signals obtained from numerical simulations were calculated and the nature frequencies were identified using the peak-picking method. The influences of the rock-concrete joint, the rock type and the concrete type on the nature frequencies were also evaluated. The results of a comprehensive numerical analysis show that the nature frequencies lower than 100 Hz can be identified. As the bonding condition becomes worse, the nature frequencies decrease. The nature frequencies change proportionally with the normal stiffness of the rock-concrete joint. As the concrete grade decreases, the third mode of frequency also decreases gradually while the variation of the first two modes of frequencies can hardly be identified. Additionally, the field microtremor measurements of tunnel lining were also carried out to verify the numerical results.

  16. The role of musical training in emergent and event-based timing

    Directory of Open Access Journals (Sweden)

    Lawrence eBaer

    2013-05-01

    Full Text Available Musical performance is thought to rely predominantly on event-based timing involving a clock-like neural process and an explicit internal representation of the time interval. Some aspects of musical performance may rely on emergent timing, which is established through the optimization of movement kinematics, and can be maintained without reference to any explicit representation of the time interval. We predicted that musical training would have its largest effect on event-based timing, supporting the dissociability of these timing processes and the dominance of event-based timing in musical performance. We compared 22 musicians and 17 non-musicians on the prototypical event-based timing task of finger tapping and on the typically emergently timed task of circle drawing. For each task, participants first responded in synchrony with a metronome (Paced and then responded at the same rate without the metronome (Unpaced. Analyses of the Unpaced phase revealed that non-musicians were more variable in their inter-response intervals for finger tapping compared to circle drawing. Musicians did not differ between the two tasks. Between groups, non-musicians were more variable than musicians for tapping but not for drawing. We were able to show that the differences were due to less timer variability in musicians on the tapping task. Correlational analyses of movement jerk and inter-response interval variability revealed a negative association for tapping and a positive association for drawing in non-musicians only. These results suggest that musical training affects temporal variability in tapping but not drawing. Additionally, musicians and non-musicians may be employing different movement strategies to maintain accurate timing in the two tasks. These findings add to our understanding of how musical training affects timing and support the dissociability of event-based and emergent timing modes.

  17. Visual Sensor Based Abnormal Event Detection with Moving Shadow Removal in Home Healthcare Applications

    OpenAIRE

    Young-Sook Lee; Wan-Young Chung

    2012-01-01

    Vision-based abnormal event detection for home healthcare systems can be greatly improved using visual sensor-based techniques able to detect, track and recognize objects in the scene. However, in moving object detection and tracking processes, moving cast shadows can be misclassified as part of objects or moving objects. Shadow removal is an essential step for developing video surveillance systems. The goal of the primary is to design novel computer vision techniques that can extract objects...

  18. Detection of vulnerable relays and sensitive controllers under cascading events based on performance indices

    DEFF Research Database (Denmark)

    Liu, Zhou; Chen, Zhe; Hu, Yanting

    2014-01-01

    ) based detection strategy is proposed to identify the vulnerable relays and sensitive controllers under the overloading situation during cascading events. Based on the impedance margin sensitivity, diverse performance indices are proposed to help improving this detection. A study case of voltage...... instability induced cascaded blackout built in real time digital simulator (RTDS) will be used to demonstrate the proposed strategy. The simulation results indicate this strategy can effectively detect the vulnerable relays and sensitive controllers under overloading situations....

  19. Implementation of Time and Frequency Response Analysis for Web-Based Laboratories

    Directory of Open Access Journals (Sweden)

    Teyana Sapula

    2011-04-01

    Full Text Available The University of Dar Es Salaam has developed the web-based laboratory for Time and Frequency Response Analysis. The purpose of this web-based laboratory is the utilization of real data from real experiments, in terms of instrumentation and experimental circuits, rather than simulations. The use of webbased laboratory came after realizing the difficulties imposed by the traditional laboratories. Web-based laboratories allow students and educators to interact with real laboratory equipment located anywhere in the world at anytime. This paper presents the implementation of web-based laboratory of single stage common emitter, resistor capacitor coupled amplifier using National Instruments Educational Laboratory Virtual Instrument Suite platform. Two components are deployed: time response analysis and frequency response analysis. The experiment allows students to carryout time and frequency analysis of the amplifier. The modular can be used to any microelectronic circuits to carry out any time response and frequency response analysis. Both the time response and frequency response analysis results of the amplifier are validated.

  20. Frequency-comb-based BOTDA sensors for high-spatial-resolution/long-distance sensing.

    Science.gov (United States)

    Jia, Xin-Hong; Chang, Han-Qing; Lin, Kai; Xu, Cong; Wu, Jia-Gui

    2017-03-20

    Frequency-comb-based Brillouin optical time-domain analysis (BOTDA) sensors were developed to achieve acquisition-time reduction and high-spatial-resolution/long-distance sensing simultaneously. We found that, for the standard frequency-comb-based BOTDA, the use of a double-sideband (DSB) pulse generates a series of pulse pairs that simultaneously propagate along the sensing fiber, leading to a nonlinear interaction between the two sidebands of each frequency comb pulse, and a significant splitting of the Brillouin gain spectrum (BGS). This problem prevents its application in high-spatial-resolution sensing due to the higher pulse power requirement. Thus, one of the sidebands of DSB pulse was proposed for greatly suppressing the BGS distortion. In combination with the phonon pre-excitation technique based on phase-shifted pulse, a sensor with a spatial-resolution approximately 60 cm along a fiber approximately 592 m in length was demonstrated. Furthermore, we explored the detailed performance of long-distance sensing by frequency- comb-based BOTDA. The use of a frequency comb for the probe wave can suppress the pulse distortion and non-local effect, which is helpful for extending the sensing distance. A spatial resolution of approximately 6 m along a sensing fiber approximately 74.2 km in length was successfully demonstrated.

  1. Waveguiding Effect in the Gigahertz Frequency Range in Pillar-based Phononic-Crystal Slabs

    Science.gov (United States)

    Pourabolghasem, Reza; Dehghannasiri, Razi; Eftekhar, Ali Asghar; Adibi, Ali

    2018-01-01

    The waveguiding effect for a phononic-crystal (PnC)-based device operating in the gigahertz (GHz) frequency regime is experimentally demonstrated. To that end, a metallic pillar-based PnC membrane with a PnC band gap in the GHz frequency range is designed, and, based on that, an acoustic waveguide operating in the GHz regime is designed and fabricated. To characterize the fabricated PnC waveguide, a set of focusing interdigital transducers is designed and fabricated, enabling efficient excitation and detection of acoustic signals inside the PnC waveguide. The finite-element method is used to study the acoustic properties of the proposed structures and optimize their design. Experimental evidence supporting the existence of the waveguiding effect in the proposed structure in the GHz frequency regime is provided, showing reasonable agreement with the numerical calculations.

  2. Multi-agent system-based event-triggered hybrid control scheme for energy internet

    DEFF Research Database (Denmark)

    Dou, Chunxia; Yue, Dong; Han, Qing Long

    2017-01-01

    This paper is concerned with an event-triggered hybrid control for the energy Internet based on a multi-agent system approach with which renewable energy resources can be fully utilized to meet load demand with high security and well dynamical quality. In the design of control, a multi-agent system...... of event-triggered hybrid control strategies whereby the multi-agent system implements the hierarchical hybrid control to achieve multiple control objectives. Finally, the effectiveness of the proposed control is validated by means of simulation results....

  3. A mobile robots experimental environment with event-based wireless communication.

    Science.gov (United States)

    Guinaldo, María; Fábregas, Ernesto; Farias, Gonzalo; Dormido-Canto, Sebastián; Chaos, Dictino; Sánchez, José; Dormido, Sebastián

    2013-07-22

    An experimental platform to communicate between a set of mobile robots through a wireless network has been developed. The mobile robots get their position through a camera which performs as sensor. The video images are processed in a PC and a Waspmote card sends the corresponding position to each robot using the ZigBee standard. A distributed control algorithm based on event-triggered communications has been designed and implemented to bring the robots into the desired formation. Each robot communicates to its neighbors only at event times. Furthermore, a simulation tool has been developed to design and perform experiments with the system. An example of usage is presented.

  4. Neural bases of event knowledge and syntax integration in comprehension of complex sentences.

    Science.gov (United States)

    Malaia, Evie; Newman, Sharlene

    2015-01-01

    Comprehension of complex sentences is necessarily supported by both syntactic and semantic knowledge, but what linguistic factors trigger a readers' reliance on a specific system? This functional neuroimaging study orthogonally manipulated argument plausibility and verb event type to investigate cortical bases of the semantic effect on argument comprehension during reading. The data suggest that telic verbs facilitate online processing by means of consolidating the event schemas in episodic memory and by easing the computation of syntactico-thematic hierarchies in the left inferior frontal gyrus. The results demonstrate that syntax-semantics integration relies on trade-offs among a distributed network of regions for maximum comprehension efficiency.

  5. A study of position independent algorithms for phone-based gait frequency detection.

    Science.gov (United States)

    Tarashansky, Alexander; Vathsangam, Harshvardhan; Sukhatme, Gaurav S

    2014-01-01

    Estimating gait frequency is an important component in the detection and diagnosis of various medical conditions. Smartphone-based kinematic sensors offer a window of opportunity in free-living gait frequency estimation. The main issue with smartphone-based gait frequency estimation algorithms is how to adjust for variations in orientation and location of the phone on the human body. While numerous algorithms have been implemented to account for these differences, little work has been done in comparing these algorithms. In this study, we compare various position independent algorithms to determine which are more suited to robust gait frequency estimation. Using sensor data collected from volunteers walking with a smartphone, we examine the effect of using three different time series with the magnitude, weighted sum, and closest vertical component algorithms described in the paper. We also test two different methods of extracting step frequency: time domain peak counting and spectral analysis. The results show that the choice of time series does not significantly affect the accuracy of frequency measurements. Furthermore, both time domain and spectral approaches show comparable results. However, time domain approaches are sensitive to false-positives while spectral approaches require a minimum set of repetitive measurements. Our study suggests a hybrid approach where both time-domain and spectral approaches be used together to complement each other's shortcomings.

  6. Interference graph-based dynamic frequency reuse in optical attocell networks

    Science.gov (United States)

    Liu, Huanlin; Xia, Peijie; Chen, Yong; Wu, Lan

    2017-11-01

    Indoor optical attocell network may achieve higher capacity than radio frequency (RF) or Infrared (IR)-based wireless systems. It is proposed as a special type of visible light communication (VLC) system using Light Emitting Diodes (LEDs). However, the system spectral efficiency may be severely degraded owing to the inter-cell interference (ICI), particularly for dense deployment scenarios. To address these issues, we construct the spectral interference graph for indoor optical attocell network, and propose the Dynamic Frequency Reuse (DFR) and Weighted Dynamic Frequency Reuse (W-DFR) algorithms to decrease ICI and improve the spectral efficiency performance. The interference graph makes LEDs can transmit data without interference and select the minimum sub-bands needed for frequency reuse. Then, DFR algorithm reuses the system frequency equally across service-providing cells to mitigate spectrum interference. While W-DFR algorithm can reuse the system frequency by using the bandwidth weight (BW), which is defined based on the number of service users. Numerical results show that both of the proposed schemes can effectively improve the average spectral efficiency (ASE) of the system. Additionally, improvement of the user data rate is also obtained by analyzing its cumulative distribution function (CDF).

  7. Increased frequency of single base substitutions in a population of transcripts expressed in cancer cells

    Directory of Open Access Journals (Sweden)

    Bianchetti Laurent

    2012-11-01

    Full Text Available Abstract Background Single Base Substitutions (SBS that alter transcripts expressed in cancer originate from somatic mutations. However, recent studies report SBS in transcripts that are not supported by the genomic DNA of tumor cells. Methods We used sequence based whole genome expression profiling, namely Long-SAGE (L-SAGE and Tag-seq (a combination of L-SAGE and deep sequencing, and computational methods to identify transcripts with greater SBS frequencies in cancer. Millions of tags produced by 40 healthy and 47 cancer L-SAGE experiments were compared to 1,959 Reference Tags (RT, i.e. tags matching the human genome exactly once. Similarly, tens of millions of tags produced by 7 healthy and 8 cancer Tag-seq experiments were compared to 8,572 RT. For each transcript, SBS frequencies in healthy and cancer cells were statistically tested for equality. Results In the L-SAGE and Tag-seq experiments, 372 and 4,289 transcripts respectively, showed greater SBS frequencies in cancer. Increased SBS frequencies could not be attributed to known Single Nucleotide Polymorphisms (SNP, catalogued somatic mutations or RNA-editing enzymes. Hypothesizing that Single Tags (ST, i.e. tags sequenced only once, were indicators of SBS, we observed that ST proportions were heterogeneously distributed across Embryonic Stem Cells (ESC, healthy differentiated and cancer cells. ESC had the lowest ST proportions, whereas cancer cells had the greatest. Finally, in a series of experiments carried out on a single patient at 1 healthy and 3 consecutive tumor stages, we could show that SBS frequencies increased during cancer progression. Conclusion If the mechanisms generating the base substitutions could be known, increased SBS frequency in transcripts would be a new useful biomarker of cancer. With the reduction of sequencing cost, sequence based whole genome expression profiling could be used to characterize increased SBS frequency in patient’s tumor and aid diagnostic.

  8. An assessment of envelope-based demodulation in case of proximity of carrier and modulation frequencies

    Science.gov (United States)

    Shahriar, Md Rifat; Borghesani, Pietro; Randall, R. B.; Tan, Andy C. C.

    2017-11-01

    Demodulation is a necessary step in the field of diagnostics to reveal faults whose signatures appear as an amplitude and/or frequency modulation. The Hilbert transform has conventionally been used for the calculation of the analytic signal required in the demodulation process. However, the carrier and modulation frequencies must meet the conditions set by the Bedrosian identity for the Hilbert transform to be applicable for demodulation. This condition, basically requiring the carrier frequency to be sufficiently higher than the frequency of the modulation harmonics, is usually satisfied in many traditional diagnostic applications (e.g. vibration analysis of gear and bearing faults) due to the order-of-magnitude ratio between the carrier and modulation frequency. However, the diversification of the diagnostic approaches and applications shows cases (e.g. electrical signature analysis-based diagnostics) where the carrier frequency is in close proximity to the modulation frequency, thus challenging the applicability of the Bedrosian theorem. This work presents an analytic study to quantify the error introduced by the Hilbert transform-based demodulation when the Bedrosian identity is not satisfied and proposes a mitigation strategy to combat the error. An experimental study is also carried out to verify the analytical results. The outcome of the error analysis sets a confidence limit on the estimated modulation (both shape and magnitude) achieved through the Hilbert transform-based demodulation in case of violated Bedrosian theorem. However, the proposed mitigation strategy is found effective in combating the demodulation error aroused in this scenario, thus extending applicability of the Hilbert transform-based demodulation.

  9. Laser heterodyne interferometric signal processing method based on rising edge locking with high frequency clock signal.

    Science.gov (United States)

    Zhang, Enzheng; Chen, Benyong; Yan, Liping; Yang, Tao; Hao, Qun; Dong, Wenjun; Li, Chaorong

    2013-02-25

    A novel phase measurement method composed of the rising-edge locked signal processing and the digital frequency mixing is proposed for laser heterodyne interferometer. The rising-edge locked signal processing, which employs a high frequency clock signal to lock the rising-edges of the reference and measurement signals, not only can improve the steepness of the rising-edge, but also can eliminate the error counting caused by multi-rising-edge phenomenon in fringe counting. The digital frequency mixing is realized by mixing the digital interference signal with a digital base signal that is different from conventional frequency mixing with analogue signals. These signal processing can improve the measurement accuracy and enhance anti-interference and measurement stability. The principle and implementation of the method are described in detail. An experimental setup was constructed and a series of experiments verified the feasibility of the method in large displacement measurement with high speed and nanometer resolution.

  10. Polarization-insensitive FSS-based perfect metamaterial absorbers for GHz and THz frequencies

    Science.gov (United States)

    Sabah, Cumali; Dincer, Furkan; Karaaslan, Muharrem; Unal, Emin; Akgol, Oguzhan

    2014-04-01

    New perfect frequency selective surface (FSS) metamaterial absorbers (MAs) based on resonator with dielectric configuration are numerically presented and investigated for both microwave and terahertz frequency ranges. Also, to verify the behaviors of the FSS MAs, one of the MAs is experimentally analyzed and tested in the microwave frequency range. Suggested FSS MAs have simple configuration which introduces flexibility to adjust their FSS metamaterial properties and to rescale the structure easily for any desired frequency range. There is no study which simultaneously includes microwave and terahertz absorbers in a single design in the literature. Besides, numerical simulations verify that the FSS MAs could achieve very high absorption levels at wide angles of incidence for both transverse electric and transverse magnetic waves. The proposed FSS MAs and their variations enable many potential application areas in radar systems, communication, stealth technologies, and so on.

  11. On-Chip Power-Combining for High-Power Schottky Diode-Based Frequency Multipliers

    Science.gov (United States)

    Chattopadhyay, Goutam; Mehdi, Imran; Schlecht, Erich T.; Lee, Choonsup; Siles, Jose V.; Maestrini, Alain E.; Thomas, Bertrand; Jung, Cecile D.

    2013-01-01

    A 1.6-THz power-combined Schottky frequency tripler was designed to handle approximately 30 mW input power. The design of Schottky-based triplers at this frequency range is mainly constrained by the shrinkage of the waveguide dimensions with frequency and the minimum diode mesa sizes, which limits the maximum number of diodes that can be placed on the chip to no more than two. Hence, multiple-chip power-combined schemes become necessary to increase the power-handling capabilities of high-frequency multipliers. The design presented here overcomes difficulties by performing the power-combining directly on-chip. Four E-probes are located at a single input waveguide in order to equally pump four multiplying structures (featuring two diodes each). The produced output power is then recombined at the output using the same concept.

  12. Observer-Based Load Frequency Control for Island Microgrid with Photovoltaic Power

    Directory of Open Access Journals (Sweden)

    Chaoxu Mu

    2017-01-01

    Full Text Available As renewable energy is widely integrated into the power system, the stochastic and intermittent power generation from renewable energy may cause system frequency deviating from the prescribed level, especially for a microgrid. In this paper, the load frequency control (LFC of an island microgrid with photovoltaic (PV power and electric vehicles (EVs is investigated, where the EVs can be treated as distributed energy storages. Considering the disturbances from load change and PV power, an observer-based integral sliding mode (OISM controller is designed to regulate the frequency back to the prescribed value, where the neural network observer is used to online estimate the PV power. Simulation studies on a benchmark microgrid system are presented to illustrate the effectiveness of OISM controller, and comparative results also demonstrate that the proposed method has a superior performance for stabilizing the frequency over the PID control.

  13. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  14. Real-Time Gait Event Detection Based on Kinematic Data Coupled to a Biomechanical Model.

    Science.gov (United States)

    Lambrecht, Stefan; Harutyunyan, Anna; Tanghe, Kevin; Afschrift, Maarten; De Schutter, Joris; Jonkers, Ilse

    2017-03-24

    Real-time detection of multiple stance events, more specifically initial contact (IC), foot flat (FF), heel off (HO), and toe off (TO), could greatly benefit neurorobotic (NR) and neuroprosthetic (NP) control. Three real-time threshold-based algorithms have been developed, detecting the aforementioned events based on kinematic data in combination with a biomechanical model. Data from seven subjects walking at three speeds on an instrumented treadmill were used to validate the presented algorithms, accumulating to a total of 558 steps. The reference for the gait events was obtained using marker and force plate data. All algorithms had excellent precision and no false positives were observed. Timing delays of the presented algorithms were similar to current state-of-the-art algorithms for the detection of IC and TO, whereas smaller delays were achieved for the detection of FF. Our results indicate that, based on their high precision and low delays, these algorithms can be used for the control of an NR/NP, with the exception of the HO event. Kinematic data is used in most NR/NP control schemes and is thus available at no additional cost, resulting in a minimal computational burden. The presented methods can also be applied for screening pathological gait or gait analysis in general in/outside of the laboratory.

  15. An electronic trigger based on care escalation to identify preventable adverse events in hospitalised patients.

    Science.gov (United States)

    Bhise, Viraj; Sittig, Dean F; Vaghani, Viralkumar; Wei, Li; Baldwin, Jessica; Singh, Hardeep

    2017-09-21

    Methods to identify preventable adverse events typically have low yield and efficiency. We refined the methods of Institute of Healthcare Improvement's Global Trigger Tool (GTT) application and leveraged electronic health record (EHR) data to improve detection of preventable adverse events, including diagnostic errors. We queried the EHR data repository of a large health system to identify an 'index hospitalization' associated with care escalation (defined as transfer to the intensive care unit (ICU) or initiation of rapid response team (RRT) within 15 days of admission) between March 2010 and August 2015. To enrich the record review sample with unexpected events, we used EHR clinical data to modify the GTT algorithm and limited eligible patients to those at lower risk for care escalation based on younger age and presence of minimal comorbid conditions. We modified the GTT review methodology; two physicians independently reviewed eligible 'e-trigger' positive records to identify preventable diagnostic and care management events. Of 88 428 hospitalisations, 887 were associated with care escalation (712 ICU transfers and 175 RRTs), of which 92 were flagged as trigger-positive and reviewed. Preventable adverse events were detected in 41 cases, yielding a trigger positive predictive value of 44.6% (reviewer agreement 79.35%; Cohen's kappa 0.573). We identified 7 (7.6%) diagnostic errors and 34 (37.0%) care management-related events: 24 (26.1%) adverse drug events, 4 (4.3%) patient falls, 4 (4.3%) procedure-related complications and 2 (2.2%) hospital-associated infections. In most events (73.1%), there was potential for temporary harm. We developed an approach using an EHR data-based trigger and modified review process to efficiently identify hospitalised patients with preventable adverse events, including diagnostic errors. Such e-triggers can help overcome limitations of currently available methods to detect preventable harm in hospitalised patients. © Article

  16. Improvement of hydrological flood forecasting through an event based output correction method

    Science.gov (United States)

    Klotz, Daniel; Nachtnebel, Hans Peter

    2014-05-01

    This contribution presents an output correction method for hydrological models. A conceptualisation of the method is presented and tested in an alpine basin in Salzburg, Austria. The aim is to develop a method which is not prone to the drawbacks of autoregressive models. Output correction methods are an attractive option for improving hydrological predictions. They are complementary to the main modelling process and do not interfere with the modelling process itself. In general, output correction models estimate the future error of a prediction and use the estimation to improve the given prediction. Different estimation techniques are available dependent on the utilized information and the estimation procedure itself. Autoregressive error models are widely used for such corrections. Autoregressive models with exogenous inputs (ARX) allow the use of additional information for the error modelling, e.g. measurements from upper basins or predicted input-signals. Autoregressive models do however exhibit deficiencies, since the errors of hydrological models do generally not behave in an autoregressive manner. The decay of the error is usually different from an autoregressive function and furthermore the residuals exhibit different patterns under different circumstances. As for an example, one might consider different error-propagation behaviours under high- and low-flow situations or snow melt driven conditions. This contribution presents a conceptualisation of an event-based correction model and focuses on flood events only. The correction model uses information about the history of the residuals and exogenous variables to give an error-estimation. The structure and parameters of the correction models can be adapted to given event classes. An event-class is a set of flood events that exhibit a similar pattern for the residuals or the hydrological conditions. In total, four different event-classes have been identified in this study. Each of them represents a different

  17. Life events, salivary cortisol, and cognitive performance in nondemented subjects: a population-based study.

    Science.gov (United States)

    Ouanes, Sami; Castelao, Enrique; Gebreab, Sirak; von Gunten, Armin; Preisig, Martin; Popp, Julius

    2017-03-01

    Older people are particularly exposed to stressful events, known to activate the hypothalamus-pituitary-adrenal axis resulting in increased cortisol levels. High cortisol has been associated with deleterious effects on cognition. We hypothesized that stressful life events could increase cortisol secretion leading to cognitive impairment. A cross-sectional analysis was conducted using data from Colaus/PsyColaus, a longitudinal population-based study among Lausanne residents. Salivary cortisol samples were obtained from 796 nondemented subjects aged at least 65. A neuropsychological battery was used to assess cognitive performance and determine the Clinical Dementia Rating Sum of Boxes (CDRSOB). Lifetime life events and their subjective impact were assessed using a validated questionnaire. The total impact of life events was associated neither with cortisol area under the curve (AUC) nor with CDRSOB nor with any cognitive domain performance. The CDRSOB was associated with the cortisol AUC, controlling for age, sex, body mass index, education and depressive symptoms (p = 0.003; B = 0.686 [0.240; 1.333]; r = 0.114). This association between CDRSOB and the cortisol AUC remained significant after controlling for life events total impact (p = 0.040; B = 0.591 [0.027; 1.155]; r = 0.106). These findings do not support the hypothesis that stressful life events increase cortisol secretion leading to cognitive impairment. The association of higher cortisol levels with poorer cognition might be not a mere reflection of stressful events but rather explained by other factors, yet to be elucidated. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. A Two-Account Life Insurance Model for Scenario-Based Valuation Including Event Risk

    DEFF Research Database (Denmark)

    Jensen, Ninna Reitzel; Schomacker, Kristian Juul

    2015-01-01

    model by conducting scenario analysis based on Monte Carlo simulation, but the model applies to scenarios in general and to worst-case and best-estimate scenarios in particular. In addition to easy computations, our model offers a common framework for the valuation of life insurance payments across......Using a two-account model with event risk, we model life insurance contracts taking into account both guaranteed and non-guaranteed payments in participating life insurance as well as in unit-linked insurance. Here, event risk is used as a generic term for life insurance events, such as death...... and unit-linked insurance. By use of a two-account model, we are able to illustrate general concepts without making the model too abstract. To allow for complicated financial markets without dramatically increasing the mathematical complexity, we focus on economic scenarios. We illustrate the use of our...

  19. Power and sample size calculation for paired recurrent events data based on robust nonparametric tests.

    Science.gov (United States)

    Su, Pei-Fang; Chung, Chia-Hua; Wang, Yu-Wen; Chi, Yunchan; Chang, Ying-Ju

    2017-05-20

    The purpose of this paper is to develop a formula for calculating the required sample size for paired recurrent events data. The developed formula is based on robust non-parametric tests for comparing the marginal mean function of events between paired samples. This calculation can accommodate the associations among a sequence of paired recurrent event times with a specification of correlated gamma frailty variables for a proportional intensity model. We evaluate the performance of the proposed method with comprehensive simulations including the impacts of paired correlations, homogeneous or nonhomogeneous processes, marginal hazard rates, censoring rate, accrual and follow-up times, as well as the sensitivity analysis for the assumption of the frailty distribution. The use of the formula is also demonstrated using a premature infant study from the neonatal intensive care unit of a tertiary center in southern Taiwan. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Stabilization of Networked Distributed Systems with Partial and Event-Based Couplings

    Directory of Open Access Journals (Sweden)

    Sufang Zhang

    2015-01-01

    Full Text Available The stabilization problem of networked distributed systems with partial and event-based couplings is investigated. The channels, which are used to transmit different levels of information of agents, are considered. The channel matrix is introduced to indicate the work state of the channels. An event condition is designed for each channel to govern the sampling instants of the channel. Since the event conditions are separately given for different channels, the sampling instants of channels are mutually independent. To stabilize the system, the state feedback controllers are implemented in the system. The control signals also suffer from the two communication constraints. The sufficient conditions in terms of linear matrix equalities are proposed to ensure the stabilization of the controlled system. Finally, a numerical example is given to demonstrate the advantage of our results.