Interrupted time-series analysis: studying trends in neurosurgery.
Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K
2015-12-01
OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
The Impact of the Hotel Room Tax: An Interrupted Time Series Approach
Bonham, Carl; Fujii, Edwin; Im, Eric; Mak, James
1992-01-01
Employs interrupted time series analysis to estimate ex post the impact of a hotel room tax on real net hotel revenues by analyzing that time series before and after the imposition of the tax. Finds that the tax had a negligible effect on real hotel revenues.
Jandoc, Racquel; Burden, Andrea M; Mamdani, Muhammad; Lévesque, Linda E; Cadarette, Suzanne M
2015-08-01
To describe the use and reporting of interrupted time series methods in drug utilization research. We completed a systematic search of MEDLINE, Web of Science, and reference lists to identify English language articles through to December 2013 that used interrupted time series methods in drug utilization research. We tabulated the number of studies by publication year and summarized methodological detail. We identified 220 eligible empirical applications since 1984. Only 17 (8%) were published before 2000, and 90 (41%) were published since 2010. Segmented regression was the most commonly applied interrupted time series method (67%). Most studies assessed drug policy changes (51%, n = 112); 22% (n = 48) examined the impact of new evidence, 18% (n = 39) examined safety advisories, and 16% (n = 35) examined quality improvement interventions. Autocorrelation was considered in 66% of studies, 31% reported adjusting for seasonality, and 15% accounted for nonstationarity. Use of interrupted time series methods in drug utilization research has increased, particularly in recent years. Despite methodological recommendations, there is large variation in reporting of analytic methods. Developing methodological and reporting standards for interrupted time series analysis is important to improve its application in drug utilization research, and we provide recommendations for consideration. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew
2014-01-01
Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.
Price, Cristofer; Unlu, Fatih
2014-01-01
The Comparative Short Interrupted Time Series (C-SITS) design is a frequently employed quasi-experimental method, in which the pre- and post-intervention changes observed in the outcome levels of a treatment group is compared with those of a comparison group where the difference between the former and the latter is attributed to the treatment. The…
Comparison Groups in Short Interrupted Time-Series: An Illustration Evaluating No Child Left Behind
Wong, Manyee; Cook, Thomas D.; Steiner, Peter M.
2009-01-01
Interrupted time-series (ITS) are often used to assess the causal effect of a planned or even unplanned shock introduced into an on-going process. The pre-intervention slope is supposed to index the causal counterfactual, and deviations from it in mean, slope or variance are used to indicate an effect. However, a secure causal inference is only…
Linden, Ariel; Yarnold, Paul R
2016-12-01
Single-group interrupted time series analysis (ITSA) is a popular evaluation methodology in which a single unit of observation is being studied, the outcome variable is serially ordered as a time series and the intervention is expected to 'interrupt' the level and/or trend of the time series, subsequent to its introduction. Given that the internal validity of the design rests on the premise that the interruption in the time series is associated with the introduction of the treatment, treatment effects may seem less plausible if a parallel trend already exists in the time series prior to the actual intervention. Thus, sensitivity analyses should focus on detecting structural breaks in the time series before the intervention. In this paper, we introduce a machine-learning algorithm called optimal discriminant analysis (ODA) as an approach to determine if structural breaks can be identified in years prior to the initiation of the intervention, using data from California's 1988 voter-initiated Proposition 99 to reduce smoking rates. The ODA analysis indicates that numerous structural breaks occurred prior to the actual initiation of Proposition 99 in 1989, including perfect structural breaks in 1983 and 1985, thereby casting doubt on the validity of treatment effects estimated for the actual intervention when using a single-group ITSA design. Given the widespread use of ITSA for evaluating observational data and the increasing use of machine-learning techniques in traditional research, we recommend that structural break sensitivity analysis is routinely incorporated in all research using the single-group ITSA design. © 2016 John Wiley & Sons, Ltd.
Linden, Ariel
2018-05-11
Interrupted time series analysis (ITSA) is an evaluation methodology in which a single treatment unit's outcome is studied serially over time and the intervention is expected to "interrupt" the level and/or trend of that outcome. ITSA is commonly evaluated using methods which may produce biased results if model assumptions are violated. In this paper, treatment effects are alternatively assessed by using forecasting methods to closely fit the preintervention observations and then forecast the post-intervention trend. A treatment effect may be inferred if the actual post-intervention observations diverge from the forecasts by some specified amount. The forecasting approach is demonstrated using the effect of California's Proposition 99 for reducing cigarette sales. Three forecast models are fit to the preintervention series-linear regression (REG), Holt-Winters (HW) non-seasonal smoothing, and autoregressive moving average (ARIMA)-and forecasts are generated into the post-intervention period. The actual observations are then compared with the forecasts to assess intervention effects. The preintervention data were fit best by HW, followed closely by ARIMA. REG fit the data poorly. The actual post-intervention observations were above the forecasts in HW and ARIMA, suggesting no intervention effect, but below the forecasts in the REG (suggesting a treatment effect), thereby raising doubts about any definitive conclusion of a treatment effect. In a single-group ITSA, treatment effects are likely to be biased if the model is misspecified. Therefore, evaluators should consider using forecast models to accurately fit the preintervention data and generate plausible counterfactual forecasts, thereby improving causal inference of treatment effects in single-group ITSA studies. © 2018 John Wiley & Sons, Ltd.
Kastner, Monika; Sawka, Anna M; Hamid, Jemila; Chen, Maggie; Thorpe, Kevin; Chignell, Mark; Ewusie, Joycelyne; Marquez, Christine; Newton, David; Straus, Sharon E
2014-09-25
Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems, yet gaps in management still exist. In response, we developed a multi-component osteoporosis knowledge translation (Op-KT) tool involving a patient-initiated risk assessment questionnaire (RAQ), which generates individualized best practice recommendations for physicians and customized education for patients at the point of care. The objective of this study was to evaluate the effectiveness of the Op-KT tool for appropriate disease management by physicians. The Op-KT tool was evaluated using an interrupted time series design. This involved multiple assessments of the outcomes 12 months before (baseline) and 12 months after tool implementation (52 data points in total). Inclusion criteria were family physicians and their patients at risk for osteoporosis (women aged ≥ 50 years, men aged ≥ 65 years). Primary outcomes were the initiation of appropriate osteoporosis screening and treatment. Analyses included segmented linear regression modeling and analysis of variance. The Op-KT tool was implemented in three family practices in Ontario, Canada representing 5 family physicians with 2840 age eligible patients (mean age 67 years; 76% women). Time series regression models showed an overall increase from baseline in the initiation of screening (3.4%; P management addressed by their physician. Study limitations included the inherent susceptibility of our design compared with a randomized trial. The multicomponent Op-KT tool significantly increased osteoporosis investigations in three family practices, and highlights its potential to facilitate patient self-management. Next steps include wider implementation and evaluation of the tool in primary care.
Morgan, Lauren; Pickering, Sharon P; Hadi, Mohammed; Robertson, Eleanor; New, Steve; Griffin, Damian; Collins, Gary; Rivero-Arias, Oliver; Catchpole, Ken; McCulloch, Peter
2015-02-01
Teamwork training and system standardisation have both been proposed to reduce error and harm in surgery. Since the approaches differ markedly, there is potential for synergy between them. Controlled interrupted time series with a 3 month intervention and observation phases before and after. Operating theatres conducting elective orthopaedic surgery in a single hospital system (UK Hospital Trust). Teamwork training based on crew resource management plus training and follow-up support in developing standardised operating procedures. Focus of subsequent standardisation efforts decided by theatre staff. Paired observers watched whole procedures together. We assessed non-technical skills using NOTECHS II, technical performance using glitch rate and compliance with WHO checklist using a simple quality tool. We measured complication and readmission rates and hospital stay using hospital administrative records. Before/after change was compared in the active and control groups using two-way ANOVA and regression models. 1121 patients were operated on before and 1100 after intervention. 44 operations were observed before and 50 afterwards. Non-technical skills (p=0.002) and WHO compliance (pteamwork and system improvement causes marked improvements in team behaviour and WHO performance, but not technical performance or outcome. These findings are consistent with the synergistic hypothesis, but larger controlled studies with a strong implementation strategy are required to test potential outcome effects. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Milojevic, Ai; Armstrong, Ben; Wilkinson, Paul
2017-10-01
There is emerging evidence that people affected by flooding suffer adverse impacts on their mental well-being, mostly based on self-reports. We examined prescription records for drugs used in the management of common mental disorder among primary care practices located in the vicinity of recent large flood events in England, 2011-2014. A controlled interrupted time series analysis was conducted of the number of prescribing items for antidepressant drugs in the year before and after the flood onset. Pre-post changes were compared by distance of the practice from the inundated boundaries among 930 practices located within 10 km of a flood. After control for deprivation and population density, there was an increase of 0.59% (95% CI 0.24 to 0.94) prescriptions in the postflood year among practices located within 1 km of a flood over and above the change observed in the furthest distance band. The increase was greater in more deprived areas. This study suggests an increase in prescribed antidepressant drugs in the year after flooding in primary care practices close to recent major floods in England. The degree to which the increase is actually concentrated in those flooded can only be determined by more detailed linkage studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
St.Clair, Travis; Cook, Thomas D.; Hallberg, Kelly
2014-01-01
Although evaluators often use an interrupted time series (ITS) design to test hypotheses about program effects, there are few empirical tests of the design's validity. We take a randomized experiment on an educational topic and compare its effects to those from a comparative ITS (CITS) design that uses the same treatment group as the experiment…
St. Clair, Travis; Hallberg, Kelly; Cook, Thomas D.
2016-01-01
We explore the conditions under which short, comparative interrupted time-series (CITS) designs represent valid alternatives to randomized experiments in educational evaluations. To do so, we conduct three within-study comparisons, each of which uses a unique data set to test the validity of the CITS design by comparing its causal estimates to…
Effect of an evidence-based website on healthcare usage: an interrupted time-series study
Spoelman, Wouter A; Bonten, Tobias N; de Waal, Margot W M; Drenthen, Ton; Smeele, Ivo J M; Nielen, Markus M J; Chavannes, Niels H
2016-01-01
Objectives Healthcare costs and usage are rising. Evidence-based online health information may reduce healthcare usage, but the evidence is scarce. The objective of this study was to determine whether the release of a nationwide evidence-based health website was associated with a reduction in healthcare usage. Design Interrupted time series analysis of observational primary care data of healthcare use in the Netherlands from 2009 to 2014. Setting General community primary care. Population 912 000 patients who visited their general practitioners 18.1 million times during the study period. Intervention In March 2012, an evidence-based health information website was launched by the Dutch College of General Practitioners. It was easily accessible and understandable using plain language. At the end of the study period, the website had 2.9 million unique page views per month. Main outcomes measures Primary outcome was the change in consultation rate (consultations/1000 patients/month) before and after the release of the website. Additionally, a reference group was created by including consultations about topics not being viewed at the website. Subgroup analyses were performed for type of consultations, sex, age and socioeconomic status. Results After launch of the website, the trend in consultation rate decreased with 1.620 consultations/1000 patients/month (p<0.001). This corresponds to a 12% decline in consultations 2 years after launch of the website. The trend in consultation rate of the reference group showed no change. The subgroup analyses showed a specific decline for consultations by phone and were significant for all other subgroups, except for the youngest age group. Conclusions Healthcare usage decreased by 12% after providing high-quality evidence-based online health information. These findings show that e-Health can be effective to improve self-management and reduce healthcare usage in times of increasing healthcare costs. PMID:28186945
Effect of an evidence-based website on healthcare usage: an interrupted time-series study.
Spoelman, Wouter A; Bonten, Tobias N; de Waal, Margot W M; Drenthen, Ton; Smeele, Ivo J M; Nielen, Markus M J; Chavannes, Niels H
2016-11-09
Healthcare costs and usage are rising. Evidence-based online health information may reduce healthcare usage, but the evidence is scarce. The objective of this study was to determine whether the release of a nationwide evidence-based health website was associated with a reduction in healthcare usage. Interrupted time series analysis of observational primary care data of healthcare use in the Netherlands from 2009 to 2014. General community primary care. 912 000 patients who visited their general practitioners 18.1 million times during the study period. In March 2012, an evidence-based health information website was launched by the Dutch College of General Practitioners. It was easily accessible and understandable using plain language. At the end of the study period, the website had 2.9 million unique page views per month. Primary outcome was the change in consultation rate (consultations/1000 patients/month) before and after the release of the website. Additionally, a reference group was created by including consultations about topics not being viewed at the website. Subgroup analyses were performed for type of consultations, sex, age and socioeconomic status. After launch of the website, the trend in consultation rate decreased with 1.620 consultations/1000 patients/month (pHealthcare usage decreased by 12% after providing high-quality evidence-based online health information. These findings show that e-Health can be effective to improve self-management and reduce healthcare usage in times of increasing healthcare costs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Directory of Open Access Journals (Sweden)
David K Humphreys
Full Text Available On November 24(th 2005, the Government of England and Wales removed regulatory restrictions on the times at which licensed premises could sell alcohol. This study tests availability theory by treating the implementation of Licensing Act (2003 as a natural experiment in alcohol policy.An interrupted time series design was employed to estimate the Act's immediate and delayed impact on violence in the City of Manchester (Population 464,200. We collected police recorded rates of violence, robbery, and total crime between the 1st of February 2004 and the 31st of December 2007. Events were aggregated by week, yielding a total of 204 observations (95 pre-, and 109 post-intervention. Secondary analysis examined changes in daily patterns of violence. Pre- and post-intervention events were separated into four three-hour segments 18∶00-20∶59, 21∶00-23.59, 00∶00-02∶59, 03∶00-05∶59.Analysis found no evidence that the Licensing Act (2003 affected the overall volume of violence. However, analyses of night-time violence found a gradual and permanent shift of weekend violence into later parts of the night. The results estimated an initial increase of 27.5% between 03∶00 to 06∶00 (ω = 0.2433, 95% CI = 0.06, 0.42, which increased to 36% by the end of the study period (δ = -0.897, 95% CI = -1.02, -0.77.This study found no evidence that a national policy increasing the physical availability of alcohol affected the overall volume of violence. There was, however, evidence suggesting that the policy may be associated with changes to patterns of violence in the early morning (3 a.m. to 6 a.m..
Repetitive Series Interrupter II.
1977-07-01
nated by other authorized documents. The citation of trade names and names of manufacturers is this report is not to be construed as official... intergrating inductor Magnet circuit load resistance Pulse-forming network load resistance Fault network load resistance Time delay between TUT fire and
Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha
2017-07-01
Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pRwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the injured in a resource-limited setting. Copyright © 2017 Elsevier Ltd. All rights reserved.
Interrupted time-series analysis of regulations to reduce paracetamol (acetaminophen poisoning.
Directory of Open Access Journals (Sweden)
Oliver W Morgan
2007-04-01
Full Text Available Paracetamol (acetaminophen poisoning is the leading cause of acute liver failure in Great Britain and the United States. Successful interventions to reduced harm from paracetamol poisoning are needed. To achieve this, the government of the United Kingdom introduced legislation in 1998 limiting the pack size of paracetamol sold in shops. Several studies have reported recent decreases in fatal poisonings involving paracetamol. We use interrupted time-series analysis to evaluate whether the recent fall in the number of paracetamol deaths is different to trends in fatal poisoning involving aspirin, paracetamol compounds, antidepressants, or nondrug poisoning suicide.We calculated directly age-standardised mortality rates for paracetamol poisoning in England and Wales from 1993 to 2004. We used an ordinary least-squares regression model divided into pre- and postintervention segments at 1999. The model included a term for autocorrelation within the time series. We tested for changes in the level and slope between the pre- and postintervention segments. To assess whether observed changes in the time series were unique to paracetamol, we compared against poisoning deaths involving compound paracetamol (not covered by the regulations, aspirin, antidepressants, and nonpoisoning suicide deaths. We did this comparison by calculating a ratio of each comparison series with paracetamol and applying a segmented regression model to the ratios. No change in the ratio level or slope indicated no difference compared to the control series. There were about 2,200 deaths involving paracetamol. The age-standardised mortality rate rose from 8.1 per million in 1993 to 8.8 per million in 1997, subsequently falling to about 5.3 per million in 2004. After the regulations were introduced, deaths dropped by 2.69 per million (p = 0.003. Trends in the age-standardised mortality rate for paracetamol compounds, aspirin, and antidepressants were broadly similar to paracetamol
A robust interrupted time series model for analyzing complex health care intervention data
Cruz, Maricela
2017-08-29
Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be
A robust interrupted time series model for analyzing complex health care intervention data
Cruz, Maricela; Bender, Miriam; Ombao, Hernando
2017-01-01
Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be
Directory of Open Access Journals (Sweden)
Anne Margreet van Dishoeck
2014-06-01
Full Text Available Background: In patients with acute ischemic stroke, early treatment with recombinant tissue plasminogen activator (rtPA improves functional outcome by effectively reducing disability and dependency. Timely thrombolysis, within 1 h, is a vital aspect of acute stroke treatment, and is reflected in the widely used performance indicator ‘door-to-needle time' (DNT. DNT measures the time from the moment the patient enters the emergency department until he/she receives intravenous rtPA. The purpose of the study was to measure quality improvement from the first implementation of thrombolysis in stroke patients in a university hospital in the Netherlands. We further aimed to identify specific interventions that affect DNT. Methods: We included all patients with acute ischemic stroke consecutively admitted to a large university hospital in the Netherlands between January 2006 and December 2012, and focused on those treated with thrombolytic therapy on admission. Data were collected routinely for research purposes and internal quality measurement (the Erasmus Stroke Study. We used a retrospective interrupted time series design to study the trend in DNT, analyzed by means of segmented regression. Results: Between January 2006 and December 2012, 1,703 patients with ischemic stroke were admitted and 262 (17% were treated with rtPA. Patients treated with thrombolysis were on average 63 years old at the time of the stroke and 52% were male. Mean age (p = 0.58 and sex distribution (p = 0.98 did not change over the years. The proportion treated with thrombolysis increased from 5% in 2006 to 22% in 2012. In 2006, none of the patients were treated within 1 h. In 2012, this had increased to 81%. In a logistic regression analysis, this trend was significant (OR 1.6 per year, CI 1.4-1.8. The median DNT was reduced from 75 min in 2006 to 45 min in 2012 (p Conclusion and Implications: The DNT steadily improved from the first implementation of thrombolysis. Specific
Pridemore, William Alex; Chamlin, Mitchell B; Cochran, John K
2007-06-01
The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance.
Pridemore, William Alex; Chamlin, Mitchell B.; Cochran, John K.
2009-01-01
The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance. PMID:20165565
Directory of Open Access Journals (Sweden)
Sylvie Bastuji-Garin
Full Text Available In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement publication on the quality of observational study reporting, using both uncontrolled before-after analyses and interrupted time series.For this quasi-experimental study, original articles reporting cohort, case-control, and cross-sectional studies published between 2004 and 2010 in the four dermatological journals having the highest 5-year impact factors (≥ 4 were selected. We compared the proportions of STROBE items (STROBE score adequately reported in each article during three periods, two pre STROBE period (2004-2005 and 2006-2007 and one post STROBE period (2008-2010. Segmented regression analysis of interrupted time series was also performed.Of the 456 included articles, 187 (41% reported cohort studies, 166 (36.4% cross-sectional studies, and 103 (22.6% case-control studies. The median STROBE score was 57% (range, 18%-98%. Before-after analysis evidenced significant STROBE score increases between the two pre-STROBE periods and between the earliest pre-STROBE period and the post-STROBE period (median score2004-05 48% versus median score2008-10 58%, p<0.001 but not between the immediate pre-STROBE period and the post-STROBE period (median score2006-07 58% versus median score2008-10 58%, p = 0.42. In the pre STROBE period, the six-monthly mean STROBE score increased significantly, by 1.19% per six-month period (absolute increase 95%CI, 0.26% to 2.11%, p = 0.016. By segmented analysis, no significant changes in STROBE score trends occurred (-0.40%; 95%CI, -2.20 to 1.41; p = 0.64 in the post STROBE statement publication.The quality of reports increased over time but was not affected by STROBE. Our findings raise concerns about the relevance of uncontrolled before
van de Pol, Ineke; van Iterson, Mat; Maaskant, Jolanda
2017-08-01
Delirium in critically-ill patients is a common multifactorial disorder that is associated with various negative outcomes. It is assumed that sleep disturbances can result in an increased risk of delirium. This study hypothesized that implementing a protocol that reduces overall nocturnal sound levels improves quality of sleep and reduces the incidence of delirium in Intensive Care Unit (ICU) patients. This interrupted time series study was performed in an adult mixed medical and surgical 24-bed ICU. A pre-intervention group of 211 patients was compared with a post-intervention group of 210 patients after implementation of a nocturnal sound-reduction protocol. Primary outcome measures were incidence of delirium, measured by the Intensive Care Delirium Screening Checklist (ICDSC) and quality of sleep, measured by the Richards-Campbell Sleep Questionnaire (RCSQ). Secondary outcome measures were use of sleep-inducing medication, delirium treatment medication, and patient-perceived nocturnal noise. A significant difference in slope in the percentage of delirium was observed between the pre- and post-intervention periods (-3.7% per time period, p=0.02). Quality of sleep was unaffected (0.3 per time period, p=0.85). The post-intervention group used significantly less sleep-inducing medication (psound-reduction protocol. However, reported sleep quality did not improve. Copyright © 2017. Published by Elsevier Ltd.
Dayer, Mark J; Jones, Simon; Prendergast, Bernard; Baddour, Larry M; Lockhart, Peter B; Thornhill, Martin H
2015-03-28
Antibiotic prophylaxis given before invasive dental procedures in patients at risk of developing infective endocarditis has historically been the focus of infective endocarditis prevention. Recent changes in antibiotic prophylaxis guidelines in the USA and Europe have substantially reduced the number of patients for whom antibiotic prophylaxis is recommended. In the UK, guidelines from the National Institute for Health and Clinical Excellence (NICE) recommended complete cessation of antibiotic prophylaxis for prevention of infective endocarditis in March, 2008. We aimed to investigate changes in the prescribing of antibiotic prophylaxis and the incidence of infective endocarditis since the introduction of these guidelines. We did a retrospective secular trend study, analysed as an interrupted time series, to investigate the effect of antibiotic prophylaxis versus no prophylaxis on the incidence of infective endocarditis in England. We analysed data for the prescription of antibiotic prophylaxis from Jan 1, 2004, to March 31, 2013, and hospital discharge episode statistics for patients with a primary diagnosis of infective endocarditis from Jan 1, 2000, to March 31, 2013. We compared the incidence of infective endocarditis before and after the introduction of the NICE guidelines using segmented regression analysis of the interrupted time series. Prescriptions of antibiotic prophylaxis for the prevention of infective endocarditis fell substantially after introduction of the NICE guidance (mean 10,900 prescriptions per month [Jan 1, 2004, to March 31, 2008] vs 2236 prescriptions per month [April 1, 2008, to March 31, 2013], pinfective endocarditis increased significantly above the projected historical trend, by 0·11 cases per 10 million people per month (95% CI 0·05-0·16, pinfective endocarditis was significant for both individuals at high risk of infective endocarditis and those at lower risk. Although our data do not establish a causal association, prescriptions
Helder, Onno K.; Brug, Johannes; van Goudoever, Johannes B.; Looman, Caspar W. N.; Reiss, Irwin K. M.; Kornelisse, René F.
2014-01-01
Sustained high compliance with hand hygiene (HH) is needed to reduce nosocomial bloodstream infections (NBSIs). However, over time, a wash out effect often occurs. We studied the long-term effect of sequential HH-promoting interventions. An observational study with an interrupted time series
Lee, John Tayu; Netuveli, Gopalakrishnan; Majeed, Azeem; Millett, Christopher
2011-01-01
The Quality and Outcomes Framework (QOF), a major pay-for-performance programme, was introduced into United Kingdom primary care in April 2004. The impact of this programme on disparities in health care remains unclear. This study examines the following questions: has this pay for performance programme improved the quality of care for coronary heart disease, stroke and hypertension in white, black and south Asian patients? Has this programme reduced disparities in the quality of care between these ethnic groups? Did general practices with different baseline performance respond differently to this programme? Retrospective cohort study of patients registered with family practices in Wandsworth, London during 2007. Segmented regression analysis of interrupted time series was used to take into account the previous time trend. Primary outcome measures were mean systolic and diastolic blood pressure, and cholesterol levels. Our findings suggest that the implementation of QOF resulted in significant short term improvements in blood pressure control. The magnitude of benefit varied between ethnic groups with a statistically significant short term reduction in systolic BP in white and black but not in south Asian patients with hypertension. Disparities in risk factor control were attenuated only on few measures and largely remained intact at the end of the study period. Pay for performance programmes such as the QOF in the UK should set challenging but achievable targets. Specific targets aimed at reducing ethnic disparities in health care may also be needed.
Abimpaye, Monique; Kirk, Catherine M; Iyer, Hari S; Gupta, Neil; Remera, Eric; Mugwaneza, Placidie; Law, Michael R
2018-01-01
Nearly a quarter of a million children have acquired HIV, prompting the implementation of new protocols-Option B and B+-for treating HIV+ pregnant women. While efficacy has been demonstrated in randomized trials, there is limited real-world evidence on the impact of these changes. Using longitudinal, routinely collected data we assessed the impact of the adoption of WHO Option B in Rwanda on mother to infant transmission. We used interrupted time series analysis to evaluate the impact of Option B on mother-to-child HIV transmission in Rwanda. Our primary outcome was the proportion of HIV tests in infants with positive results at six weeks of age. We included data for 20 months before and 22 months after the 2010 policy change. Of the 15,830 HIV tests conducted during our study period, 392 tested positive. We found a significant decrease in both the level (-2.08 positive tests per 100 tests conducted, 95% CI: -2.71 to -1.45, p Option B in Rwanda contributed to an immediate decrease in the rate of HIV transmission from mother to child. This suggests other countries may benefit from adopting these WHO guidelines.
Fuller, Daniel; Sahlqvist, Shannon; Cummins, Steven; Ogilvie, David
2012-01-01
To investigate the immediate and sustained effects of two London Underground strikes on use of a public bicycle share program. An interrupted time series design was used to examine the impact of two 24 hour strikes on the total number of trips per day and mean trip duration per day on the London public bicycle share program. The strikes occurred on September 6th and October 4th 2010 and limited service on the London Underground. The mean total number of trips per day over the whole study period was 14,699 (SD=5390) while the mean trip duration was 18.5 minutes (SD=3.7). Significant increases in daily trip count were observed following strike 1 (3864: 95% CI 125 to 7604) and strike 2 (11,293: 95% CI 5169 to 17,416). Events that greatly constrain the primary motorised mode of transportation for a population may have unintended short-term effects on travel behaviour. These findings suggest that limiting transportation options may have the potential to increase population levels of physical activity by promoting the use of cycling. Copyright © 2011 Elsevier Inc. All rights reserved.
Crofts, J F; Lenguerrand, E; Bentham, G L; Tawfik, S; Claireaux, H A; Odd, D; Fox, R; Draycott, T J
2016-01-01
To investigate management and outcomes of incidences of shoulder dystocia in the 12 years following the introduction of an obstetric emergencies training programme. Interrupted time-series study comparing management and neonatal outcome of births complicated by shoulder dystocia over three 4-year periods: (i) Pre-training (1996-99), (ii) Early training (2001-04), and (iii) Late training (2009-12). Southmead Hospital, Bristol, UK, with approximately 6000 births per annum. Infants and their mothers who experienced shoulder dystocia. A bi-monthly multi-professional 1-day intrapartum emergencies training course, that included a 30-minute practical session on shoulder dystocia management, commenced in 2000. Neonatal morbidity (brachial plexus injury, humeral fracture, clavicular fracture, 5-minute Apgar score dystocia (resolution manoeuvres performed, traction applied, head-to-body delivery interval). Compliance with national guidance improved with continued training. At least one recognised resolution manoeuvre was used in 99.8% (561/562) of cases of shoulder dystocia in the late training period, demonstrating a continued improvement from 46.3% (150/324, P dystocia. © 2015 Royal College of Obstetricians and Gynaecologists.
Yang, Caijun; Shen, Qian; Cai, Wenfang; Zhu, Wenwen; Li, Zongjie; Wu, Lina; Fang, Yu
2017-02-01
To assess the long-term effects of the introduction of China's zero-markup drug policy on hospitalisation expenditure and hospitalisation expenditures after reimbursement. An interrupted time series was used to evaluate the impact of the zero-markup drug policy on hospitalisation expenditure and hospitalisation expenditure after reimbursement at primary health institutions in Fufeng County of Shaanxi Province, western China. Two regression models were developed. Monthly average hospitalisation expenditure and monthly average hospitalisation expenditure after reimbursement in primary health institutions were analysed covering the period 2009 through to 2013. For the monthly average hospitalisation expenditure, the increasing trend was slowed down after the introduction of the zero-markup drug policy (coefficient = -16.49, P = 0.009). For the monthly average hospitalisation expenditure after reimbursement, the increasing trend was slowed down after the introduction of the zero-markup drug policy (coefficient = -10.84, P = 0.064), and a significant decrease in the intercept was noted after the second intervention of changes in reimbursement schemes of the new rural cooperative medical insurance (coefficient = -220.64, P markup drug policy in western China. However, hospitalisation expenditure and hospitalisation expenditure after reimbursement were still increasing. More effective policies are needed to prevent these costs from continuing to rise. © 2016 John Wiley & Sons Ltd.
Zvoch, Keith
2016-01-01
Piecewise growth models (PGMs) were used to estimate and model changes in the preliteracy skill development of kindergartners in a moderately sized school district in the Pacific Northwest. PGMs were applied to interrupted time-series (ITS) data that arose within the context of a response-to-intervention (RtI) instructional framework. During the…
Kastner, Monika; Sawka, Anna; Thorpe, Kevin; Chignel, Mark; Marquez, Christine; Newton, David; Straus, Sharon E
2011-07-22
Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems. Although guidelines on assessing and managing osteoporosis are available, many patients are not receiving appropriate diagnostic testing or treatment. Findings from a systematic review of osteoporosis interventions, a series of mixed-methods studies, and advice from experts in osteoporosis and human-factors engineering were used collectively to develop a multicomponent tool (targeted to family physicians and patients at risk for osteoporosis) that may support clinical decision making in osteoporosis disease management at the point of care. A three-phased approach will be used to evaluate the osteoporosis tool. In phase 1, the tool will be implemented in three family practices. It will involve ensuring optimal functioning of the tool while minimizing disruption to usual practice. In phase 2, the tool will be pilot tested in a quasi-experimental interrupted time series (ITS) design to determine if it can improve osteoporosis disease management at the point of care. Phase 3 will involve conducting a qualitative postintervention follow-up study to better understand participants' experiences and perceived utility of the tool and readiness to adopt the tool at the point of care. The osteoporosis tool has the potential to make several contributions to the development and evaluation of complex, chronic disease interventions, such as the inclusion of an implementation strategy prior to conducting an evaluation study. Anticipated benefits of the tool may be to increase awareness for patients about osteoporosis and its associated risks and provide an opportunity to discuss a management plan with their physician, which may all facilitate patient self-management.
Directory of Open Access Journals (Sweden)
Marvin A. H. Berrevoets
2017-08-01
Full Text Available Abstract Background Timely switch from intravenous (iv antibiotics to oral therapy is a key component of antimicrobial stewardship programs in order to improve patient safety, promote early discharge and reduce costs. We have introduced a time-efficient and easily implementable intervention that relies on a computerized trigger tool, which identifies patients who are candidates for an iv to oral antibiotic switch. Methods The intervention was introduced on all internal medicine wards in a teaching hospital. Patients were automatically identified by an electronic trigger tool when parenteral antibiotics were used for >48 h and clinical or pharmacological data did not preclude switch therapy. A weekly educational session was introduced to alert the physicians on the intervention wards. The intervention wards were compared with control wards, which included all other hospital wards. An interrupted time-series analysis was performed to compare the pre-intervention period with the post-intervention period using ‘% of i.v. prescriptions >72 h’ and ‘median duration of iv therapy per prescription’ as outcomes. We performed a detailed prospective evaluation on a subset of 244 prescriptions to evaluate the efficacy and appropriateness of the intervention. Results The number of intravenous prescriptions longer than 72 h was reduced by 19% in the intervention group (n = 1519 (p < 0.01 and the median duration of iv antibiotics was reduced with 0.8 days (p = <0.05. Compared to the control group (n = 4366 the intervention was responsible for an additional decrease of 13% (p < 0.05 in prolonged prescriptions. The detailed prospective evaluation of a subgroup of patients showed that adherence to the electronic reminder was 72%. Conclusions An electronic trigger tool combined with a weekly educational session was effective in reducing the duration of intravenous antimicrobial therapy.
Dowding, Dawn W; Turley, Marianne; Garrido, Terhilda
2012-01-01
To evaluate the impact of electronic health record (EHR) implementation on nursing care processes and outcomes. Interrupted time series analysis, 2003-2009. A large US not-for-profit integrated health care organization. 29 hospitals in Northern and Southern California. An integrated EHR including computerized physician order entry, nursing documentation, risk assessment tools, and documentation tools. Percentage of patients with completed risk assessments for hospital acquired pressure ulcers (HAPUs) and falls (process measures) and rates of HAPU and falls (outcome measures). EHR implementation was significantly associated with an increase in documentation rates for HAPU risk (coefficient 2.21, 95% CI 0.67 to 3.75); the increase for fall risk was not statistically significant (0.36; -3.58 to 4.30). EHR implementation was associated with a 13% decrease in HAPU rates (coefficient -0.76, 95% CI -1.37 to -0.16) but no decrease in fall rates (-0.091; -0.29 to 0.11). Irrespective of EHR implementation, HAPU rates decreased significantly over time (-0.16; -0.20 to -0.13), while fall rates did not (0.0052; -0.01 to 0.02). Hospital region was a significant predictor of variation for both HAPU (0.72; 0.30 to 1.14) and fall rates (0.57; 0.41 to 0.72). The introduction of an integrated EHR was associated with a reduction in the number of HAPUs but not in patient fall rates. Other factors, such as changes over time and hospital region, were also associated with variation in outcomes. The findings suggest that EHR impact on nursing care processes and outcomes is dependent on a number of factors that should be further explored.
Directory of Open Access Journals (Sweden)
Anthony A Laverty
Full Text Available We evaluated the impact of a COPD discharge care bundle on readmission rates following hospitalisation with an acute exacerbation.Interrupted time series analysis, comparing readmission rates for COPD exacerbations at nine trusts that introduced the bundle, to two comparison groups; (1 other NHS trusts in London and (2 all other NHS trusts in England. Care bundles were implemented at different times for different NHS trusts, ranging from October 2009 to April 2011.Nine NHS acute trusts in the London, England.Patients aged 45 years and older admitted to an NHS acute hospital in England for acute exacerbation of COPD. Data come from Hospital Episode Statistics, April 2002 to March 2012.Annual trend readmission rates (and in total bed days within 7, 28 and 90 days, before and after implementation.In hospitals introducing the bundle readmission rates were rising before implementation and falling afterwards (e.g. readmissions within 28 days +2.13% per annum (pa pre and -5.32% pa post (p for difference in trends = 0.012. Following implementation, readmission rates within 7 and 28 day were falling faster than among other trusts in London, although this was not statistically significant (e.g. readmissions within 28 days -4.6% pa vs. -3.2% pa, p = 0.44. Comparisons with a national control group were similar.The COPD discharge care bundle appeared to be associated with a reduction in readmission rate among hospitals using it. The significance of this is unclear because of changes to background trends in London and nationally.
Directory of Open Access Journals (Sweden)
Alexander M Aiken
Full Text Available In low-income countries, Surgical Site Infection (SSI is a common form of hospital-acquired infection. Antibiotic prophylaxis is an effective method of preventing these infections, if given immediately before the start of surgery. Although several studies in Africa have compared pre-operative versus post-operative prophylaxis, there are no studies describing the implementation of policies to improve prescribing of surgical antibiotic prophylaxis in African hospitals.We conducted SSI surveillance at a typical Government hospital in Kenya over a 16 month period between August 2010 and December 2011, using standard definitions of SSI and the extent of contamination of surgical wounds. As an intervention, we developed a hospital policy that advised pre-operative antibiotic prophylaxis and discouraged extended post-operative antibiotics use. We measured process, outcome and balancing effects of this intervention in using an interrupted time series design.From a starting point of near-exclusive post-operative antibiotic use, after policy introduction in February 2011 there was rapid adoption of the use of pre-operative antibiotic prophylaxis (60% of operations at 1 week; 98% at 6 weeks and a substantial decrease in the use of post-operative antibiotics (40% of operations at 1 week; 10% at 6 weeks in Clean and Clean-Contaminated surgery. There was no immediate step-change in risk of SSI, but overall, there appeared to be a moderate reduction in the risk of superficial SSI across all levels of wound contamination. There were marked reductions in the costs associated with antibiotic use, the number of intravenous injections performed and nursing time spent administering these.Implementation of a locally developed policy regarding surgical antibiotic prophylaxis is an achievable quality improvement target for hospitals in low-income countries, and can lead to substantial benefits for individual patients and the institution.
Aiken, Alexander M; Wanyoro, Anthony K; Mwangi, Jonah; Juma, Francis; Mugoya, Isaac K; Scott, J Anthony G
2013-01-01
In low-income countries, Surgical Site Infection (SSI) is a common form of hospital-acquired infection. Antibiotic prophylaxis is an effective method of preventing these infections, if given immediately before the start of surgery. Although several studies in Africa have compared pre-operative versus post-operative prophylaxis, there are no studies describing the implementation of policies to improve prescribing of surgical antibiotic prophylaxis in African hospitals. We conducted SSI surveillance at a typical Government hospital in Kenya over a 16 month period between August 2010 and December 2011, using standard definitions of SSI and the extent of contamination of surgical wounds. As an intervention, we developed a hospital policy that advised pre-operative antibiotic prophylaxis and discouraged extended post-operative antibiotics use. We measured process, outcome and balancing effects of this intervention in using an interrupted time series design. From a starting point of near-exclusive post-operative antibiotic use, after policy introduction in February 2011 there was rapid adoption of the use of pre-operative antibiotic prophylaxis (60% of operations at 1 week; 98% at 6 weeks) and a substantial decrease in the use of post-operative antibiotics (40% of operations at 1 week; 10% at 6 weeks) in Clean and Clean-Contaminated surgery. There was no immediate step-change in risk of SSI, but overall, there appeared to be a moderate reduction in the risk of superficial SSI across all levels of wound contamination. There were marked reductions in the costs associated with antibiotic use, the number of intravenous injections performed and nursing time spent administering these. Implementation of a locally developed policy regarding surgical antibiotic prophylaxis is an achievable quality improvement target for hospitals in low-income countries, and can lead to substantial benefits for individual patients and the institution.
Harris, Alex H S; Bowe, Thomas; Hagedorn, Hildi; Nevedal, Andrea; Finlay, Andrea K; Gidwani, Risha; Rosen, Craig; Kay, Chad; Christopher, Melissa
2016-09-15
Active consideration of effective medications to treat alcohol use disorder (AUD) is a consensus standard of care, yet knowledge and use of these medications are very low across diverse settings. This study evaluated the overall effectiveness a multifaceted academic detailing program to address this persistent quality problem in the US Veterans Health Administration (VHA), as well as the context and process factors that explained variation in effectiveness across sites. An interrupted time series design, analyzed with mixed-effects segmented logistic regression, was used to evaluate changes in level and rate of change in the monthly percent of patients with a clinically documented AUD who received naltrexone, acamprosate, disulfiram, or topiramate. Using data from a 20 month post-implementation period, intervention sites (n = 37) were compared to their own 16 month pre-implementation performance and separately to the rest of VHA. From immediately pre-intervention to the end of the observation period, the percent of patients in the intervention sites with AUD who received medication increased over 3.4 % in absolute terms and 68 % in relative terms (i.e., 4.9-8.3 %). This change was significant compared to the pre-implementation period in the intervention sites and secular trends in control sites. Sites with lower pre-implementation adoption, more person hours of detailing, but fewer people detailed, had larger immediate increases in medication receipt after implementation. The average number of detailing encounters per person was associated with steeper increases in slope over time. This study found empirical support for a multifaceted quality improvement strategy aimed at increasing access to and utilization of pharmacotherapy for AUD. Future studies should focus on determining how to enhance the programs effects, especially in non-responsive locations.
Lopez Bernal, James A; Gasparrini, Antonio; Artundo, Carlos M; McKee, Martin
2013-10-01
The current financial crisis is having a major impact on European economies, especially that of Spain. Past evidence suggests that adverse macro-economic conditions exacerbate mental illness, but evidence from the current crisis is limited. This study analyses the association between the financial crisis and suicide rates in Spain. An interrupted time-series analysis of national suicides data between 2005 and 2010 was used to establish whether there has been any deviation in the underlying trend in suicide rates associated with the financial crisis. Segmented regression with a seasonally adjusted quasi-Poisson model was used for the analysis. Stratified analyses were performed to establish whether the effect of the crisis on suicides varied by region, sex and age group. The mean monthly suicide rate in Spain during the study period was 0.61 per 100 000 with an underlying trend of a 0.3% decrease per month. We found an 8.0% increase in the suicide rate above this underlying trend since the financial crisis (95% CI: 1.009-1.156; P = 0.03); this was robust to sensitivity analysis. A control analysis showed no change in deaths from accidental falls associated with the crisis. Stratified analyses suggested that the association between the crisis and suicide rates is greatest in the Mediterranean and Northern areas, in males and amongst those of working age. The financial crisis in Spain has been associated with a relative increase in suicides. Males and those of working age may be at particular risk of suicide associated with the crisis and may benefit from targeted interventions.
Nistal-Nuño, Beatriz
2017-09-01
In Chile, a new law introduced in March 2012 decreased the legal blood alcohol concentration (BAC) limit for driving while impaired from 1 to 0.8 g/l and the legal BAC limit for driving under the influence of alcohol from 0.5 to 0.3 g/l. The goal is to assess the impact of this new law on mortality and morbidity outcomes in Chile. A review of national databases in Chile was conducted from January 2003 to December 2014. Segmented regression analysis of interrupted time series was used for analyzing the data. In a series of multivariable linear regression models, the change in intercept and slope in the monthly incidence rate of traffic deaths and injuries and association with alcohol per 100,000 inhabitants was estimated from pre-intervention to postintervention, while controlling for secular changes. In nested regression models, potential confounding seasonal effects were accounted for. All analyses were performed at a two-sided significance level of 0.05. Immediate level drops in all the monthly rates were observed after the law from the end of the prelaw period in the majority of models and in all the de-seasonalized models, although statistical significance was reached only in the model for injures related to alcohol. After the law, the estimated monthly rate dropped abruptly by -0.869 for injuries related to alcohol and by -0.859 adjusting for seasonality (P < 0.001). Regarding the postlaw long-term trends, it was evidenced a steeper decreasing trend after the law in the models for deaths related to alcohol, although these differences were not statistically significant. A strong evidence of a reduction in traffic injuries related to alcohol was found following the law in Chile. Although insufficient evidence was found of a statistically significant effect for the beneficial effects seen on deaths and overall injuries, potential clinically important effects cannot be ruled out. Copyright © 2017 The Royal Society for Public Health. Published by Elsevier Ltd
Directory of Open Access Journals (Sweden)
Jing Sun
2017-08-01
Full Text Available Abstract Background It is globally agreed that a well-designed health system deliver timely and convenient access to health services for all patients. Many interventions aiming to reduce waiting times have been implemented in Chinese public tertiary hospitals to improve patients’ satisfaction. However, few were well-documented, and the effects were rarely measured with robust methods. Methods We conducted a longitudinal study of the length of waiting times in a public tertiary hospital in Southern China which developed comprehensive data collection systems. Around an average of 60,000 outpatients and 70,000 prescribed outpatients per month were targeted for the study during Oct 2014-February 2017. We analyzed longitudinal time series data using a segmented linear regression model to assess changes in levels and trends of waiting times before and after the introduction of waiting time reduction interventions. Pearson correlation analysis was conducted to indicate the strength of association between waiting times and patient satisfactions. The statistical significance level was set at 0.05. Results The monthly average length of waiting time decreased 3.49 min (P = 0.003 for consultations and 8.70 min (P = 0.02 for filling prescriptions in the corresponding month when respective interventions were introduced. The trend shifted from baseline slight increasing to afterwards significant decreasing for filling prescriptions (P =0.003. There was a significant negative correlation between waiting time of filling prescriptions and outpatient satisfaction towards pharmacy services (r = −0.71, P = 0.004. Conclusions The interventions aimed at reducing waiting time and raising patient satisfaction in Fujian Provincial Hospital are effective. A long-lasting reduction effect on waiting time for filling prescriptions was observed because of carefully designed continuous efforts, rather than a one-time campaign, and with appropriate incentives
Sun, Jing; Lin, Qian; Zhao, Pengyu; Zhang, Qiongyao; Xu, Kai; Chen, Huiying; Hu, Cecile Jia; Stuntz, Mark; Li, Hong; Liu, Yuanli
2017-08-22
It is globally agreed that a well-designed health system deliver timely and convenient access to health services for all patients. Many interventions aiming to reduce waiting times have been implemented in Chinese public tertiary hospitals to improve patients' satisfaction. However, few were well-documented, and the effects were rarely measured with robust methods. We conducted a longitudinal study of the length of waiting times in a public tertiary hospital in Southern China which developed comprehensive data collection systems. Around an average of 60,000 outpatients and 70,000 prescribed outpatients per month were targeted for the study during Oct 2014-February 2017. We analyzed longitudinal time series data using a segmented linear regression model to assess changes in levels and trends of waiting times before and after the introduction of waiting time reduction interventions. Pearson correlation analysis was conducted to indicate the strength of association between waiting times and patient satisfactions. The statistical significance level was set at 0.05. The monthly average length of waiting time decreased 3.49 min (P = 0.003) for consultations and 8.70 min (P = 0.02) for filling prescriptions in the corresponding month when respective interventions were introduced. The trend shifted from baseline slight increasing to afterwards significant decreasing for filling prescriptions (P =0.003). There was a significant negative correlation between waiting time of filling prescriptions and outpatient satisfaction towards pharmacy services (r = -0.71, P = 0.004). The interventions aimed at reducing waiting time and raising patient satisfaction in Fujian Provincial Hospital are effective. A long-lasting reduction effect on waiting time for filling prescriptions was observed because of carefully designed continuous efforts, rather than a one-time campaign, and with appropriate incentives implemented by a taskforce authorized by the hospital managers. This
Jeandron, Aurélie; Saidi, Jaime Mufitini; Kapama, Alois; Burhole, Manu; Birembano, Freddy; Vandevelde, Thierry; Gasparrini, Antonio; Armstrong, Ben; Cairncross, Sandy; Ensink, Jeroen H. J.
2015-01-01
Background The eastern provinces of the Democratic Republic of the Congo have been identified as endemic areas for cholera transmission, and despite continuous control efforts, they continue to experience regular cholera outbreaks that occasionally spread to the rest of the country. In a region where access to improved water sources is particularly poor, the question of which improvements in water access should be prioritized to address cholera transmission remains unresolved. This study aimed at investigating the temporal association between water supply interruptions and Cholera Treatment Centre (CTC) admissions in a medium-sized town. Methods and Findings Time-series patterns of daily incidence of suspected cholera cases admitted to the Cholera Treatment Centre in Uvira in South Kivu Province between 2009 and 2014 were examined in relation to the daily variations in volume of water supplied by the town water treatment plant. Quasi-poisson regression and distributed lag nonlinear models up to 12 d were used, adjusting for daily precipitation rates, day of the week, and seasonal variations. A total of 5,745 patients over 5 y of age with acute watery diarrhoea symptoms were admitted to the CTC over the study period of 1,946 d. Following a day without tap water supply, the suspected cholera incidence rate increased on average by 155% over the next 12 d, corresponding to a rate ratio of 2.55 (95% CI: 1.54–4.24), compared to the incidence experienced after a day with optimal production (defined as the 95th percentile—4,794 m3). Suspected cholera cases attributable to a suboptimal tap water supply reached 23.2% of total admissions (95% CI 11.4%–33.2%). Although generally reporting less admissions to the CTC, neighbourhoods with a higher consumption of tap water were more affected by water supply interruptions, with a rate ratio of 3.71 (95% CI: 1.91–7.20) and an attributable fraction of cases of 31.4% (95% CI: 17.3%–42.5%). The analysis did not suggest any
Jeandron, Aurélie; Saidi, Jaime Mufitini; Kapama, Alois; Burhole, Manu; Birembano, Freddy; Vandevelde, Thierry; Gasparrini, Antonio; Armstrong, Ben; Cairncross, Sandy; Ensink, Jeroen H J
2015-10-01
The eastern provinces of the Democratic Republic of the Congo have been identified as endemic areas for cholera transmission, and despite continuous control efforts, they continue to experience regular cholera outbreaks that occasionally spread to the rest of the country. In a region where access to improved water sources is particularly poor, the question of which improvements in water access should be prioritized to address cholera transmission remains unresolved. This study aimed at investigating the temporal association between water supply interruptions and Cholera Treatment Centre (CTC) admissions in a medium-sized town. Time-series patterns of daily incidence of suspected cholera cases admitted to the Cholera Treatment Centre in Uvira in South Kivu Province between 2009 and 2014 were examined in relation to the daily variations in volume of water supplied by the town water treatment plant. Quasi-poisson regression and distributed lag nonlinear models up to 12 d were used, adjusting for daily precipitation rates, day of the week, and seasonal variations. A total of 5,745 patients over 5 y of age with acute watery diarrhoea symptoms were admitted to the CTC over the study period of 1,946 d. Following a day without tap water supply, the suspected cholera incidence rate increased on average by 155% over the next 12 d, corresponding to a rate ratio of 2.55 (95% CI: 1.54-4.24), compared to the incidence experienced after a day with optimal production (defined as the 95th percentile-4,794 m3). Suspected cholera cases attributable to a suboptimal tap water supply reached 23.2% of total admissions (95% CI 11.4%-33.2%). Although generally reporting less admissions to the CTC, neighbourhoods with a higher consumption of tap water were more affected by water supply interruptions, with a rate ratio of 3.71 (95% CI: 1.91-7.20) and an attributable fraction of cases of 31.4% (95% CI: 17.3%-42.5%). The analysis did not suggest any association between levels of residual
Weiss, Deborah; Dunn, Sandra I; Sprague, Ann E; Fell, Deshayne B; Grimshaw, Jeremy M; Darling, Elizabeth; Graham, Ian D; Harrold, JoAnn; Smith, Graeme N; Peterson, Wendy E; Reszel, Jessica; Lanes, Andrea; Walker, Mark C; Taljaard, Monica
2018-06-01
To assess the effect of the Maternal Newborn Dashboard on six key clinical performance indicators in the province of Ontario, Canada. Interrupted time series using population-based data from the provincial birth registry covering a 3-year period before implementation of the Dashboard and 2.5 years after implementation (November 2009 through March 2015). All hospitals in the province of Ontario providing maternal-newborn care (n=94). A hospital-based online audit and feedback programme. Rates of the six performance indicators included in the Dashboard. 2.5 years after implementation, the audit and feedback programme was associated with statistically significant absolute decreases in the rates of episiotomy (decrease of 1.5 per 100 women, 95% CI 0.64 to 2.39), induction for postdates in women who were less than 41 weeks at delivery (decrease of 11.7 per 100 women, 95% CI 7.4 to 16.0), repeat caesarean delivery in low-risk women performed before 39 weeks (decrease of 10.4 per 100 women, 95% CI 9.3 to 11.5) and an absolute increase in the rate of appropriately timed group B streptococcus screening (increase of 2.8 per 100, 95% CI 2.2 to 3.5). The audit and feedback programme did not significantly affect the rates of unsatisfactory newborn screening blood samples or formula supplementation at discharge. No statistically significant effects were observed for the two internal control outcomes or the four external control indicators-in fact, two external control indicators (episiotomy and postdates induction) worsened relative to before implementation. An electronic audit and feedback programme implemented in maternal-newborn hospitals was associated with clinically relevant practice improvements at the provincial level in the majority of targeted indicators. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Matthews, Anthony; Herrett, Emily; Gasparrini, Antonio; Van Staa, Tjeerd; Goldacre, Ben; Smeeth, Liam; Bhaskaran, Krishnan
2016-06-28
To quantify how a period of intense media coverage of controversy over the risk:benefit balance of statins affected their use. Interrupted time series analysis of prospectively collected electronic data from primary care. Clinical Practice Research Datalink (CPRD) in the United Kingdom. Patients newly eligible for or currently taking statins for primary and secondary cardiovascular disease prevention in each month in January 2011-March 2015. Adjusted odds ratios for starting/stopping taking statins after the media coverage (October 2013-March 2014). There was no evidence that the period of high media coverage was associated with changes in statin initiation among patients with a high recorded risk score for cardiovascular disease (primary prevention) or a recent cardiovascular event (secondary prevention) (odds ratio 0.99 (95% confidence interval 0.87 to 1.13; P=0.92) and 1.04 (0.92 to 1.18; P=0.54), respectively), though there was a decrease in the overall proportion of patients with a recorded risk score. Patients already taking statins were more likely to stop taking them for both primary and secondary prevention after the high media coverage period (1.11 (1.05 to 1.18; P<0.001) and 1.12 (1.04 to 1.21; P=0.003), respectively). Stratified analyses showed that older patients and those with a longer continuous prescription were more likely to stop taking statins after the media coverage. In post hoc analyses, the increased rates of cessation were no longer observed after six months. A period of intense public discussion over the risks:benefit balance of statins, covered widely in the media, was followed by a transient rise in the proportion of people who stopped taking statins. This research highlights the potential for widely covered health stories in the lay media to impact on healthcare related behaviour. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Lane, Tyler J; Gray, Shannon; Hassani-Mahmooei, Behrooz; Collie, Alex
2018-01-05
Early intervention following occupational injury can improve health outcomes and reduce the duration and cost of workers' compensation claims. Financial early reporting incentives (ERIs) for employers may shorten the time between injury and access to compensation benefits and services. We examined ERI effect on time spent in the claim lodgement process in two Australian states: South Australia (SA), which introduced them in January 2009, and Tasmania (TAS), which introduced them in July 2010. Using administrative records of 1.47 million claims lodged between July 2006 and June 2012, we conducted an interrupted time series study of ERI impact on monthly median days in the claim lodgement process. Time periods included claim reporting, insurer decision, and total time. The 18-month gap in implementation between the states allowed for a multiple baseline design. In SA, we analysed periods within claim reporting: worker and employer reporting times (similar data were not available in TAS). To account for external threats to validity, we examined impact in reference to a comparator of other Australian workers' compensation jurisdictions. Total time in the process did not immediately change, though trend significantly decreased in both jurisdictions (SA: -0.36 days per month, 95% CI -0.63 to -0.09; TAS: 0.35, -0.50 to -0.20). Claim reporting time also decreased in both (SA: -1.6 days, -2.4 to -0.8; TAS: -5.4, -7.4 to -3.3). In TAS, there was a significant increase in insurer decision time (4.6, 3.9 to 5.4) and a similar but non-significant pattern in SA. In SA, worker reporting time significantly decreased (-4.7, -5.8 to -3.5), but employer reporting time did not (-0.3, -0.8 to 0.2). The results suggest that ERIs reduced claim lodgement time and, in the long-term, reduced total time in the claim lodgement process. However, only worker reporting time significantly decreased in SA, indicating that ERIs may not have shortened the process through the intended target of
Leopold, Christine; Zhang, Fang; Mantel-Teeuwisse, Aukje K; Vogler, Sabine; Valkova, Silvia; Ross-Degnan, Dennis; Wagner, Anita K
2014-07-25
To analyze the impacts of pharmaceutical sector policies implemented to contain country spending during the economic recession--a reference price system in Finland and a mix of policies including changes in reimbursement rates, a generic promotion campaign and discounts granted to the public payer in Portugal - on utilization of, as a proxy for access to, antipsychotic medicines. We obtained monthly IMS Health sales data in standard units of antipsychotic medicines in Portugal and Finland for the period January 2007 to December 2011. We used an interrupted time series design to estimate changes in overall use and generic market shares by comparing pre-policy and post-policy levels and trends. Both countries' policy approaches were associated with slight, likely unintended, decreases in overall use of antipsychotic medicines and with increases in generic market shares of major antipsychotic products. In Finland, quetiapine and risperidone generic market shares increased substantially (estimates one year post-policy compared to before, quetiapine: 6.80% [3.92%, 9.68%]; risperidone: 11.13% [6.79%, 15.48%]. The policy interventions in Portugal resulted in a substantially increased generic market share for amisulpride (estimate one year post-policy compared to before: 22.95% [21.01%, 24.90%]; generic risperidone already dominated the market prior to the policy interventions. Different policy approaches to contain pharmaceutical expenditures in times of the economic recession in Finland and Portugal had intended--increased use of generics--and likely unintended--slightly decreased overall sales, possibly consistent with decreased access to needed medicines--impacts. These findings highlight the importance of monitoring and evaluating the effects of pharmaceutical policy interventions on use of medicines and health outcomes.
Helder, Onno K; Brug, Johannes; van Goudoever, Johannes B; Looman, Caspar W N; Reiss, Irwin K M; Kornelisse, René F
2014-07-01
Sustained high compliance with hand hygiene (HH) is needed to reduce nosocomial bloodstream infections (NBSIs). However, over time, a wash out effect often occurs. We studied the long-term effect of sequential HH-promoting interventions. An observational study with an interrupted time series analysis of the occurrence of NBSI was performed in very low-birth weight (VLBW) infants. Interventions consisted of an education program, gain-framed screen saver messages, and an infection prevention week with an introduction on consistent glove use. A total of 1,964 VLBW infants admitted between January 1, 2002, and December 31, 2011, were studied. The proportion of infants with ≥1 NBSI decreased from 47.6%-21.2% (P Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.
Harper, Sam; Bruckner, Tim A
2017-07-01
Research suggests that the Great Recession of 2007-2009 led to nearly 5000 excess suicides in the United States. However, prior work has not accounted for seasonal patterning and unique suicide trends by age and gender. We calculated monthly suicide rates from 1999 to 2013 for men and women aged 15 and above. Suicide rates before the Great Recession were used to predict the rate during and after the Great Recession. Death rates for each age-gender group were modeled using Poisson regression with robust variance, accounting for seasonal and nonlinear suicide trajectories. There were 56,658 suicide deaths during the Great Recession. Age- and gender-specific suicide trends before the recession demonstrated clear seasonal and nonlinear trajectories. Our models predicted 57,140 expected suicide deaths, leading to 482 fewer observed than expected suicides (95% confidence interval -2079, 943). We found little evidence to suggest that the Great Recession interrupted existing trajectories of suicide rates. Suicide rates were already increasing before the Great Recession for middle-aged men and women. Future studies estimating the impact of recessions on suicide should account for the diverse and unique suicide trajectories of different social groups. Copyright © 2017 Elsevier Inc. All rights reserved.
Laliotis, Ioannis; Ioannidis, John P A; Stavropoulou, Charitini
2016-12-01
Greece was one of the countries hit the hardest by the 2008 financial crisis in Europe. Yet, evidence on the effect of the crisis on total and cause-specific mortality remains unclear. We explored whether the economic crisis affected the trend of overall and cause-specific mortality rates. We used regional panel data from the Hellenic Statistical Authority to assess mortality trends by age, sex, region, and cause in Greece between January, 2001, and December, 2013. We used Eurostat data to calculate monthly age-standardised mortality rates per 100 000 inhabitants for each region. Data were divided into two subperiods: before the crisis (January, 2001, to August, 2008) and after the onset of the crisis (September, 2008, to December, 2013). We tested for changes in the slope of mortality by doing an interrupted time-series analysis. Overall mortality continued to decline after the onset of the financial crisis (-0·065, 95% CI -0·080 to -0·049), but at a slower pace than before the crisis (-0·13, -0·15 to -0·10; trend difference 0·062, 95% CI 0·041 to 0·083; pperiod after the onset of the crisis with extrapolated values based on the period before the crisis, we estimate that an extra 242 deaths per month occurred after the onset of the crisis. Mortality trends have been interrupted after the onset of compared with before the crisis, but changes vary by age, sex, and cause of death. The increase in deaths due to adverse events during medical treatment might reflect the effects of deterioration in quality of care during economic recessions. None. Copyright © 2016 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY license. Published by Elsevier Ltd.. All rights reserved.
Hassett, Leanne; Simpson, Grahame; Cotter, Rachel; Whiting, Diane; Hodgkinson, Adeline; Martin, Diane
2015-04-01
To investigate whether the introduction of an electronic goals system followed by staff training improved the quality, rating, framing and structure of goals written by a community-based brain injury rehabilitation team. Interrupted time series design. Two interventions were introduced six months apart. The first intervention comprised the introduction of an electronic goals system. The second intervention comprised a staff goal training workshop. An audit protocol was devised to evaluate the goals. A random selection of goal statements from the 12 months prior to the interventions (Time 1 baseline) were compared with all goal statements written after the introduction of the electronic goals system (Time 2) and staff training (Time 3). All goals were de-identified for client and time-period, and randomly ordered. A total of 745 goals (Time 1 n = 242; Time 2 n = 283; Time 3 n = 220) were evaluated. Compared with baseline, the introduction of the electronic goals system alone significantly increased goal rating, framing and structure (χ(2) tests 144.7, 18.9, 48.1, respectively, p goal quality, which was only a trend at Time 2, was statistically significant at Time 3 (χ(2) 15.0, p ≤ 001). The training also led to a further significant increase in the framing and structuring of goals over the electronic goals system (χ(2) 11.5, 12.5, respectively, p ≤ 0.001). An electronic goals system combined with staff training improved the quality, rating, framing and structure of goal statements. © The Author(s) 2014.
Directory of Open Access Journals (Sweden)
Amy B. Martin
2012-09-01
Full Text Available Disasters serve as shocks and precipitate unanticipated disturbances to the health care system. Public health surveillance is generally focused on monitoring latent health and environmental exposure effects, rather than health system performance in response to these local shocks. The following intervention study sought to determine the long-term effects of the 2005 chlorine spill in Graniteville, South Carolina on primary care access for vulnerable populations. We used an interrupted time-series approach to model monthly visits for Ambulatory Care Sensitive Conditions, an indicator of unmet primary care need, to quantify the impact of the disaster on unmet primary care need in Medicaid beneficiaries. The results showed Medicaid beneficiaries in the directly impacted service area experienced improved access to primary care in the 24 months post-disaster. We provide evidence that a health system serving the medically underserved can prove resilient and display improved adaptive capacity under adverse circumstances (i.e., technological disasters to ensure access to primary care for vulnerable sub-groups. The results suggests a new application for ambulatory care sensitive conditions as a population-based metric to advance anecdotal evidence of secondary surge and evaluate pre- and post-health system surge capacity following a disaster.
Tavares, Margarida; Carvalho, Ana Cláudia; Almeida, José Pedro; Andrade, Paulo; São-Simão, Ricardo; Soares, Pedro; Alves, Carlos; Pinto, Rui; Fontanet, Arnaud; Watier, Laurence
2018-06-01
A prospective audit and feedback antimicrobial stewardship intervention conducted in the Orthopaedics Department of a university hospital in Portugal was evaluated by comparing an interrupted time series in the intervention group with a non-intervention (control) group. Monthly antibiotic use (except cefazolin) was measured as the World Health Organization's Anatomical Therapeutic Chemical defined daily doses (ATC-DDD) from January 2012 to September 2016, excluding the 6-month phase of intervention implementation starting on 1 January 2015. Compared with the control group, the intervention group had a monthly decrease in the use of fluoroquinolones by 2.3 DDD/1000 patient-days [95% confidence interval (CI) -3.97 to -0.63]. An increase in the use of penicillins by 103.3 DDD/1000 patient-days (95% CI 47.42 to 159.10) was associated with intervention implementation, followed by a decrease during the intervention period (slope = -5.2, 95% CI -8.56 to -1.82). In the challenging scenario of treatment of osteoarticular and prosthetic joint infections, an audit and feedback intervention reduced antibiotic exposure and spectrum. Copyright © 2018 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.
Helder, O.K.; Brug, J.; van Goudoever, J.B.; Looman, C.W.N.; Reiss, I.K.M.; Kornelisse, R.F.
2014-01-01
Background Sustained high compliance with hand hygiene (HH) is needed to reduce nosocomial bloodstream infections (NBSIs). However, over time, a wash out effect often occurs. We studied the long-term effect of sequential HH-promoting interventions. Methods An observational study with an interrupted
Cordtz, René Lindholm; Hawley, Samuel; Prieto-Alhambra, Daniel; Højgaard, Pil; Zobbe, Kristian; Overgaard, Søren; Odgaard, Anders; Kristensen, Lars Erik; Dreyer, Lene
2018-05-01
To study the impact of the introduction of biological disease-modifying anti-rheumatic drugs (bDMARDs) and associated rheumatoid arthritis (RA) management guidelines on the incidence of total hip (THR) and knee replacements (TKR) in Denmark. Nationwide register-based cohort and interrupted time-series analysis. Patients with incident RA between 1996 and 2011 were identified in the Danish National Patient Register. Patients with RA were matched on age, sex and municipality with up to 10 general population comparators (GPCs). Standardised 5-year incidence rates of THR and TKR per 1000 person-years were calculated for patients with RA and GPCs in 6-month periods. Levels and trends in the pre-bDMARD (1996-2001) were compared with the bDMARD era (2003-2016) using segmented linear regression interrupted by a 1-year lag period (2002). We identified 30 404 patients with incident RA and 297 916 GPCs. In 1996, the incidence rate of THR and TKR was 8.72 and 5.87, respectively, among patients with RA, and 2.89 and 0.42 in GPCs. From 1996 to 2016, the incidence rate of THR decreased among patients with RA, but increased among GPCs. Among patients with RA, the incidence rate of TKR increased from 1996 to 2001, but started to decrease from 2003 and throughout the bDMARD era. The incidence of TKR increased among GPCs from 1996 to 2016. We report that the incidence rate of THR and TKR was 3-fold and 14-fold higher, respectively among patients with RA compared with GPCs in 1996. In patients with RA, introduction of bDMARDs was associated with a decreasing incidence rate of TKR, whereas the incidence of THR had started to decrease before bDMARD introduction. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Brals, Daniëlla; Aderibigbe, Sunday A; Wit, Ferdinand W; van Ophem, Johannes C M; van der List, Marijn; Osagbemi, Gordon K; Hendriks, Marleen E; Akande, Tanimola M; Boele van Hensbroek, Michael; Schultsz, Constance
2017-09-01
Access to quality obstetric care is considered essential to reducing maternal and new-born mortality. We evaluated the effect of the introduction of a multifaceted voluntary health insurance programme on hospital deliveries in rural Nigeria. We used an interrupted time-series design, including a control group. The intervention consisted of providing voluntary health insurance covering primary and secondary healthcare, including antenatal and obstetric care, combined with improving the quality of healthcare facilities. We compared changes in hospital deliveries from 1 May 2005 to 30 April 2013 between the programme area and control area in a difference-in-differences analysis with multiple time periods, adjusting for observed confounders. Data were collected through household surveys. Eligible households ( n = 1500) were selected from a stratified probability sample of enumeration areas. All deliveries during the 4-year baseline period ( n = 460) and 4-year follow-up period ( n = 380) were included. Insurance coverage increased from 0% before the insurance was introduced to 70.2% in April 2013 in the programme area. In the control area insurance coverage remained 0% between May 2005 and April 2013. Although hospital deliveries followed a common stable trend over the 4 pre-programme years ( P = 0.89), the increase in hospital deliveries during the 4-year follow-up period in the programme area was 29.3 percentage points (95% CI: 16.1 to 42.6; P health insurance but who could make use of the upgraded care delivered significantly more often in a hospital during the follow-up period than women living in the control area ( P = 0.04). Voluntary health insurance combined with quality healthcare services is highly effective in increasing hospital deliveries in rural Nigeria, by improving access to healthcare for insured and uninsured women in the programme area. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and
Hopewell, Sally; Ravaud, Philippe; Baron, Gabriel; Boutron, Isabelle
2012-06-22
To investigate the effect of the CONSORT for Abstracts guidelines, and different editorial policies used by five leading general medical journals to implement the guidelines, on the reporting quality of abstracts of randomised trials. Interrupted time series analysis. We randomly selected up to 60 primary reports of randomised trials per journal per year from five high impact, general medical journals in 2006-09, if indexed in PubMed with an electronic abstract. We excluded reports that did not include an electronic abstract, and any secondary trial publications or economic analyses. We classified journals in three categories: those not mentioning the guidelines in their instructions to authors (JAMA and New England Journal of Medicine), those referring to the guidelines in their instructions to authors but with no specific policy to implement them (BMJ), and those referring to the guidelines in their instructions to authors with an active policy to implement them (Annals of Internal Medicine and Lancet). Two authors extracted data independently using the CONSORT for Abstracts checklist. Mean number of CONSORT items reported in selected abstracts, among nine items reported in fewer than 50% of the abstracts published across the five journals in 2006. We assessed 955 reports of abstracts of randomised trials. Journals with an active policy to enforce the guidelines showed an immediate increase in the level of mean number of items reported (increase of 1.50 items; P=0.0037). At 23 months after publication of the guidelines, the mean number of items reported per abstract for the primary outcome was 5.41 of nine items, a 53% increase compared with the expected level estimated on the basis of pre-intervention trends. The change in level or trend did not increase in journals with no policy to enforce the guidelines (BMJ, JAMA, and New England Journal of Medicine). Active implementation of the CONSORT for Abstracts guidelines by journals can lead to improvements in the
Allen, Chenoa D; McNeely, Clea A
2017-10-01
In the United States, there is concern that recent state laws restricting undocumented immigrants' rights could threaten access to Medicaid and the Children's Health Insurance Program (CHIP) for citizen children of immigrant parents. Of particular concern are omnibus immigration laws, state laws that include multiple provisions increasing immigration enforcement and restricting rights for undocumented immigrants. These laws could limit Medicaid/CHIP access for citizen children in immigrant families by creating misinformation about their eligibility and fostering fear and mistrust of government among immigrant parents. This study uses nationally-representative data from the National Health Interview Survey (2005-2014; n = 70,187) and comparative interrupted time series methods to assess whether passage of state omnibus immigration laws reduced access to Medicaid/CHIP for US citizen Latino children. We found that law passage did not reduce enrollment for children with noncitizen parents and actually resulted in temporary increases in coverage among Latino children with at least one citizen parent. These findings are surprising in light of prior research. We offer potential explanations for this finding and conclude with a call for future research to be expanded in three ways: 1) examine whether policy effects vary for children of undocumented parents, compared to children whose noncitizen parents are legally present; 2) examine the joint effects of immigration-related policies at different levels, from the city or county to the state to the federal; and 3) draw on the large social movements and political mobilization literature that describes when and how Latinos and immigrants push back against restrictive immigration laws. Copyright © 2017 Elsevier Ltd. All rights reserved.
Zombré, David; De Allegri, Manuela; Ridde, Valéry
2017-04-01
Little is known about the long-term effects of user fee exemption policies on health care use in developing countries. We examined the association between user fee exemption and health care use among children under five in Burkina Faso. We also examined how factors related to characteristics of health facilities and their environment moderate this association. We used a multilevel controlled interrupted time-series design to examine the strength of effect and long term effects of user fee exemption policy on the rate of health service utilization in children under five between January 2004 and December 2014. The initiation of the intervention more than doubled the utilization rate with an immediate 132.596% increase in intervention facilities (IRR: 2.326; 95% CI: 1.980 to 2.672). The effect of the intervention was 32.766% higher in facilities with higher workforce density (IRR: 1.328; 95% CI (1.209-1.446)) and during the rainy season (IRR:1.2001; 95% CI: 1.0953-1.3149), but not significant in facilities with higher dispersed populations (IRR: 1.075; 95% CI: (0.942-1.207)). Although the intervention effect was substantially significant immediately following its inception, the pace of growth, while positive over a first phase, decelerated to stabilize itself three years and 7 months later before starting to decrease slowly towards the end of the study period. This study provides additional evidence to support user fee exemption policies complemented by improvements in health care quality. Future work should include an assessment of the impact of user fee exemption on infant morbidity and mortality and better discuss factors that could explain the slowdown in this upward trend of utilization rates three and a half years after the intervention onset. Copyright © 2017. Published by Elsevier Ltd.
Hungerford, Daniel; Vivancos, Roberto; Read, Jonathan M; Iturriza-Gόmara, Miren; French, Neil; Cunliffe, Nigel A
2018-01-29
Rotavirus causes severe gastroenteritis in infants and young children worldwide. The UK introduced the monovalent rotavirus vaccine (Rotarix®) in July 2013. Vaccination is free of charge to parents, with two doses delivered at 8 and 12 weeks of age. We evaluated vaccine impact across a health system in relation to socioeconomic deprivation. We used interrupted time-series analyses to assess changes in monthly health-care attendances in Merseyside, UK, for all ages, from July 2013 to June 2016, compared to predicted counterfactual attendances without vaccination spanning 3-11 years pre-vaccine. Outcome measures included laboratory-confirmed rotavirus gastroenteritis (RVGE) hospitalisations, acute gastroenteritis (AGE) hospitalisations, emergency department (ED) attendances for gastrointestinal conditions and consultations for infectious gastroenteritis at community walk-in centres (WIC) and general practices (GP). All analyses were stratified by age. Hospitalisations were additionally stratified by vaccine uptake and small-area-level socioeconomic deprivation. The uptake of the first and second doses of rotavirus vaccine was 91.4% (29,108/31,836) and 86.7% (27,594/31,836), respectively. Among children aged impact was greatest during the rotavirus season and for vaccine-eligible age groups. In adults aged 65+ years, AGE hospitalisations fell by 25% (95% CI 19-30%; p socioeconomically deprived communities (adjusted incident rate ratio 1.57; 95% CI 1.51-1.64; p impact was greatest among the most deprived populations, despite lower vaccine uptake. Prioritising vaccine uptake in socioeconomically deprived communities should give the greatest health benefit in terms of population disease burden.
Thomsen, Jakob Louis Demant; Mathiesen, Ole; Hägi-Pedersen, Daniel; Skovgaard, Lene Theil; Østergaard, Doris; Engbaek, Jens; Gätke, Mona Ring
2017-10-06
Muscle relaxants facilitate endotracheal intubation under general anesthesia and improve surgical conditions. Residual neuromuscular blockade occurs when the patient is still partially paralyzed when awakened after surgery. The condition is associated with subjective discomfort and an increased risk of respiratory complications. Use of an objective neuromuscular monitoring device may prevent residual block. Despite this, many anesthetists refrain from using the device. Efforts to increase the use of objective monitoring are time consuming and require the presence of expert personnel. A neuromuscular monitoring e-learning module might support consistent use of neuromuscular monitoring devices. The aim of the study is to assess the effect of a neuromuscular monitoring e-learning module on anesthesia staff's use of objective neuromuscular monitoring and the incidence of residual neuromuscular blockade in surgical patients at 6 Danish teaching hospitals. In this interrupted time series study, we are collecting data repeatedly, in consecutive 3-week periods, before and after the intervention, and we will analyze the effect using segmented regression analysis. Anesthesia departments in the Zealand Region of Denmark are included, and data from all patients receiving a muscle relaxant are collected from the anesthesia information management system MetaVision. We will assess the effect of the module on all levels of potential effect: staff's knowledge and skills, patient care practice, and patient outcomes. The primary outcome is use of neuromuscular monitoring in patients according to the type of muscle relaxant received. Secondary outcomes include last recorded train-of-four value, administration of reversal agents, and time to discharge from the postanesthesia care unit as well as a multiple-choice test to assess knowledge. The e-learning module was developed based on a needs assessment process, including focus group interviews, surveys, and expert opinions. The e
McLintock, Kate; Russell, Amy M; Alderson, Sarah L; West, Robert; House, Allan; Westerman, Karen; Foy, Robbie
2014-08-20
To evaluate the effects of Quality and Outcomes Framework (QOF) incentivised case finding for depression on diagnosis and treatment in targeted and non-targeted long-term conditions. Interrupted time series analysis. General practices in Leeds, UK. 65 (58%) of 112 general practices shared data on 37,229 patients with diabetes and coronary heart disease targeted by case finding incentives, and 101,008 patients with four other long-term conditions not targeted (hypertension, epilepsy, chronic obstructive pulmonary disease and asthma). Incentivised case finding for depression using two standard screening questions. Clinical codes indicating new depression-related diagnoses and new prescriptions of antidepressants. We extracted routinely recorded data from February 2002 through April 2012. The number of new diagnoses and prescriptions for those on registers was modelled with a binomial regression, which provided the strength of associations between time periods and their rates. New diagnoses of depression increased from 21 to 94/100,000 per month in targeted patients between the periods 2002-2004 and 2007-2011 (OR 2.09; 1.92 to 2.27). The rate increased from 27 to 77/100,000 per month in non-targeted patients (OR 1.53; 1.46 to 1.62). The slopes in prescribing for both groups flattened to zero immediately after QOF was introduced but before incentivised case finding (p<0.01 for both). Antidepressant prescribing in targeted patients returned to the pre-QOF secular upward trend (Wald test for equivalence of slope, z=0.73, p=0.47); the slope was less steep for non-targeted patients (z=-4.14, p<0.01). Incentivised case finding increased new depression-related diagnoses. The establishment of QOF disrupted rising trends in new prescriptions of antidepressants, which resumed following the introduction of incentivised case finding. Prescribing trends are of concern given that they may include people with mild-to-moderate depression unlikely to respond to such treatment
Directory of Open Access Journals (Sweden)
Sham Lal
Full Text Available Malaria endemic countries have scaled-up community health worker (CHW interventions, to diagnose and treat malaria in communities with limited access to public health systems. The evaluations of these programmes have centred on CHW's compliance to guidelines, but the broader changes at public health centres including utilisation and diagnoses made, has received limited attention.This analysis was conducted during a CHW-intervention for malaria in Rukungiri District, Western Uganda. Outpatient department (OPD visit data were collected for children under-5 attending three health centres one year before the CHW-intervention started (pre-intervention period and for 20 months during the intervention (intervention-period. An interrupted time series analysis with segmented regression models was used to compare the trends in malaria, non-malaria and overall OPD visits during the pre-intervention and intervention-period.The introduction of a CHW-intervention suggested the frequency of diagnoses of diarrhoeal diseases, pneumonia and helminths increased, whilst the frequency of malaria diagnoses declined at health centres. In May 2010 when the intervention began, overall health centre utilisation decreased by 63% compared to the pre-intervention period and the health centres saw 32 fewer overall visits per month compared to the pre-intervention period (p<0.001. Malaria visits also declined shortly after the intervention began and there were 27 fewer visits per month during the intervention-period compared with the pre-intervention period (p<0.05. The declines in overall and malaria visits were sustained for the entire intervention-period. In contrast, there were no observable changes in trends of non-malarial visits between the pre-intervention and intervention-period.This analysis suggests introducing a CHW-intervention can reduce the number of child malaria visits and change the profile of cases presenting at health centres. The reduction in workload of
Morales, Daniel R; Donnan, Peter T; Daly, Fergus; Staa, Tjeerd Van; Sullivan, Frank M
2013-01-01
Objectives To measure the incidence of Bell's palsy and determine the impact of clinical trial findings on Bell's palsy management in the UK. Design Interrupted time series regression analysis and incidence measures. Setting General practices in the UK contributing to the Clinical Practice Research Datalink (CPRD). Participants Patients ≥16 years with a diagnosis of Bell's palsy between 2001 and 2012. Interventions (1) Publication of the 2004 Cochrane reviews of clinical trials on corticosteroids and antivirals for Bell's palsy, which made no clear recommendation on their use and (2) publication of the 2007 Scottish Bell's Palsy Study (SBPS), which made a clear recommendation that treatment with prednisolone alone improves chances for complete recovery. Main outcome measures Incidence of Bell's palsy per 100 000 person-years. Changes in the management of Bell's palsy with either prednisolone therapy, antiviral therapy, combination therapy (prednisolone with antiviral therapy) or untreated cases. Results During the 12-year period, 14 460 cases of Bell's palsy were identified with an overall incidence of 37.7/100 000 person-years. The 2004 Cochrane reviews were associated with immediate falls in prednisolone therapy (−6.3% (−11.0 to −1.6)), rising trends in combination therapy (1.1% per quarter (0.5 to 1.7)) and falling trends for untreated cases (−0.8% per quarter (−1.4 to −0.3)). SBPS was associated with immediate increases in prednisolone therapy (5.1% (0.9 to 9.3)) and rising trends in prednisolone therapy (0.7% per quarter (0.4 to 1.2)); falling trends in combination therapy (−1.7% per quarter (−2.2 to −1.3)); and rising trends for untreated cases (1.2% per quarter (0.8 to 1.6)). Despite improvements, 44% still remain untreated. Conclusions SBPS was clearly associated with change in management, but a significant proportion of patients failed to receive effective treatment, which cannot be fully explained. Clarity and uncertainty in
Directory of Open Access Journals (Sweden)
Kang-Cheng Su
Full Text Available To investigate the effect of a simplified prevention bundle with alcohol-based, dual hand hygiene (HH audit on the incidence of early-onset ventilation-associated pneumonia (VAP.This 3-year, quasi-experimental study with interrupted time-series analysis was conducted in two cardiovascular surgery intensive care units in a medical center. Unaware external HH audit (eHH performed by non-unit-based observers was a routine task before and after bundle implementation. Based on the realistic ICU settings, we implemented a 3-component bundle, which included: a compulsory education program, a knowing internal HH audit (iHH performed by unit-based observers, and a standardized oral care (OC protocol with 0.1% chlorhexidine gluconate. The study periods comprised 4 phases: 12-month pre-implementation phase 1 (eHH+/education-/iHH-/OC-, 3-month run-in phase 2 (eHH+/education+/iHH+/OC+, 15-month implementation phase 3 (eHH+/education+/iHH+/OC+, and 6-month post-implementation phase 4 (eHH+/education-/iHH+/OC-.A total of 2553 ventilator-days were observed. VAP incidences (events/1000 ventilator days in phase 1-4 were 39.1, 40.5, 15.9, and 20.4, respectively. VAP was significantly reduced by 59% in phase 3 (vs. phase 1, incidence rate ratio [IRR] 0.41, P = 0.002, but rebounded in phase 4. Moreover, VAP incidence was inversely correlated to compliance of OC (r2 = 0.531, P = 0.001 and eHH (r2 = 0.878, P < 0.001, but not applied for iHH, despite iHH compliance was higher than eHH compliance during phase 2 to 4. Compared to eHH, iHH provided more efficient and faster improvements for standard HH practice. The minimal compliances required for significant VAP reduction were 85% and 75% for OC and eHH (both P < 0.05, IRR 0.28 and 0.42, respectively.This simplified prevention bundle effectively reduces early-onset VAP incidence. An unaware HH compliance correlates with VAP incidence. A knowing HH audit provides better improvement in HH practice. Accordingly, we suggest
Bergen, Helen; Simkin, Sue; Dodd, Sue; Pocock, Phil; Bernal, William; Gunnell, David; Kapur, Navneet
2013-01-01
Objective To assess the long term effect of United Kingdom legislation introduced in September 1998 to restrict pack sizes of paracetamol on deaths from paracetamol poisoning and liver unit activity. Design Interrupted time series analyses to assess mean quarterly changes from October 1998 to the end of 2009 relative to projected deaths without the legislation based on pre-legislation trends. Setting Mortality (1993-2009) and liver unit activity (1995-2009) in England and Wales, using information from the Office for National Statistics and NHS Blood and Transplant, respectively. Participants Residents of England and Wales. Main outcome measures Suicide, deaths of undetermined intent, and accidental poisoning deaths involving single drug ingestion of paracetamol and paracetamol compounds in people aged 10 years and over, and liver unit registrations and transplantations for paracetamol induced hepatotoxicity. Results Compared with the pre-legislation level, following the legislation there was an estimated average reduction of 17 (95% confidence interval −25 to −9) deaths per quarter in England and Wales involving paracetamol alone (with or without alcohol) that received suicide or undetermined verdicts. This decrease represented a 43% reduction or an estimated 765 fewer deaths over the 11¼ years after the legislation. A similar effect was found when accidental poisoning deaths were included, and when a conservative method of analysis was used. This decrease was largely unaltered after controlling for a non-significant reduction in deaths involving other methods of poisoning and also suicides by all methods. There was a 61% reduction in registrations for liver transplantation for paracetamol induced hepatotoxicity (−11 (−20 to −1) registrations per quarter). But no reduction was seen in actual transplantations (−3 (−12 to 6)), nor in registrations after a conservative method of analysis was used. Conclusions UK legislation to reduce pack sizes of
Rotter, Thomas; Plishka, Christopher; Hansia, Mohammed Rashaad; Goodridge, Donna; Penz, Erika; Kinsman, Leigh; Lawal, Adegboyega; O'Quinn, Sheryl; Buchan, Nancy; Comfort, Patricia; Patel, Prakesh; Anderson, Sheila; Winkel, Tanya; Lang, Rae Lynn; Marciniuk, Darcy D
2017-11-28
Chronic obstructive pulmonary disease (COPD) has substantial economic and human costs; it is expected to be the third leading cause of death worldwide by 2030. To minimize these costs high quality guidelines have been developed. However, guidelines alone rarely result in meaningful change. One method of integrating guidelines into practice is the use of clinical pathways (CPWs). CPWs bring available evidence to a range of healthcare professionals by detailing the essential steps in care and adapting guidelines to the local context. We are working with local stakeholders to develop CPWs for COPD with the aims of improving care while reducing utilization. The CPWs will employ several steps including: standardizing diagnostic training, unifying components of chronic disease care, coordinating education and reconditioning programs, and ensuring care uses best practices. Further, we have worked to identify evidence-informed implementation strategies which will be tailored to the local context. We will conduct a three-year research project using an interrupted time series (ITS) design in the form of a multiple baseline approach with control groups. The CPW will be implemented in two health regions (experimental groups) and two health regions will act as controls (control groups). The experimental and control groups will each contain an urban and rural health region. Primary outcomes for the study will be quality of care operationalized using hospital readmission rates and emergency department (ED) presentation rates. Secondary outcomes will be healthcare utilization and guideline adherence, operationalized using hospital admission rates, hospital length of stay and general practitioner (GP) visits. Results will be analyzed using segmented regression analysis. Funding has been procured from multiple stakeholders. The project has been deemed exempt from ethics review as it is a quality improvement project. Intervention implementation is expected to begin in summer of 2017
Booth, Richard G; Allen, Britney N; Bray Jenkyn, Krista M; Li, Lihua; Shariff, Salimah Z
2018-04-06
Despite the uptake of mass media campaigns, their overall impact remains unclear. Since 2011, a Canadian telecommunications company has operated an annual, large-scale mental health advocacy campaign (Bell Let's Talk) focused on mental health awareness and stigma reduction. In February 2012, the campaign began to explicitly leverage the social media platform Twitter and incented participation from the public by promising donations of Can $0.05 for each interaction with a campaign-specific username (@Bell_LetsTalk). The intent of the study was to examine the impact of this 2012 campaign on youth outpatient mental health services in the province of Ontario, Canada. Monthly outpatient mental health visits (primary health care and psychiatric services) were obtained for the Ontario youth aged 10 to 24 years (approximately 5.66 million visits) from January 1, 2006 to December 31, 2015. Interrupted time series, autoregressive integrated moving average modeling was implemented to evaluate the impact of the campaign on rates of monthly outpatient mental health visits. A lagged intervention date of April 1, 2012 was selected to account for the delay required for a patient to schedule and attend a mental health-related physician visit. The inclusion of Twitter into the 2012 Bell Let's Talk campaign was temporally associated with an increase in outpatient mental health utilization for both males and females. Within primary health care environments, female adolescents aged 10 to 17 years experienced a monthly increase in the mental health visit rate from 10.2/1000 in April 2006 to 14.1/1000 in April 2015 (slope change of 0.094 following campaign, Pcampaign, Pcampaign (slope change of 0.005, P=.02; slope change of 0.003, P=.005, respectively). For young adults aged 18 to 24 years, females who used primary health care experienced the most significant increases in mental health visit rates from 26.5/1000 in April 2006 to 29.2/1000 in April 2015 (slope change of 0.17 following
Molinari, Noelle-Angelique M; LeBlanc, Tanya Telfair; Stephens, William
2018-03-20
The first Ebola virus disease (EVD) case in the United States (US) was confirmed September 30, 2014 in a man 45 years old. This event created considerable media attention and there was fear of an EVD outbreak in the US. This study examined whether emergency department (ED) visits changed in metropolitan Dallas-Fort Worth--, Texas (DFW) after this EVD case was confirmed. Using Texas Health Services Region 2/3 syndromic surveillance data and focusing on DFW, interrupted time series analyses were conducted using segmented regression models with autoregressive errors for overall ED visits and rates of several chief complaints, including fever with gastrointestinal distress (FGI). Date of fatal case confirmation was the "event." Results indicated the event was highly significant for ED visits overall (Pcapacity as well as for public health messaging in the wake of a public health emergency.
van der Waal, Zelda; Rushton, Steven; Rankin, Judith
2018-01-01
Objectives To determine whether introduction or withdrawal of a maternal financial incentive was associated with changes in timing of first attendance for antenatal care (‘booking’), or incidence of small for gestational age. Design A natural experimental evaluation using interrupted time series analysis. Setting A hospital-based maternity unit in the north of England. Participants 34 589 women (and their live-born babies) who delivered at the study hospital and completed the 25th week of pregnancy in the 75 months before (January 2003 to March 2009), 21 months during (April 2009 to December 2010) and 36 months after (January 2011 to December 2013) the incentive was available. Intervention The Health in Pregnancy Grant was a financial incentive of £190 ($235; €211) payable to pregnant women in the UK from the 25th week of pregnancy, contingent on them receiving routine antenatal care. Primary and secondary outcome measures The primary outcome was mean gestational age at booking. Secondary outcomes were proportion of women booking by 10, 18 and 25 weeks’ gestation; and proportion of babies that were small for gestational age. Results By 21 months after introduction of the grant (ie, immediately prior to withdrawal), compared with what was predicted given prior trends, there was an reduction in mean gestational age at booking of 4.8 days (95% CI 2.3 to 8.2). The comparable figure for 24 months after withdrawal was an increase of 14.0 days (95% CI 2.8 to 16.8). No changes in incidence of small for gestational age babies were seen. Conclusions The introduction of a universal financial incentive for timely attendance at antenatal care was associated with a reduction in mean gestational age at first attendance, but not the proportion of babies that were small for gestational age. Future research should explore the effects of incentives offered at different times in pregnancy and of differing values; and how stakeholders view such incentives. PMID:29391362
Adams, Jean; van der Waal, Zelda; Rushton, Steven; Rankin, Judith
2018-01-31
To determine whether introduction or withdrawal of a maternal financial incentive was associated with changes in timing of first attendance for antenatal care ('booking'), or incidence of small for gestational age. A natural experimental evaluation using interrupted time series analysis. A hospital-based maternity unit in the north of England. 34 589 women (and their live-born babies) who delivered at the study hospital and completed the 25th week of pregnancy in the 75 months before (January 2003 to March 2009), 21 months during (April 2009 to December 2010) and 36 months after (January 2011 to December 2013) the incentive was available. The Health in Pregnancy Grant was a financial incentive of £190 ($235; €211) payable to pregnant women in the UK from the 25th week of pregnancy, contingent on them receiving routine antenatal care. The primary outcome was mean gestational age at booking. Secondary outcomes were proportion of women booking by 10, 18 and 25 weeks' gestation; and proportion of babies that were small for gestational age. By 21 months after introduction of the grant (ie, immediately prior to withdrawal), compared with what was predicted given prior trends, there was an reduction in mean gestational age at booking of 4.8 days (95% CI 2.3 to 8.2). The comparable figure for 24 months after withdrawal was an increase of 14.0 days (95% CI 2.8 to 16.8). No changes in incidence of small for gestational age babies were seen. The introduction of a universal financial incentive for timely attendance at antenatal care was associated with a reduction in mean gestational age at first attendance, but not the proportion of babies that were small for gestational age. Future research should explore the effects of incentives offered at different times in pregnancy and of differing values; and how stakeholders view such incentives. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No
Park, Hee-Jung; Lee, Jun Hyup; Park, Sujin; Kim, Tae-Il
2018-02-01
This study utilized a strong quasi-experimental design to test the hypothesis that the implementation of a policy to expand dental care services resulted in an increase in the usage of dental outpatient services. A total of 45,650,000 subjects with diagnoses of gingivitis or advanced periodontitis who received dental scaling were selected and examined, utilizing National Health Insurance claims data from July 2010 through November 2015. We performed a segmented regression analysis of the interrupted time-series to analyze the time-series trend in dental costs before and after the policy implementation, and assessed immediate changes in dental costs. After the policy change was implemented, a statistically significant 18% increase occurred in the observed total dental cost per patient, after adjustment for age, sex, and residence area. In addition, the dental costs of outpatient gingivitis treatment increased immediately by almost 47%, compared with a 15% increase in treatment costs for advanced periodontitis outpatients. This policy effect appears to be sustainable. The introduction of the new policy positively impacted the immediate and long-term outpatient utilization of dental scaling treatment in South Korea. While the policy was intended to entice patients to prevent periodontal disease, thus benefiting the insurance system, our results showed that the policy also increased treatment accessibility for potential periodontal disease patients and may improve long-term periodontal health in the South Korean population.
2018-01-01
Purpose This study utilized a strong quasi-experimental design to test the hypothesis that the implementation of a policy to expand dental care services resulted in an increase in the usage of dental outpatient services. Methods A total of 45,650,000 subjects with diagnoses of gingivitis or advanced periodontitis who received dental scaling were selected and examined, utilizing National Health Insurance claims data from July 2010 through November 2015. We performed a segmented regression analysis of the interrupted time-series to analyze the time-series trend in dental costs before and after the policy implementation, and assessed immediate changes in dental costs. Results After the policy change was implemented, a statistically significant 18% increase occurred in the observed total dental cost per patient, after adjustment for age, sex, and residence area. In addition, the dental costs of outpatient gingivitis treatment increased immediately by almost 47%, compared with a 15% increase in treatment costs for advanced periodontitis outpatients. This policy effect appears to be sustainable. Conclusions The introduction of the new policy positively impacted the immediate and long-term outpatient utilization of dental scaling treatment in South Korea. While the policy was intended to entice patients to prevent periodontal disease, thus benefiting the insurance system, our results showed that the policy also increased treatment accessibility for potential periodontal disease patients and may improve long-term periodontal health in the South Korean population. PMID:29535886
Tung, Yu-Chi; Chang, Guann-Ming; Cheng, Shou-Hsia
2015-01-01
As healthcare spending continues to increase, reimbursement cuts have become 1 type of healthcare reform to contain costs. Little is known about the long-term impact of cuts in reimbursement, especially under a global budget cap with fee-for-service (FFS) reimbursement, on processes and outcomes of care. The FFS-based reimbursement cuts have been implemented since July 2002 in Taiwan. We examined the long-term association of FFS-based reimbursement cuts with trends in processes and outcomes of care for stroke. We analyzed all 411,487 patients with stroke admitted to general acute care hospitals in Taiwan during the period 1997 to 2010 through Taiwan's National Health Insurance Research Database. We used a quasi-experimental design with quarterly measures of healthcare utilization and outcomes and used segmented autoregressive integrated moving average models for the analysis. After accounting for secular trends and other confounders, the implementation of the FFS-based reimbursement cuts was associated with trend changes in computed tomography/magnetic resonance imaging scanning (0.31% per quarter; P=0.013), antiplatelet/anticoagulant use (-0.20% per quarter; Pprocesses and outcomes of care over time. However, the reimbursement cuts from the FFS-based global budget cap are associated with trend changes in processes and outcomes of care for stroke. The FFS-based reimbursement cuts may have long-term positive and negative associations with stroke care. © 2014 American Heart Association, Inc.
Directory of Open Access Journals (Sweden)
Leonardi Giovanni
2011-02-01
Full Text Available Abstract Background Limited evidence suggests that being flooded may increase mortality and morbidity among affected householders not just at the time of the flood but for months afterwards. The objective of this study is to explore the methods for quantifying such long-term health effects of flooding by analysis of routine mortality registrations in England and Wales. Methods Mortality data, geo-referenced by postcode of residence, were linked to a national database of flood events for 1994 to 2005. The ratio of mortality in the post-flood year to that in the pre-flood year within flooded postcodes was compared with that in non-flooded boundary areas (within 5 km of a flood. Further analyses compared the observed number of flood-area deaths in the year after flooding with the number expected from analysis of mortality trends stratified by region, age-group, sex, deprivation group and urban-rural status. Results Among the 319 recorded floods, there were 771 deaths in the year before flooding and 693 deaths in the year after (post-/pre-flood ratio of 0.90, 95% CI 0.82, 1.00. This ratio did not vary substantially by age, sex, population density or deprivation. A similar post-flood 'deficit' of deaths was suggested by the analyses based on observed/expected deaths. Conclusions The observed post-flood 'deficit' of deaths is counter-intuitive and difficult to interpret because of the possible influence of population displacement caused by flooding. The bias that might arise from such displacement remains unquantified but has important implications for future studies that use place of residence as a marker of exposure.
2011-01-01
Background Limited evidence suggests that being flooded may increase mortality and morbidity among affected householders not just at the time of the flood but for months afterwards. The objective of this study is to explore the methods for quantifying such long-term health effects of flooding by analysis of routine mortality registrations in England and Wales. Methods Mortality data, geo-referenced by postcode of residence, were linked to a national database of flood events for 1994 to 2005. The ratio of mortality in the post-flood year to that in the pre-flood year within flooded postcodes was compared with that in non-flooded boundary areas (within 5 km of a flood). Further analyses compared the observed number of flood-area deaths in the year after flooding with the number expected from analysis of mortality trends stratified by region, age-group, sex, deprivation group and urban-rural status. Results Among the 319 recorded floods, there were 771 deaths in the year before flooding and 693 deaths in the year after (post-/pre-flood ratio of 0.90, 95% CI 0.82, 1.00). This ratio did not vary substantially by age, sex, population density or deprivation. A similar post-flood 'deficit' of deaths was suggested by the analyses based on observed/expected deaths. Conclusions The observed post-flood 'deficit' of deaths is counter-intuitive and difficult to interpret because of the possible influence of population displacement caused by flooding. The bias that might arise from such displacement remains unquantified but has important implications for future studies that use place of residence as a marker of exposure. PMID:21288358
Stocks, S J; McNamee, R; Turner, S; Carder, M; Agius, R M
2015-07-01
Reducing healthcare-associated infections (HCAI) has been a priority in the U.K. over recent decades and this has been reflected in interventions focusing on improving hygiene procedures. To evaluate whether these interventions coincided with an increased incidence of work-related irritant contact dermatitis (ICD) attributed to hand hygiene or/and other hygiene measures in healthcare workers (HCWs). A quasi-experimental (interrupted time series) design was used to compare trends in incidence of ICD in HCWs attributed to hygiene before and after interventions to reduce HCAI with trends in the same periods in control groups (ICD in other workers). Cases of ICD reported to a U.K. surveillance scheme from 1996 to 2012 were analysed. The time periods compared were defined objectively based on the dates of the publication of national evidence-based guidelines, the U.K. Health Act 2006 and the Cleanyourhands campaign. The reported incidence of ICD in HCWs attributed to hygiene has increased steadily from 1996 to 2012 [annual incidence rate ratio (95% confidence interval): hand hygiene only 1.10 (1.07-1.12); all hygiene 1.05 (1.03-1.07)], whereas the incidence in other workers is declining. An increase in incidence of ICD in HCWs attributed to hand hygiene was observed at the beginning of the Cleanyourhands campaign. The increasing incidence of ICD in HCWs combined with the popularity of interventions to reduce HCAI warrants increased efforts towards identifying products and implementing practices posing the least risk of ICD. © 2015 British Association of Dermatologists.
Stone, Sheldon Paul; Fuller, Christopher; Savage, Joan; Cookson, Barry; Hayward, Andrew; Cooper, Ben; Duckworth, Georgia; Michie, Susan; Murray, Miranda; Jeanes, Annette; Roberts, J; Teare, Louise; Charlett, Andre
2012-05-03
To evaluate the impact of the Cleanyourhands campaign on rates of hospital procurement of alcohol hand rub and soap, report trends in selected healthcare associated infections, and investigate the association between infections and procurement. Prospective, ecological, interrupted time series study from 1 July 2004 to 30 June 2008. 187 acute trusts in England and Wales. Installation of bedside alcohol hand rub, materials promoting hand hygiene and institutional engagement, regular hand hygiene audits, rolled out nationally from 1 December 2004. Quarterly (that is, every three months) rates for each trust of hospital procurement of alcohol hand rub and liquid soap; Staphylococcus aureus bacteraemia (meticillin resistant (MRSA) and meticillin sensitive (MSSA)) and Clostridium difficile infection for each trust. Associations between procurement and infection rates assessed by mixed effect Poisson regression model (which also accounted for effect of bed occupancy, hospital type, and timing of other national interventions targeting these infections). Combined procurement of soap and alcohol hand rub tripled from 21.8 to 59.8 mL per patient bed day; procurement rose in association with each phase of the campaign. Rates fell for MRSA bacteraemia (1.88 to 0.91 cases per 10,000 bed days) and C difficile infection (16.75 to 9.49 cases). MSSA bacteraemia rates did not fall. Increased procurement of soap was independently associated with reduced C difficile infection throughout the study (adjusted incidence rate ratio for 1 mL increase per patient bed day 0.993, 95% confidence interval 0.990 to 0.996; P hospital procurement of alcohol rub and soap, which the results suggest has an important role in reducing rates of some healthcare associated infections. National interventions for infection control undertaken in the context of a high profile political drive can reduce selected healthcare associated infections.
Rawat, Angeli; Uebel, Kerry; Moore, David; Yassi, Annalee
2018-04-15
Noncommunicable diseases (NCDs), specifically diabetes and hypertension, are rising in high HIV-burdened countries such as South Africa. How integrated HIV care into primary health care (PHC) influences NCD care is unknown. We aimed to understand whether differences existed in NCD care (pre- versus post-integration) and how changes may relate to HIV patient numbers. Public sector PHC clinics in Free State, South Africa. Using a quasiexperimental design, we analyzed monthly administrative data on 4 indicators for diabetes and hypertension (clinic and population levels) during 4 years as HIV integration was implemented in PHC. Data represented 131 PHC clinics with a catchment population of 1.5 million. We used interrupted time series analysis at ±18 and ±30 months from HIV integration in each clinic to identify changes in trends postintegration compared with those in preintegration. We used linear mixed-effect models to study relationships between HIV and NCD indicators. Patients receiving antiretroviral therapy in the 131 PHC clinics studied increased from 1614 (April 2009) to 57, 958 (April 2013). Trends in new diabetes patients on treatment remained unchanged. However, population-level new hypertensives on treatment decreased at ±30 months from integration by 6/100, 000 (SE = 3, P < 0.02) and was associated with the number of new patients with HIV on treatment at the clinics. Our findings suggest that during the implementation of integrated HIV care into PHC clinics, care for hypertensive patients could be compromised. Further research is needed to understand determinants of NCD care in South Africa and other high HIV-burdened settings to ensure patient-centered PHC.
Chun, Sung-Youn; Park, Hye-Ki; Han, Kyu-Tae; Kim, Woorim; Lee, Hyo-Jung; Park, Eun-Cheol
2017-07-12
We evaluated the effectiveness of a policy allowing for the sale of over-the-counter drugs outside of pharmacies by examining its effect on number of monthly outpatient visits for acute upper respiratory infections, dyspepsia, and migraine. We used medical claims data extracted from the Korean National Health Insurance Cohort Database from 2009 to 2013. The Korean National Health Insurance Cohort Database comprises a nationally representative sample of claims - about 2% of the entire population - obtained from the medical record data held by the Korean National Health Insurance Corporation (which has data on the entire nation). The analysis included26,284,706 person-months of 1,042,728 individuals. An interrupted-time series analysis was performed. Outcome measures were monthly outpatient visits for acute upper respiratory infections, dyspepsia, and migraine. To investigate the effect of the policy, we compared the number of monthly visits before and after the policy's implementation in 2012. For acute upper respiratory infections, monthly outpatient visits showed a decreasing trend before the policy (ß = -0.0003);after it, a prompt change and increasing trend in monthly outpatient visits were observed, but these were non-significant. For dyspepsia, the trend was increasing before implementation (ß = -0.0101), but this reversed after implementation(ß = -0.007). For migraine, an increasing trend was observed before the policy (ß = 0.0057). After it, we observed a significant prompt change (ß = -0.0314) but no significant trend. Deregulation of selling over-the-counter medication outside of pharmacies reduced monthly outpatient visits for dyspepsia and migraine symptoms, but not acute upper respiratory infections.
Thinking aloud in the presence of interruptions and time constraints
DEFF Research Database (Denmark)
Hertzum, Morten; Holmegaard, Kristin Due
2013-01-01
and time constraints, two frequent elements of real-world activities. We find that the presence of auditory, visual, audiovisual, or no interruptions interacts with thinking aloud for task solution rate, task completion time, and participants’ fixation rate. Thinking-aloud participants also spend longer......Thinking aloud is widely used for usability evaluation and its reactivity is therefore important to the quality of evaluation results. This study investigates whether thinking aloud (i.e., verbalization at levels 1 and 2) affects the behaviour of users who perform tasks that involve interruptions...... responding to interruptions than control participants. Conversely, the absence or presence of time constraints does not interact with thinking aloud, suggesting that time pressure is less likely to make thinking aloud reactive than previously assumed. Our results inform practitioners faced with the decision...
Directory of Open Access Journals (Sweden)
M Muzigaba
2017-04-01
Full Text Available Background. Case fatality rates for childhood severe acute malnutrition (SAM remain high in some resource-limited facilities in South Africa (SA, despite the widespread availability of the World Health Organization treatment guidelines. There is a need to develop reproducible interventions that reinforce the implementation of these guidelines and assess their effect and sustainability. Objectives. To assess the short-term and sustained effects of a health system strengthening intervention on mortality attributable to SAM in two hospitals located in the Eastern Cape Province of SA. Methods. This was a theory-driven evaluation conducted in two rural hospitals in SA over a 69-month period (2009 - 2014. In both facilities, a health system strengthening intervention was implemented within the first 32 months, and thereafter discontinued. Sixty-nine monthly data series were collected on: (i monthly total SAM case fatality rate (CFR; (ii monthly SAM CFR within 24 hours of admission; and (iii monthly SAM CFR among HIV-positive cases, to determine the intervention’s effect within the first 32 months and sustainability over the remaining 37 months. The data were analysed using Linden’s method for analysing interrupted time series data. Results. The study revealed that the intervention was associated with a statistically significant decrease of up to 0.4% in monthly total SAM CFR, a non-statistically significant decrease of up to 0.09% in monthly SAM CFR within 24 hours of admission and a non-statistically significant decrease of up to 0.11% in monthly SAM CFR among HIV-positive cases. The decrease in mortality trends for both outcomes was only slightly reversed upon the discontinuation of the intervention. No autocorrelation was detected in the regression models generated during data analyses. Conclusion. The study findings suggest that although the intervention was designed to be self-sustaining, this may not have been the case. A qualitative enquiry
Jewkes, Rachel; Gibbs, Andrew; Jama-Shai, Nwabisa; Willan, Samantha; Misselhorn, Alison; Mushinga, Mildred; Washington, Laura; Mbatha, Nompumelelo; Skiweyiya, Yandisa
2014-12-29
Gender-based violence and HIV are highly prevalent in the harsh environment of informal settlements and reducing violence here is very challenging. The group intervention Stepping Stones has been shown to reduce men's perpetration of violence in more rural areas, but violence experienced by women in the study was not affected. Economic empowerment interventions with gender training can protect older women from violence, but microloan interventions have proved challenging with young women. We investigated whether combining a broad economic empowerment intervention and Stepping Stones could impact on violence among young men and women. The intervention, Creating Futures, was developed as a new generation of economic empowerment intervention, which enabled livelihood strengthening though helping participants find work or set up a business, and did not give cash or make loans. We piloted Stepping Stones with Creating Futures in two informal settlements of Durban with 232 out of school youth, mostly aged 18-30 and evaluated with a shortened interrupted time series of two baseline surveys and at 28 and 58 weeks post-baseline. 94/110 men and 111/122 women completed the last assessment, 85.5% and 90.2% respectively of those enrolled. To determine trend, we built random effects regression models with each individual as the cluster for each variable, and measured the slope of the line across the time points. Men's mean earnings in the past month increased by 247% from R411 (~$40) to R1015 (~$102, and women's by 278% R 174 (~$17) to R 484 (about $48) (trend test, p < 0.0001). There was a significant reduction in women's experience of the combined measure of physical and/or sexual IPV in the prior three months from 30.3% to 18.9% (p = 0.037). This was not seen for men. However both men and women scored significantly better on gender attitudes and men significantly reduced their controlling practices in their relationship. The prevalence of moderate or severe depression
Cornelsen, Laura; Mytton, Oliver T; Adams, Jean; Gasparrini, Antonio; Iskander, Dalia; Knai, Cecile; Petticrew, Mark; Scott, Courtney; Smith, Richard; Thompson, Claire; White, Martin; Cummins, Steven
2017-11-01
This study evaluates changes in sales of non-alcoholic beverages in Jamie's Italian, a national chain of commercial restaurants in the UK, following the introduction of a £0.10 per-beverage levy on sugar-sweetened beverages (SSBs) and supporting activity including beverage menu redesign, new products and establishment of a children's health fund from levy proceeds. We used an interrupted time series design to quantify changes in sales of non-alcoholic beverages 12 weeks and 6 months after implementation of the levy, using itemised electronic point of sale data. Main outcomes were number of SSBs and other non-alcoholic beverages sold per customer. Linear regression and multilevel random effects models, adjusting for seasonality and clustering, were used to investigate changes in SSB sales across all restaurants (n=37) and by tertiles of baseline restaurant SSB sales per customer. Compared with the prelevy period, the number of SSBs sold per customer declined by 11.0% (-17.3% to -4.3%) at 12 weeks and 9.3% (-15.2% to -3.2%) at 6 months. For non-levied beverages, sales per customer of children's fruit juice declined by 34.7% (-55.3% to -4.3%) at 12 weeks and 9.9% (-16.8% to -2.4%) at 6 months. At 6 months, sales per customer of fruit juice increased by 21.8% (14.0% to 30.2%) but sales of diet cola (-7.3%; -11.7% to -2.8%) and bottled waters (-6.5%; -11.0% to -1.7%) declined. Changes in sales were only observed in restaurants in the medium and high tertiles of baseline SSB sales per customer. Introduction of a £0.10 levy on SSBs alongside complementary activities is associated with declines in SSB sales per customer in the short and medium term, particularly in restaurants with higher baseline sales of SSBs. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Talboom-Kamp, Esther Pwa; Verdijk, Noortje A; Kasteleyn, Marise J; Harmans, Lara M; Talboom, Irvin Jsh; Looijmans-van den Akker, Ingrid; van Geloven, Nan; Numans, Mattijs E; Chavannes, Niels H
2017-08-16
Worldwide nearly 3 million people die from chronic obstructive pulmonary disease (COPD) every year. Integrated disease management (IDM) improves quality of life for COPD patients and can reduce hospitalization. Self-management of COPD through eHealth is an effective method to improve IDM and clinical outcomes. The objective of this implementation study was to investigate the effect of 3 chronic obstructive pulmonary disease eHealth programs applied in primary care on health status. The e-Vita COPD study compares different levels of integration of Web-based self-management platforms in IDM in 3 primary care settings. Patient health status is examined using the Clinical COPD Questionnaire (CCQ). The parallel cohort design includes 3 levels of integration in IDM (groups 1, 2, 3) and randomization of 2 levels of personal assistance for patients (group A, high assistance, group B, low assistance). Interrupted time series (ITS) design was used to collect CCQ data at multiple time points before and after intervention, and multilevel linear regression modeling was used to analyze CCQ data. Of the 702 invited patients, 215 (30.6%) registered to a platform. Of these, 82 participated in group 1 (high integration IDM), 36 in group 1A (high assistance), and 46 in group 1B (low assistance); 96 participated in group 2 (medium integration IDM), 44 in group 2A (high assistance) and 52 in group 2B (low assistance); also, 37 participated in group 3 (no integration IDM). In the total group, no significant difference was found in change in CCQ trend (P=.334) before (-0.47% per month) and after the intervention (-0.084% per month). Also, no significant difference was found in CCQ changes before versus after the intervention between the groups with high versus low personal assistance. In all subgroups, there was no significant change in the CCQ trend before and after the intervention (group 1A, P=.237; 1B, P=.991; 2A, P=.120; 2B, P=.166; 3, P=.945). The e-Vita eHealth-supported COPD
Enhanced Interrupt Response Time in the nMPRA based on Embedded Real Time Microcontrollers
Directory of Open Access Journals (Sweden)
GAITAN, N. C.
2017-08-01
Full Text Available In any real-time operating system, task switching and scheduling, interrupts, synchronization and communication between processes, represent major problems. The implementation of these mechanisms through software generates significant delays for many applications. The nMPRA (Multi Pipeline Register Architecture architecture is designed for the implementation of real-time embedded microcontrollers. It supports the competitive execution of n tasks, enabling very fast switching between them, with a usual delay of one machine cycle and a maximum of 3 machine cycles, for the memory-related work instructions. This is because each task has its own PC (Program Counter, set of pipeline registers and a general registers file. The nMPRA is provided with an advanced distributed interrupt controller that implements the concept of "interrupts as threads". This allows the attachment of one or more interrupts to the same task. In this context, the original contribution of this article is to presents the solutions for improving the response time to interrupts when a task has attached a large number of interrupts. The proposed solutions enhance the original architecture for interrupts logic in order to transfer control, to the interrupt handler as soon as possible, and to create an interrupt prioritization at task level.
Cane, James E; Cauchard, Fabrice; Weger, Ulrich W
2012-01-01
Two experiments examined how interruptions impact reading and how interruption lags and the reader's spatial memory affect the recovery from such interruptions. Participants read paragraphs of text and were interrupted unpredictably by a spoken news story while their eye movements were monitored. Time made available for consolidation prior to responding to the interruption did not aid reading resumption. However, providing readers with a visual cue that indicated the interruption location did aid task resumption substantially in Experiment 2. Taken together, the findings show that the recovery from interruptions during reading draws on spatial memory resources and can be aided by processes that support spatial memory. Practical implications are discussed.
National Research Council Canada - National Science Library
Adler, Robert
1997-01-01
We describe how to take a stable, ARMA, time series through the various stages of model identification, parameter estimation, and diagnostic checking, and accompany the discussion with a goodly number...
Multivariate Time Series Search
National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...
DEFF Research Database (Denmark)
Hisdal, H.; Holmqvist, E.; Hyvärinen, V.
Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...
DEFF Research Database (Denmark)
Fischer, Paul; Hilbert, Astrid
2012-01-01
We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...
DEFF Research Database (Denmark)
Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse
2012-01-01
We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...
Woodward, Wayne A; Elliott, Alan C
2011-01-01
""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…
Predicting chaotic time series
International Nuclear Information System (INIS)
Farmer, J.D.; Sidorowich, J.J.
1987-01-01
We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow
International Nuclear Information System (INIS)
Vajna, Szabolcs; Kertész, János; Tóth, Bálint
2013-01-01
Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)
Real time interrupt handling using FORTRAN IV plus under RSX-11M
International Nuclear Information System (INIS)
Schultz, D.E.
1981-01-01
A real-time data acquisition application for a linear accelerator is described. The important programming features of this application are use of connect to interrupt, a shared library, map to I/O page, and a shared data area. How you can provide rapid interrupt handling using these tools from FORTRAN IV PLUS is explained
Introduction to Time Series Modeling
Kitagawa, Genshiro
2010-01-01
In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f
GPS Position Time Series @ JPL
Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen
2013-01-01
Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis
International Nuclear Information System (INIS)
Kwong, Dora L.W.; Sham, Jonathan S.T.; Chua, Daniel T.T.; Choy, Damon T.K.; Au, Gordon K.H.; Wu, P.M.
1997-01-01
Purpose: The effect of interruptions and prolonged overall treatment time in radiotherapy for nasopharyngeal carcinoma and the significance of timing of interruption was investigated. Methods and Materials: Treatment records of 229 patients treated with continuous course (CC) and 567 patients treated with split course (SC) radiotherapy for nonmetastatic NPC were reviewed. Overall treatment time without inclusion of time for boost was calculated. Treatment that extended 1 week beyond scheduled time was considered prolonged. Outcome in patients who completed treatment 'per schedule' were compared with those who had 'prolonged' treatment. Because of known patient selection bias between CC and SC, patients on the two schedules were analyzed separately. Multivariate analysis was performed for patients on SC. Total number of days of interruption, age, sex, T and N stage, and the use of boost were tested for the whole SC group. Analysis on the effect of timing of interruption was performed in a subgroup of 223 patients on SC who had a single unplanned interruption. Timing of interruption, either before or after the fourth week for the unplanned interruption, was tested in addition to the other variables in multivariate analysis for this subgroup of SC. Results: Twenty-seven (11.8%) patients on CC and 96 (16.9%) patients on SC had prolonged treatment. Patients on SC who had prolonged treatment had significantly poorer loco-regional control rate and disease free survival when compared with those who completed radiotherapy per schedule (p = 0.0063 and 0.001, respectively, with adjustment for stage). For CC, the effect of prolonged treatment on outcome was not significant. The small number of events for patients on CC probably account for the insignificant finding. The number of days of interruption was confirmed as prognostic factor, independent of T and N stages, for loco-regional control and disease-free survival in multivariate analysis for SC. The hazard rate for loco
Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi
2012-10-01
In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.
西埜, 晴久
2004-01-01
The paper investigates an application of long-memory processes to economic time series. We show properties of long-memory processes, which are motivated to model a long-memory phenomenon in economic time series. An FARIMA model is described as an example of long-memory model in statistical terms. The paper explains basic limit theorems and estimation methods for long-memory processes in order to apply long-memory models to economic time series.
Hospital admission interviews are time-consuming with several interruptions
DEFF Research Database (Denmark)
Ghazanfar, Misbah N; Honoré, Per Gustaf Hartvig; Nielsen, Trine R H
2012-01-01
The admission interview is an important procedure to reduce medication errors. Studies indicate that physicians do not spend much time on the interview and that the major obstacles are lack of time and heavy workload. The aim of this study was to measure the time physicians spend on admission...... interviews and to describe factors that affect time consumption....
Power Supply Interruption Costs: Models and Methods Incorporating Time Dependent Patterns
International Nuclear Information System (INIS)
Kjoelle, G.H.
1996-12-01
This doctoral thesis develops models and methods for estimation of annual interruption costs for delivery points, emphasizing the handling of time dependent patterns and uncertainties in the variables determining the annual costs. It presents an analytical method for calculation of annual expected interruption costs for delivery points in radial systems, based on a radial reliability model, with time dependent variables. And a similar method for meshed systems, based on a list of outage events, assuming that these events are found in advance from load flow and contingency analyses. A Monte Carlo simulation model is given which handles both time variations and stochastic variations in the input variables and is based on the same list of outage events. This general procedure for radial and meshed systems provides expectation values and probability distributions for interruption costs from delivery points. There is also a procedure for handling uncertainties in input variables by a fuzzy description, giving annual interruption costs as a fuzzy membership function. The methods are developed for practical applications in radial and meshed systems, based on available data from failure statistics, load registrations and customer surveys. Traditional reliability indices such as annual interruption time, power- and energy not supplied, are calculated as by-products. The methods are presented as algorithms and/or procedures which are available as prototypes. 97 refs., 114 figs., 62 tabs
Power Supply Interruption Costs: Models and Methods Incorporating Time Dependent Patterns
Energy Technology Data Exchange (ETDEWEB)
Kjoelle, G.H.
1996-12-01
This doctoral thesis develops models and methods for estimation of annual interruption costs for delivery points, emphasizing the handling of time dependent patterns and uncertainties in the variables determining the annual costs. It presents an analytical method for calculation of annual expected interruption costs for delivery points in radial systems, based on a radial reliability model, with time dependent variables. And a similar method for meshed systems, based on a list of outage events, assuming that these events are found in advance from load flow and contingency analyses. A Monte Carlo simulation model is given which handles both time variations and stochastic variations in the input variables and is based on the same list of outage events. This general procedure for radial and meshed systems provides expectation values and probability distributions for interruption costs from delivery points. There is also a procedure for handling uncertainties in input variables by a fuzzy description, giving annual interruption costs as a fuzzy membership function. The methods are developed for practical applications in radial and meshed systems, based on available data from failure statistics, load registrations and customer surveys. Traditional reliability indices such as annual interruption time, power- and energy not supplied, are calculated as by-products. The methods are presented as algorithms and/or procedures which are available as prototypes. 97 refs., 114 figs., 62 tabs.
Forecasting Cryptocurrencies Financial Time Series
DEFF Research Database (Denmark)
Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco
2018-01-01
This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely...
Time series with tailored nonlinearities
Räth, C.; Laut, I.
2015-10-01
It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.
Models for dependent time series
Tunnicliffe Wilson, Granville; Haywood, John
2015-01-01
Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater
Clustering of financial time series
D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo
2013-05-01
This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.
Time series analysis time series analysis methods and applications
Rao, Tata Subba; Rao, C R
2012-01-01
The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...
Forecasting Cryptocurrencies Financial Time Series
Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco
2018-01-01
This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely on Dynamic Model Averaging to combine a large set of univariate Dynamic Linear Models and several multivariate Vector Autoregressive models with different forms of time variation. We find statistical si...
Directory of Open Access Journals (Sweden)
Novian Habibie
2016-02-01
Full Text Available Comunication between microcontrollers is one of the crucial point in embedded sytems. On the other hand, embedded system must be able to run many parallel task simultaneously. To handle this, we need a reliabe system that can do a multitasking without decreasing every task’s performance. The most widely used methods for multitasking in embedded systems are using Interrupt Service Routine (ISR or using Real Time Operating System (RTOS. This research compared perfomance of USART communication on system with RTOS to a system that use interrupt. Experiments run on two identical development board XMega A3BU-Xplained which used intenal sensor (light and temperature and used servo as external component. Perfomance comparison done by counting ping time (elapsing time to transmit data and get a reply as a mark that data has been received and compare it. This experiments divided into two scenarios: (1 system loaded with many tasks, (2 system loaded with few tasks. Result of the experiments show that communication will be faster if system only loaded with few tasks. System with RTOS has won from interrupt in case (1, but lose to interrupt in case (2.
Stochastic models for time series
Doukhan, Paul
2018-01-01
This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...
van den Akker, R.
2007-01-01
This thesis adresses statistical problems in econometrics. The first part contributes statistical methodology for nonnegative integer-valued time series. The second part of this thesis discusses semiparametric estimation in copula models and develops semiparametric lower bounds for a large class of
A Time Series Forecasting Method
Directory of Open Access Journals (Sweden)
Wang Zhao-Yu
2017-01-01
Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.
International Work-Conference on Time Series
Pomares, Héctor; Valenzuela, Olga
2017-01-01
This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...
Multiple Indicator Stationary Time Series Models.
Sivo, Stephen A.
2001-01-01
Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…
Grieger, I; Atkinson, G H
1985-09-24
An investigation of the photolytic conditions used to initiate and spectroscopically monitor the bacteriorhodopsin (BR) photocycle utilizing time-resolved resonance Raman (TR3) spectroscopy has revealed and characterized two photoinduced reactions that interrupt the thermal pathway. One reaction involves the photolytic interconversion of M-412 and M', and the other involves the direct photolytic conversion of the BR-570/K-590 photostationary mixture either to M-412 and M' or to M-like intermediates within 10 ns. The photolytic threshold conditions describing both reactions have been quantitatively measured and are discussed in terms of experimental parameters.
International Nuclear Information System (INIS)
Sinclair, Judith A.; Oates, Jason P.; Dale, Roger G.
1999-01-01
Purpose: The use of radiobiological modelling to examine the likely consequences of interruptions to radiotherapy schedules and to assess various compensatory measures. Methods and Materials: An effect-time graphical display, the BED-time chart, has been developed using the linear-quadratic (LQ) model. This is used to examine the effects on tumour and normal tissues of treatment interruption scenarios representative of clinical situations. The mathematical criteria governing successful salvage have also been drafted and applied to typical situations. Results: The successful salvage of an interrupted treatment is dependent on a number of interacting factors and the method presented here can be used to examine the trade-offs that exist. Although the mathematics may be complex, it is shown that the dilemmas posed by an interrupted treatment may be more easily appreciated with reference to BED-time charts. These may therefore have a useful role as a teaching aid for portraying a wider variety of radiotherapy problems and also in the documentation of interruptions to treatment and the measures taken to compensate for them. Conclusions: Interruptions to radiotherapy regimes are undesirable and compensatory measures need to be initiated as soon as possible after the gap, with a view to completing the amended treatment within the originally prescribed treatment time. Adequate compensation is particularly difficult for long gaps and gaps which occur towards the end of the scheduled treatment. Modelling exercises can help establish guidelines on the available windows of opportunity
A Course in Time Series Analysis
Peña, Daniel; Tsay, Ruey S
2011-01-01
New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a
The analysis of time series: an introduction
National Research Council Canada - National Science Library
Chatfield, Christopher
1989-01-01
.... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...
Prediction and Geometry of Chaotic Time Series
National Research Council Canada - National Science Library
Leonardi, Mary
1997-01-01
This thesis examines the topic of chaotic time series. An overview of chaos, dynamical systems, and traditional approaches to time series analysis is provided, followed by an examination of state space reconstruction...
Global Population Density Grid Time Series Estimates
National Aeronautics and Space Administration — Global Population Density Grid Time Series Estimates provide a back-cast time series of population density grids based on the year 2000 population grid from SEDAC's...
Kolmogorov Space in Time Series Data
Kanjamapornkul, K.; Pinčák, R.
2016-01-01
We provide the proof that the space of time series data is a Kolmogorov space with $T_{0}$-separation axiom using the loop space of time series data. In our approach we define a cyclic coordinate of intrinsic time scale of time series data after empirical mode decomposition. A spinor field of time series data comes from the rotation of data around price and time axis by defining a new extradimension to time series data. We show that there exist hidden eight dimensions in Kolmogorov space for ...
Effective Feature Preprocessing for Time Series Forecasting
DEFF Research Database (Denmark)
Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao
2006-01-01
Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....
Time Series Analysis and Forecasting by Example
Bisgaard, Soren
2011-01-01
An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in
Duality between Time Series and Networks
Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.
2011-01-01
Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093
A Review of Subsequence Time Series Clustering
Directory of Open Access Journals (Sweden)
Seyedjamal Zolhavarieh
2014-01-01
Full Text Available Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
A review of subsequence time series clustering.
Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.
A Review of Subsequence Time Series Clustering
Teh, Ying Wah
2014-01-01
Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332
Directory of Open Access Journals (Sweden)
Alison L Hill
2016-04-01
Full Text Available Monitoring the efficacy of novel reservoir-reducing treatments for HIV is challenging. The limited ability to sample and quantify latent infection means that supervised antiretroviral therapy (ART interruption studies are generally required. Here we introduce a set of mathematical and statistical modeling tools to aid in the design and interpretation of ART-interruption trials. We show how the likely size of the remaining reservoir can be updated in real-time as patients continue off treatment, by combining the output of laboratory assays with insights from models of reservoir dynamics and rebound. We design an optimal schedule for viral load sampling during interruption, whereby the frequency of follow-up can be decreased as patients continue off ART without rebound. While this scheme can minimize costs when the chance of rebound between visits is low, we find that the reservoir will be almost completely reseeded before rebound is detected unless sampling occurs at least every two weeks and the most sensitive viral load assays are used. We use simulated data to predict the clinical trial size needed to estimate treatment effects in the face of highly variable patient outcomes and imperfect reservoir assays. Our findings suggest that large numbers of patients-between 40 and 150-will be necessary to reliably estimate the reservoir-reducing potential of a new therapy and to compare this across interventions. As an example, we apply these methods to the two "Boston patients", recipients of allogeneic hematopoietic stem cell transplants who experienced large reductions in latent infection and underwent ART-interruption. We argue that the timing of viral rebound was not particularly surprising given the information available before treatment cessation. Additionally, we show how other clinical data can be used to estimate the relative contribution that remaining HIV+ cells in the recipient versus newly infected cells from the donor made to the
DEFF Research Database (Denmark)
Korsholm, Stephan; Schoeberl, Martin; Ravn, Anders Peter
2008-01-01
An important part of implementing device drivers is to control the interrupt facilities of the hardware platform and to program interrupt handlers. Current methods for handling interrupts in Java use a server thread waiting for the VM to signal an interrupt occurrence. It means that the interrupt...... is handled at a later time, which has some disadvantages. We present constructs that allow interrupts to be handled directly and not at a later point decided by a scheduler. A desirable feature of our approach is that we do not require a native middleware layer but can handle interrupts entirely with Java...... code. We have implemented our approach using an interpreter and a Java processor, and give an example demonstrating its use....
Data mining in time series databases
Kandel, Abraham; Bunke, Horst
2004-01-01
Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.
International Work-Conference on Time Series
Pomares, Héctor
2016-01-01
This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.
BRITS: Bidirectional Recurrent Imputation for Time Series
Cao, Wei; Wang, Dong; Li, Jian; Zhou, Hao; Li, Lei; Li, Yitan
2018-01-01
Time series are widely used as signals in many classification/regression tasks. It is ubiquitous that time series contains many missing values. Given multiple correlated time series data, how to fill in missing values and to predict their class labels? Existing imputation methods often impose strong assumptions of the underlying data generating process, such as linear dynamics in the state space. In this paper, we propose BRITS, a novel method based on recurrent neural networks for missing va...
Geometric noise reduction for multivariate time series.
Mera, M Eugenia; Morán, Manuel
2006-03-01
We propose an algorithm for the reduction of observational noise in chaotic multivariate time series. The algorithm is based on a maximum likelihood criterion, and its goal is to reduce the mean distance of the points of the cleaned time series to the attractor. We give evidence of the convergence of the empirical measure associated with the cleaned time series to the underlying invariant measure, implying the possibility to predict the long run behavior of the true dynamics.
Frontiers in Time Series and Financial Econometrics
Ling, S.; McAleer, M.J.; Tong, H.
2015-01-01
__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time series analysis. The purpose of this special issue of the journal on “Frontiers in Time Series and Financial Econometrics” is to highlight several areas of research by leading academics in which novel methods have contrib...
Neural Network Models for Time Series Forecasts
Tim Hill; Marcus O'Connor; William Remus
1996-01-01
Neural networks have been advocated as an alternative to traditional statistical forecasting methods. In the present experiment, time series forecasts produced by neural networks are compared with forecasts from six statistical time series methods generated in a major forecasting competition (Makridakis et al. [Makridakis, S., A. Anderson, R. Carbone, R. Fildes, M. Hibon, R. Lewandowski, J. Newton, E. Parzen, R. Winkler. 1982. The accuracy of extrapolation (time series) methods: Results of a ...
Forecasting Enrollments with Fuzzy Time Series.
Song, Qiang; Chissom, Brad S.
The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…
Analysis of Heavy-Tailed Time Series
DEFF Research Database (Denmark)
Xie, Xiaolei
This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...
Statistical criteria for characterizing irradiance time series.
Energy Technology Data Exchange (ETDEWEB)
Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.
2010-10-01
We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.
The foundations of modern time series analysis
Mills, Terence C
2011-01-01
This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.
Lag space estimation in time series modelling
DEFF Research Database (Denmark)
Goutte, Cyril
1997-01-01
The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...
Entropic Analysis of Electromyography Time Series
Kaufman, Miron; Sung, Paul
2005-03-01
We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.
Correlation and multifractality in climatological time series
International Nuclear Information System (INIS)
Pedron, I T
2010-01-01
Climate can be described by statistical analysis of mean values of atmospheric variables over a period. It is possible to detect correlations in climatological time series and to classify its behavior. In this work the Hurst exponent, which can characterize correlation and persistence in time series, is obtained by using the Detrended Fluctuation Analysis (DFA) method. Data series of temperature, precipitation, humidity, solar radiation, wind speed, maximum squall, atmospheric pressure and randomic series are studied. Furthermore, the multifractality of such series is analyzed applying the Multifractal Detrended Fluctuation Analysis (MF-DFA) method. The results indicate presence of correlation (persistent character) in all climatological series and multifractality as well. A larger set of data, and longer, could provide better results indicating the universality of the exponents.
Homogenising time series: beliefs, dogmas and facts
Domonkos, P.
2011-06-01
In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.
Network structure of multivariate time series.
Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito
2015-10-21
Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.
Modeling Time Series Data for Supervised Learning
Baydogan, Mustafa Gokce
2012-01-01
Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…
Time series modeling, computation, and inference
Prado, Raquel
2010-01-01
The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit
Time Series Analysis Forecasting and Control
Box, George E P; Reinsel, Gregory C
2011-01-01
A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl
Visibility Graph Based Time Series Analysis.
Stephen, Mutua; Gu, Changgui; Yang, Huijie
2015-01-01
Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Visibility Graph Based Time Series Analysis.
Directory of Open Access Journals (Sweden)
Mutua Stephen
Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Data Mining Smart Energy Time Series
Directory of Open Access Journals (Sweden)
Janina POPEANGA
2015-07-01
Full Text Available With the advent of smart metering technology the amount of energy data will increase significantly and utilities industry will have to face another big challenge - to find relationships within time-series data and even more - to analyze such huge numbers of time series to find useful patterns and trends with fast or even real-time response. This study makes a small review of the literature in the field, trying to demonstrate how essential is the application of data mining techniques in the time series to make the best use of this large quantity of data, despite all the difficulties. Also, the most important Time Series Data Mining techniques are presented, highlighting their applicability in the energy domain.
Time series prediction: statistical and neural techniques
Zahirniak, Daniel R.; DeSimio, Martin P.
1996-03-01
In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.
Detecting nonlinear structure in time series
International Nuclear Information System (INIS)
Theiler, J.
1991-01-01
We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of ''surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs
Nonparametric factor analysis of time series
Rodríguez-Poo, Juan M.; Linton, Oliver Bruce
1998-01-01
We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.
Applied time series analysis and innovative computing
Ao, Sio-Iong
2010-01-01
This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.
Measuring multiscaling in financial time-series
International Nuclear Information System (INIS)
Buonocore, R.J.; Aste, T.; Di Matteo, T.
2016-01-01
We discuss the origin of multiscaling in financial time-series and investigate how to best quantify it. Our methodology consists in separating the different sources of measured multifractality by analyzing the multi/uni-scaling behavior of synthetic time-series with known properties. We use the results from the synthetic time-series to interpret the measure of multifractality of real log-returns time-series. The main finding is that the aggregation horizon of the returns can introduce a strong bias effect on the measure of multifractality. This effect can become especially important when returns distributions have power law tails with exponents in the range (2, 5). We discuss the right aggregation horizon to mitigate this bias.
Complex network approach to fractional time series
Energy Technology Data Exchange (ETDEWEB)
Manshour, Pouya [Physics Department, Persian Gulf University, Bushehr 75169 (Iran, Islamic Republic of)
2015-10-15
In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.
Multivariate Time Series Decomposition into Oscillation Components.
Matsuda, Takeru; Komaki, Fumiyasu
2017-08-01
Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.
Introduction to time series analysis and forecasting
Montgomery, Douglas C; Kulahci, Murat
2008-01-01
An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Clinical and epidemiological rounds. Time series
Directory of Open Access Journals (Sweden)
León-Álvarez, Alba Luz
2016-07-01
Full Text Available Analysis of time series is a technique that implicates the study of individuals or groups observed in successive moments in time. This type of analysis allows the study of potential causal relationships between different variables that change over time and relate to each other. It is the most important technique to make inferences about the future, predicting, on the basis or what has happened in the past and it is applied in different disciplines of knowledge. Here we discuss different components of time series, the analysis technique and specific examples in health research.
Time Series Forecasting with Missing Values
Directory of Open Access Journals (Sweden)
Shin-Fu Wu
2015-11-01
Full Text Available Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, on the other hand, may alter the original time series. In this study, we propose a novel forecasting method based on least squares support vector machine (LSSVM. We employ the input patterns with the temporal information which is defined as local time index (LTI. Time series data as well as local time indexes are fed to LSSVM for doing forecasting without imputation. We compare the forecasting performance of our method with other imputation methods. Experimental results show that the proposed method is promising and is worth further investigations.
Efficient Approximate OLAP Querying Over Time Series
DEFF Research Database (Denmark)
Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang
2016-01-01
The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...... queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...... of data grows. This is a particular problem when querying time series data, which generally contains multiple measures recorded at fine time granularities. Usually, this issue is addressed either by scaling up hardware or by employing workload based query optimization techniques. However, these solutions...
Time averaging, ageing and delay analysis of financial time series
Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf
2017-06-01
We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Turbulencelike Behavior of Seismic Time Series
International Nuclear Information System (INIS)
Manshour, P.; Saberi, S.; Sahimi, Muhammad; Peinke, J.; Pacheco, Amalio F.; Rahimi Tabar, M. Reza
2009-01-01
We report on a stochastic analysis of Earth's vertical velocity time series by using methods originally developed for complex hierarchical systems and, in particular, for turbulent flows. Analysis of the fluctuations of the detrended increments of the series reveals a pronounced transition in their probability density function from Gaussian to non-Gaussian. The transition occurs 5-10 hours prior to a moderate or large earthquake, hence representing a new and reliable precursor for detecting such earthquakes
Snow, Marcellus S.
1989-09-01
A mathematical model is presented of costs and operational factors involved in provision for service interruptions of both a mature and typically large incumbent satellite system and of a smaller, more recently operational system. The equation expresses the required launch frequency for the new system as a function of the launch spacing of the mature system; the time disparity between the inauguration of the two systems; and the rate of capacity depreciation. In addition, a technique is presented to compare the relative extent to which the discounted costs of the new system exceed those of the mature system in furnishing the same effective capacity in orbit, and thus the same service liability, at a given point in time. It is determined that a mature incumbent communications satellite system, having more capacity in orbit, will on balance have a lower probability of service interruption than a newer, smaller system.
Introduction to time series analysis and forecasting
Montgomery, Douglas C; Kulahci, Murat
2015-01-01
Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts. Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both
Time series modeling in traffic safety research.
Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue
2018-08-01
The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.
Forecasting autoregressive time series under changing persistence
DEFF Research Database (Denmark)
Kruse, Robinson
Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...
Building Chaotic Model From Incomplete Time Series
Siek, Michael; Solomatine, Dimitri
2010-05-01
This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual
Layered Ensemble Architecture for Time Series Forecasting.
Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin
2016-01-01
Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.
Time series clustering in large data sets
Directory of Open Access Journals (Sweden)
Jiří Fejfar
2011-01-01
Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.
Introduction to time series and forecasting
Brockwell, Peter J
2016-01-01
This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...
Complex dynamic in ecological time series
Peter Turchin; Andrew D. Taylor
1992-01-01
Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...
Inferring interdependencies from short time series
Indian Academy of Sciences (India)
Abstract. Complex networks provide an invaluable framework for the study of interlinked dynamical systems. In many cases, such networks are constructed from observed time series by first estimating the ...... does not quantify causal relations (unlike IOTA, or .... Africa_map_regions.svg, which is under public domain.
On modeling panels of time series
Ph.H.B.F. Franses (Philip Hans)
2002-01-01
textabstractThis paper reviews research issues in modeling panels of time series. Examples of this type of data are annually observed macroeconomic indicators for all countries in the world, daily returns on the individual stocks listed in the S&P500, and the sales records of all items in a
25 years of time series forecasting
de Gooijer, J.G.; Hyndman, R.J.
2006-01-01
We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During
Nonlinear Time Series Analysis via Neural Networks
Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin
This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.
Markov Trends in Macroeconomic Time Series
R. Paap (Richard)
1997-01-01
textabstractMany macroeconomic time series are characterised by long periods of positive growth, expansion periods, and short periods of negative growth, recessions. A popular model to describe this phenomenon is the Markov trend, which is a stochastic segmented trend where the slope depends on the
Modeling vector nonlinear time series using POLYMARS
de Gooijer, J.G.; Ray, B.K.
2003-01-01
A modified multivariate adaptive regression splines method for modeling vector nonlinear time series is investigated. The method results in models that can capture certain types of vector self-exciting threshold autoregressive behavior, as well as provide good predictions for more general vector
Modeling seasonality in bimonthly time series
Ph.H.B.F. Franses (Philip Hans)
1992-01-01
textabstractA recurring issue in modeling seasonal time series variables is the choice of the most adequate model for the seasonal movements. One selection method for quarterly data is proposed in Hylleberg et al. (1990). Market response models are often constructed for bimonthly variables, and
Time Series Modelling using Proc Varmax
DEFF Research Database (Denmark)
Milhøj, Anders
2007-01-01
In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box...
On clustering fMRI time series
DEFF Research Database (Denmark)
Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.
1999-01-01
Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...
Robust Control Charts for Time Series Data
Croux, C.; Gelper, S.; Mahieu, K.
2010-01-01
This article presents a control chart for time series data, based on the one-step- ahead forecast errors of the Holt-Winters forecasting method. We use robust techniques to prevent that outliers affect the estimation of the control limits of the chart. Moreover, robustness is important to maintain
Optimal transformations for categorical autoregressive time series
Buuren, S. van
1996-01-01
This paper describes a method for finding optimal transformations for analyzing time series by autoregressive models. 'Optimal' implies that the agreement between the autoregressive model and the transformed data is maximal. Such transformations help 1) to increase the model fit, and 2) to analyze
Lecture notes for Advanced Time Series Analysis
DEFF Research Database (Denmark)
Madsen, Henrik; Holst, Jan
1997-01-01
A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...
Forecasting with periodic autoregressive time series models
Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)
1999-01-01
textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption
2005-01-01
On Sunday 12 June 2005, a site-wide security software upgrade will be performed on all CERN network equipment. This maintenance operation will cause at least 2 short network interruptions of 2 minutes on each equipment item. There are hundreds of such items across the CERN site (Meyrin, Prévessin and all SPS and LHC pits), and it will thus take the whole day to treat them all. All network users and services will be affected. Central batch computing services will be interrupted during this period, expected to last from 8 a.m. until late evening. Job submission will still be possible but no jobs will actually be run. It is hoped to complete the computer centre upgrades in the morning so that stable access can be restored to lxplus, afs and nice services as soon as possible; this cannot be guaranteed, however. The opportunity will be used to interrupt and perform upgrades on the CERN Document Servers.
Stochastic nature of series of waiting times
Anvari, Mehrnaz; Aghamohammadi, Cina; Dashti-Naserabadi, H.; Salehi, E.; Behjat, E.; Qorbani, M.; Khazaei Nezhad, M.; Zirak, M.; Hadjihosseini, Ali; Peinke, Joachim; Tabar, M. Reza Rahimi
2013-06-01
Although fluctuations in the waiting time series have been studied for a long time, some important issues such as its long-range memory and its stochastic features in the presence of nonstationarity have so far remained unstudied. Here we find that the “waiting times” series for a given increment level have long-range correlations with Hurst exponents belonging to the interval 1/2
The Statistical Analysis of Time Series
Anderson, T W
2011-01-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George
Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.
2002-01-01
In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing
Algorithm for Compressing Time-Series Data
Hawkins, S. Edward, III; Darlington, Edward Hugo
2012-01-01
An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").
Wong, Manyee; Cook, Thomas D.; Steiner, Peter M.
2015-01-01
Some form of a short interrupted time series (ITS) is often used to evaluate state and national programs. An ITS design with a single treatment group assumes that the pretest functional form can be validly estimated and extrapolated into the postintervention period where it provides a valid counterfactual. This assumption is problematic. Ambiguous…
Inverse statistical approach in heartbeat time series
International Nuclear Information System (INIS)
Ebadi, H; Shirazi, A H; Mani, Ali R; Jafari, G R
2011-01-01
We present an investigation on heart cycle time series, using inverse statistical analysis, a concept borrowed from studying turbulence. Using this approach, we studied the distribution of the exit times needed to achieve a predefined level of heart rate alteration. Such analysis uncovers the most likely waiting time needed to reach a certain change in the rate of heart beat. This analysis showed a significant difference between the raw data and shuffled data, when the heart rate accelerates or decelerates to a rare event. We also report that inverse statistical analysis can distinguish between the electrocardiograms taken from healthy volunteers and patients with heart failure
Visibility graphlet approach to chaotic time series
Energy Technology Data Exchange (ETDEWEB)
Mutua, Stephen [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China); Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega (Kenya); Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn; Yang, Huijie, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)
2016-05-15
Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.
Time-Series Analysis: A Cautionary Tale
Damadeo, Robert
2015-01-01
Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.
Time Series Analysis Using Geometric Template Matching.
Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina
2013-03-01
We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.
Forecasting with nonlinear time series models
DEFF Research Database (Denmark)
Kock, Anders Bredahl; Teräsvirta, Timo
In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...... applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic...... and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...
Nonlinear time series analysis with R
Huffaker, Ray; Rosa, Rodolfo
2017-01-01
In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...
Reconstruction of tritium time series in precipitation
International Nuclear Information System (INIS)
Celle-Jeanton, H.; Gourcy, L.; Aggarwal, P.K.
2002-01-01
Tritium is commonly used in groundwaters studies to calculate the recharge rate and to identify the presence of a modern recharge. The knowledge of 3 H precipitation time series is then very important for the study of groundwater recharge. Rozanski and Araguas provided good information on precipitation tritium content in 180 stations of the GNIP network to the end of 1987, but it shows some lacks of measurements either within one chronicle or within one region (the Southern hemisphere for instance). Therefore, it seems to be essential to find a method to recalculate data for a region where no measurement is available.To solve this problem, we propose another method which is based on triangulation. It needs the knowledge of 3 H time series of 3 stations surrounding geographically the 4-th station for which tritium input curve has to be reconstructed
Directory of Open Access Journals (Sweden)
Silvia Erina
2012-10-01
Full Text Available The study was carried out on 9 Romanian Black and White primiparous cows. The aim of this study was todetermine some aspect of nutritional behaviour of the cows. During the experiments, the following behaviour aspectswere determined: interruption number and their duration in the feed consumption time. Results showed that theadministration order of forages had an influence on the interruptions number, which was 0.74 less for hay in fibroussucculentorder (O1. For silage, the interruption number was 0.42 higher in fibrous-succulent order (O1. Betweenportion 1 (P1 and portion 3 (P3, the significant difference (p<0.05 was for interruptions duration, duringconsumption silage, in favour portion P1. Distinct significant differences (p<0.01 was observed for the interruptionnumber during consumption silage (0.95 sec. higher in P1 than in P3, for interruption duration (5.96 sec. higher inP1 than in P3. Between P2 and P3, significant difference (p<0.05 was observed for interruptions number duringconsumption silage and for average interruptions duration during consumption beet in favour to portion P2.Regarding the number of feedings per portion, always the differences were higher in the second feeding F1 than inthe first feeding F2.
Time Series Forecasting with Missing Values
Shin-Fu Wu; Chia-Yung Chang; Shie-Jue Lee
2015-01-01
Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, o...
Time series analysis of barometric pressure data
International Nuclear Information System (INIS)
La Rocca, Paola; Riggi, Francesco; Riggi, Daniele
2010-01-01
Time series of atmospheric pressure data, collected over a period of several years, were analysed to provide undergraduate students with educational examples of application of simple statistical methods of analysis. In addition to basic methods for the analysis of periodicities, a comparison of two forecast models, one based on autoregression algorithms, and the other making use of an artificial neural network, was made. Results show that the application of artificial neural networks may give slightly better results compared to traditional methods.
Directory of Open Access Journals (Sweden)
Mohammad Kazemi
2012-04-01
Full Text Available This paper considers preemption and idle time are allowed in a single machine scheduling problem with just-in-time (JIT approach. It incorporates Earliness/Tardiness (E/T penalties, interruption penalties and holding cost of jobs which are waiting to be processed as work-in-process (WIP. Generally in non-preemptive problems, E/T penalties are a function of the completion time of the jobs. Then, we introduce a non-linear preemptive scheduling model where the earliness penalty depends on the starting time of a job. The model is liberalized by an elaborately–designed procedure to reach the optimum solution. To validate and verify the performance of proposed model, computational results are presented by solving a number of numerical examples.
Causal strength induction from time series data.
Soo, Kevin W; Rottman, Benjamin M
2018-04-01
One challenge when inferring the strength of cause-effect relations from time series data is that the cause and/or effect can exhibit temporal trends. If temporal trends are not accounted for, a learner could infer that a causal relation exists when it does not, or even infer that there is a positive causal relation when the relation is negative, or vice versa. We propose that learners use a simple heuristic to control for temporal trends-that they focus not on the states of the cause and effect at a given instant, but on how the cause and effect change from one observation to the next, which we call transitions. Six experiments were conducted to understand how people infer causal strength from time series data. We found that participants indeed use transitions in addition to states, which helps them to reach more accurate causal judgments (Experiments 1A and 1B). Participants use transitions more when the stimuli are presented in a naturalistic visual format than a numerical format (Experiment 2), and the effect of transitions is not driven by primacy or recency effects (Experiment 3). Finally, we found that participants primarily use the direction in which variables change rather than the magnitude of the change for estimating causal strength (Experiments 4 and 5). Collectively, these studies provide evidence that people often use a simple yet effective heuristic for inferring causal strength from time series data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Interpretable Categorization of Heterogeneous Time Series Data
Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua
2017-01-01
We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.
Interpretation of a compositional time series
Tolosana-Delgado, R.; van den Boogaart, K. G.
2012-04-01
Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA
Timing calibration and spectral cleaning of LOFAR time series data
Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Horandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.
We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are
Sasikala, S.; Indhira, K.; Chandrasekaran, V. M.
2017-11-01
In this paper, we have considered an MX / (a,b) / 1 queueing system with server breakdown without interruption, multiple vacations, setup times and N-policy. After a batch of service, if the size of the queue is ξ (customers in the queue. After a vacation, if the server finds at least N customers waiting for service, then the server needs a setup time to start the service. After a batch of service, if the amount of waiting customers in the queue is ξ (≥ a) then the server serves a batch of min(ξ,b) customers, where b ≥ a. We derived the probability generating function of queue length at arbitrary time epoch. Further, we obtained some important performance measures.
Outlier Detection in Structural Time Series Models
DEFF Research Database (Denmark)
Marczak, Martyna; Proietti, Tommaso
investigate via Monte Carlo simulations how this approach performs for detecting additive outliers and level shifts in the analysis of nonstationary seasonal time series. The reference model is the basic structural model, featuring a local linear trend, possibly integrated of order two, stochastic seasonality......Structural change affects the estimation of economic signals, like the underlying growth rate or the seasonally adjusted series. An important issue, which has attracted a great deal of attention also in the seasonal adjustment literature, is its detection by an expert procedure. The general......–to–specific approach to the detection of structural change, currently implemented in Autometrics via indicator saturation, has proven to be both practical and effective in the context of stationary dynamic regression models and unit–root autoregressions. By focusing on impulse– and step–indicator saturation, we...
Effect of an evidence-based website on healthcare usage: an interrupted time-series study.
Spoelman, W.A.; Bonten, T.N.; Waal, M.W.M. de; Drenthen, T.; Smeele, I.J.M.; Nielen, M.M.; Chavannes, N.
2016-01-01
Objectives: Healthcare costs and usage are rising. Evidence-based online health information may reduce healthcare usage, but the evidence is scarce. The objective of this study was to determine whether the release of a nationwide evidence-based health website was associated with a reduction in
Analysis of JET ELMy time series
International Nuclear Information System (INIS)
Zvejnieks, G.; Kuzovkov, V.N.
2005-01-01
Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)
Fourier analysis of time series an introduction
Bloomfield, Peter
2000-01-01
A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample
Estimating High-Dimensional Time Series Models
DEFF Research Database (Denmark)
Medeiros, Marcelo C.; Mendes, Eduardo F.
We study the asymptotic properties of the Adaptive LASSO (adaLASSO) in sparse, high-dimensional, linear time-series models. We assume both the number of covariates in the model and candidate variables can increase with the number of observations and the number of candidate variables is, possibly......, larger than the number of observations. We show the adaLASSO consistently chooses the relevant variables as the number of observations increases (model selection consistency), and has the oracle property, even when the errors are non-Gaussian and conditionally heteroskedastic. A simulation study shows...
Inferring causality from noisy time series data
DEFF Research Database (Denmark)
Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian
2016-01-01
Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...
Useful Pattern Mining on Time Series
DEFF Research Database (Denmark)
Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter
2013-01-01
We present the architecture of a “useful pattern” mining system that is capable of detecting thousands of different candlestick sequence patterns at the tick or any higher granularity levels. The system architecture is highly distributed and performs most of its highly compute-intensive aggregation...... calculations as complex but efficient distributed SQL queries on the relational databases that store the time-series. We present initial results from mining all frequent candlestick sequences with the characteristic property that when they occur then, with an average at least 60% probability, they signal a 2...
Trottini, Mario; Vigo, Isabel; Belda, Santiago
2015-01-01
Given a time series, running trends analysis (RTA) involves evaluating least squares trends over overlapping time windows of L consecutive time points, with overlap by all but one observation. This produces a new series called the “running trends series,” which is used as summary statistics of the original series for further analysis. In recent years, RTA has been widely used in climate applied research as summary statistics for time series and time series association. There is no doubt that ...
Time series analysis of temporal networks
Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh
2016-01-01
A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue
Anomaly on Superspace of Time Series Data
Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin
2017-11-01
We apply the G-theory and anomaly of ghost and antighost fields in the theory of supersymmetry to study a superspace over time series data for the detection of hidden general supply and demand equilibrium in the financial market. We provide proof of the existence of a general equilibrium point over 14 extradimensions of the new G-theory compared with the M-theory of the 11 dimensions model of Edward Witten. We found that the process of coupling between nonequilibrium and equilibrium spinor fields of expectation ghost fields in the superspace of time series data induces an infinitely long exact sequence of cohomology from a short exact sequence of moduli state space model. If we assume that the financial market is separated into two topological spaces of supply and demand as the D-brane and anti-D-brane model, then we can use a cohomology group to compute the stability of the market as a stable point of the general equilibrium of the interaction between D-branes of the market. We obtain the result that the general equilibrium will exist if and only if the 14th Batalin-Vilkovisky cohomology group with the negative dimensions underlying 14 major hidden factors influencing the market is zero.
Tool Wear Monitoring Using Time Series Analysis
Song, Dong Yeul; Ohara, Yasuhiro; Tamaki, Haruo; Suga, Masanobu
A tool wear monitoring approach considering the nonlinear behavior of cutting mechanism caused by tool wear and/or localized chipping is proposed, and its effectiveness is verified through the cutting experiment and actual turning machining. Moreover, the variation in the surface roughness of the machined workpiece is also discussed using this approach. In this approach, the residual error between the actually measured vibration signal and the estimated signal obtained from the time series model corresponding to dynamic model of cutting is introduced as the feature of diagnosis. Consequently, it is found that the early tool wear state (i.e. flank wear under 40µm) can be monitored, and also the optimal tool exchange time and the tool wear state for actual turning machining can be judged by this change in the residual error. Moreover, the variation of surface roughness Pz in the range of 3 to 8µm can be estimated by the monitoring of the residual error.
Time Series Based for Online Signature Verification
Directory of Open Access Journals (Sweden)
I Ketut Gede Darma Putra
2013-11-01
Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.
International Nuclear Information System (INIS)
Moonen, L.; Voet, H. van der; Nijs, R. de; Horenblas, S.; Hart, A.A.M.; Bartelink, H.
1998-01-01
Purpose: To evaluate and eventually quantify a possible influence of tumor proliferation during the external radiation course on local control in muscle invasive bladder cancer. Methods and Materials: The influence of total dose, overall treatment time, and treatment interruption has retrospectively been analyzed in a series of 379 patients with nonmetastasized, muscle-invasive transitional cell carcinoma of the urinary bladder. All patients received external beam radiotherapy at the Netherlands Cancer Institute between 1977 and 1990. Total dose varied between 50 and 75 Gy with a mean of 60.5 Gy and a median of 60.4 Gy. Overall treatment time varied between 20 and 270 days with a mean of 49 days and a median of 41 days. Number of fractions varied between 17 and 36 with a mean of 27 and a median of 26. Two hundred and forty-four patients had a continuous radiation course, whereas 135 had an intended split course or an unintended treatment interruption. Median follow-up was 22 months for all patients and 82 months for the 30 patients still alive at last follow-up. A stepwise procedure using proportional hazard regression has been used to identify prognostic treatment factors with respect to local recurrence as sole first recurrence. Results: One hundred and thirty-six patients experienced a local recurrence and 120 of these occurred before regional or distant metastases. The actuarial local control rate was 40.3% at 5 years and 32.3% at 10 years. In a multivariate analysis total dose showed a significant association with local control (p 0.0039), however in a markedly nonlinear way. In fact only those patients treated with a dose below 57.5 Gy had a significant higher bladder relapse rate, whereas no difference in relapse rate was found among patients treated with doses above 57.5 Gy. This remained the case even after adjustment for overall treatment time and all significant tumor and patient characteristics. The Normalized Tumor Dose (NTD) (α/β = 10) and NTD (
Automated time series forecasting for biosurveillance.
Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit
2007-09-30
For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.
Palmprint Verification Using Time Series Method
Directory of Open Access Journals (Sweden)
A. A. Ketut Agung Cahyawan Wiranatha
2013-11-01
Full Text Available The use of biometrics as an automatic recognition system is growing rapidly in solving security problems, palmprint is one of biometric system which often used. This paper used two steps in center of mass moment method for region of interest (ROI segmentation and apply the time series method combined with block window method as feature representation. Normalized Euclidean Distance is used to measure the similarity degrees of two feature vectors of palmprint. System testing is done using 500 samples palms, with 4 samples as the reference image and the 6 samples as test images. Experiment results show that this system can achieve a high performance with success rate about 97.33% (FNMR=1.67%, FMR=1.00 %, T=0.036.
Deconvolution of time series in the laboratory
John, Thomas; Pietschmann, Dirk; Becker, Volker; Wagner, Christian
2016-10-01
In this study, we present two practical applications of the deconvolution of time series in Fourier space. First, we reconstruct a filtered input signal of sound cards that has been heavily distorted by a built-in high-pass filter using a software approach. Using deconvolution, we can partially bypass the filter and extend the dynamic frequency range by two orders of magnitude. Second, we construct required input signals for a mechanical shaker in order to obtain arbitrary acceleration waveforms, referred to as feedforward control. For both situations, experimental and theoretical approaches are discussed to determine the system-dependent frequency response. Moreover, for the shaker, we propose a simple feedback loop as an extension to the feedforward control in order to handle nonlinearities of the system.
Using entropy to cut complex time series
Mertens, David; Poncela Casasnovas, Julia; Spring, Bonnie; Amaral, L. A. N.
2013-03-01
Using techniques from statistical physics, physicists have modeled and analyzed human phenomena varying from academic citation rates to disease spreading to vehicular traffic jams. The last decade's explosion of digital information and the growing ubiquity of smartphones has led to a wealth of human self-reported data. This wealth of data comes at a cost, including non-uniform sampling and statistically significant but physically insignificant correlations. In this talk I present our work using entropy to identify stationary sub-sequences of self-reported human weight from a weight management web site. Our entropic approach-inspired by the infomap network community detection algorithm-is far less biased by rare fluctuations than more traditional time series segmentation techniques. Supported by the Howard Hughes Medical Institute
Normalizing the causality between time series
Liang, X. San
2015-08-01
Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.
Phase correlation of foreign exchange time series
Wu, Ming-Chya
2007-03-01
Correlation of foreign exchange rates in currency markets is investigated based on the empirical data of USD/DEM and USD/JPY exchange rates for a period from February 1 1986 to December 31 1996. The return of exchange time series is first decomposed into a number of intrinsic mode functions (IMFs) by the empirical mode decomposition method. The instantaneous phases of the resultant IMFs calculated by the Hilbert transform are then used to characterize the behaviors of pricing transmissions, and the correlation is probed by measuring the phase differences between two IMFs in the same order. From the distribution of phase differences, our results show explicitly that the correlations are stronger in daily time scale than in longer time scales. The demonstration for the correlations in periods of 1986-1989 and 1990-1993 indicates two exchange rates in the former period were more correlated than in the latter period. The result is consistent with the observations from the cross-correlation calculation.
Costationarity of Locally Stationary Time Series Using costat
Cardinali, Alessandro; Nason, Guy P.
2013-01-01
This article describes the R package costat. This package enables a user to (i) perform a test for time series stationarity; (ii) compute and plot time-localized autocovariances, and (iii) to determine and explore any costationary relationship between two locally stationary time series. Two locally stationary time series are said to be costationary if there exists two time-varying combination functions such that the linear combination of the two series with the functions produces another time...
Fisher information framework for time series modeling
Venkatesan, R. C.; Plastino, A.
2017-08-01
A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.
Time series modeling for syndromic surveillance
Directory of Open Access Journals (Sweden)
Mandl Kenneth D
2003-01-01
Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool
Climate Prediction Center (CPC) Global Temperature Time Series
National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...
Climate Prediction Center (CPC) Global Precipitation Time Series
National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...
Foundations of Sequence-to-Sequence Modeling for Time Series
Kuznetsov, Vitaly; Mariet, Zelda
2018-01-01
The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting. We provide the first theoretical analysis of this time series forecasting framework. We include a comparison of sequence-to-sequence modeling to classical time series models, and as such our theory can serve as a quantitative guide for practiti...
Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui
2014-07-01
The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.
Effectiveness of firefly algorithm based neural network in time series ...
African Journals Online (AJOL)
Effectiveness of firefly algorithm based neural network in time series forecasting. ... In the experiments, three well known time series were used to evaluate the performance. Results obtained were compared with ... Keywords: Time series, Artificial Neural Network, Firefly Algorithm, Particle Swarm Optimization, Overfitting ...
Time Series Observations in the North Indian Ocean
Digital Repository Service at National Institute of Oceanography (India)
Shenoy, D.M.; Naik, H.; Kurian, S.; Naqvi, S.W.A.; Khare, N.
Ocean and the ongoing time series study (Candolim Time Series; CaTS) off Goa. In addition, this article also focuses on the new time series initiative in the Arabian Sea and the Bay of Bengal under Sustained Indian Ocean Biogeochemistry and Ecosystem...
Modeling of Volatility with Non-linear Time Series Model
Kim Song Yon; Kim Mun Chol
2013-01-01
In this paper, non-linear time series models are used to describe volatility in financial time series data. To describe volatility, two of the non-linear time series are combined into form TAR (Threshold Auto-Regressive Model) with AARCH (Asymmetric Auto-Regressive Conditional Heteroskedasticity) error term and its parameter estimation is studied.
Hidden Markov Models for Time Series An Introduction Using R
Zucchini, Walter
2009-01-01
Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.
Efficient Algorithms for Segmentation of Item-Set Time Series
Chundi, Parvathi; Rosenkrantz, Daniel J.
We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.
Nwaogu, Chima J.; Dietz, Maurine W.; Tieleman, B. Irene; Cresswell, Will
Birds should store body reserves if starvation risk is anticipated; this is known as an ‘interrupted foraging response’. If foraging remains unrestricted, however, body mass should remain low to limit the predation risk that gaining and carrying body reserves entails. In temperate environments mass
TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION
Directory of Open Access Journals (Sweden)
Goran Klepac
2007-12-01
Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.
An Energy-Based Similarity Measure for Time Series
Directory of Open Access Journals (Sweden)
Pierre Brunagel
2007-11-01
Full Text Available A new similarity measure, called SimilB, for time series analysis, based on the cross-ÃŽÂ¨B-energy operator (2004, is introduced. ÃŽÂ¨B is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED or the Pearson correlation coefficient (CC, SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of ÃŽÂ¨B are presented. Particularly, we show that ÃŽÂ¨B as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.
Time-series prediction and applications a machine intelligence approach
Konar, Amit
2017-01-01
This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...
Effects of Interruptibility-Aware Robot Behavior
Banerjee, Siddhartha; Silva, Andrew; Feigh, Karen; Chernova, Sonia
2018-01-01
As robots become increasingly prevalent in human environments, there will inevitably be times when a robot needs to interrupt a human to initiate an interaction. Our work introduces the first interruptibility-aware mobile robot system, and evaluates the effects of interruptibility-awareness on human task performance, robot task performance, and on human interpretation of the robot's social aptitude. Our results show that our robot is effective at predicting interruptibility at high accuracy, ...
Directory of Open Access Journals (Sweden)
Cynthia Firnhaber
Full Text Available The clinical outcomes of short interruptions of PI-based ART regimens remains undefined.A 2-arm non-inferiority trial was conducted on 53 HIV-1 infected South African participants with viral load 450 cells/µl on stavudine (or zidovudine, lamivudine and lopinavir/ritonavir. Subjects were randomized to a sequential 2, 4 and 8-week ART interruptions or b continuous ART (cART. Primary analysis was based on the proportion of CD4 count >350 cells(c/ml over 72 weeks. Adherence, HIV-1 drug resistance, and CD4 count rise over time were analyzed as secondary endpoints.The proportions of CD4 counts >350 cells/µl were 82.12% for the intermittent arm and 93.73 for the cART arm; the difference of 11.95% was above the defined 10% threshold for non-inferiority (upper limit of 97.5% CI, 24.1%; 2-sided CI: -0.16, 23.1. No clinically significant differences in opportunistic infections, adverse events, adherence or viral resistance were noted; after randomization, long-term CD4 rise was observed only in the cART arm.We are unable to conclude that short PI-based ART interruptions are non-inferior to cART in retention of immune reconstitution; however, short interruptions did not lead to a greater rate of resistance mutations or adverse events than cART suggesting that this regimen may be more forgiving than NNRTIs if interruptions in therapy occur.ClinicalTrials.gov NCT00100646.
Vector bilinear autoregressive time series model and its superiority ...
African Journals Online (AJOL)
In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.
A novel weight determination method for time series data aggregation
Xu, Paiheng; Zhang, Rong; Deng, Yong
2017-09-01
Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.
Capturing Structure Implicitly from Time-Series having Limited Data
Emaasit, Daniel; Johnson, Matthew
2018-01-01
Scientific fields such as insider-threat detection and highway-safety planning often lack sufficient amounts of time-series data to estimate statistical models for the purpose of scientific discovery. Moreover, the available limited data are quite noisy. This presents a major challenge when estimating time-series models that are robust to overfitting and have well-calibrated uncertainty estimates. Most of the current literature in these fields involve visualizing the time-series for noticeabl...
Sound, memory and interruption
DEFF Research Database (Denmark)
Pinder, David
2016-01-01
This chapter considers how art can interrupt the times and spaces of urban development so they might be imagined, experienced and understood differently. It focuses on the construction of the M11 Link Road through north-east London during the 1990s that demolished hundreds of homes and displaced...... around a thousand people. The highway was strongly resisted and it became the site of one of the country’s longest and largest anti-road struggles. The chapter addresses specifically Graeme Miller’s sound walk LINKED (2003), which for more than a decade has been broadcasting memories and stories...... of people who were violently displaced by the road as well as those who actively sought to halt it. Attention is given to the walk’s interruption of senses of the given and inevitable in two main ways. The first is in relation to the pace of the work and its deployment of slowness and arrest in a context...
Mathematical foundations of time series analysis a concise introduction
Beran, Jan
2017-01-01
This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.
Trend time-series modeling and forecasting with neural networks.
Qi, Min; Zhang, G Peter
2008-05-01
Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.
Time series analysis in the social sciences the fundamentals
Shin, Youseop
2017-01-01
Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re
Stochastic time series analysis of hydrology data for water resources
Sathish, S.; Khadar Babu, S. K.
2017-11-01
The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.
Interpretable Early Classification of Multivariate Time Series
Ghalwash, Mohamed F.
2013-01-01
Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…
Studies on time series applications in environmental sciences
Bărbulescu, Alina
2016-01-01
Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .
DTW-APPROACH FOR UNCORRELATED MULTIVARIATE TIME SERIES IMPUTATION
Phan , Thi-Thu-Hong; Poisson Caillault , Emilie; Bigand , André; Lefebvre , Alain
2017-01-01
International audience; Missing data are inevitable in almost domains of applied sciences. Data analysis with missing values can lead to a loss of efficiency and unreliable results, especially for large missing sub-sequence(s). Some well-known methods for multivariate time series imputation require high correlations between series or their features. In this paper , we propose an approach based on the shape-behaviour relation in low/un-correlated multivariate time series under an assumption of...
Two-fractal overlap time series: Earthquakes and market crashes
Indian Academy of Sciences (India)
velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations. Keywords. Cantor set; time series; earthquake; market crash. PACS Nos 05.00; 02.50.-r; 64.60; 89.65.Gh; 95.75.Wx. 1. Introduction. Capturing dynamical patterns of ...
Metagenomics meets time series analysis: unraveling microbial community dynamics
Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.
2015-01-01
The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic
forecasting with nonlinear time series model: a monte-carlo
African Journals Online (AJOL)
PUBLICATIONS1
erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.
Critical values for unit root tests in seasonal time series
Ph.H.B.F. Franses (Philip Hans); B. Hobijn (Bart)
1997-01-01
textabstractIn this paper, we present tables with critical values for a variety of tests for seasonal and non-seasonal unit roots in seasonal time series. We consider (extensions of) the Hylleberg et al. and Osborn et al. test procedures. These extensions concern time series with increasing seasonal
Measurements of spatial population synchrony: influence of time series transformations.
Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël
2015-09-01
Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.
Transition Icons for Time-Series Visualization and Exploratory Analysis.
Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa
2018-03-01
The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.
Time Series Econometrics for the 21st Century
Hansen, Bruce E.
2017-01-01
The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…
The Prediction of Teacher Turnover Employing Time Series Analysis.
Costa, Crist H.
The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…
Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models
Price, Larry R.
2012-01-01
The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…
Time series forecasting based on deep extreme learning machine
Guo, Xuqi; Pang, Y.; Yan, Gaowei; Qiao, Tiezhu; Yang, Guang-Hong; Yang, Dan
2017-01-01
Multi-layer Artificial Neural Networks (ANN) has caught widespread attention as a new method for time series forecasting due to the ability of approximating any nonlinear function. In this paper, a new local time series prediction model is established with the nearest neighbor domain theory, in
Parameterizing unconditional skewness in models for financial time series
DEFF Research Database (Denmark)
He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo
In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...
Robust Forecasting of Non-Stationary Time Series
Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.
2010-01-01
This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable
Efficient use of correlation entropy for analysing time series data
Indian Academy of Sciences (India)
Abstract. The correlation dimension D2 and correlation entropy K2 are both important quantifiers in nonlinear time series analysis. However, use of D2 has been more common compared to K2 as a discriminating measure. One reason for this is that D2 is a static measure and can be easily evaluated from a time series.
Time series prediction of apple scab using meteorological ...
African Journals Online (AJOL)
A new prediction model for the early warning of apple scab is proposed in this study. The method is based on artificial intelligence and time series prediction. The infection period of apple scab was evaluated as the time series prediction model instead of summation of wetness duration. Also, the relations of different ...
A Dynamic Fuzzy Cluster Algorithm for Time Series
Directory of Open Access Journals (Sweden)
Min Ji
2013-01-01
clustering time series by introducing the definition of key point and improving FCM algorithm. The proposed algorithm works by determining those time series whose class labels are vague and further partitions them into different clusters over time. The main advantage of this approach compared with other existing algorithms is that the property of some time series belonging to different clusters over time can be partially revealed. Results from simulation-based experiments on geographical data demonstrate the excellent performance and the desired results have been obtained. The proposed algorithm can be applied to solve other clustering problems in data mining.
Variable Selection in Time Series Forecasting Using Random Forests
Directory of Open Access Journals (Sweden)
Hristos Tyralis
2017-10-01
Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.
GEKF, GUKF and GGPF based prediction of chaotic time-series with additive and multiplicative noises
International Nuclear Information System (INIS)
Wu Xuedong; Song Zhihuan
2008-01-01
On the assumption that random interruptions in the observation process are modelled by a sequence of independent Bernoulli random variables, this paper generalize the extended Kalman filtering (EKF), the unscented Kalman filtering (UKF) and the Gaussian particle filtering (GPF) to the case in which there is a positive probability that the observation in each time consists of noise alone and does not contain the chaotic signal (These generalized novel algorithms are referred to as GEKF, GUKF and GGPF correspondingly in this paper). Using weights and network output of neural networks to constitute state equation and observation equation for chaotic time-series prediction to obtain the linear system state transition equation with continuous update scheme in an online fashion, and the prediction results of chaotic time series represented by the predicted observation value, these proposed novel algorithms are applied to the prediction of Mackey–Glass time-series with additive and multiplicative noises. Simulation results prove that the GGPF provides a relatively better prediction performance in comparison with GEKF and GUKF. (general)
Frontiers in Time Series and Financial Econometrics : An overview
S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)
2015-01-01
markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time
Frontiers in Time Series and Financial Econometrics: An Overview
S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)
2015-01-01
markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time
vector bilinear autoregressive time series model and its superiority
African Journals Online (AJOL)
KEYWORDS: Linear time series, Autoregressive process, Autocorrelation function, Partial autocorrelation function,. Vector time .... important result on matrix algebra with respect to the spectral ..... application to covariance analysis of super-.
Effectiveness of Multivariate Time Series Classification Using Shapelets
Directory of Open Access Journals (Sweden)
A. P. Karpenko
2015-01-01
Full Text Available Typically, time series classifiers require signal pre-processing (filtering signals from noise and artifact removal, etc., enhancement of signal features (amplitude, frequency, spectrum, etc., classification of signal features in space using the classical techniques and classification algorithms of multivariate data. We consider a method of classifying time series, which does not require enhancement of the signal features. The method uses the shapelets of time series (time series shapelets i.e. small fragments of this series, which reflect properties of one of its classes most of all.Despite the significant number of publications on the theory and shapelet applications for classification of time series, the task to evaluate the effectiveness of this technique remains relevant. An objective of this publication is to study the effectiveness of a number of modifications of the original shapelet method as applied to the multivariate series classification that is a littlestudied problem. The paper presents the problem statement of multivariate time series classification using the shapelets and describes the shapelet–based basic method of binary classification, as well as various generalizations and proposed modification of the method. It also offers the software that implements a modified method and results of computational experiments confirming the effectiveness of the algorithmic and software solutions.The paper shows that the modified method and the software to use it allow us to reach the classification accuracy of about 85%, at best. The shapelet search time increases in proportion to input data dimension.
Pseudo-random bit generator based on lag time series
García-Martínez, M.; Campos-Cantón, E.
2014-12-01
In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.
DEFF Research Database (Denmark)
Ejrnæs, Anders
2011-01-01
for their careers. On the one hand, our findings confirm the hypothesis that long-term absence from the labour market due to full-time care has negative consequences for women's occupational careers. On the other hand, our findings show that countries with well paid leave schemes combined with access to high...... quality childcare reduce the perceived negative occupational consequences of the time spent on full-time care. This is the case independently of the duration of the career interruption due to care-giving....
Analysis of time series and size of equivalent sample
International Nuclear Information System (INIS)
Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge
2004-01-01
In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions
Characterizing time series: when Granger causality triggers complex networks
International Nuclear Information System (INIS)
Ge Tian; Cui Yindong; Lin Wei; Liu Chong; Kurths, Jürgen
2012-01-01
In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIH human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length. (paper)
Characterizing time series: when Granger causality triggers complex networks
Ge, Tian; Cui, Yindong; Lin, Wei; Kurths, Jürgen; Liu, Chong
2012-08-01
In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIHMassachusetts Institute of Technology-Beth Israel Hospital. human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length.
Sensor-Generated Time Series Events: A Definition Language
Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan
2012-01-01
There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.
Current interruption transients calculation
Peelo, David F
2014-01-01
Provides an original, detailed and practical description of current interruption transients, origins, and the circuits involved, and how they can be calculated Current Interruption Transients Calculationis a comprehensive resource for the understanding, calculation and analysis of the transient recovery voltages (TRVs) and related re-ignition or re-striking transients associated with fault current interruption and the switching of inductive and capacitive load currents in circuits. This book provides an original, detailed and practical description of current interruption transients, origins,
Time Series Decomposition into Oscillation Components and Phase Estimation.
Matsuda, Takeru; Komaki, Fumiyasu
2017-02-01
Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.
Signal Processing for Time-Series Functions on a Graph
2018-02-01
Figures Fig. 1 Time -series function on a fixed graph.............................................2 iv Approved for public release; distribution is...φi〉`2(V)φi (39) 6= f̄ (40) Instead, we simply recover the average of f over time . 13 Approved for public release; distribution is unlimited. This...ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time -Series Functions on a Graph by Humberto Muñoz-Barona, Jean Vettel, and
Clinical time series prediction: Toward a hierarchical dynamical system framework.
Liu, Zitao; Hauskrecht, Milos
2015-09-01
Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.
Clinical time series prediction: towards a hierarchical dynamical system framework
Liu, Zitao; Hauskrecht, Milos
2014-01-01
Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive
Conditional time series forecasting with convolutional neural networks
A. Borovykh (Anastasia); S.M. Bohte (Sander); C.W. Oosterlee (Cornelis)
2017-01-01
textabstractForecasting financial time series using past observations has been a significant topic of interest. While temporal relationships in the data exist, they are difficult to analyze and predict accurately due to the non-linear trends and noise present in the series. We propose to learn these
Analysis of complex time series using refined composite multiscale entropy
International Nuclear Information System (INIS)
Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang
2014-01-01
Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.
Forecasting daily meteorological time series using ARIMA and regression models
Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir
2018-04-01
The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.
Segmentation of Nonstationary Time Series with Geometric Clustering
DEFF Research Database (Denmark)
Bocharov, Alexei; Thiesson, Bo
2013-01-01
We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...... from data, where clustering is used to propose one single split candidate at each split level. We use the class of ART time series models to serve as illustration, but because of the non-parametric nature of our segmentation approach, it readily generalizes to a wide range of time-series models that go...
Modelling road accidents: An approach using structural time series
Junus, Noor Wahida Md; Ismail, Mohd Tahir
2014-09-01
In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.
Multivariate time series analysis with R and financial applications
Tsay, Ruey S
2013-01-01
Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl
Scalable Prediction of Energy Consumption using Incremental Time Series Clustering
Energy Technology Data Exchange (ETDEWEB)
Simmhan, Yogesh; Noor, Muhammad Usman
2013-10-09
Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.
Characterizing interdependencies of multiple time series theory and applications
Hosoya, Yuzo; Takimoto, Taro; Kinoshita, Ryo
2017-01-01
This book introduces academic researchers and professionals to the basic concepts and methods for characterizing interdependencies of multiple time series in the frequency domain. Detecting causal directions between a pair of time series and the extent of their effects, as well as testing the non existence of a feedback relation between them, have constituted major focal points in multiple time series analysis since Granger introduced the celebrated definition of causality in view of prediction improvement. Causality analysis has since been widely applied in many disciplines. Although most analyses are conducted from the perspective of the time domain, a frequency domain method introduced in this book sheds new light on another aspect that disentangles the interdependencies between multiple time series in terms of long-term or short-term effects, quantitatively characterizing them. The frequency domain method includes the Granger noncausality test as a special case. Chapters 2 and 3 of the book introduce an i...
Understanding Emergency Medicine Physicians Multitasking Behaviors Around Interruptions.
Fong, Allan; Ratwani, Raj M
2018-06-11
Interruptions can adversely impact human performance, particularly in fast-paced and high-risk environments such as the emergency department (ED). Understanding physician behaviors before, during, and after interruptions is important to the design and promotion of safe and effective workflow solutions. However, traditional human factors based interruption models do not accurately reflect the complexities of real-world environments like the ED and may not capture multiple interruptions and multitasking. We present a more comprehensive framework for understanding interruptions that is composed of three phases, each with multiple levels: Interruption Start Transition, Interruption Engagement, and Interruption End Transition. This three-phase framework is not constrained to discrete task transitions, providing a robust method to categorize multitasking behaviors around interruptions. We apply this framework in categorizing 457 interruption episodes. 457 interruption episodes were captured during 36 hours of observation. The interrupted task was immediately suspended 348 (76.1%) times. Participants engaged in new self-initiated tasks during the interrupting task 164 (35.9%) times and did not directly resume the interrupted task in 284 (62.1%) interruption episodes. Using this framework provides a more detailed description of the types of physician behaviors in complex environments. Understanding the different types of interruption and resumption patterns, which may have a different impact on performance, can support the design of interruption mitigation strategies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Hamilton, Ian; Lloyd, Charlie; Hewitt, Catherine; Godfrey, Christine
2014-01-01
The UK Misuse of Drugs Act (1971) divided controlled drugs into three groups A, B and C, with descending criminal sanctions attached to each class. Cannabis was originally assigned by the Act to Group B but in 2004, it was transferred to the lowest risk group, Group C. Then in 2009, on the basis of increasing concerns about a link between high strength cannabis and schizophrenia, it was moved back to Group B. The aim of this study is to test the assumption that changes in classification lead to changes in levels of psychosis. In particular, it explores whether the two changes in 2004 and 2009 were associated with changes in the numbers of people admitted for cannabis psychosis. An interrupted time series was used to investigate the relationship between the two changes in cannabis classification and their impact on hospital admissions for cannabis psychosis. Reflecting the two policy changes, two interruptions to the time series were made. Hospital Episode Statistics admissions data was analysed covering the period 1999 through to 2010. There was a significantly increasing trend in cannabis psychosis admissions from 1999 to 2004. However, following the reclassification of cannabis from B to C in 2004, there was a significant change in the trend such that cannabis psychosis admissions declined to 2009. Following the second reclassification of cannabis back to class B in 2009, there was a significant change to increasing admissions. This study shows a statistical association between the reclassification of cannabis and hospital admissions for cannabis psychosis in the opposite direction to that predicted by the presumed relationship between the two. However, the reasons for this statistical association are unclear. It is unlikely to be due to changes in cannabis use over this period. Other possible explanations include changes in policing and systemic changes in mental health services unrelated to classification decisions. Copyright © 2013 Elsevier B.V. All rights
Scale-dependent intrinsic entropies of complex time series.
Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E
2016-04-13
Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).
Quantifying memory in complex physiological time-series.
Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R
2013-01-01
In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.
Elements of nonlinear time series analysis and forecasting
De Gooijer, Jan G
2017-01-01
This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...
Growth And Export Expansion In Mauritius - A Time Series Analysis ...
African Journals Online (AJOL)
Growth And Export Expansion In Mauritius - A Time Series Analysis. ... RV Sannassee, R Pearce ... Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings ...
On robust forecasting of autoregressive time series under censoring
Kharin, Y.; Badziahin, I.
2009-01-01
Problems of robust statistical forecasting are considered for autoregressive time series observed under distortions generated by interval censoring. Three types of robust forecasting statistics are developed; meansquare risk is evaluated for the developed forecasting statistics. Numerical results are given.
AFSC/ABL: Ugashik sockeye salmon scale time series
National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...
Unsupervised land cover change detection: meaningful sequential time series analysis
CSIR Research Space (South Africa)
Salmon, BP
2011-06-01
Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...
Fast and Flexible Multivariate Time Series Subsequence Search
National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...
AFSC/ABL: Naknek sockeye salmon scale time series
National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....
Constructing ordinal partition transition networks from multivariate time series.
Zhang, Jiayang; Zhou, Jie; Tang, Ming; Guo, Heng; Small, Michael; Zou, Yong
2017-08-10
A growing number of algorithms have been proposed to map a scalar time series into ordinal partition transition networks. However, most observable phenomena in the empirical sciences are of a multivariate nature. We construct ordinal partition transition networks for multivariate time series. This approach yields weighted directed networks representing the pattern transition properties of time series in velocity space, which hence provides dynamic insights of the underling system. Furthermore, we propose a measure of entropy to characterize ordinal partition transition dynamics, which is sensitive to capturing the possible local geometric changes of phase space trajectories. We demonstrate the applicability of pattern transition networks to capture phase coherence to non-coherence transitions, and to characterize paths to phase synchronizations. Therefore, we conclude that the ordinal partition transition network approach provides complementary insight to the traditional symbolic analysis of nonlinear multivariate time series.
forecasting with nonlinear time series model: a monte-carlo
African Journals Online (AJOL)
PUBLICATIONS1
Carlo method of forecasting using a special nonlinear time series model, called logistic smooth transition ... We illustrate this new method using some simulation ..... in MATLAB 7.5.0. ... process (DGP) using the logistic smooth transi-.
Chaotic time series prediction: From one to another
International Nuclear Information System (INIS)
Zhao Pengfei; Xing Lei; Yu Jun
2009-01-01
In this Letter, a new local linear prediction model is proposed to predict a chaotic time series of a component x(t) by using the chaotic time series of another component y(t) in the same system with x(t). Our approach is based on the phase space reconstruction coming from the Takens embedding theorem. To illustrate our results, we present an example of Lorenz system and compare with the performance of the original local linear prediction model.
The use of synthetic input sequences in time series modeling
International Nuclear Information System (INIS)
Oliveira, Dair Jose de; Letellier, Christophe; Gomes, Murilo E.D.; Aguirre, Luis A.
2008-01-01
In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure
Advances in Antithetic Time Series Analysis : Separating Fact from Artifact
Directory of Open Access Journals (Sweden)
Dennis Ridley
2016-01-01
Full Text Available The problem of biased time series mathematical model parameter estimates is well known to be insurmountable. When used to predict future values by extrapolation, even a de minimis bias will eventually grow into a large bias, with misleading results. This paper elucidates how combining antithetic time series' solves this baffling problem of bias in the fitted and forecast values by dynamic bias cancellation. Instead of growing to infinity, the average error can converge to a constant. (original abstract
Multiple Time Series Ising Model for Financial Market Simulations
International Nuclear Information System (INIS)
Takaishi, Tetsuya
2015-01-01
In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated
Stacked Heterogeneous Neural Networks for Time Series Forecasting
Directory of Open Access Journals (Sweden)
Florin Leon
2010-01-01
Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.
Robust Forecasting of Non-Stationary Time Series
Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.
2010-01-01
This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable forecasts in the presence of outliers, non-linearity, and heteroscedasticity. In the absence of outliers, the forecasts are only slightly less precise than those based on a localized Least Squares estima...
Automated Feature Design for Time Series Classification by Genetic Programming
Harvey, Dustin Yewell
2014-01-01
Time series classification (TSC) methods discover and exploit patterns in time series and other one-dimensional signals. Although many accurate, robust classifiers exist for multivariate feature sets, general approaches are needed to extend machine learning techniques to make use of signal inputs. Numerous applications of TSC can be found in structural engineering, especially in the areas of structural health monitoring and non-destructive evaluation. Additionally, the fields of process contr...
Geomechanical time series and its singularity spectrum analysis
Czech Academy of Sciences Publication Activity Database
Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta
2012-01-01
Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf
Time Series Analysis of Insar Data: Methods and Trends
Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique
2015-01-01
Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.
Similarity estimators for irregular and age uncertain time series
Rehfeld, K.; Kurths, J.
2013-09-01
Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity
Similarity estimators for irregular and age-uncertain time series
Rehfeld, K.; Kurths, J.
2014-01-01
Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity
Data imputation analysis for Cosmic Rays time series
Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.
2017-05-01
The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.
Correlation measure to detect time series distances, whence economy globalization
Miśkiewicz, Janusz; Ausloos, Marcel
2008-11-01
An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.
Multiresolution analysis of Bursa Malaysia KLCI time series
Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed
2017-05-01
In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.
Time domain series system definition and gear set reliability modeling
International Nuclear Information System (INIS)
Xie, Liyang; Wu, Ningxiang; Qian, Wenxue
2016-01-01
Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.
State-level gonorrhea rates and expedited partner therapy laws: insights from time series analyses.
Owusu-Edusei, K; Cramer, R; Chesson, H W; Gift, T L; Leichliter, J S
2017-06-01
In this study, we examined state-level monthly gonorrhea morbidity and assessed the potential impact of existing expedited partner therapy (EPT) laws in relation to the time that the laws were enacted. Longitudinal study. We obtained state-level monthly gonorrhea morbidity (number of cases/100,000 for males, females and total) from the national surveillance data. We used visual examination (of morbidity trends) and an autoregressive time series model in a panel format with intervention (interrupted time series) analysis to assess the impact of state EPT laws based on the months in which the laws were enacted. For over 84% of the states with EPT laws, the monthly morbidity trends did not show any noticeable decreases on or after the laws were enacted. Although we found statistically significant decreases in gonorrhea morbidity within four of the states with EPT laws (Alaska, Illinois, Minnesota, and Vermont), there were no significant decreases when the decreases in the four states were compared contemporaneously with the decreases in states that do not have the laws. We found no impact (decrease in gonorrhea morbidity) attributable exclusively to the EPT law(s). However, these results do not imply that the EPT laws themselves were not effective (or failed to reduce gonorrhea morbidity), because the effectiveness of the EPT law is dependent on necessary intermediate events/outcomes, including sexually transmitted infection service providers' awareness and practice, as well as acceptance by patients and their partners. Published by Elsevier Ltd.
Evaluation of scaling invariance embedded in short time series.
Directory of Open Access Journals (Sweden)
Xue Pan
Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.
Evaluation of scaling invariance embedded in short time series.
Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping
2014-01-01
Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.
Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.
Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi
2015-02-01
We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.
Drunk driving detection based on classification of multivariate time series.
Li, Zhenlong; Jin, Xue; Zhao, Xiaohua
2015-09-01
This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.
Recurrent Neural Networks for Multivariate Time Series with Missing Values.
Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan
2018-04-17
Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.
Self-affinity in the dengue fever time series
Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.
2016-06-01
Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.
Stochastic modeling of hourly rainfall times series in Campania (Italy)
Giorgio, M.; Greco, R.
2009-04-01
Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil
Arbitrage, market definition and monitoring a time series approach
Burke, S; Hunter, J
2012-01-01
This article considers the application to regional price data of time series methods to test stationarity, multivariate cointegration and exogeneity. The discovery of stationary price differentials in a bivariate setting implies that the series are rendered stationary by capturing a common trend and we observe through this mechanism long-run arbitrage. This is indicative of a broader market definition and efficiency. The problem is considered in relation to more than 700 weekly data points on...
Time Series Analysis of Wheat Futures Reward in China
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.
Unstable Periodic Orbit Analysis of Histograms of Chaotic Time Series
International Nuclear Information System (INIS)
Zoldi, S.M.
1998-01-01
Using the Lorenz equations, we have investigated whether unstable periodic orbits (UPOs) associated with a strange attractor may predict the occurrence of the robust sharp peaks in histograms of some experimental chaotic time series. Histograms with sharp peaks occur for the Lorenz parameter value r=60.0 but not for r=28.0 , and the sharp peaks for r=60.0 do not correspond to a histogram derived from any single UPO. However, we show that histograms derived from the time series of a non-Axiom-A chaotic system can be accurately predicted by an escape-time weighting of UPO histograms. copyright 1998 The American Physical Society
Minimum entropy density method for the time series analysis
Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae
2009-01-01
The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.
Multi-Scale Dissemination of Time Series Data
DEFF Research Database (Denmark)
Guo, Qingsong; Zhou, Yongluan; Su, Li
2013-01-01
In this paper, we consider the problem of continuous dissemination of time series data, such as sensor measurements, to a large number of subscribers. These subscribers fall into multiple subscription levels, where each subscription level is specified by the bandwidth constraint of a subscriber......, which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time...
Compounding approach for univariate time series with nonstationary variances
Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich
2015-12-01
A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.
Characterizing time series via complexity-entropy curves
Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.
2017-06-01
The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.
Recurrent Neural Network Applications for Astronomical Time Series
Protopapas, Pavlos
2017-06-01
The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.
Multi-granular trend detection for time-series analysis
van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.
2017-01-01
Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data
Time Series Analysis Based on Running Mann Whitney Z Statistics
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
The Photoplethismographic Signal Processed with Nonlinear Time Series Analysis Tools
International Nuclear Information System (INIS)
Hernandez Caceres, Jose Luis; Hong, Rolando; Garcia Lanz, Abel; Garcia Dominguez, Luis; Cabannas, Karelia
2001-01-01
Finger photoplethismography (PPG) signals were submitted to nonlinear time series analysis. The applied analytical techniques were: (i) High degree polynomial fitting for baseline estimation; (ii) FFT analysis for estimating power spectra; (iii) fractal dimension estimation via the Higuchi's time-domain method, and (iv) kernel nonparametric estimation for reconstructing noise free-attractors and also for estimating signal's stochastic components
Time Series Outlier Detection Based on Sliding Window Prediction
Directory of Open Access Journals (Sweden)
Yufeng Yu
2014-01-01
Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.
Grammar-based feature generation for time-series prediction
De Silva, Anthony Mihirana
2015-01-01
This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...
Learning of time series through neuron-to-neuron instruction
Energy Technology Data Exchange (ETDEWEB)
Miyazaki, Y [Department of Physics, Kyoto University, Kyoto 606-8502, (Japan); Kinzel, W [Institut fuer Theoretische Physik, Universitaet Wurzburg, 97074 Wurzburg (Germany); Shinomoto, S [Department of Physics, Kyoto University, Kyoto (Japan)
2003-02-07
A model neuron with delayline feedback connections can learn a time series generated by another model neuron. It has been known that some student neurons that have completed such learning under the instruction of a teacher's quasi-periodic sequence mimic the teacher's time series over a long interval, even after instruction has ceased. We found that in addition to such faithful students, there are unfaithful students whose time series eventually diverge exponentially from that of the teacher. In order to understand the circumstances that allow for such a variety of students, the orbit dimension was estimated numerically. The quasi-periodic orbits in question were found to be confined in spaces with dimensions significantly smaller than that of the full phase space.
Learning of time series through neuron-to-neuron instruction
International Nuclear Information System (INIS)
Miyazaki, Y; Kinzel, W; Shinomoto, S
2003-01-01
A model neuron with delayline feedback connections can learn a time series generated by another model neuron. It has been known that some student neurons that have completed such learning under the instruction of a teacher's quasi-periodic sequence mimic the teacher's time series over a long interval, even after instruction has ceased. We found that in addition to such faithful students, there are unfaithful students whose time series eventually diverge exponentially from that of the teacher. In order to understand the circumstances that allow for such a variety of students, the orbit dimension was estimated numerically. The quasi-periodic orbits in question were found to be confined in spaces with dimensions significantly smaller than that of the full phase space
Time series analysis and its applications with R examples
Shumway, Robert H
2017-01-01
The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...
Nonlinear time series analysis of the human electrocardiogram
International Nuclear Information System (INIS)
Perc, Matjaz
2005-01-01
We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method
Neural network versus classical time series forecasting models
Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam
2017-05-01
Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.
Track Irregularity Time Series Analysis and Trend Forecasting
Directory of Open Access Journals (Sweden)
Jia Chaolong
2012-01-01
Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.
A multidisciplinary database for geophysical time series management
Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.
2013-12-01
The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.
A novel time series link prediction method: Learning automata approach
Moradabadi, Behnaz; Meybodi, Mohammad Reza
2017-09-01
Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.
Time series patterns and language support in DBMS
Telnarova, Zdenka
2017-07-01
This contribution is focused on pattern type Time Series as a rich in semantics representation of data. Some example of implementation of this pattern type in traditional Data Base Management Systems is briefly presented. There are many approaches how to manipulate with patterns and query patterns. Crucial issue can be seen in systematic approach to pattern management and specific pattern query language which takes into consideration semantics of patterns. Query language SQL-TS for manipulating with patterns is shown on Time Series data.
Testing for intracycle determinism in pseudoperiodic time series.
Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A
2008-06-01
A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.
Bootstrap Power of Time Series Goodness of fit tests
Directory of Open Access Journals (Sweden)
Sohail Chand
2013-10-01
Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.
Handbook of Time Series Analysis Recent Theoretical Developments and Applications
Schelter, Björn; Timmer, Jens
2006-01-01
This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de
Time Series Analysis of Onchocerciasis Data from Mexico: A Trend towards Elimination
Pérez-Rodríguez, Miguel A.; Adeleke, Monsuru A.; Orozco-Algarra, María E.; Arrendondo-Jiménez, Juan I.; Guo, Xianwu
2013-01-01
Background In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. Results A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. Conclusion The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance. PMID:23459370
Quantifying Selection with Pool-Seq Time Series Data.
Taus, Thomas; Futschik, Andreas; Schlötterer, Christian
2017-11-01
Allele frequency time series data constitute a powerful resource for unraveling mechanisms of adaptation, because the temporal dimension captures important information about evolutionary forces. In particular, Evolve and Resequence (E&R), the whole-genome sequencing of replicated experimentally evolving populations, is becoming increasingly popular. Based on computer simulations several studies proposed experimental parameters to optimize the identification of the selection targets. No such recommendations are available for the underlying parameters selection strength and dominance. Here, we introduce a highly accurate method to estimate selection parameters from replicated time series data, which is fast enough to be applied on a genome scale. Using this new method, we evaluate how experimental parameters can be optimized to obtain the most reliable estimates for selection parameters. We show that the effective population size (Ne) and the number of replicates have the largest impact. Because the number of time points and sequencing coverage had only a minor effect, we suggest that time series analysis is feasible without major increase in sequencing costs. We anticipate that time series analysis will become routine in E&R studies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
A window-based time series feature extraction method.
Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife
2017-10-01
This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.
Stochastic generation of hourly wind speed time series
International Nuclear Information System (INIS)
Shamshad, A.; Wan Mohd Ali Wan Hussin; Bawadi, M.A.; Mohd Sanusi, S.A.
2006-01-01
In the present study hourly wind speed data of Kuala Terengganu in Peninsular Malaysia are simulated by using transition matrix approach of Markovian process. The wind speed time series is divided into various states based on certain criteria. The next wind speed states are selected based on the previous states. The cumulative probability transition matrix has been formed in which each row ends with 1. Using the uniform random numbers between 0 and 1, a series of future states is generated. These states have been converted to the corresponding wind speed values using another uniform random number generator. The accuracy of the model has been determined by comparing the statistical characteristics such as average, standard deviation, root mean square error, probability density function and autocorrelation function of the generated data to those of the original data. The generated wind speed time series data is capable to preserve the wind speed characteristics of the observed data
Segmentation of time series with long-range fractal correlations
Bernaola-Galván, P.; Oliver, J.L.; Hackenberg, M.; Coronado, A.V.; Ivanov, P.Ch.; Carpena, P.
2012-01-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome. PMID:23645997
Segmentation of time series with long-range fractal correlations.
Bernaola-Galván, P; Oliver, J L; Hackenberg, M; Coronado, A V; Ivanov, P Ch; Carpena, P
2012-06-01
Segmentation is a standard method of data analysis to identify change-points dividing a nonstationary time series into homogeneous segments. However, for long-range fractal correlated series, most of the segmentation techniques detect spurious change-points which are simply due to the heterogeneities induced by the correlations and not to real nonstationarities. To avoid this oversegmentation, we present a segmentation algorithm which takes as a reference for homogeneity, instead of a random i.i.d. series, a correlated series modeled by a fractional noise with the same degree of correlations as the series to be segmented. We apply our algorithm to artificial series with long-range correlations and show that it systematically detects only the change-points produced by real nonstationarities and not those created by the correlations of the signal. Further, we apply the method to the sequence of the long arm of human chromosome 21, which is known to have long-range fractal correlations. We obtain only three segments that clearly correspond to the three regions of different G + C composition revealed by means of a multi-scale wavelet plot. Similar results have been obtained when segmenting all human chromosome sequences, showing the existence of previously unknown huge compositional superstructures in the human genome.
Lavergne, M Ruth; Law, Michael R; Peterson, Sandra; Garrison, Scott; Hurley, Jeremiah; Cheng, Lucy; McGrail, Kimberlyn
2018-02-01
We studied the effects of incentive payments to primary care physicians for the care of patients with diabetes, hypertension, and Chronic Obstructive Pulmonary Disease (COPD) in British Columbia, Canada. We used linked administrative health data to examine monthly primary care visits, continuity of care, laboratory testing, pharmaceutical dispensing, hospitalizations, and total h ealth care spending. We examined periods two years before and two years after each incentive was introduced, and used segmented regression to assess whether there were changes in level or trend of outcome measures across all eligible patients following incentive introduction, relative to pre-intervention periods. We observed no increases in primary care visits or continuity of care after incentives were introduced. Rates of ACR testing and antihypertensive dispensing increased among patients with hypertension, but none of the other modest increases in laboratory testing or prescriptions dispensed reached statistical significance. Rates of hospitalizations for stroke and heart failure among patients with hypertension fell relative to pre-intervention patterns, while hospitalizations for COPD increased. Total hospitalizations and hospitalizations via the emergency department did not change. Health care spending increased for patients with hypertension. This large-scale incentive scheme for primary care physicians showed some positive effects for patients with hypertension, but we observe no similar changes in patient management, reductions in hospitalizations, or changes in spending for patients with diabetes and COPD. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
van de Pol, Ineke; van Iterson, Mat; Maaskant, Jolanda
2017-01-01
Delirium in critically-ill patients is a common multifactorial disorder that is associated with various negative outcomes. It is assumed that sleep disturbances can result in an increased risk of delirium. This study hypothesized that implementing a protocol that reduces overall nocturnal sound
Dayer, Mark J; Jones, Simon; Prendergast, Bernard; Baddour, Larry M.; Lockhart, Peter B; Thornhill, Martin H
2017-01-01
Background Antibiotic prophylaxis (AP) administered prior to invasive procedures in patients at risk of developing infective endocarditis (IE) has historically been the focus of IE prevention. Recent changes in AP guidelines in the US and Europe have substantially reduced the numbers for whom AP is recommended. In the UK, the National Institute for Health and Care Excellence (NICE) guidelines recommended complete cessation of AP in March 2008. We report the impact of these guidelines on AP prescribing; in addition, IE incidence was examined following the introduction of the guidelines. Methods We analyzed English AP prescribing data from January 2004 to March 2013 and hospital discharge episode statistics for patients with a primary diagnosis of IE from January 2000 to March 2013. Findings AP prescribing rates fell dramatically after introduction of the NICE guidance (10,935 prescriptions/month vs. 2,236 prescriptions/month, p<0·0001). Commencing in March 2008, there was also a significant increase in the number of IE cases/month (0·11 cases/10million/month, CI 0·05–0·16, p<0·0001) above the projected historical trend. By March 2013, there were an additional 35 cases/month than would have been expected if the previous trend had continued. This increase in IE incidence was significant for both ‘high-risk’ and ‘lower-risk’ individuals. Interpretation Although our data do not establish a causal relationship, there has been a substantial reduction in AP prescribing and a significant increase in IE incidence in England since introduction of the NICE guidelines in 2008. Funding Different aspects of this study were supported by Heart Research UK and Simplyhealth [Grant Ref: RG2632/13/14] and NIDCR R03 grant [Ref: 1R03DE023092-01] from the National Institutes for Health. PMID:25467569
Current interruption by density depression
International Nuclear Information System (INIS)
Wagner, J.S.; Tajima, T.; Akasofu, S.I.
1985-04-01
Using a one-dimensional electrostatic particle code, we examine processes associated with current interruption in a collisionless plasma when a density depression is present along the current channel. Current interruption due to double layers was suggested by Alfven and Carlqvist (1967) as a cause of solar flares. At a local density depression, plasma instabilities caused by an electron current flow are accentuated, leading to current disruption. Our simulation study encompasses a wide range of the parameters in such a way that under appropriate conditions, both the Alfven and Carlqvist (1967) regime and the Smith and Priest (1972) regime take place. In the latter regime the density depression decays into a stationary structure (''ion-acoustic layer'') which spawns a series of ion-acoustic ''solitons'' and ion phase space holes travelling upstream. A large inductance of the current circuit tends to enhance the plasma instabilities
Detecting macroeconomic phases in the Dow Jones Industrial Average time series
Wong, Jian Cheng; Lian, Heng; Cheong, Siew Ann
2009-11-01
In this paper, we perform statistical segmentation and clustering analysis of the Dow Jones Industrial Average (DJI) time series between January 1997 and August 2008. Modeling the index movements and log-index movements as stationary Gaussian processes, we find a total of 116 and 119 statistically stationary segments respectively. These can then be grouped into between five and seven clusters, each representing a different macroeconomic phase. The macroeconomic phases are distinguished primarily by their volatilities. We find that the US economy, as measured by the DJI, spends most of its time in a low-volatility phase and a high-volatility phase. The former can be roughly associated with economic expansion, while the latter contains the economic contraction phase in the standard economic cycle. Both phases are interrupted by a moderate-volatility market correction phase, but extremely-high-volatility market crashes are found mostly within the high-volatility phase. From the temporal distribution of various phases, we see a high-volatility phase from mid-1998 to mid-2003, and another starting mid-2007 (the current global financial crisis). Transitions from the low-volatility phase to the high-volatility phase are preceded by a series of precursor shocks, whereas the transition from the high-volatility phase to the low-volatility phase is preceded by a series of inverted shocks. The time scale for both types of transitions is about a year. We also identify the July 1997 Asian Financial Crisis to be the trigger for the mid-1998 transition, and an unnamed May 2006 market event related to corrections in the Chinese markets to be the trigger for the mid-2007 transition.
Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study
Michaels, Anthony F.; Knap, Anthony H.
1992-01-01
Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.
Complexity analysis of the turbulent environmental fluid flow time series
Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.
2014-02-01
We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.
Outlier detection algorithms for least squares time series regression
DEFF Research Database (Denmark)
Johansen, Søren; Nielsen, Bent
We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator Sat...
Tempered fractional time series model for turbulence in geophysical flows
Meerschaert, Mark M.; Sabzikar, Farzad; Phanikumar, Mantha S.; Zeleke, Aklilu
2014-09-01
We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model.
Tempered fractional time series model for turbulence in geophysical flows
International Nuclear Information System (INIS)
Meerschaert, Mark M; Sabzikar, Farzad; Phanikumar, Mantha S; Zeleke, Aklilu
2014-01-01
We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model. (paper)
Classical pooling of cross-section and time series data
International Nuclear Information System (INIS)
Nuamah, N.N.N.N.
2000-04-01
This paper discusses the classical pooling of cross-section and time series data. The re-expressions of the normal equations of this model are given to indicate the source of the paradox that arises in the estimation of the regression coefficient. (author)
Time series analysis in chaotic diode resonator circuit
Energy Technology Data Exchange (ETDEWEB)
Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)
2006-01-01
A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.
Time series analysis in chaotic diode resonator circuit
International Nuclear Information System (INIS)
Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.
2006-01-01
A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated
Time Series Factor Analysis with an Application to Measuring Money
Gilbert, Paul D.; Meijer, Erik
2005-01-01
Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the
Time series analysis of monthly pulpwood use in the Northeast
James T. Bones
1980-01-01
Time series analysis was used to develop a model that depicts pulpwood use in the Northeast. The model is useful in forecasting future pulpwood requirements (short term) or monitoring pulpwood-use activity in relation to past use patterns. The model predicted a downturn in use during 1980.
Time series prediction with simple recurrent neural networks ...
African Journals Online (AJOL)
A hybrid of the two called Elman-Jordan (or Multi-recurrent) neural network is also being used. In this study, we evaluated the performance of these neural networks on three established bench mark time series prediction problems. Results from the experiments showed that Jordan neural network performed significantly ...
Dynamic Factor Analysis of Nonstationary Multivariate Time Series.
Molenaar, Peter C. M.; And Others
1992-01-01
The dynamic factor model proposed by P. C. Molenaar (1985) is exhibited, and a dynamic nonstationary factor model (DNFM) is constructed with latent factor series that have time-varying mean functions. The use of a DNFM is illustrated using data from a television viewing habits study. (SLD)
Single-Index Additive Vector Autoregressive Time Series Models
LI, YEHUA; GENTON, MARC G.
2009-01-01
We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided
Daily time series evapotranspiration maps for Oklahoma and Texas panhandle
Evapotranspiration (ET) is an important process in ecosystems’ water budget and closely linked to its productivity. Therefore, regional scale daily time series ET maps developed at high and medium resolutions have large utility in studying the carbon-energy-water nexus and managing water resources. ...
United States forest disturbance trends observed with landsat time series
Jeffrey G. Masek; Samuel N. Goward; Robert E. Kennedy; Warren B. Cohen; Gretchen G. Moisen; Karen Schleweiss; Chengquan. Huang
2013-01-01
Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing US land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest...
Koopman Operator Framework for Time Series Modeling and Analysis
Surana, Amit
2018-01-01
We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.
Time series analysis in astronomy: Limits and potentialities
DEFF Research Database (Denmark)
Vio, R.; Kristensen, N.R.; Madsen, Henrik
2005-01-01
In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...
Time Series Analysis of 3D Coordinates Using Nonstochastic Observations
Velsink, H.
2016-01-01
Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to formulate constraints on
Time Series Analysis of 3D Coordinates Using Nonstochastic Observations
Hiddo Velsink
2016-01-01
From the article: Abstract Adjustment and testing of a combination of stochastic and nonstochastic observations is applied to the deformation analysis of a time series of 3D coordinates. Nonstochastic observations are constant values that are treated as if they were observations. They are used to
A Hybrid Joint Moment Ratio Test for Financial Time Series
P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)
1998-01-01
textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons
Time Series, Stochastic Processes and Completeness of Quantum Theory
International Nuclear Information System (INIS)
Kupczynski, Marian
2011-01-01
Most of physical experiments are usually described as repeated measurements of some random variables. Experimental data registered by on-line computers form time series of outcomes. The frequencies of different outcomes are compared with the probabilities provided by the algorithms of quantum theory (QT). In spite of statistical predictions of QT a claim was made that it provided the most complete description of the data and of the underlying physical phenomena. This claim could be easily rejected if some fine structures, averaged out in the standard descriptive statistical analysis, were found in time series of experimental data. To search for these structures one has to use more subtle statistical tools which were developed to study time series produced by various stochastic processes. In this talk we review some of these tools. As an example we show how the standard descriptive statistical analysis of the data is unable to reveal a fine structure in a simulated sample of AR (2) stochastic process. We emphasize once again that the violation of Bell inequalities gives no information on the completeness or the non locality of QT. The appropriate way to test the completeness of quantum theory is to search for fine structures in time series of the experimental data by means of the purity tests or by studying the autocorrelation and partial autocorrelation functions.
factor high order fuzzy time series with applications to temperature
African Journals Online (AJOL)
HOD
In this paper, a novel two – factor high – order fuzzy time series forecasting method based on .... to balance between local and global exploitations of the swarms. While, .... Although, there were a number of outliers but, the spread at the spot in ...
RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.
Stránský, V; Thinová, L
2017-11-01
In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Identification of human operator performance models utilizing time series analysis
Holden, F. M.; Shinners, S. M.
1973-01-01
The results of an effort performed by Sperry Systems Management Division for AMRL in applying time series analysis as a tool for modeling the human operator are presented. This technique is utilized for determining the variation of the human transfer function under various levels of stress. The human operator's model is determined based on actual input and output data from a tracking experiment.
Notes on economic time series analysis system theoretic perspectives
Aoki, Masanao
1983-01-01
In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...
Book Review: "Hidden Markov Models for Time Series: An ...
African Journals Online (AJOL)
Hidden Markov Models for Time Series: An Introduction using R. by Walter Zucchini and Iain L. MacDonald. Chapman & Hall (CRC Press), 2009. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · http://dx.doi.org/10.4314/saaj.v10i1.61717 · AJOL African Journals Online.
Long-memory time series theory and methods
Palma, Wilfredo
2007-01-01
Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.
ISO 9000 Series Certification Over Time: what have we learnt?
A. van der Wiele (Ton); A.M. Brown (Alan)
2002-01-01
textabstractThe ISO 9000 experiences of the same sample of organisations over a five year time period is examined in this paper. The responses to a questionnaire sent out at the end of 1999 to companies which had a reasonably long term experience with the ISO 9000 series quality system are analysed.
Detection of "noisy" chaos in a time series
DEFF Research Database (Denmark)
Chon, K H; Kanters, J K; Cohen, R J
1997-01-01
Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both...
Conditional mode regression: Application to functional time series prediction
Dabo-Niang, Sophie; Laksaci, Ali
2008-01-01
We consider $\\alpha$-mixing observations and deal with the estimation of the conditional mode of a scalar response variable $Y$ given a random variable $X$ taking values in a semi-metric space. We provide a convergence rate in $L^p$ norm of the estimator. A useful and typical application to functional times series prediction is given.
Tests for nonlinearity in short stationary time series
International Nuclear Information System (INIS)
Chang, T.; Sauer, T.; Schiff, S.J.
1995-01-01
To compare direct tests for detecting determinism in chaotic time series, data from Henon, Lorenz, and Mackey--Glass equations were contaminated with various levels of additive colored noise. These data were analyzed with a variety of recently developed tests for determinism, and the results compared
Seasonal time series forecasting: a comparative study of arima and ...
African Journals Online (AJOL)
This paper addresses the concerns of Faraway and Chatfield (1998) who questioned the forecasting ability of Artificial Neural Networks (ANN). In particular the paper compares the performance of Artificial Neural Networks (ANN) and ARIMA models in forecasting of seasonal (monthly) Time series. Using the Airline data ...
Multivariate time series modeling of selected childhood diseases in ...
African Journals Online (AJOL)
This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...
multivariate time series modeling of selected childhood diseases
African Journals Online (AJOL)
2016-06-17
Jun 17, 2016 ... KEYWORDS: Multivariate Approach, Pre-whitening, Vector Time Series, .... Alternatively, the process may be written in mean adjusted form as .... The AIC criterion asymptotically over estimates the order with positive probability, whereas the BIC and HQC criteria ... has the same asymptotic distribution as Ǫ.
Static Checking of Interrupt-driven Software
DEFF Research Database (Denmark)
Brylow, Dennis; Damgaard, Niels; Palsberg, Jens
2001-01-01
at the assembly level. In this paper we present the design and implementation of a static checker for interrupt-driven Z86-based software with hard real-time requirements. For six commercial microcontrollers, our checker has produced upper bounds on interrupt latencies and stack sizes, as well as verified...
Classification of time series patterns from complex dynamic systems
Energy Technology Data Exchange (ETDEWEB)
Schryver, J.C.; Rao, N.
1998-07-01
An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.
Normalization methods in time series of platelet function assays
Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham
2016-01-01
Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217
Wavelet transform approach for fitting financial time series data
Ahmed, Amel Abdoullah; Ismail, Mohd Tahir
2015-10-01
This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.
Optimization of recurrent neural networks for time series modeling
DEFF Research Database (Denmark)
Pedersen, Morten With
1997-01-01
The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...
Recursive Bayesian recurrent neural networks for time-series modeling.
Mirikitani, Derrick T; Nikolaev, Nikolay
2010-02-01
This paper develops a probabilistic approach to recursive second-order training of recurrent neural networks (RNNs) for improved time-series modeling. A general recursive Bayesian Levenberg-Marquardt algorithm is derived to sequentially update the weights and the covariance (Hessian) matrix. The main strengths of the approach are a principled handling of the regularization hyperparameters that leads to better generalization, and stable numerical performance. The framework involves the adaptation of a noise hyperparameter and local weight prior hyperparameters, which represent the noise in the data and the uncertainties in the model parameters. Experimental investigations using artificial and real-world data sets show that RNNs equipped with the proposed approach outperform standard real-time recurrent learning and extended Kalman training algorithms for recurrent networks, as well as other contemporary nonlinear neural models, on time-series modeling.
On the plurality of times: disunified time and the A-series | Nefdt ...
African Journals Online (AJOL)
Then, I attempt to show that disunified time is a problem for a semantics based on the A-series since A-truthmakers are hard to come by in a universe of temporally disconnected time-series. Finally, I provide a novel argument showing that presentists should be particularly fearful of such a universe. South African Journal of ...
Recurrence and symmetry of time series: Application to transition detection
International Nuclear Information System (INIS)
Girault, Jean-Marc
2015-01-01
Highlights: •A new theoretical framework based on the symmetry concept is proposed. •Four types of symmetry present in any time series were analyzed. •New descriptors make possible the analysis of regime changes in logistic systems. •Chaos–chaos, chaos–periodic, symmetry-breaking, symmetry-increasing bifurcations can be detected. -- Abstract: The study of transitions in low dimensional, nonlinear dynamical systems is a complex problem for which there is not yet a simple, global numerical method able to detect chaos–chaos, chaos–periodic bifurcations and symmetry-breaking, symmetry-increasing bifurcations. We present here for the first time a general framework focusing on the symmetry concept of time series that at the same time reveals new kinds of recurrence. We propose several numerical tools based on the symmetry concept allowing both the qualification and quantification of different kinds of possible symmetry. By using several examples based on periodic symmetrical time series and on logistic and cubic maps, we show that it is possible with simple numerical tools to detect a large number of bifurcations of chaos–chaos, chaos–periodic, broken symmetry and increased symmetry types
Reconstruction of ensembles of coupled time-delay systems from time series.
Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P
2014-06-01
We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.
Basic interrupt and command structures and applications
International Nuclear Information System (INIS)
Davies, R.C.
1974-01-01
Interrupt and command structures of a real-time system are described through specific examples. References to applications of a real-time system and programing development references are supplied. (auth)
Topological data analysis of financial time series: Landscapes of crashes
Gidea, Marian; Katz, Yuri
2018-02-01
We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.
FTSPlot: fast time series visualization for large datasets.
Directory of Open Access Journals (Sweden)
Michael Riss
Full Text Available The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N; the visualization itself can be done with a complexity of O(1 and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64 bytes, on the x86_64 architecture currently up to 2(48 bytes are supported, and benchmarks have been conducted with 2(40 bytes/1 TiB or 1.3 x 10(11 double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.
Dynamical analysis and visualization of tornadoes time series.
Directory of Open Access Journals (Sweden)
António M Lopes
Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
Dynamical analysis and visualization of tornadoes time series.
Lopes, António M; Tenreiro Machado, J A
2015-01-01
In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.
Financial time series analysis based on information categorization method
Tian, Qiang; Shang, Pengjian; Feng, Guochen
2014-12-01
The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.
"Observation Obscurer" - Time Series Viewer, Editor and Processor
Andronov, I. L.
The program is described, which contains a set of subroutines suitable for East viewing and interactive filtering and processing of regularly and irregularly spaced time series. Being a 32-bit DOS application, it may be used as a default fast viewer/editor of time series in any compute shell ("commander") or in Windows. It allows to view the data in the "time" or "phase" mode, to remove ("obscure") or filter outstanding bad points; to make scale transformations and smoothing using few methods (e.g. mean with phase binning, determination of the statistically opti- mal number of phase bins; "running parabola" (Andronov, 1997, As. Ap. Suppl, 125, 207) fit and to make time series analysis using some methods, e.g. correlation, autocorrelation and histogram analysis: determination of extrema etc. Some features have been developed specially for variable star observers, e.g. the barycentric correction, the creation and fast analysis of "OC" diagrams etc. The manual for "hot keys" is presented. The computer code was compiled with a 32-bit Free Pascal (www.freepascal.org).
Cluster analysis of activity-time series in motor learning
DEFF Research Database (Denmark)
Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A
2002-01-01
Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...
Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance
Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao
2018-01-01
Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy. PMID:29795600
Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance.
Liu, Yongli; Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao
2018-01-01
Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy.
A Non-standard Empirical Likelihood for Time Series
DEFF Research Database (Denmark)
Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.
Standard blockwise empirical likelihood (BEL) for stationary, weakly dependent time series requires specifying a fixed block length as a tuning parameter for setting confidence regions. This aspect can be difficult and impacts coverage accuracy. As an alternative, this paper proposes a new version...... of BEL based on a simple, though non-standard, data-blocking rule which uses a data block of every possible length. Consequently, the method involves no block selection and is also anticipated to exhibit better coverage performance. Its non-standard blocking scheme, however, induces non......-standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...
Models for Pooled Time-Series Cross-Section Data
Directory of Open Access Journals (Sweden)
Lawrence E Raffalovich
2015-07-01
Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.
Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective
Chen, Shyi-Ming
2013-01-01
Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...
Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis
Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.
We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.
Time series prediction by feedforward neural networks - is it difficult?
International Nuclear Information System (INIS)
Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang
2003-01-01
The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/γ 2 (γ >> 1). The generalization error is found to decrease as ε g ∝ exp(-α/γ 2 ), where α is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results
Time series analysis methods and applications for flight data
Zhang, Jianye
2017-01-01
This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.
Nonparametric autocovariance estimation from censored time series by Gaussian imputation.
Park, Jung Wook; Genton, Marc G; Ghosh, Sujit K
2009-02-01
One of the most frequently used methods to model the autocovariance function of a second-order stationary time series is to use the parametric framework of autoregressive and moving average models developed by Box and Jenkins. However, such parametric models, though very flexible, may not always be adequate to model autocovariance functions with sharp changes. Furthermore, if the data do not follow the parametric model and are censored at a certain value, the estimation results may not be reliable. We develop a Gaussian imputation method to estimate an autocovariance structure via nonparametric estimation of the autocovariance function in order to address both censoring and incorrect model specification. We demonstrate the effectiveness of the technique in terms of bias and efficiency with simulations under various rates of censoring and underlying models. We describe its application to a time series of silicon concentrations in the Arctic.
Deviations from uniform power law scaling in nonstationary time series
Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.
1997-01-01
A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.
An integral time series on simulated labeling using fractal structure
International Nuclear Information System (INIS)
Djainal, D.D.
1997-01-01
This research deals with the detection of time series of vertical two-phase flow, in attempt to developed an objective indicator of time series flow patterns. One of new method is fractal analysis which can complement conventional methods in the description of highly irregular fluctuations. in the present work, fractal analysis applied to analyze simulated boiling coolant signal. this simulated signals built by sum random elements in small subchannels of the coolant channel. Two modes are defined and both modes are characterized by their void fractions. in the case of unimodal-PDF signals, the difference between these modes is relative small. on other hand, bimodal-PDF signals have relative large range. in this research, fractal dimension can indicate the characters of that signals simulation
Chaotic time series. Part II. System Identification and Prediction
Directory of Open Access Journals (Sweden)
Bjørn Lillekjendlie
1994-10-01
Full Text Available This paper is the second in a series of two, and describes the current state of the art in modeling and prediction of chaotic time series. Sample data from deterministic non-linear systems may look stochastic when analysed with linear methods. However, the deterministic structure may be uncovered and non-linear models constructed that allow improved prediction. We give the background for such methods from a geometrical point of view, and briefly describe the following types of methods: global polynomials, local polynomials, multilayer perceptrons and semi-local methods including radial basis functions. Some illustrative examples from known chaotic systems are presented, emphasising the increase in prediction error with time. We compare some of the algorithms with respect to prediction accuracy and storage requirements, and list applications of these methods to real data from widely different areas.
Time series analysis of ozone data in Isfahan
Omidvari, M.; Hassanzadeh, S.; Hosseinibalam, F.
2008-07-01
Time series analysis used to investigate the stratospheric ozone formation and decomposition processes. Different time series methods are applied to detect the reason for extreme high ozone concentrations for each season. Data was convert into seasonal component and frequency domain, the latter has been evaluated by using the Fast Fourier Transform (FFT), spectral analysis. The power density spectrum estimated from the ozone data showed peaks at cycle duration of 22, 20, 36, 186, 365 and 40 days. According to seasonal component analysis most fluctuation was in 1999 and 2000, but the least fluctuation was in 2003. The best correlation between ozone and sun radiation was found in 2000. Other variables which are not available cause to this fluctuation in the 1999 and 2001. The trend of ozone is increasing in 1999 and is decreasing in other years.
Detecting structural breaks in time series via genetic algorithms
DEFF Research Database (Denmark)
Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid
2016-01-01
of the time series under consideration is available. Therefore, a black-box optimization approach is our method of choice for detecting structural breaks. We describe a genetic algorithm framework which easily adapts to a large number of statistical settings. To evaluate the usefulness of different crossover...... and mutation operations for this problem, we conduct extensive experiments to determine good choices for the parameters and operators of the genetic algorithm. One surprising observation is that use of uniform and one-point crossover together gave significantly better results than using either crossover...... operator alone. Moreover, we present a specific fitness function which exploits the sparse structure of the break points and which can be evaluated particularly efficiently. The experiments on artificial and real-world time series show that the resulting algorithm detects break points with high precision...
Time series analysis of nuclear instrumentation in EBR-II
International Nuclear Information System (INIS)
Imel, G.R.
1996-01-01
Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals
Mathematical methods in time series analysis and digital image processing
Kurths, J; Maass, P; Timmer, J
2008-01-01
The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.
Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis
Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.
2015-06-01
This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.
Forecasting the Reference Evapotranspiration Using Time Series Model
Directory of Open Access Journals (Sweden)
H. Zare Abyaneh
2016-10-01
Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference
Quality Control Procedure Based on Partitioning of NMR Time Series
Directory of Open Access Journals (Sweden)
Michał Staniszewski
2018-03-01
Full Text Available The quality of the magnetic resonance spectroscopy (MRS depends on the stability of magnetic resonance (MR system performance and optimal hardware functioning, which ensure adequate levels of signal-to-noise ratios (SNR as well as good spectral resolution and minimal artifacts in the spectral data. MRS quality control (QC protocols and methodologies are based on phantom measurements that are repeated regularly. In this work, a signal partitioning algorithm based on a dynamic programming (DP method for QC assessment of the spectral data is described. The proposed algorithm allows detection of the change points—the abrupt variations in the time series data. The proposed QC method was tested using the simulated and real phantom data. Simulated data were randomly generated time series distorted by white noise. The real data were taken from the phantom quality control studies of the MRS scanner collected for four and a half years and analyzed by LCModel software. Along with the proposed algorithm, performance of various literature methods was evaluated for the predefined number of change points based on the error values calculated by subtracting the mean values calculated for the periods between the change-points from the original data points. The time series were checked using external software, a set of external methods and the proposed tool, and the obtained results were comparable. The application of dynamic programming in the analysis of the phantom MRS data is a novel approach to QC. The obtained results confirm that the presented change-point-detection tool can be used either for independent analysis of MRS time series (or any other or as a part of quality control.
Financial Time Series Prediction Using Elman Recurrent Random Neural Networks
Directory of Open Access Journals (Sweden)
Jie Wang
2016-01-01
(ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.
Appropriate use of the increment entropy for electrophysiological time series.
Liu, Xiaofeng; Wang, Xue; Zhou, Xu; Jiang, Aimin
2018-04-01
The increment entropy (IncrEn) is a new measure for quantifying the complexity of a time series. There are three critical parameters in the IncrEn calculation: N (length of the time series), m (dimensionality), and q (quantifying precision). However, the question of how to choose the most appropriate combination of IncrEn parameters for short datasets has not been extensively explored. The purpose of this research was to provide guidance on choosing suitable IncrEn parameters for short datasets by exploring the effects of varying the parameter values. We used simulated data, epileptic EEG data and cardiac interbeat (RR) data to investigate the effects of the parameters on the calculated IncrEn values. The results reveal that IncrEn is sensitive to changes in m, q and N for short datasets (N≤500). However, IncrEn reaches stability at a data length of N=1000 with m=2 and q=2, and for short datasets (N=100), it shows better relative consistency with 2≤m≤6 and 2≤q≤8 We suggest that the value of N should be no less than 100. To enable a clear distinction between different classes based on IncrEn, we recommend that m and q should take values between 2 and 4. With appropriate parameters, IncrEn enables the effective detection of complexity variations in physiological time series, suggesting that IncrEn should be useful for the analysis of physiological time series in clinical applications. Copyright © 2018 Elsevier Ltd. All rights reserved.
Which DTW Method Applied to Marine Univariate Time Series Imputation
Phan , Thi-Thu-Hong; Caillault , Émilie; Lefebvre , Alain; Bigand , André
2017-01-01
International audience; Missing data are ubiquitous in any domains of applied sciences. Processing datasets containing missing values can lead to a loss of efficiency and unreliable results, especially for large missing sub-sequence(s). Therefore, the aim of this paper is to build a framework for filling missing values in univariate time series and to perform a comparison of different similarity metrics used for the imputation task. This allows to suggest the most suitable methods for the imp...
Time series regression model for infectious disease and weather.
Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro
2015-10-01
Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria
Palka, Jessica; Wessollek, Christine; Karrasch, Pierre
2017-10-01
The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in
Identification of neutral biochemical network models from time series data
Directory of Open Access Journals (Sweden)
Maia Marco
2009-05-01
Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.
Identification of neutral biochemical network models from time series data.
Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S
2009-05-05
The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.
Generation and prediction of time series by a neural network
International Nuclear Information System (INIS)
Eisenstein, E.; Kanter, I.; Kessler, D.A.; Kinzel, W.
1995-01-01
Generation and prediction of time series are analyzed for the case of a bit generator: a perceptron where in each time step the input units are shifted one bit to the right with the state of the leftmost input unit set equal to the output unit in the previous time step. The long-time dynamical behavior of the bit generator consists of cycles whose typical period scales polynomially with the size of the network and whose spatial structure is periodic with a typical finite wavelength. The generalization error on a cycle is zero for a finite training set, and global dynamical behaviors can also be learned in a finite time. Hence, a projection of a rule can be learned in a finite time
Comparison of correlation analysis techniques for irregularly sampled time series
Directory of Open Access Journals (Sweden)
K. Rehfeld
2011-06-01
Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.
All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.
We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ^{18}O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.
Acute ischaemic stroke prediction from physiological time series patterns
Directory of Open Access Journals (Sweden)
Qing Zhang,
2013-05-01
Full Text Available BackgroundStroke is one of the major diseases with human mortality. Recent clinical research has indicated that early changes in common physiological variables represent a potential therapeutic target, thus the manipulation of these variables may eventually yield an effective way to optimise stroke recovery.AimsWe examined correlations between physiological parameters of patients during the first 48 hours after a stroke, and their stroke outcomes after 3 months. We wanted to discover physiological determinants that could be used to improve health outcomes by supporting the medical decisions that need to be made early on a patient’s stroke experience.Method We applied regression-based machine learning techniques to build a prediction algorithm that can forecast 3-month outcomes from initial physiological time series data during the first 48 hours after stroke. In our method, not only did we use statistical characteristics as traditional prediction features, but also we adopted trend patterns of time series data as new key features.ResultsWe tested our prediction method on a real physiological data set of stroke patients. The experiment results revealed an average high precision rate: 90%. We also tested prediction methods only considering statistical characteristics of physiological data, and concluded an average precision rate: 71%.ConclusionWe demonstrated that using trend pattern features in prediction methods improved the accuracy of stroke outcome prediction. Therefore, trend patterns of physiological time series data have an important role in the early treatment of patients with acute ischaemic stroke.
Time series analysis for psychological research: examining and forecasting change.
Jebb, Andrew T; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials.
Toward automatic time-series forecasting using neural networks.
Yan, Weizhong
2012-07-01
Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.
Modeling financial time series with S-plus
Zivot, Eric
2003-01-01
The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...
Time series analysis for psychological research: examining and forecasting change
Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming
2015-01-01
Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341
Reconstruction of network topology using status-time-series data
Pandey, Pradumn Kumar; Badarla, Venkataramana
2018-01-01
Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.
Spectral Unmixing Analysis of Time Series Landsat 8 Images
Zhuo, R.; Xu, L.; Peng, J.; Chen, Y.
2018-05-01
Temporal analysis of Landsat 8 images opens up new opportunities in the unmixing procedure. Although spectral analysis of time series Landsat imagery has its own advantage, it has rarely been studied. Nevertheless, using the temporal information can provide improved unmixing performance when compared to independent image analyses. Moreover, different land cover types may demonstrate different temporal patterns, which can aid the discrimination of different natures. Therefore, this letter presents time series K-P-Means, a new solution to the problem of unmixing time series Landsat imagery. The proposed approach is to obtain the "purified" pixels in order to achieve optimal unmixing performance. The vertex component analysis (VCA) is used to extract endmembers for endmember initialization. First, nonnegative least square (NNLS) is used to estimate abundance maps by using the endmember. Then, the estimated endmember is the mean value of "purified" pixels, which is the residual of the mixed pixel after excluding the contribution of all nondominant endmembers. Assembling two main steps (abundance estimation and endmember update) into the iterative optimization framework generates the complete algorithm. Experiments using both simulated and real Landsat 8 images show that the proposed "joint unmixing" approach provides more accurate endmember and abundance estimation results compared with "separate unmixing" approach.
Clustering Multivariate Time Series Using Hidden Markov Models
Directory of Open Access Journals (Sweden)
Shima Ghassempour
2014-03-01
Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.
Cross-sample entropy of foreign exchange time series
Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao
2010-11-01
The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.
Earthquake forecasting studies using radon time series data in Taiwan
Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong
2017-04-01
For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.
Institute of Scientific and Technical Information of China (English)
尹华站; 李丹; 袁祥勇; 黄希庭
2013-01-01
solution as it uses a blank interruption instead. The researchers consistently found a similar position and interruption effect in both paradigms (Casini & Macar, 1997; Cortin, & Remblai,, 2006; Remblai, & Cortin,, 2003). Furthermore, the results showed both the discontinuity and interference of current information processing were belong to interruption effect, but to varying extents (Cortin, & Masse, 2000; Macar, 2002). However, though the position and interruption effect were similar in the two paradigms, they have not been explored in a same stimuli series. As we know, information exchange with the outside world is not dependent on single sensory channel, but rather the interaction of cross-modal information processing. It would be valuable to explore the position and interruption effect in the context ofcross-modal processing. It would not only help to uncover the cognitive mechanism of time processing, but also have important practical values as it is more similar with daily life. Therefore, the present study was designed to investigate the position and interruption effect in the two paradigms in the cross-modal conditions. To this end, the study consisted of two experiments. In experiment 1, 2500 ms and 4500 ms were set for the target time intervals, using the same stimulus sequence (visual presentation, with aural interruption), participants were allocated to control, break and interference condition respectively. In experiment 2,the target intervals were set to 1500 ms and 2500 ms. Results of experiment 1 showed that the interruption effect is more significant in break condition regardless of target time intervals. Furthermore, under the 2500ms, position effect were found in all three conditions, whereas under the 4500ms condition, the position effect only existed in the break condition. Experiment 2 found that there was position effect consistently, regardless of the interpolation conditions or target time intervals. Besides, the interrupt effect was more
A method for generating high resolution satellite image time series
Guo, Tao
2014-10-01
There is an increasing demand for satellite remote sensing data with both high spatial and temporal resolution in many applications. But it still is a challenge to simultaneously improve spatial resolution and temporal frequency due to the technical limits of current satellite observation systems. To this end, much R&D efforts have been ongoing for years and lead to some successes roughly in two aspects, one includes super resolution, pan-sharpen etc. methods which can effectively enhance the spatial resolution and generate good visual effects, but hardly preserve spectral signatures and result in inadequate analytical value, on the other hand, time interpolation is a straight forward method to increase temporal frequency, however it increase little informative contents in fact. In this paper we presented a novel method to simulate high resolution time series data by combing low resolution time series data and a very small number of high resolution data only. Our method starts with a pair of high and low resolution data set, and then a spatial registration is done by introducing LDA model to map high and low resolution pixels correspondingly. Afterwards, temporal change information is captured through a comparison of low resolution time series data, and then projected onto the high resolution data plane and assigned to each high resolution pixel according to the predefined temporal change patterns of each type of ground objects. Finally the simulated high resolution data is generated. A preliminary experiment shows that our method can simulate a high resolution data with a reasonable accuracy. The contribution of our method is to enable timely monitoring of temporal changes through analysis of time sequence of low resolution images only, and usage of costly high resolution data can be reduces as much as possible, and it presents a highly effective way to build up an economically operational monitoring solution for agriculture, forest, land use investigation
Forecasting long memory time series under a break in persistence
DEFF Research Database (Denmark)
Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson
We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...... of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines...
Extracting the relevant delays in time series modelling
DEFF Research Database (Denmark)
Goutte, Cyril
1997-01-01
selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...
Satellite Image Time Series Decomposition Based on EEMD
Directory of Open Access Journals (Sweden)
Yun-long Kong
2015-11-01
Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.
Deriving crop calendar using NDVI time-series
Patel, J. H.; Oza, M. P.
2014-11-01
Agricultural intensification is defined in terms as cropping intensity, which is the numbers of crops (single, double and triple) per year in a unit cropland area. Information about crop calendar (i.e. number of crops in a parcel of land and their planting & harvesting dates and date of peak vegetative stage) is essential for proper management of agriculture. Remote sensing sensors provide a regular, consistent and reliable measurement of vegetation response at various growth stages of crop. Therefore it is ideally suited for monitoring purpose. The spectral response of vegetation, as measured by the Normalized Difference Vegetation Index (NDVI) and its profiles, can provide a new dimension for describing vegetation growth cycle. The analysis based on values of NDVI at regular time interval provides useful information about various crop growth stages and performance of crop in a season. However, the NDVI data series has considerable amount of local fluctuation in time domain and needs to be smoothed so that dominant seasonal behavior is enhanced. Based on temporal analysis of smoothed NDVI series, it is possible to extract number of crop cycles per year and their crop calendar. In the present study, a methodology is developed to extract key elements of crop growth cycle (i.e. number of crops per year and their planting - peak - harvesting dates). This is illustrated by analysing MODIS-NDVI data series of one agricultural year (from June 2012 to May 2013) over Gujarat. Such an analysis is very useful for analysing dynamics of kharif and rabi crops.
Linear and nonlinear dynamic systems in financial time series prediction
Directory of Open Access Journals (Sweden)
Salim Lahmiri
2012-10-01
Full Text Available Autoregressive moving average (ARMA process and dynamic neural networks namely the nonlinear autoregressive moving average with exogenous inputs (NARX are compared by evaluating their ability to predict financial time series; for instance the S&P500 returns. Two classes of ARMA are considered. The first one is the standard ARMA model which is a linear static system. The second one uses Kalman filter (KF to estimate and predict ARMA coefficients. This model is a linear dynamic system. The forecasting ability of each system is evaluated by means of mean absolute error (MAE and mean absolute deviation (MAD statistics. Simulation results indicate that the ARMA-KF system performs better than the standard ARMA alone. Thus, introducing dynamics into the ARMA process improves the forecasting accuracy. In addition, the ARMA-KF outperformed the NARX. This result may suggest that the linear component found in the S&P500 return series is more dominant than the nonlinear part. In sum, we conclude that introducing dynamics into the ARMA process provides an effective system for S&P500 time series prediction.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
Energy Technology Data Exchange (ETDEWEB)
Scargle, Jeffrey D. [Space Science and Astrobiology Division, MS 245-3, NASA Ames Research Center, Moffett Field, CA 94035-1000 (United States); Norris, Jay P. [Physics Department, Boise State University, 2110 University Drive, Boise, ID 83725-1570 (United States); Jackson, Brad [The Center for Applied Mathematics and Computer Science, Department of Mathematics, San Jose State University, One Washington Square, MH 308, San Jose, CA 95192-0103 (United States); Chiang, James, E-mail: jeffrey.d.scargle@nasa.gov [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)
2013-02-20
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Assessing Coupling Dynamics from an Ensemble of Time Series
Directory of Open Access Journals (Sweden)
Germán Gómez-Herrero
2015-04-01
Full Text Available Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts, which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.
Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-01-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.
STUDIES IN ASTRONOMICAL TIME SERIES ANALYSIS. VI. BAYESIAN BLOCK REPRESENTATIONS
International Nuclear Information System (INIS)
Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James
2013-01-01
This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.
Time-Series Analysis of Supergranule Characterstics at Solar Minimum
Williams, Peter E.; Pesnell, W. Dean
2013-01-01
Sixty days of Doppler images from the Solar and Heliospheric Observatory (SOHO) / Michelson Doppler Imager (MDI) investigation during the 1996 and 2008 solar minima have been analyzed to show that certain supergranule characteristics (size, size range, and horizontal velocity) exhibit fluctuations of three to five days. Cross-correlating parameters showed a good, positive correlation between supergranulation size and size range, and a moderate, negative correlation between size range and velocity. The size and velocity do exhibit a moderate, negative correlation, but with a small time lag (less than 12 hours). Supergranule sizes during five days of co-temporal data from MDI and the Solar Dynamics Observatory (SDO) / Helioseismic Magnetic Imager (HMI) exhibit similar fluctuations with a high level of correlation between them. This verifies the solar origin of the fluctuations, which cannot be caused by instrumental artifacts according to these observations. Similar fluctuations are also observed in data simulations that model the evolution of the MDI Doppler pattern over a 60-day period. Correlations between the supergranule size and size range time-series derived from the simulated data are similar to those seen in MDI data. A simple toy-model using cumulative, uncorrelated exponential growth and decay patterns at random emergence times produces a time-series similar to the data simulations. The qualitative similarities between the simulated and the observed time-series suggest that the fluctuations arise from stochastic processes occurring within the solar convection zone. This behavior, propagating to surface manifestations of supergranulation, may assist our understanding of magnetic-field-line advection, evolution, and interaction.
Time series analysis of the behavior of brazilian natural rubber
Directory of Open Access Journals (Sweden)
Antônio Donizette de Oliveira
2009-03-01
Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.
Razavi, Saman; Vogel, Richard
2018-02-01
Prewhitening, the process of eliminating or reducing short-term stochastic persistence to enable detection of deterministic change, has been extensively applied to time series analysis of a range of geophysical variables. Despite the controversy around its utility, methodologies for prewhitening time series continue to be a critical feature of a variety of analyses including: trend detection of hydroclimatic variables and reconstruction of climate and/or hydrology through proxy records such as tree rings. With a focus on the latter, this paper presents a generalized approach to exploring the impact of a wide range of stochastic structures of short- and long-term persistence on the variability of hydroclimatic time series. Through this approach, we examine the impact of prewhitening on the inferred variability of time series across time scales. We document how a focus on prewhitened, residual time series can be misleading, as it can drastically distort (or remove) the structure of variability across time scales. Through examples with actual data, we show how such loss of information in prewhitened time series of tree rings (so-called "residual chronologies") can lead to the underestimation of extreme conditions in climate and hydrology, particularly droughts, reconstructed for centuries preceding the historical period.
Monitoring Forest Regrowth Using a Multi-Platform Time Series
Sabol, Donald E., Jr.; Smith, Milton O.; Adams, John B.; Gillespie, Alan R.; Tucker, Compton J.
1996-01-01
Over the past 50 years, the forests of western Washington and Oregon have been extensively harvested for timber. This has resulted in a heterogeneous mosaic of remaining mature forests, clear-cuts, new plantations, and second-growth stands that now occur in areas that formerly were dominated by extensive old-growth forests and younger forests resulting from fire disturbance. Traditionally, determination of seral stage and stand condition have been made using aerial photography and spot field observations, a methodology that is not only time- and resource-intensive, but falls short of providing current information on a regional scale. These limitations may be solved, in part, through the use of multispectral images which can cover large areas at spatial resolutions in the order of tens of meters. The use of multiple images comprising a time series potentially can be used to monitor land use (e.g. cutting and replanting), and to observe natural processes such as regeneration, maturation and phenologic change. These processes are more likely to be spectrally observed in a time series composed of images taken during different seasons over a long period of time. Therefore, for many areas, it may be necessary to use a variety of images taken with different imaging systems. A common framework for interpretation is needed that reduces topographic, atmospheric, instrumental, effects as well as differences in lighting geometry between images. The present state of remote-sensing technology in general use does not realize the full potential of the multispectral data in areas of high topographic relief. For example, the primary method for analyzing images of forested landscapes in the Northwest has been with statistical classifiers (e.g. parallelepiped, nearest-neighbor, maximum likelihood, etc.), often applied to uncalibrated multispectral data. Although this approach has produced useful information from individual images in some areas, landcover classes defined by these
State-space prediction model for chaotic time series
Alparslan, A. K.; Sayar, M.; Atilgan, A. R.
1998-08-01
A simple method for predicting the continuation of scalar chaotic time series ahead in time is proposed. The false nearest neighbors technique in connection with the time-delayed embedding is employed so as to reconstruct the state space. A local forecasting model based upon the time evolution of the topological neighboring in the reconstructed phase space is suggested. A moving root-mean-square error is utilized in order to monitor the error along the prediction horizon. The model is tested for the convection amplitude of the Lorenz model. The results indicate that for approximately 100 cycles of the training data, the prediction follows the actual continuation very closely about six cycles. The proposed model, like other state-space forecasting models, captures the long-term behavior of the system due to the use of spatial neighbors in the state space.
Discovering significant evolution patterns from satellite image time series.
Petitjean, François; Masseglia, Florent; Gançarski, Pierre; Forestier, Germain
2011-12-01
Satellite Image Time Series (SITS) provide us with precious information on land cover evolution. By studying these series of images we can both understand the changes of specific areas and discover global phenomena that spread over larger areas. Changes that can occur throughout the sensing time can spread over very long periods and may have different start time and end time depending on the location, which complicates the mining and the analysis of series of images. This work focuses on frequent sequential pattern mining (FSPM) methods, since this family of methods fits the above-mentioned issues. This family of methods consists of finding the most frequent evolution behaviors, and is actually able to extract long-term changes as well as short term ones, whenever the change may start and end. However, applying FSPM methods to SITS implies confronting two main challenges, related to the characteristics of SITS and the domain's constraints. First, satellite images associate multiple measures with a single pixel (the radiometric levels of different wavelengths corresponding to infra-red, red, etc.), which makes the search space multi-dimensional and thus requires specific mining algorithms. Furthermore, the non evolving regions, which are the vast majority and overwhelm the evolving ones, challenge the discovery of these patterns. We propose a SITS mining framework that enables discovery of these patterns despite these constraints and characteristics. Our proposal is inspired from FSPM and provides a relevant visualization principle. Experiments carried out on 35 images sensed over 20 years show the proposed approach makes it possible to extract relevant evolution behaviors.
Remote-Sensing Time Series Analysis, a Vegetation Monitoring Tool
McKellip, Rodney; Prados, Donald; Ryan, Robert; Ross, Kenton; Spruce, Joseph; Gasser, Gerald; Greer, Randall
2008-01-01
The Time Series Product Tool (TSPT) is software, developed in MATLAB , which creates and displays high signal-to- noise Vegetation Indices imagery and other higher-level products derived from remotely sensed data. This tool enables automated, rapid, large-scale regional surveillance of crops, forests, and other vegetation. TSPT temporally processes high-revisit-rate satellite imagery produced by the Moderate Resolution Imaging Spectroradiometer (MODIS) and by other remote-sensing systems. Although MODIS imagery is acquired daily, cloudiness and other sources of noise can greatly reduce the effective temporal resolution. To improve cloud statistics, the TSPT combines MODIS data from multiple satellites (Aqua and Terra). The TSPT produces MODIS products as single time-frame and multitemporal change images, as time-series plots at a selected location, or as temporally processed image videos. Using the TSPT program, MODIS metadata is used to remove and/or correct bad and suspect data. Bad pixel removal, multiple satellite data fusion, and temporal processing techniques create high-quality plots and animated image video sequences that depict changes in vegetation greenness. This tool provides several temporal processing options not found in other comparable imaging software tools. Because the framework to generate and use other algorithms is established, small modifications to this tool will enable the use of a large range of remotely sensed data types. An effective remote-sensing crop monitoring system must be able to detect subtle changes in plant health in the earliest stages, before the effects of a disease outbreak or other adverse environmental conditions can become widespread and devastating. The integration of the time series analysis tool with ground-based information, soil types, crop types, meteorological data, and crop growth models in a Geographic Information System, could provide the foundation for a large-area crop-surveillance system that could identify
Weighted statistical parameters for irregularly sampled time series
Rimoldini, Lorenzo
2014-01-01
Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.
Detecting switching and intermittent causalities in time series
Zanin, Massimiliano; Papo, David
2017-04-01
During the last decade, complex network representations have emerged as a powerful instrument for describing the cross-talk between different brain regions both at rest and as subjects are carrying out cognitive tasks, in healthy brains and neurological pathologies. The transient nature of such cross-talk has nevertheless by and large been neglected, mainly due to the inherent limitations of some metrics, e.g., causality ones, which require a long time series in order to yield statistically significant results. Here, we present a methodology to account for intermittent causal coupling in neural activity, based on the identification of non-overlapping windows within the original time series in which the causality is strongest. The result is a less coarse-grained assessment of the time-varying properties of brain interactions, which can be used to create a high temporal resolution time-varying network. We apply the proposed methodology to the analysis of the brain activity of control subjects and alcoholic patients performing an image recognition task. Our results show that short-lived, intermittent, local-scale causality is better at discriminating both groups than global network metrics. These results highlight the importance of the transient nature of brain activity, at least under some pathological conditions.
Ngan, Chun-Kit
2013-01-01
Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…
The Hierarchical Spectral Merger Algorithm: A New Time Series Clustering Procedure
Euá n, Carolina; Ombao, Hernando; Ortega, Joaquí n
2018-01-01
We present a new method for time series clustering which we call the Hierarchical Spectral Merger (HSM) method. This procedure is based on the spectral theory of time series and identifies series that share similar oscillations or waveforms
Aerosol Climate Time Series Evaluation In ESA Aerosol_cci
Popp, T.; de Leeuw, G.; Pinnock, S.
2015-12-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. By the end of 2015 full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which are also validated. The paper will summarize and discuss the results of major reprocessing and validation conducted in 2015. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension with successor instruments of the Sentinel family will be described and the complementarity of the different satellite aerosol products
Characterizability of metabolic pathway systems from time series data.
Voit, Eberhard O
2013-12-01
Over the past decade, the biomathematical community has devoted substantial effort to the complicated challenge of estimating parameter values for biological systems models. An even more difficult issue is the characterization of functional forms for the processes that govern these systems. Most parameter estimation approaches tacitly assume that these forms are known or can be assumed with some validity. However, this assumption is not always true. The recently proposed method of Dynamic Flux Estimation (DFE) addresses this problem in a genuinely novel fashion for metabolic pathway systems. Specifically, DFE allows the characterization of fluxes within such systems through an analysis of metabolic time series data. Its main drawback is the fact that DFE can only directly be applied if the pathway system contains as many metabolites as unknown fluxes. This situation is unfortunately rare. To overcome this roadblock, earlier work in this field had proposed strategies for augmenting the set of unknown fluxes with independent kinetic information, which however is not always available. Employing Moore-Penrose pseudo-inverse methods of linear algebra, the present article discusses an approach for characterizing fluxes from metabolic time series data that is applicable even if the pathway system is underdetermined and contains more fluxes than metabolites. Intriguingly, this approach is independent of a specific modeling framework and unaffected by noise in the experimental time series data. The results reveal whether any fluxes may be characterized and, if so, which subset is characterizable. They also help with the identification of fluxes that, if they could be determined independently, would allow the application of DFE. Copyright © 2013 Elsevier Inc. All rights reserved.
JTSA: an open source framework for time series abstractions.
Sacchi, Lucia; Capozzi, Davide; Bellazzi, Riccardo; Larizza, Cristiana
2015-10-01
The evaluation of the clinical status of a patient is frequently based on the temporal evolution of some parameters, making the detection of temporal patterns a priority in data analysis. Temporal abstraction (TA) is a methodology widely used in medical reasoning for summarizing and abstracting longitudinal data. This paper describes JTSA (Java Time Series Abstractor), a framework including a library of algorithms for time series preprocessing and abstraction and an engine to execute a workflow for temporal data processing. The JTSA framework is grounded on a comprehensive ontology that models temporal data processing both from the data storage and the abstraction computation perspective. The JTSA framework is designed to allow users to build their own analysis workflows by combining different algorithms. Thanks to the modular structure of a workflow, simple to highly complex patterns can be detected. The JTSA framework has been developed in Java 1.7 and is distributed under GPL as a jar file. JTSA provides: a collection of algorithms to perform temporal abstraction and preprocessing of time series, a framework for defining and executing data analysis workflows based on these algorithms, and a GUI for workflow prototyping and testing. The whole JTSA project relies on a formal model of the data types and of the algorithms included in the library. This model is the basis for the design and implementation of the software application. Taking into account this formalized structure, the user can easily extend the JTSA framework by adding new algorithms. Results are shown in the context of the EU project MOSAIC to extract relevant patterns from data coming related to the long term monitoring of diabetic patients. The proof that JTSA is a versatile tool to be adapted to different needs is given by its possible uses, both as a standalone tool for data summarization and as a module to be embedded into other architectures to select specific phenotypes based on TAs in a large
Causes of unplanned interruption of radiotherapy
International Nuclear Information System (INIS)
Diegues, Sylvia Suelotto; Ciconelli, Rozana Mesquita; Segreto, Roberto Araujo
2008-01-01
Objective: To evaluate the occurrence and causes of unplanned interruption of radiotherapy. Materials and methods: Retrospective study developed in the Division of Radiotherapy of Hospital Alemao Oswaldo Cruz in Sao Paulo, SP, Brazil, with data collected from 560 dossiers of patients submitted to radiotherapy in the period between January 1, 2005 and December 31, 2005. Chi-squared and Student t tests were utilized in the data analysis, and p < 0.05 was considered as statistically significant. Results: Interruption of treatment was identified in 350 cases, corresponding to 62.5% of the patients. The reasons for treatment interruption were the following: preventive device maintenance (55%), patient's own private reasons (13%), adverse reactions to the treatment or to combined radiotherapy/chemotherapy (6%), clinical worsening (3%), two or more combined reasons (23%). The interruption time interval ranged between 1 and 24 days (mean 1.4 day). One-day interruption was mostly due to preventive device maintenance (84.4%); two-five-day interruption was due to combined reasons (48.28%). Conclusion: The most frequent cause of interruption was preventive device maintenance, with maximum two-day time interval. (author)
Time series analysis of brain regional volume by MR image
International Nuclear Information System (INIS)
Tanaka, Mika; Tarusawa, Ayaka; Nihei, Mitsuyo; Fukami, Tadanori; Yuasa, Tetsuya; Wu, Jin; Ishiwata, Kiichi; Ishii, Kenji
2010-01-01
The present study proposed a methodology of time series analysis of volumes of frontal, parietal, temporal and occipital lobes and cerebellum because such volumetric reports along the process of individual's aging have been scarcely presented. Subjects analyzed were brain images of 2 healthy males and 18 females of av. age of 69.0 y, of which T1-weighted 3D SPGR (spoiled gradient recalled in the steady state) acquisitions with a GE SIGNA EXCITE HD 1.5T machine were conducted for 4 times in the time series of 42-50 months. The image size was 256 x 256 x (86-124) voxels with digitization level 16 bits. As the template for the regions, the standard gray matter atlas (icbn452 a tlas p robability g ray) and its labeled one (icbn.Labels), provided by UCLA Laboratory of Neuro Imaging, were used for individual's standardization. Segmentation, normalization and coregistration were performed with the MR imaging software SPM8 (Statistic Parametric Mapping 8). Volumes of regions were calculated as their voxel ratio to the whole brain voxel in percent. It was found that the regional volumes decreased with aging in all above lobes examined and cerebellum in average percent per year of -0.11, -0.07, -0.04, -0.02, and -0.03, respectively. The procedure for calculation of the regional volumes, which has been manually operated hitherto, can be automatically conducted for the individual brain using the standard atlases above. (T.T.)
Artificial neural networks applied to forecasting time series.
Montaño Moreno, Juan J; Palmer Pol, Alfonso; Muñoz Gracia, Pilar
2011-04-01
This study offers a description and comparison of the main models of Artificial Neural Networks (ANN) which have proved to be useful in time series forecasting, and also a standard procedure for the practical application of ANN in this type of task. The Multilayer Perceptron (MLP), Radial Base Function (RBF), Generalized Regression Neural Network (GRNN), and Recurrent Neural Network (RNN) models are analyzed. With this aim in mind, we use a time series made up of 244 time points. A comparative study establishes that the error made by the four neural network models analyzed is less than 10%. In accordance with the interpretation criteria of this performance, it can be concluded that the neural network models show a close fit regarding their forecasting capacity. The model with the best performance is the RBF, followed by the RNN and MLP. The GRNN model is the one with the worst performance. Finally, we analyze the advantages and limitations of ANN, the possible solutions to these limitations, and provide an orientation towards future research.
On the maximum-entropy/autoregressive modeling of time series
Chao, B. F.
1984-01-01
The autoregressive (AR) model of a random process is interpreted in the light of the Prony's relation which relates a complex conjugate pair of poles of the AR process in the z-plane (or the z domain) on the one hand, to the complex frequency of one complex harmonic function in the time domain on the other. Thus the AR model of a time series is one that models the time series as a linear combination of complex harmonic functions, which include pure sinusoids and real exponentials as special cases. An AR model is completely determined by its z-domain pole configuration. The maximum-entropy/autogressive (ME/AR) spectrum, defined on the unit circle of the z-plane (or the frequency domain), is nothing but a convenient, but ambiguous visual representation. It is asserted that the position and shape of a spectral peak is determined by the corresponding complex frequency, and the height of the spectral peak contains little information about the complex amplitude of the complex harmonic functions.
Quantifying evolutionary dynamics from variant-frequency time series
Khatri, Bhavin S.
2016-09-01
From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.
Exploratory joint and separate tracking of geographically related time series
Balasingam, Balakumar; Willett, Peter; Levchuk, Georgiy; Freeman, Jared
2012-05-01
Target tracking techniques have usually been applied to physical systems via radar, sonar or imaging modalities. But the same techniques - filtering, association, classification, track management - can be applied to nontraditional data such as one might find in other fields such as economics, business and national defense. In this paper we explore a particular data set. The measurements are time series collected at various sites; but other than that little is known about it. We shall refer to as the data as representing the Megawatt hour (MWH) output of various power plants located in Afghanistan. We pose such questions as: 1. Which power plants seem to have a common model? 2. Do any power plants change their models with time? 3. Can power plant behavior be predicted, and if so, how far to the future? 4. Are some of the power plants stochastically linked? That is, do we observed a lack of power demand at one power plant as implying a surfeit of demand elsewhere? The observations seem well modeled as hidden Markov. This HMM modeling is compared to other approaches; and tests are continued to other (albeit self-generated) data sets with similar characteristics. Keywords: Time-series analysis, hidden Markov models, statistical similarity, clustering weighted
GPS time series at Campi Flegrei caldera (2000-2013
Directory of Open Access Journals (Sweden)
Prospero De Martino
2014-05-01
Full Text Available The Campi Flegrei caldera is an active volcanic system associated to a high volcanic risk, and represents a well known and peculiar example of ground deformations (bradyseism, characterized by intense uplift periods, followed by subsidence phases with some episodic superimposed mini-uplifts. Ground deformation is an important volcanic precursor, and, its continuous monitoring, is one of the main tool for short time forecast of eruptive activity. This paper provides an overview of the continuous GPS monitoring of the Campi Flegrei caldera from January 2000 to July 2013, including network operations, data recording and processing, and data products. In this period the GPS time series allowed continuous and accurate tracking of ground deformation of the area. Seven main uplift episodes were detected, and during each uplift period, the recurrent horizontal displacement pattern, radial from the “caldera center”, suggests no significant change in deformation source geometry and location occurs. The complete archive of GPS time series at Campi Flegrei area is reported in the Supplementary materials. These data can be usefull for the scientific community in improving the research on Campi Flegrei caldera dynamic and hazard assessment.
Estimation of dynamic flux profiles from metabolic time series data
Directory of Open Access Journals (Sweden)
Chou I-Chun
2012-07-01
Full Text Available Abstract Background Advances in modern high-throughput techniques of molecular biology have enabled top-down approaches for the estimation of parameter values in metabolic systems, based on time series data. Special among them is the recent method of dynamic flux estimation (DFE, which uses such data not only for parameter estimation but also for the identification of functional forms of the processes governing a metabolic system. DFE furthermore provides diagnostic tools for the evaluation of model validity and of the quality of a model fit beyond residual errors. Unfortunately, DFE works only when the data are more or less complete and the system contains as many independent fluxes as metabolites. These drawbacks may be ameliorated with other types of estimation and information. However, such supplementations incur their own limitations. In particular, assumptions must be made regarding the functional forms of some processes and detailed kinetic information must be available, in addition to the time series data. Results The authors propose here a systematic approach that supplements DFE and overcomes some of its shortcomings. Like DFE, the approach is model-free and requires only minimal assumptions. If sufficient time series data are available, the approach allows the determination of a subset of fluxes that enables the subsequent applicability of DFE to the rest of the flux system. The authors demonstrate the procedure with three artificial pathway systems exhibiting distinct characteristics and with actual data of the trehalose pathway in Saccharomyces cerevisiae. Conclusions The results demonstrate that the proposed method successfully complements DFE under various situations and without a priori assumptions regarding the model representation. The proposed method also permits an examination of whether at all, to what degree, or within what range the available time series data can be validly represented in a particular functional format of
Chaotic time series analysis in economics: Balance and perspectives
International Nuclear Information System (INIS)
Faggini, Marisa
2014-01-01
The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area
Ensemble Deep Learning for Biomedical Time Series Classification
Directory of Open Access Journals (Sweden)
Lin-peng Jin
2016-01-01
Full Text Available Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.