WorldWideScience

Sample records for interrupted time series

  1. Clinical and epidemiological round: Interrupted time series

    Directory of Open Access Journals (Sweden)

    León-Álvarez, Alba Luz

    2017-07-01

    Full Text Available In quasi-experimental research, it is commonly used the interrupted time series analysis, which measures the effect of an intervention from a specific time point. This technique integrates longitudinal data and allows to discover detailed trends before and after such intervention. It is considered an important tool to understand the patterns of change after any event, it is applicable in different disciplines and have a great potential to draw conclusions in research with long follow-up periods that require objective evaluation of interventions.

  2. Innovating patient care delivery: DSRIP's interrupted time series analysis paradigm.

    Science.gov (United States)

    Shenoy, Amrita G; Begley, Charles E; Revere, Lee; Linder, Stephen H; Daiger, Stephen P

    2017-12-07

    Adoption of Medicaid Section 1115 waiver is one of the many ways of innovating healthcare delivery system. The Delivery System Reform Incentive Payment (DSRIP) pool, one of the two funding pools of the waiver has four categories viz. infrastructure development, program innovation and redesign, quality improvement reporting and lastly, bringing about population health improvement. A metric of the fourth category, preventable hospitalization (PH) rate was analyzed in the context of eight conditions for two time periods, pre-reporting years (2010-2012) and post-reporting years (2013-2015) for two hospital cohorts, DSRIP participating and non-participating hospitals. The study explains how DSRIP impacted Preventable Hospitalization (PH) rates of eight conditions for both hospital cohorts within two time periods. Eight PH rates were regressed as the dependent variable with time, intervention and post-DSRIP Intervention as independent variables. PH rates of eight conditions were then consolidated into one rate for regressing with the above independent variables to evaluate overall impact of DSRIP. An interrupted time series regression was performed after accounting for auto-correlation, stationarity and seasonality in the dataset. In the individual regression model, PH rates showed statistically significant coefficients for seven out of eight conditions in DSRIP participating hospitals. In the combined regression model, the coefficient of the PH rate showed a statistically significant decrease with negative p-values for regression coefficients in DSRIP participating hospitals compared to positive/increased p-values for regression coefficients in DSRIP non-participating hospitals. Several macro- and micro-level factors may have likely contributed DSRIP hospitals outperforming DSRIP non-participating hospitals. Healthcare organization/provider collaboration, support from healthcare professionals, DSRIP's design, state reimbursement and coordination in care delivery methods

  3. The Impact of the Hotel Room Tax: An Interrupted Time Series Approach

    OpenAIRE

    Bonham, Carl; Fujii, Edwin; Im, Eric; Mak, James

    1992-01-01

    Employs interrupted time series analysis to estimate ex post the impact of a hotel room tax on real net hotel revenues by analyzing that time series before and after the imposition of the tax. Finds that the tax had a negligible effect on real hotel revenues.

  4. Using machine learning to identify structural breaks in single-group interrupted time series designs.

    Science.gov (United States)

    Linden, Ariel; Yarnold, Paul R

    2016-12-01

    Single-group interrupted time series analysis (ITSA) is a popular evaluation methodology in which a single unit of observation is being studied, the outcome variable is serially ordered as a time series and the intervention is expected to 'interrupt' the level and/or trend of the time series, subsequent to its introduction. Given that the internal validity of the design rests on the premise that the interruption in the time series is associated with the introduction of the treatment, treatment effects may seem less plausible if a parallel trend already exists in the time series prior to the actual intervention. Thus, sensitivity analyses should focus on detecting structural breaks in the time series before the intervention. In this paper, we introduce a machine-learning algorithm called optimal discriminant analysis (ODA) as an approach to determine if structural breaks can be identified in years prior to the initiation of the intervention, using data from California's 1988 voter-initiated Proposition 99 to reduce smoking rates. The ODA analysis indicates that numerous structural breaks occurred prior to the actual initiation of Proposition 99 in 1989, including perfect structural breaks in 1983 and 1985, thereby casting doubt on the validity of treatment effects estimated for the actual intervention when using a single-group ITSA design. Given the widespread use of ITSA for evaluating observational data and the increasing use of machine-learning techniques in traditional research, we recommend that structural break sensitivity analysis is routinely incorporated in all research using the single-group ITSA design. © 2016 John Wiley & Sons, Ltd.

  5. A knowledge translation tool improved osteoporosis disease management in primary care: an interrupted time series analysis.

    Science.gov (United States)

    Kastner, Monika; Sawka, Anna M; Hamid, Jemila; Chen, Maggie; Thorpe, Kevin; Chignell, Mark; Ewusie, Joycelyne; Marquez, Christine; Newton, David; Straus, Sharon E

    2014-09-25

    Osteoporosis affects over 200 million people worldwide at a high cost to healthcare systems, yet gaps in management still exist. In response, we developed a multi-component osteoporosis knowledge translation (Op-KT) tool involving a patient-initiated risk assessment questionnaire (RAQ), which generates individualized best practice recommendations for physicians and customized education for patients at the point of care. The objective of this study was to evaluate the effectiveness of the Op-KT tool for appropriate disease management by physicians. The Op-KT tool was evaluated using an interrupted time series design. This involved multiple assessments of the outcomes 12 months before (baseline) and 12 months after tool implementation (52 data points in total). Inclusion criteria were family physicians and their patients at risk for osteoporosis (women aged ≥ 50 years, men aged ≥ 65 years). Primary outcomes were the initiation of appropriate osteoporosis screening and treatment. Analyses included segmented linear regression modeling and analysis of variance. The Op-KT tool was implemented in three family practices in Ontario, Canada representing 5 family physicians with 2840 age eligible patients (mean age 67 years; 76% women). Time series regression models showed an overall increase from baseline in the initiation of screening (3.4%; P management addressed by their physician. Study limitations included the inherent susceptibility of our design compared with a randomized trial. The multicomponent Op-KT tool significantly increased osteoporosis investigations in three family practices, and highlights its potential to facilitate patient self-management. Next steps include wider implementation and evaluation of the tool in primary care.

  6. A combined teamwork training and work standardisation intervention in operating theatres: controlled interrupted time series study.

    Science.gov (United States)

    Morgan, Lauren; Pickering, Sharon P; Hadi, Mohammed; Robertson, Eleanor; New, Steve; Griffin, Damian; Collins, Gary; Rivero-Arias, Oliver; Catchpole, Ken; McCulloch, Peter

    2015-02-01

    Teamwork training and system standardisation have both been proposed to reduce error and harm in surgery. Since the approaches differ markedly, there is potential for synergy between them. Controlled interrupted time series with a 3 month intervention and observation phases before and after. Operating theatres conducting elective orthopaedic surgery in a single hospital system (UK Hospital Trust). Teamwork training based on crew resource management plus training and follow-up support in developing standardised operating procedures. Focus of subsequent standardisation efforts decided by theatre staff. Paired observers watched whole procedures together. We assessed non-technical skills using NOTECHS II, technical performance using glitch rate and compliance with WHO checklist using a simple quality tool. We measured complication and readmission rates and hospital stay using hospital administrative records. Before/after change was compared in the active and control groups using two-way ANOVA and regression models. 1121 patients were operated on before and 1100 after intervention. 44 operations were observed before and 50 afterwards. Non-technical skills (p=0.002) and WHO compliance (pteamwork and system improvement causes marked improvements in team behaviour and WHO performance, but not technical performance or outcome. These findings are consistent with the synergistic hypothesis, but larger controlled studies with a strong implementation strategy are required to test potential outcome effects. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  7. Reduction in male suicide mortality following the 2006 Russian alcohol policy: an interrupted time series analysis.

    Science.gov (United States)

    Pridemore, William Alex; Chamlin, Mitchell B; Andreev, Evgeny

    2013-11-01

    We took advantage of a natural experiment to assess the impact on suicide mortality of a suite of Russian alcohol policies. We obtained suicide counts from anonymous death records collected by the Russian Federal State Statistics Service. We used autoregressive integrated moving average (ARIMA) interrupted time series techniques to model the effect of the alcohol policy (implemented in January 2006) on monthly male and female suicide counts between January 2000 and December 2010. Monthly male and female suicide counts decreased during the period under study. Although the ARIMA analysis showed no impact of the policy on female suicide mortality, the results revealed an immediate and permanent reduction of about 9% in male suicides (Ln ω0 = -0.096; P = .01). Despite a recent decrease in mortality, rates of alcohol consumption and suicide in Russia remain among the highest in the world. Our analysis revealed that the 2006 alcohol policy in Russia led to a 9% reduction in male suicide mortality, meaning the policy was responsible for saving 4000 male lives annually that would otherwise have been lost to suicide. Together with recent similar findings elsewhere, our results suggest an important role for public health and other population level interventions, including alcohol policy, in reducing alcohol-related harm.

  8. The effect of the Swedish bicycle helmet law for children: an interrupted time series study.

    Science.gov (United States)

    Bonander, Carl; Nilson, Finn; Andersson, Ragnar

    2014-12-01

    Previous population-based research has shown that bicycle helmet laws can reduce head injury rates among cyclists. According to deterrence theory, such laws are mainly effective if there is a high likelihood of being apprehended. In this study, we investigated the effect of the Swedish helmet law for children under the age of 15, a population that cannot be fined. An interrupted time series design was used. Monthly inpatient data on injured cyclists from 1998-2012, stratified by age (0-14, 15+), sex, and injury diagnosis, was obtained from the National Patient Register. The main outcome measure was the proportion of head injury admissions per month. Intervention effect estimates were obtained using generalized autoregressive moving average (GARMA) models. Pre-legislation trend and seasonality was adjusted for, and differences-in-differences estimation was obtained using adults as a non-equivalent control group. There was a statistically significant intervention effect among male children, where the proportion of head injuries dropped by 7.8 percentage points. There was no evidence of an intervention effect on the proportion of head injuries among female children. According to hospital admission data, the bicycle helmet law appears to have had an effect only on male children. This study, while quasi-experimental and thus not strictly generalizable, can contribute to increased knowledge regarding the effects of bicycle helmet laws. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.

  9. Evaluating the impact of flexible alcohol trading hours on violence: an interrupted time series analysis.

    Directory of Open Access Journals (Sweden)

    David K Humphreys

    Full Text Available On November 24(th 2005, the Government of England and Wales removed regulatory restrictions on the times at which licensed premises could sell alcohol. This study tests availability theory by treating the implementation of Licensing Act (2003 as a natural experiment in alcohol policy.An interrupted time series design was employed to estimate the Act's immediate and delayed impact on violence in the City of Manchester (Population 464,200. We collected police recorded rates of violence, robbery, and total crime between the 1st of February 2004 and the 31st of December 2007. Events were aggregated by week, yielding a total of 204 observations (95 pre-, and 109 post-intervention. Secondary analysis examined changes in daily patterns of violence. Pre- and post-intervention events were separated into four three-hour segments 18∶00-20∶59, 21∶00-23.59, 00∶00-02∶59, 03∶00-05∶59.Analysis found no evidence that the Licensing Act (2003 affected the overall volume of violence. However, analyses of night-time violence found a gradual and permanent shift of weekend violence into later parts of the night. The results estimated an initial increase of 27.5% between 03∶00 to 06∶00 (ω = 0.2433, 95% CI = 0.06, 0.42, which increased to 36% by the end of the study period (δ = -0.897, 95% CI = -1.02, -0.77.This study found no evidence that a national policy increasing the physical availability of alcohol affected the overall volume of violence. There was, however, evidence suggesting that the policy may be associated with changes to patterns of violence in the early morning (3 a.m. to 6 a.m..

  10. Methods, applications, interpretations and challenges of interrupted time series (ITS) data: protocol for a scoping review.

    Science.gov (United States)

    Ewusie, Joycelyne E; Blondal, Erik; Soobiah, Charlene; Beyene, Joseph; Thabane, Lehana; Straus, Sharon E; Hamid, Jemila S

    2017-07-02

    Interrupted time series (ITS) design involves collecting data across multiple time points before and after the implementation of an intervention to assess the effect of the intervention on an outcome. ITS designs have become increasingly common in recent times with frequent use in assessing impact of evidence implementation interventions. Several statistical methods are currently available for analysing data from ITS designs; however, there is a lack of guidance on which methods are optimal for different data types and on their implications in interpreting results. Our objective is to conduct a scoping review of existing methods for analysing ITS data, to summarise their characteristics and properties, as well as to examine how the results are reported. We also aim to identify gaps and methodological deficiencies. We will search electronic databases from inception until August 2016 (eg, MEDLINE and JSTOR). Two reviewers will independently screen titles, abstracts and full-text articles and complete the data abstraction. The anticipated outcome will be a summarised description of all the methods that have been used in analysing ITS data in health research, how those methods were applied, their strengths and limitations and the transparency of interpretation/reporting of the results. We will provide summary tables of the characteristics of the included studies. We will also describe the similarities and differences of the various methods. Ethical approval is not required for this study since we are just considering the methods used in the analysis and there will not be identifiable patient data. Results will be disseminated through open access peer-reviewed publications. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Impact of bariatric surgery on clinical depression. Interrupted time series study with matched controls.

    Science.gov (United States)

    Booth, Helen; Khan, Omar; Prevost, A Toby; Reddy, Marcus; Charlton, Judith; Gulliford, Martin C

    2015-03-15

    Obesity is associated with depression. This study aimed to evaluate whether clinical depression is reduced after bariatric surgery (BS). Obese adults who received BS procedures from 2002 to 2014 were sampled from the UK Clinical Practice Research Datalink. An interrupted time series design, with matched controls, was conducted from three years before, to a maximum of seven years after surgery. Controls were matched for body mass index (BMI), age, gender and year of procedure. Clinical depression was defined as a medical diagnosis recorded in year, or an antidepressant prescribed in year to a participant ever diagnosed with depression. Adjusted odds ratios (AOR) were estimated. There were 3045 participants (mean age 45.9; mean BMI 44.0kg/m(2)) who received BS, including laparoscopic gastric banding in 1297 (43%), gastric bypass in 1265 (42%), sleeve gastrectomy in 477 (16%) and six undefined. Before surgery, 36% of BS participants, and 21% of controls, had clinical depression; between-group AOR, 2.02, 95%CI 1.75-2.33, P<0.001. In the second post-operative year 32% had depression; AOR, compared to time without surgery, 0.83 (0.76-0.90, P<0.001). By the seventh year, the prevalence of depression increased to 37%; AOR 0.99 (0.76-1.29, P=0.959). Despite matching there were differences in depression between BS and control patients, representing the highly selective nature of BS. Depression is frequent among individuals selected to undergo bariatric surgery. Bariatric surgery may be associated with a modest reduction in clinical depression over the initial post-operative years but this is not maintained. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  12. Dollarization and economic development in Zimbabwe: An interrupted time-series analysis

    Directory of Open Access Journals (Sweden)

    Raphael Tabani Mpofu

    2015-10-01

    Full Text Available This paper examines the impact of dollarization on the performance of the Zimbabwean economy from 2003 to 2014 using an interrupted time-series analysis. In Zimbabwe’s case, dollarization was the official replacement of the Zimbabwean dollar with the U.S. dollar. Rapid dollarization in the economy was accelerated by the exogenous shock caused by the injection of cash dollars into the Zimbabwean economy, mostly from international transfers. Since the official adoption of dollarization, Zimbabwe is largely a cash-based economy, with a huge amount of U.S. dollars that are in circulation outside the banking system. A hands-off approach to currency management has served Zimbabwe well since 2009, but a number of risks are beginning to emerge as the economy has slowly regenerated itself and the need for large capital injections has increased. Macroeconomic data obtained from the World Bank and from the Reserve Bank of Zimbabwe’s Monthly Economic Review is analysed. According to the tests conducted, it was found that dollarization did introduce some macroeconomic stability in Zimbabwe although a few key macroeconomic variables showed a sustained improvement. Statistical analysis shows that increased dollarization had positively affected reversed the spiralling effects of hyperinflation that were prevalent prior to 2009, although inflationary pressures still continued, albeit at a slower pace. This research has implications not just for Zimbabwean policy makers as they grapple with decisions pertaining to re-adoption of a local currency and/or the continuation of the use of the US dollar and/or the adoption of a regional currency, for example, the South African rand. The African Union and specifically, the Southern Africa Development Community should look at these policy issues very closely in order to provide policy direction to its member states.

  13. Improving PICC use and outcomes in hospitalised patients: an interrupted time series study using MAGIC criteria.

    Science.gov (United States)

    Swaminathan, Lakshmi; Flanders, Scott; Rogers, Mary; Calleja, Yvonne; Snyder, Ashley; Thyagarajan, Rama; Bercea, Priscila; Chopra, Vineet

    2018-04-01

    Although important in clinical care, reports of inappropriate peripherally inserted central catheter (PICC) use are growing. To test whether implementation of the Michigan Appropriateness Guide for Intravenous Catheters (MAGIC) can improve PICC use and patient outcomes. Quasi-experimental, interrupted time series design at one study site with nine contemporaneous external controls. Ten hospitals participating in a state-wide quality collaborative from 1 August 2014 to 31 July 2016. 963 hospitalised patients who received a PICC at the study site vs 6613 patients at nine control sites. A multimodal intervention (tool, training, electronic changes, education) derived from MAGIC. Appropriateness of PICC use and rates of PICC-associated complications. Segmented Poisson regression was used for analyses. Absolute rates of inappropriate PICC use decreased substantially at the study site versus controls (91.3% to 65.3% (-26.0%) vs 72.2% to 69.6% (-2.6%); PPICC use occurred at the study site (incidence rate ratio 0.86 (95% CI 0.74 to 0.99; P=0.048)); no change was observed at control sites. While the incidence of all PICC complications decreased to a greater extent at the study site, the absolute difference between controls and intervention was small (33.9% to 26.7% (-7.2%) vs 22.4% to 20.8% (-1.6%); P=0.036). Non-randomised design limits inference; the most effective component of the multimodal intervention is unknown; effects following implementation were modest. In a multihospital quality improvement project, implementation of MAGIC improved PICC appropriateness and reduced complications to a modest extent. Given the size and resources required for this study, future work should consider cost-to-benefit ratio of similar approaches. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Interrupted time-series analysis of regulations to reduce paracetamol (acetaminophen poisoning.

    Directory of Open Access Journals (Sweden)

    Oliver W Morgan

    2007-04-01

    Full Text Available Paracetamol (acetaminophen poisoning is the leading cause of acute liver failure in Great Britain and the United States. Successful interventions to reduced harm from paracetamol poisoning are needed. To achieve this, the government of the United Kingdom introduced legislation in 1998 limiting the pack size of paracetamol sold in shops. Several studies have reported recent decreases in fatal poisonings involving paracetamol. We use interrupted time-series analysis to evaluate whether the recent fall in the number of paracetamol deaths is different to trends in fatal poisoning involving aspirin, paracetamol compounds, antidepressants, or nondrug poisoning suicide.We calculated directly age-standardised mortality rates for paracetamol poisoning in England and Wales from 1993 to 2004. We used an ordinary least-squares regression model divided into pre- and postintervention segments at 1999. The model included a term for autocorrelation within the time series. We tested for changes in the level and slope between the pre- and postintervention segments. To assess whether observed changes in the time series were unique to paracetamol, we compared against poisoning deaths involving compound paracetamol (not covered by the regulations, aspirin, antidepressants, and nonpoisoning suicide deaths. We did this comparison by calculating a ratio of each comparison series with paracetamol and applying a segmented regression model to the ratios. No change in the ratio level or slope indicated no difference compared to the control series. There were about 2,200 deaths involving paracetamol. The age-standardised mortality rate rose from 8.1 per million in 1993 to 8.8 per million in 1997, subsequently falling to about 5.3 per million in 2004. After the regulations were introduced, deaths dropped by 2.69 per million (p = 0.003. Trends in the age-standardised mortality rate for paracetamol compounds, aspirin, and antidepressants were broadly similar to paracetamol

  15. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    Science.gov (United States)

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pimprovement of +6.1% (p=0.017) and sustained monthly improvements in care delivery-improving at a rate of 0.7% per month (p=0.028). The SAMU experience demonstrates the utility of a responsive, data-driven quality improvement programme to yield significant immediate and sustained improvements in pre-hospital care for trauma in Rwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the

  16. Impact of STROBE Statement Publication on Quality of Observational Study Reporting: Interrupted Time Series versus Before-After Analysis

    OpenAIRE

    Bastuji-Garin, Sylvie; Sbidian, Emilie; Gaudy-Marqueste, Caroline; Ferrat, Emilie; Roujeau, Jean-Claude; Richard, Marie-Aleth; Canoui-Poitrine, Florence; Bouwes Bavinck, Jan Nico; Coenraads, Pieter-Jan; Diepgen, T.L.; Elsner, Peter; Garcia-Doval, Ignacio; Grob, J.J.; Langan, Sinead; Naldi, L.

    2013-01-01

    textabstractBackground:In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement publication on the quality of observational study reporting, using both uncontrolled before-after analyses and interrupted time series.Methods:For this quasi-experimental study, original articles reporting...

  17. A robust interrupted time series model for analyzing complex health care intervention data

    KAUST Repository

    Cruz, Maricela

    2017-08-29

    Current health policy calls for greater use of evidence-based care delivery services to improve patient quality and safety outcomes. Care delivery is complex, with interacting and interdependent components that challenge traditional statistical analytic techniques, in particular, when modeling a time series of outcomes data that might be

  18. Measuring Quality Improvement in Acute Ischemic Stroke Care: Interrupted Time Series Analysis of Door-to-Needle Time

    Directory of Open Access Journals (Sweden)

    Anne Margreet van Dishoeck

    2014-06-01

    Full Text Available Background: In patients with acute ischemic stroke, early treatment with recombinant tissue plasminogen activator (rtPA improves functional outcome by effectively reducing disability and dependency. Timely thrombolysis, within 1 h, is a vital aspect of acute stroke treatment, and is reflected in the widely used performance indicator ‘door-to-needle time' (DNT. DNT measures the time from the moment the patient enters the emergency department until he/she receives intravenous rtPA. The purpose of the study was to measure quality improvement from the first implementation of thrombolysis in stroke patients in a university hospital in the Netherlands. We further aimed to identify specific interventions that affect DNT. Methods: We included all patients with acute ischemic stroke consecutively admitted to a large university hospital in the Netherlands between January 2006 and December 2012, and focused on those treated with thrombolytic therapy on admission. Data were collected routinely for research purposes and internal quality measurement (the Erasmus Stroke Study. We used a retrospective interrupted time series design to study the trend in DNT, analyzed by means of segmented regression. Results: Between January 2006 and December 2012, 1,703 patients with ischemic stroke were admitted and 262 (17% were treated with rtPA. Patients treated with thrombolysis were on average 63 years old at the time of the stroke and 52% were male. Mean age (p = 0.58 and sex distribution (p = 0.98 did not change over the years. The proportion treated with thrombolysis increased from 5% in 2006 to 22% in 2012. In 2006, none of the patients were treated within 1 h. In 2012, this had increased to 81%. In a logistic regression analysis, this trend was significant (OR 1.6 per year, CI 1.4-1.8. The median DNT was reduced from 75 min in 2006 to 45 min in 2012 (p Conclusion and Implications: The DNT steadily improved from the first implementation of thrombolysis. Specific

  19. Effects of a Brief Team Training Program on Surgical Teams' Nontechnical Skills: An Interrupted Time-Series Study.

    Science.gov (United States)

    Gillespie, Brigid M; Harbeck, Emma; Kang, Evelyn; Steel, Catherine; Fairweather, Nicole; Panuwatwanich, Kriengsak; Chaboyer, Wendy

    2017-04-27

    Up to 60% of adverse events in surgery are the result of poor communication and teamwork. Nontechnical skills in surgery (NOTSS) are critical to the success of surgery and patient safety. The study aim was to evaluate the effect of a brief team training intervention on teams' observed NOTSS. Pretest-posttest interrupted time-series design with statistical process control analysis was used to detect longitudinal changes in teams' NOTSS. We evaluated NOTSS using the revised NOTECHS weekly for 20 to 25 weeks before and after implementation of a team training program. We observed 179 surgical procedures with cardiac, vascular, upper gastrointestinal, and hepatobiliary teams. Mean posttest NOTECHS scores increased across teams, showing special cause variation. There were also significant before and after improvements in NOTECHS scores in respect to professional role and in the use of the Surgical Safety Checklist. Our results suggest associated improvements in teams' NOTSS after implementation of the team training program.

  20. An Interrupted Time-Series Analysis of Durkheim's Social Deregulation Thesis: The Case of the Russian Federation

    Science.gov (United States)

    Pridemore, William Alex; Chamlin, Mitchell B.; Cochran, John K.

    2009-01-01

    The dissolution of the Soviet Union resulted in sudden, widespread, and fundamental changes to Russian society. The former social welfare system-with its broad guarantees of employment, healthcare, education, and other forms of social support-was dismantled in the shift toward democracy, rule of law, and a free-market economy. This unique natural experiment provides a rare opportunity to examine the potentially disintegrative effects of rapid social change on deviance, and thus to evaluate one of Durkheim's core tenets. We took advantage of this opportunity by performing interrupted time-series analyses of annual age-adjusted homicide, suicide, and alcohol-related mortality rates for the Russian Federation using data from 1956 to 2002, with 1992-2002 as the postintervention time-frame. The ARIMA models indicate that, controlling for the long-term processes that generated these three time series, the breakup of the Soviet Union was associated with an appreciable increase in each of the cause-of-death rates. We interpret these findings as being consistent with the Durkheimian hypothesis that rapid social change disrupts social order, thereby increasing the level of crime and deviance. PMID:20165565

  1. Persistent threats to validity in single-group interrupted time series analysis with a cross over design.

    Science.gov (United States)

    Linden, Ariel

    2017-04-01

    The basic single-group interrupted time series analysis (ITSA) design has been shown to be susceptible to the most common threat to validity-history-the possibility that some other event caused the observed effect in the time series. A single-group ITSA with a crossover design (in which the intervention is introduced and withdrawn 1 or more times) should be more robust. In this paper, we describe and empirically assess the susceptibility of this design to bias from history. Time series data from 2 natural experiments (the effect of multiple repeals and reinstatements of Louisiana's motorcycle helmet law on motorcycle fatalities and the association between the implementation and withdrawal of Gorbachev's antialcohol campaign with Russia's mortality crisis) are used to illustrate that history remains a threat to ITSA validity, even in a crossover design. Both empirical examples reveal that the single-group ITSA with a crossover design may be biased because of history. In the case of motorcycle fatalities, helmet laws appeared effective in reducing mortality (while repealing the law increased mortality), but when a control group was added, it was shown that this trend was similar in both groups. In the case of Gorbachev's antialcohol campaign, only when contrasting the results against those of a control group was the withdrawal of the campaign found to be the more likely culprit in explaining the Russian mortality crisis than the collapse of the Soviet Union. Even with a robust crossover design, single-group ITSA models remain susceptible to bias from history. Therefore, a comparable control group design should be included, whenever possible. © 2016 John Wiley & Sons, Ltd.

  2. Impact of STROBE statement publication on quality of observational study reporting: interrupted time series versus before-after analysis.

    Directory of Open Access Journals (Sweden)

    Sylvie Bastuji-Garin

    Full Text Available In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement publication on the quality of observational study reporting, using both uncontrolled before-after analyses and interrupted time series.For this quasi-experimental study, original articles reporting cohort, case-control, and cross-sectional studies published between 2004 and 2010 in the four dermatological journals having the highest 5-year impact factors (≥ 4 were selected. We compared the proportions of STROBE items (STROBE score adequately reported in each article during three periods, two pre STROBE period (2004-2005 and 2006-2007 and one post STROBE period (2008-2010. Segmented regression analysis of interrupted time series was also performed.Of the 456 included articles, 187 (41% reported cohort studies, 166 (36.4% cross-sectional studies, and 103 (22.6% case-control studies. The median STROBE score was 57% (range, 18%-98%. Before-after analysis evidenced significant STROBE score increases between the two pre-STROBE periods and between the earliest pre-STROBE period and the post-STROBE period (median score2004-05 48% versus median score2008-10 58%, p<0.001 but not between the immediate pre-STROBE period and the post-STROBE period (median score2006-07 58% versus median score2008-10 58%, p = 0.42. In the pre STROBE period, the six-monthly mean STROBE score increased significantly, by 1.19% per six-month period (absolute increase 95%CI, 0.26% to 2.11%, p = 0.016. By segmented analysis, no significant changes in STROBE score trends occurred (-0.40%; 95%CI, -2.20 to 1.41; p = 0.64 in the post STROBE statement publication.The quality of reports increased over time but was not affected by STROBE. Our findings raise concerns about the relevance of uncontrolled before

  3. Impact of STROBE statement publication on quality of observational study reporting: interrupted time series versus before-after analysis.

    Science.gov (United States)

    Bastuji-Garin, Sylvie; Sbidian, Emilie; Gaudy-Marqueste, Caroline; Ferrat, Emilie; Roujeau, Jean-Claude; Richard, Marie-Aleth; Canoui-Poitrine, Florence

    2013-01-01

    In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement publication on the quality of observational study reporting, using both uncontrolled before-after analyses and interrupted time series. For this quasi-experimental study, original articles reporting cohort, case-control, and cross-sectional studies published between 2004 and 2010 in the four dermatological journals having the highest 5-year impact factors (≥ 4) were selected. We compared the proportions of STROBE items (STROBE score) adequately reported in each article during three periods, two pre STROBE period (2004-2005 and 2006-2007) and one post STROBE period (2008-2010). Segmented regression analysis of interrupted time series was also performed. Of the 456 included articles, 187 (41%) reported cohort studies, 166 (36.4%) cross-sectional studies, and 103 (22.6%) case-control studies. The median STROBE score was 57% (range, 18%-98%). Before-after analysis evidenced significant STROBE score increases between the two pre-STROBE periods and between the earliest pre-STROBE period and the post-STROBE period (median score2004-05 48% versus median score2008-10 58%, pSTROBE period and the post-STROBE period (median score2006-07 58% versus median score2008-10 58%, p = 0.42). In the pre STROBE period, the six-monthly mean STROBE score increased significantly, by 1.19% per six-month period (absolute increase 95%CI, 0.26% to 2.11%, p = 0.016). By segmented analysis, no significant changes in STROBE score trends occurred (-0.40%; 95%CI, -2.20 to 1.41; p = 0.64) in the post STROBE statement publication. The quality of reports increased over time but was not affected by STROBE. Our findings raise concerns about the relevance of uncontrolled before-after analysis for estimating the impact of

  4. Effect of nocturnal sound reduction on the incidence of delirium in intensive care unit patients: An interrupted time series analysis.

    Science.gov (United States)

    van de Pol, Ineke; van Iterson, Mat; Maaskant, Jolanda

    2017-08-01

    Delirium in critically-ill patients is a common multifactorial disorder that is associated with various negative outcomes. It is assumed that sleep disturbances can result in an increased risk of delirium. This study hypothesized that implementing a protocol that reduces overall nocturnal sound levels improves quality of sleep and reduces the incidence of delirium in Intensive Care Unit (ICU) patients. This interrupted time series study was performed in an adult mixed medical and surgical 24-bed ICU. A pre-intervention group of 211 patients was compared with a post-intervention group of 210 patients after implementation of a nocturnal sound-reduction protocol. Primary outcome measures were incidence of delirium, measured by the Intensive Care Delirium Screening Checklist (ICDSC) and quality of sleep, measured by the Richards-Campbell Sleep Questionnaire (RCSQ). Secondary outcome measures were use of sleep-inducing medication, delirium treatment medication, and patient-perceived nocturnal noise. A significant difference in slope in the percentage of delirium was observed between the pre- and post-intervention periods (-3.7% per time period, p=0.02). Quality of sleep was unaffected (0.3 per time period, p=0.85). The post-intervention group used significantly less sleep-inducing medication (psound-reduction protocol. However, reported sleep quality did not improve. Copyright © 2017. Published by Elsevier Ltd.

  5. The effects of pay for performance on disparities in stroke, hypertension, and coronary heart disease management: interrupted time series study.

    Directory of Open Access Journals (Sweden)

    John Tayu Lee

    Full Text Available The Quality and Outcomes Framework (QOF, a major pay-for-performance programme, was introduced into United Kingdom primary care in April 2004. The impact of this programme on disparities in health care remains unclear. This study examines the following questions: has this pay for performance programme improved the quality of care for coronary heart disease, stroke and hypertension in white, black and south Asian patients? Has this programme reduced disparities in the quality of care between these ethnic groups? Did general practices with different baseline performance respond differently to this programme?Retrospective cohort study of patients registered with family practices in Wandsworth, London during 2007. Segmented regression analysis of interrupted time series was used to take into account the previous time trend. Primary outcome measures were mean systolic and diastolic blood pressure, and cholesterol levels. Our findings suggest that the implementation of QOF resulted in significant short term improvements in blood pressure control. The magnitude of benefit varied between ethnic groups with a statistically significant short term reduction in systolic BP in white and black but not in south Asian patients with hypertension. Disparities in risk factor control were attenuated only on few measures and largely remained intact at the end of the study period.Pay for performance programmes such as the QOF in the UK should set challenging but achievable targets. Specific targets aimed at reducing ethnic disparities in health care may also be needed.

  6. The effects of pay for performance on disparities in stroke, hypertension, and coronary heart disease management: interrupted time series study.

    Science.gov (United States)

    Lee, John Tayu; Netuveli, Gopalakrishnan; Majeed, Azeem; Millett, Christopher

    2011-01-01

    The Quality and Outcomes Framework (QOF), a major pay-for-performance programme, was introduced into United Kingdom primary care in April 2004. The impact of this programme on disparities in health care remains unclear. This study examines the following questions: has this pay for performance programme improved the quality of care for coronary heart disease, stroke and hypertension in white, black and south Asian patients? Has this programme reduced disparities in the quality of care between these ethnic groups? Did general practices with different baseline performance respond differently to this programme? Retrospective cohort study of patients registered with family practices in Wandsworth, London during 2007. Segmented regression analysis of interrupted time series was used to take into account the previous time trend. Primary outcome measures were mean systolic and diastolic blood pressure, and cholesterol levels. Our findings suggest that the implementation of QOF resulted in significant short term improvements in blood pressure control. The magnitude of benefit varied between ethnic groups with a statistically significant short term reduction in systolic BP in white and black but not in south Asian patients with hypertension. Disparities in risk factor control were attenuated only on few measures and largely remained intact at the end of the study period. Pay for performance programmes such as the QOF in the UK should set challenging but achievable targets. Specific targets aimed at reducing ethnic disparities in health care may also be needed.

  7. Use of a glucose management service improves glycemic control following vascular surgery: an interrupted time-series study.

    Science.gov (United States)

    Wallaert, Jessica B; Chaidarun, Sushela S; Basta, Danielle; King, Kathryn; Comi, Richard; Ogrinc, Greg; Nolan, Brian W; Goodney, Philip P

    2015-05-01

    The optimal method for obtaining good blood glucose control in noncritically ill patients undergoing peripheral vascular surgery remains a topic of debate for surgeons, endocrinologists, and others involved in the care of patients with peripheral arterial disease and diabetes. A prospective trial was performed to evaluate the impact of routine use of a glucose management service (GMS) on glycemic control within 24 hours of lower-extremity revascularization (LER). In an interrupted time-series design (May 1, 2011-April 30, 2012), surgeon-directed diabetic care (Baseline phase) to routine GMS involvement (Intervention phase) was compared following LER. GMS assumed responsibility for glucose management through discharge. The main outcome measure was glycemic control, assessed by (1) mean hospitalization glucose and (2) the percentage of recorded glucose values within target range. Statistical process control charts were used to assess the impact of the intervention. Clinically important differences in patient demographics were noted between groups; the 19 patients in the Intervention arm had worse peripheral vascular disease than the 19 patients in the Baseline arm (74% critical limb ischemia versus 58%; p = .63). Routine use of GMS significantly reduced mean hospitalization glucose (191 mg/dL Baseline versus 150 mg/dL Intervention, p improved glycemic control in patients undergoing LER. Future work is needed to examine the impact of improved glycemic control on clinical outcomes following LER.

  8. Effects of fluoroquinolone restriction (from 2007 to 2012) on Clostridium difficile infections: interrupted time-series analysis.

    Science.gov (United States)

    Sarma, J B; Marshall, B; Cleeve, V; Tate, D; Oswald, T; Woolfrey, S

    2015-09-01

    Antimicrobial stewardship is a key component in the reduction of healthcare-associated infections, particularly Clostridium difficile infection (CDI). We successfully restricted the use of cephalosporins and, subsequently, fluoroquinolones. From an endemically high level of >280 cases per year in 2007-08, the number of CDIs reduced to 72 cases in 2011-12. To describe the implementation and impact of fluoroquinolone restriction on CDI. This was an interrupted time-series analysis pre and post fluoroquinolone restriction for 60 months based on a Poisson distribution model. In June 2008, fluoroquinolone consumption halved to about 5 defined daily doses (DDD) per 100 occupied bed-days (OBD). This was followed by a significant fall in CDI number [rate ratio (RR): 0.332; 95% confidence interval (CI): 0.240-0.460] which remained low over the subsequent months. Subsequently, fluoroquinolone consumption was further reduced to about 2 DDD/100 OBD in June 2010 accompanied by further reduction in CDI rate (RR: 0.394; 95% CI: 0.199-0.781). In a univariate Poisson model the CDI rate was associated with fluoroquinolone usage (RR: 1.086; 95% CI: 1.077-1.094). We conclude that in an environment where cephalosporin usage is already low, the reduction in fluoroquinolone usage was associated with an immediate, large, and significant reduction in CDI cases. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  9. Smoke-free parks and beaches: an interrupted time-series study of behavioural impact in New York City.

    Science.gov (United States)

    Johns, Michael; Farley, Shannon M; Rajulu, Deepa T; Kansagra, Susan M; Juster, Harlan R

    2015-09-01

    In 2011, New York City (NYC) parks and beaches became smoke-free. There is currently little research evaluating the impact of such laws on smoking behaviour at the population level. We used an interrupted time-series study design to analyse data from the New York State Adult Tobacco Survey to assess the law's impact using the rest of New York State as a comparison. Trends in how frequently respondents noticed people smoking in parks and beaches were analysed between the third quarter of 2009 and the fourth quarter of 2012, comparing NYC to the rest of the state. The trend in the frequency of NYC residents noticing people smoking in local parks and beaches decreased significantly over the six quarters after the law took effect. There was no comparable decline among residents in the rest of the state. An increase in the number of respondents who never noticed people smoking in NYC contributed to this decline. These results are consistent with previous studies and provide population-level evidence that suggest the law has reduced smoking in parks and on beaches. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  10. Prevention of brachial plexus injury-12 years of shoulder dystocia training: an interrupted time-series study.

    Science.gov (United States)

    Crofts, J F; Lenguerrand, E; Bentham, G L; Tawfik, S; Claireaux, H A; Odd, D; Fox, R; Draycott, T J

    2016-01-01

    To investigate management and outcomes of incidences of shoulder dystocia in the 12 years following the introduction of an obstetric emergencies training programme. Interrupted time-series study comparing management and neonatal outcome of births complicated by shoulder dystocia over three 4-year periods: (i) Pre-training (1996-99), (ii) Early training (2001-04), and (iii) Late training (2009-12). Southmead Hospital, Bristol, UK, with approximately 6000 births per annum. Infants and their mothers who experienced shoulder dystocia. A bi-monthly multi-professional 1-day intrapartum emergencies training course, that included a 30-minute practical session on shoulder dystocia management, commenced in 2000. Neonatal morbidity (brachial plexus injury, humeral fracture, clavicular fracture, 5-minute Apgar score shoulder dystocia (resolution manoeuvres performed, traction applied, head-to-body delivery interval). Compliance with national guidance improved with continued training. At least one recognised resolution manoeuvre was used in 99.8% (561/562) of cases of shoulder dystocia in the late training period, demonstrating a continued improvement from 46.3% (150/324, P shoulder dystocia. © 2015 Royal College of Obstetricians and Gynaecologists.

  11. The impact of public transportation strikes on use of a bicycle share program in London: interrupted time series design.

    Science.gov (United States)

    Fuller, Daniel; Sahlqvist, Shannon; Cummins, Steven; Ogilvie, David

    2012-01-01

    To investigate the immediate and sustained effects of two London Underground strikes on use of a public bicycle share program. An interrupted time series design was used to examine the impact of two 24 hour strikes on the total number of trips per day and mean trip duration per day on the London public bicycle share program. The strikes occurred on September 6th and October 4th 2010 and limited service on the London Underground. The mean total number of trips per day over the whole study period was 14,699 (SD=5390) while the mean trip duration was 18.5 minutes (SD=3.7). Significant increases in daily trip count were observed following strike 1 (3864: 95% CI 125 to 7604) and strike 2 (11,293: 95% CI 5169 to 17,416). Events that greatly constrain the primary motorised mode of transportation for a population may have unintended short-term effects on travel behaviour. These findings suggest that limiting transportation options may have the potential to increase population levels of physical activity by promoting the use of cycling. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Evaluating a community-based exercise intervention with adults living with HIV: protocol for an interrupted time series study.

    Science.gov (United States)

    O'Brien, Kelly K; Bayoumi, Ahmed M; Solomon, Patricia; Tang, Ada; Murzin, Kate; Chan Carusone, Soo; Zobeiry, Mehdi; Nayar, Ayesha; Davis, Aileen M

    2016-10-20

    Our aim was to evaluate a community-based exercise (CBE) intervention with the goal of reducing disability and enhancing health for community-dwelling people living with HIV (PLWH). We will use a mixed-methods implementation science study design, including a prospective longitudinal interrupted time series study, to evaluate a CBE intervention with PLWH in Toronto, Canada. We will recruit PLWH who consider themselves medically stable and safe to participate in exercise. In the baseline phase (0-8 months), participants will be monitored bimonthly. In the intervention phase (8-14 months), participants will take part in a 24-week CBE intervention that includes aerobic, resistance, balance and flexibility exercise at the YMCA 3 times per week, with weekly supervision by a fitness instructor, and monthly educational sessions. In the follow-up phase (14-22 months), participants will be encouraged to continue to engage in unsupervised exercise 3 times per week. Quantitative assessment: We will assess cardiopulmonary fitness, strength, weight, body composition and flexibility outcomes followed by the administration of self-reported questionnaires to assess disability and contextual factor outcomes (coping, mastery, stigma, social support) bimonthly. We will use time series regression analysis to determine the level and trend of outcomes across each phase in relation to the intervention. Qualitative assessment: We will conduct a series of face-to-face interviews with a subsample of participants and recreation providers at initiation, midpoint and completion of the 24-week CBE intervention. We will explore experiences and anticipated benefits with exercise, perceived impact of CBE for PLWH and the strengths and challenges of implementing a CBE intervention. Interviews will be audio recorded and analysed thematically. Protocol approved by the University of Toronto HIV/AIDS Research Ethics Board. Knowledge translation will occur with stakeholders in the form of

  13. An electronic trigger tool to optimise intravenous to oral antibiotic switch: a controlled, interrupted time series study

    Directory of Open Access Journals (Sweden)

    Marvin A. H. Berrevoets

    2017-08-01

    Full Text Available Abstract Background Timely switch from intravenous (iv antibiotics to oral therapy is a key component of antimicrobial stewardship programs in order to improve patient safety, promote early discharge and reduce costs. We have introduced a time-efficient and easily implementable intervention that relies on a computerized trigger tool, which identifies patients who are candidates for an iv to oral antibiotic switch. Methods The intervention was introduced on all internal medicine wards in a teaching hospital. Patients were automatically identified by an electronic trigger tool when parenteral antibiotics were used for >48 h and clinical or pharmacological data did not preclude switch therapy. A weekly educational session was introduced to alert the physicians on the intervention wards. The intervention wards were compared with control wards, which included all other hospital wards. An interrupted time-series analysis was performed to compare the pre-intervention period with the post-intervention period using ‘% of i.v. prescriptions >72 h’ and ‘median duration of iv therapy per prescription’ as outcomes. We performed a detailed prospective evaluation on a subset of 244 prescriptions to evaluate the efficacy and appropriateness of the intervention. Results The number of intravenous prescriptions longer than 72 h was reduced by 19% in the intervention group (n = 1519 (p < 0.01 and the median duration of iv antibiotics was reduced with 0.8 days (p = <0.05. Compared to the control group (n = 4366 the intervention was responsible for an additional decrease of 13% (p < 0.05 in prolonged prescriptions. The detailed prospective evaluation of a subgroup of patients showed that adherence to the electronic reminder was 72%. Conclusions An electronic trigger tool combined with a weekly educational session was effective in reducing the duration of intravenous antimicrobial therapy.

  14. Changing use of surgical antibiotic prophylaxis in Thika Hospital, Kenya: a quality improvement intervention with an interrupted time series design.

    Directory of Open Access Journals (Sweden)

    Alexander M Aiken

    Full Text Available In low-income countries, Surgical Site Infection (SSI is a common form of hospital-acquired infection. Antibiotic prophylaxis is an effective method of preventing these infections, if given immediately before the start of surgery. Although several studies in Africa have compared pre-operative versus post-operative prophylaxis, there are no studies describing the implementation of policies to improve prescribing of surgical antibiotic prophylaxis in African hospitals.We conducted SSI surveillance at a typical Government hospital in Kenya over a 16 month period between August 2010 and December 2011, using standard definitions of SSI and the extent of contamination of surgical wounds. As an intervention, we developed a hospital policy that advised pre-operative antibiotic prophylaxis and discouraged extended post-operative antibiotics use. We measured process, outcome and balancing effects of this intervention in using an interrupted time series design.From a starting point of near-exclusive post-operative antibiotic use, after policy introduction in February 2011 there was rapid adoption of the use of pre-operative antibiotic prophylaxis (60% of operations at 1 week; 98% at 6 weeks and a substantial decrease in the use of post-operative antibiotics (40% of operations at 1 week; 10% at 6 weeks in Clean and Clean-Contaminated surgery. There was no immediate step-change in risk of SSI, but overall, there appeared to be a moderate reduction in the risk of superficial SSI across all levels of wound contamination. There were marked reductions in the costs associated with antibiotic use, the number of intravenous injections performed and nursing time spent administering these.Implementation of a locally developed policy regarding surgical antibiotic prophylaxis is an achievable quality improvement target for hospitals in low-income countries, and can lead to substantial benefits for individual patients and the institution.

  15. A multifaceted intervention to improve sepsis management in general hospital wards with evaluation using segmented regression of interrupted time series.

    Science.gov (United States)

    Marwick, Charis A; Guthrie, Bruce; Pringle, Jan E C; Evans, Josie M M; Nathwani, Dilip; Donnan, Peter T; Davey, Peter G

    2014-12-01

    Antibiotic administration to inpatients developing sepsis in general hospital wards was frequently delayed. We aimed to reproduce improvements in sepsis management reported in other settings. Ninewells Hospital, an 860-bed teaching hospital with quality improvement (QI) experience, in Scotland, UK. The intervention wards were 22 medical, surgical and orthopaedic inpatient wards. A multifaceted intervention, informed by baseline process data and questionnaires and interviews with junior doctors, evaluated using segmented regression analysis of interrupted time series (ITS) data. MEASURES FOR IMPROVEMENT: Primary outcome measure: antibiotic administration within 4 hours of sepsis onset. Secondary measures: antibiotics within 8 hours; mean and median time to antibiotics; medical review within 30 min for patients with a standardised early warning system score .4; blood cultures taken before antibiotic administration; blood lactate level measured. The intervention included printed and electronic clinical guidance, educational clinical team meetings including baseline performance data, audit and monthly feedback on performance. Performance against all study outcome measures improved postintervention but differences were small and ITS analysis did not attribute the observed changes to the intervention. Rigorous analysis of this carefully designed improvement intervention could not confirm significant effects. Statistical analysis of many such studies is inadequate, and there is insufficient reporting of negative studies. In light of recent evidence, involving senior clinical team members in verbal feedback and action planning may have made the intervention more effective. Our focus on rigorous intervention design and evaluation was at the expense of iterative refinement, which likely reduced the effect. This highlights the necessary, but challenging, requirement to invest in all three components for effective QI.

  16. Multifaceted academic detailing program to increase pharmacotherapy for alcohol use disorder: interrupted time series evaluation of effectiveness.

    Science.gov (United States)

    Harris, Alex H S; Bowe, Thomas; Hagedorn, Hildi; Nevedal, Andrea; Finlay, Andrea K; Gidwani, Risha; Rosen, Craig; Kay, Chad; Christopher, Melissa

    2016-09-15

    Active consideration of effective medications to treat alcohol use disorder (AUD) is a consensus standard of care, yet knowledge and use of these medications are very low across diverse settings. This study evaluated the overall effectiveness a multifaceted academic detailing program to address this persistent quality problem in the US Veterans Health Administration (VHA), as well as the context and process factors that explained variation in effectiveness across sites. An interrupted time series design, analyzed with mixed-effects segmented logistic regression, was used to evaluate changes in level and rate of change in the monthly percent of patients with a clinically documented AUD who received naltrexone, acamprosate, disulfiram, or topiramate. Using data from a 20 month post-implementation period, intervention sites (n = 37) were compared to their own 16 month pre-implementation performance and separately to the rest of VHA. From immediately pre-intervention to the end of the observation period, the percent of patients in the intervention sites with AUD who received medication increased over 3.4 % in absolute terms and 68 % in relative terms (i.e., 4.9-8.3 %). This change was significant compared to the pre-implementation period in the intervention sites and secular trends in control sites. Sites with lower pre-implementation adoption, more person hours of detailing, but fewer people detailed, had larger immediate increases in medication receipt after implementation. The average number of detailing encounters per person was associated with steeper increases in slope over time. This study found empirical support for a multifaceted quality improvement strategy aimed at increasing access to and utilization of pharmacotherapy for AUD. Future studies should focus on determining how to enhance the programs effects, especially in non-responsive locations.

  17. Helmet legislation and admissions to hospital for cycling related head injuries in Canadian provinces and territories: interrupted time series analysis

    Science.gov (United States)

    Ramsay, Tim; Turgeon, Alexis F; Zarychanski, Ryan

    2013-01-01

    Objective To investigate the association between helmet legislation and admissions to hospital for cycling related head injuries among young people and adults in Canada. Design Interrupted time series analysis using data from the National Trauma Registry Minimum Data Set. Setting Canadian provinces and territories; between 1994 and 2003, six of 10 provinces implemented helmet legislation. Participants All admissions (n=66 716) to acute care hospitals in Canada owing to cycling related injury between 1994 and 2008. Main outcome measure Rate of admissions to hospital for cycling related head injuries before and after the implementation of provincial helmet legislation. Results Between 1994 and 2008, 66 716 hospital admissions were for cycling related injuries in Canada. Between 1994 and 2003, the rate of head injuries among young people decreased by 54.0% (95% confidence interval 48.2% to 59.8%) in provinces with helmet legislation compared with 33.1% (23.3% to 42.9%) in provinces and territories without legislation. Among adults, the rate of head injuries decreased by 26.0% (16.0% to 36.3%) in provinces with legislation but remained constant in provinces and territories without legislation. After taking baseline trends into consideration, however, we were unable to detect an independent effect of legislation on the rate of hospital admissions for cycling related head injuries. Conclusions Reductions in the rates of admissions to hospital for cycling related head injuries were greater in provinces with helmet legislation, but injury rates were already decreasing before the implementation of legislation and the rate of decline was not appreciably altered on introduction of legislation. While helmets reduce the risk of head injuries and we encourage their use, in the Canadian context of existing safety campaigns, improvements to the cycling infrastructure, and the passive uptake of helmets, the incremental contribution of provincial helmet legislation to reduce

  18. Impact of a hospital-wide hand hygiene initiative on healthcare-associated infections: results of an interrupted time series.

    Science.gov (United States)

    Kirkland, Kathryn B; Homa, Karen A; Lasky, Rosalind A; Ptak, Judy A; Taylor, Eileen A; Splaine, Mark E

    2012-12-01

    Evidence that hand hygiene (HH) reduces healthcare-associated infections has been available for almost two centuries. Yet HH compliance among healthcare professionals continues to be low, and most efforts to improve it have failed. To improve healthcare workers' HH, and reduce healthcare-associated infections. 3-year interrupted time series with multiple sequential interventions and 1-year post-intervention follow-up. Teaching hospital in rural New Hampshire. In five categories: (1) leadership/accountability; (2) measurement/feedback; (3) hand sanitiser availability; (4) education/training; and (5) marketing/communication. Monthly changes in observed HH compliance (%) and rates of healthcare-associated infection (including Staphylococcus aureus infections, Clostridium difficile infections and bloodstream infections) per 1000 inpatient days. The subset of S aureus infections attributable to the operating room served as a tracer condition. We used statistical process control charts to identify significant changes. HH compliance increased significantly from 41% to 87% (p<0.01) during the initiative, and improved further to 91% (p<0.01) the following year. Nurses achieved higher HH compliance (93%) than physicians (78%). There was a significant, sustained decline in the healthcare-associated infection rate from 4.8 to 3.3 (p<0.01) per 1000 inpatient days. The rate of S aureus infections attributable to the operating room rose, while the rate of other S aureus infections fell. Our initiative was associated with a large and significant hospital-wide improvement in HH which was sustained through the following year and a significant, sustained reduction in the incidence of healthcare-associated infection. The observed increased incidence of the tracer condition supports the assertion that HH improvement contributed to infection reduction. Persistent variation in HH performance among different groups requires further study.

  19. Effects of fluoroquinolone restriction (from 2007 to 2012) on resistance in Enterobacteriaceae: interrupted time-series analysis.

    Science.gov (United States)

    Sarma, J B; Marshall, B; Cleeve, V; Tate, D; Oswald, T; Woolfrey, S

    2015-09-01

    Antibiotic stewardship is a key component in the effort to reduce healthcare-associated infections. To describe the implementation and analyse the impact of fluoroquinolone restriction on resistance in Enterobacteriaceae, focusing on urinary isolates of extended-spectrum β-lactamase (ESBL)-producing Escherichia coli, which were historically almost universally resistant to fluoroquinolones. ESBL-producing E. coli hospital and community isolates, obtained between April 2009 and March 2012 from consecutive non-duplicate urine samples, were included in an interrupted time-series analysis based on a Poisson distribution model. Periods before and after fluoroquinolone restriction were compared. The trend in fluoroquinolone resistance in all urinary isolates of Enterobacteriaceae (N ≈ 20,000 per year) and blood culture isolates of E. coli (N ≈ 350) between 2009 and 2013 were also analysed. A large decline in the percentage of ciprofloxacin-resistant ESBL-producing urinary E. coli isolates was observed in both hospital (risk ratio: 0.473; 95% confidence interval: 0.315-0.712) and community settings (0.098; 0.062-0.157). The decline was also marked in all urinary isolates of Enterobacteriaceae and E. coli isolates from blood cultures. We conclude that reducing fluoroquinolone usage to a level of ≤2 defined daily doses per 100 occupied bed-days in hospital sufficiently removed selection pressure to allow resistant Enterobacteriaceae – specifically, the UK endemic strains of ESBL-producing E. coli – to revert back to fluoroquinolone susceptibility within a short span of four months. This was accompanied with a concomitant reduction in overall ESBL burden. Copyright © 2015 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  20. Bridging the Gap: using an interrupted time series design to evaluate systems reform addressing refugee maternal and child health inequalities.

    Science.gov (United States)

    Yelland, Jane; Riggs, Elisha; Szwarc, Josef; Casey, Sue; Dawson, Wendy; Vanpraag, Dannielle; East, Chris; Wallace, Euan; Teale, Glyn; Harrison, Bernie; Petschel, Pauline; Furler, John; Goldfeld, Sharon; Mensah, Fiona; Biro, Mary Anne; Willey, Sue; Cheng, I-Hao; Small, Rhonda; Brown, Stephanie

    2015-04-30

    The risk of poor maternal and perinatal outcomes in high-income countries such as Australia is greatest for those experiencing extreme social and economic disadvantage. Australian data show that women of refugee background have higher rates of stillbirth, fetal death in utero and perinatal mortality compared with Australian born women. Policy and health system responses to such inequities have been slow and poorly integrated. This protocol describes an innovative programme of quality improvement and reform in publically funded universal health services in Melbourne, Australia, that aims to address refugee maternal and child health inequalities. A partnership of 11 organisations spanning health services, government and research is working to achieve change in the way that maternity and early childhood health services support families of refugee background. The aims of the programme are to improve access to universal health care for families of refugee background and build organisational and system capacity to address modifiable risk factors for poor maternal and child health outcomes. Quality improvement initiatives are iterative, co-designed by partners and implemented using the Plan Do Study Act framework in four maternity hospitals and two local government maternal and child health services. Bridging the Gap is designed as a multi-phase, quasi-experimental study. Evaluation methods include use of interrupted time series design to examine health service use and maternal and child health outcomes over a 3-year period of implementation. Process measures will examine refugee families' experiences of specific initiatives and service providers' views and experiences of innovation and change. It is envisaged that the Bridging the Gap program will provide essential evidence to support service and policy innovation and knowledge about what it takes to implement sustainable improvements in the way that health services support vulnerable populations, within the constraints

  1. The effect of the late 2000s financial crisis on suicides in Spain: an interrupted time-series analysis.

    Science.gov (United States)

    Lopez Bernal, James A; Gasparrini, Antonio; Artundo, Carlos M; McKee, Martin

    2013-10-01

    The current financial crisis is having a major impact on European economies, especially that of Spain. Past evidence suggests that adverse macro-economic conditions exacerbate mental illness, but evidence from the current crisis is limited. This study analyses the association between the financial crisis and suicide rates in Spain. An interrupted time-series analysis of national suicides data between 2005 and 2010 was used to establish whether there has been any deviation in the underlying trend in suicide rates associated with the financial crisis. Segmented regression with a seasonally adjusted quasi-Poisson model was used for the analysis. Stratified analyses were performed to establish whether the effect of the crisis on suicides varied by region, sex and age group. The mean monthly suicide rate in Spain during the study period was 0.61 per 100 000 with an underlying trend of a 0.3% decrease per month. We found an 8.0% increase in the suicide rate above this underlying trend since the financial crisis (95% CI: 1.009-1.156; P = 0.03); this was robust to sensitivity analysis. A control analysis showed no change in deaths from accidental falls associated with the crisis. Stratified analyses suggested that the association between the crisis and suicide rates is greatest in the Mediterranean and Northern areas, in males and amongst those of working age. The financial crisis in Spain has been associated with a relative increase in suicides. Males and those of working age may be at particular risk of suicide associated with the crisis and may benefit from targeted interventions.

  2. The impact of economic austerity and prosperity events on suicide in Greece: a 30-year interrupted time-series analysis

    Science.gov (United States)

    Branas, Charles C; Kastanaki, Anastasia E; Michalodimitrakis, Manolis; Tzougas, John; Kranioti, Elena F; Theodorakis, Pavlos N; Carr, Brendan G; Wiebe, Douglas J

    2015-01-01

    Objectives To complete a 30-year interrupted time-series analysis of the impact of austerity-related and prosperity-related events on the occurrence of suicide across Greece. Setting Greece from 1 January 1983 to 31 December 2012. Participants A total of 11 505 suicides, 9079 by men and 2426 by women, occurring in Greece over the study period. Primary and secondary outcomes National data from the Hellenic Statistical Authority assembled as 360 monthly counts of: all suicides, male suicides, female suicides and all suicides plus potentially misclassified suicides. Results In 30 years, the highest months of suicide in Greece occurred in 2012. The passage of new austerity measures in June 2011 marked the beginning of significant, abrupt and sustained increases in total suicides (+35.7%, pGreek recession began (+13.1%, p<0.01), and an abrupt but temporary increase in April 2012 following a public suicide committed in response to austerity conditions (+29.7%, p<0.05). Suicides by women in Greece also underwent an abrupt and sustained increase in May 2011 following austerity-related events (+35.8%, p<0.05). One prosperity-related event, the January 2002 launch of the Euro in Greece, marked an abrupt but temporary decrease in male suicides (−27.1%, p<0.05). Conclusions This is the first multidecade, national analysis of suicide in Greece using monthly data. Select austerity-related events in Greece corresponded to statistically significant increases for suicides overall, as well as for suicides among men and women. The consideration of future austerity measures should give greater weight to the unintended mental health consequences that may follow and the public messaging of these policies and related events. PMID:25643700

  3. Reducing waiting time and raising outpatient satisfaction in a Chinese public tertiary general hospital-an interrupted time series study

    Directory of Open Access Journals (Sweden)

    Jing Sun

    2017-08-01

    Full Text Available Abstract Background It is globally agreed that a well-designed health system deliver timely and convenient access to health services for all patients. Many interventions aiming to reduce waiting times have been implemented in Chinese public tertiary hospitals to improve patients’ satisfaction. However, few were well-documented, and the effects were rarely measured with robust methods. Methods We conducted a longitudinal study of the length of waiting times in a public tertiary hospital in Southern China which developed comprehensive data collection systems. Around an average of 60,000 outpatients and 70,000 prescribed outpatients per month were targeted for the study during Oct 2014-February 2017. We analyzed longitudinal time series data using a segmented linear regression model to assess changes in levels and trends of waiting times before and after the introduction of waiting time reduction interventions. Pearson correlation analysis was conducted to indicate the strength of association between waiting times and patient satisfactions. The statistical significance level was set at 0.05. Results The monthly average length of waiting time decreased 3.49 min (P = 0.003 for consultations and 8.70 min (P = 0.02 for filling prescriptions in the corresponding month when respective interventions were introduced. The trend shifted from baseline slight increasing to afterwards significant decreasing for filling prescriptions (P =0.003. There was a significant negative correlation between waiting time of filling prescriptions and outpatient satisfaction towards pharmacy services (r = −0.71, P = 0.004. Conclusions The interventions aimed at reducing waiting time and raising patient satisfaction in Fujian Provincial Hospital are effective. A long-lasting reduction effect on waiting time for filling prescriptions was observed because of carefully designed continuous efforts, rather than a one-time campaign, and with appropriate incentives

  4. Water Supply Interruptions and Suspected Cholera Incidence: A Time-Series Regression in the Democratic Republic of the Congo

    Science.gov (United States)

    Jeandron, Aurélie; Saidi, Jaime Mufitini; Kapama, Alois; Burhole, Manu; Birembano, Freddy; Vandevelde, Thierry; Gasparrini, Antonio; Armstrong, Ben; Cairncross, Sandy; Ensink, Jeroen H. J.

    2015-01-01

    Background The eastern provinces of the Democratic Republic of the Congo have been identified as endemic areas for cholera transmission, and despite continuous control efforts, they continue to experience regular cholera outbreaks that occasionally spread to the rest of the country. In a region where access to improved water sources is particularly poor, the question of which improvements in water access should be prioritized to address cholera transmission remains unresolved. This study aimed at investigating the temporal association between water supply interruptions and Cholera Treatment Centre (CTC) admissions in a medium-sized town. Methods and Findings Time-series patterns of daily incidence of suspected cholera cases admitted to the Cholera Treatment Centre in Uvira in South Kivu Province between 2009 and 2014 were examined in relation to the daily variations in volume of water supplied by the town water treatment plant. Quasi-poisson regression and distributed lag nonlinear models up to 12 d were used, adjusting for daily precipitation rates, day of the week, and seasonal variations. A total of 5,745 patients over 5 y of age with acute watery diarrhoea symptoms were admitted to the CTC over the study period of 1,946 d. Following a day without tap water supply, the suspected cholera incidence rate increased on average by 155% over the next 12 d, corresponding to a rate ratio of 2.55 (95% CI: 1.54–4.24), compared to the incidence experienced after a day with optimal production (defined as the 95th percentile—4,794 m3). Suspected cholera cases attributable to a suboptimal tap water supply reached 23.2% of total admissions (95% CI 11.4%–33.2%). Although generally reporting less admissions to the CTC, neighbourhoods with a higher consumption of tap water were more affected by water supply interruptions, with a rate ratio of 3.71 (95% CI: 1.91–7.20) and an attributable fraction of cases of 31.4% (95% CI: 17.3%–42.5%). The analysis did not suggest any

  5. Data for Improvement and Clinical Excellence: a report of an interrupted time series trial of feedback in home care.

    Science.gov (United States)

    Fraser, Kimberly D; Sales, Anne E; Baylon, Melba Andrea B; Schalm, Corinne; Miklavcic, John J

    2017-05-18

    There is substantial evidence about the effectiveness of audit with feedback, but none that we know have been conducted in home care settings. The primary purpose of the Data for Improvement and Clinical Excellence - Home Care (DICE-HC) project was to evaluate the effects of an audit and feedback delivered to care providers on home care client outcomes. The objective of this paper is to report the effects of feedback on four specific quality indicators: pain, falls, delirium, and hospital visits. A 10-month audit with feedback intervention study was conducted with care providers in seven home care offices in Alberta, Canada, which involved delivery of four quarterly feedback reports consisting of data derived from the Resident Assessment Instrument - Home Care (RAI-HC). The primary evaluation employed an interrupted time series design using segmented regression analysis to assess the effects of feedback reporting on the four quality indicators: pain, falls, delirium, and hospitalization. Changes in level and trend of the quality indicators were measured before, during, and after the implementation of feedback reports. Pressure ulcer reporting was analyzed as a comparator condition not included in the feedback report. Care providers were surveyed on responses to feedback reporting which informed a process evaluation. At initiation of feedback report implementation, the percentage of clients reporting pain and falls significantly increased. Though the percentage of clients reporting pain and falls tended to increase and reporting of delirium and hospital visits tended to decrease relative to the pre-intervention period, there was no significant effect of feedback reporting on quality indicators during the 10-month intervention. The percentage of clients reporting falls, delirium, and hospital visits significantly increased in the 6-month period following feedback reporting relative to the intervention period. About 50% of the care providers that read and understand

  6. Effect of a population-level performance dashboard intervention on maternal-newborn outcomes: an interrupted time series study.

    Science.gov (United States)

    Weiss, Deborah; Dunn, Sandra I; Sprague, Ann E; Fell, Deshayne B; Grimshaw, Jeremy M; Darling, Elizabeth; Graham, Ian D; Harrold, JoAnn; Smith, Graeme N; Peterson, Wendy E; Reszel, Jessica; Lanes, Andrea; Walker, Mark C; Taljaard, Monica

    2017-11-24

    To assess the effect of the Maternal Newborn Dashboard on six key clinical performance indicators in the province of Ontario, Canada. Interrupted time series using population-based data from the provincial birth registry covering a 3-year period before implementation of the Dashboard and 2.5 years after implementation (November 2009 through March 2015). All hospitals in the province of Ontario providing maternal-newborn care (n=94). A hospital-based online audit and feedback programme. Rates of the six performance indicators included in the Dashboard. 2.5 years after implementation, the audit and feedback programme was associated with statistically significant absolute decreases in the rates of episiotomy (decrease of 1.5 per 100 women, 95% CI 0.64 to 2.39), induction for postdates in women who were less than 41 weeks at delivery (decrease of 11.7 per 100 women, 95% CI 7.4 to 16.0), repeat caesarean delivery in low-risk women performed before 39 weeks (decrease of 10.4 per 100 women, 95% CI 9.3 to 11.5) and an absolute increase in the rate of appropriately timed group B streptococcus screening (increase of 2.8 per 100, 95% CI 2.2 to 3.5). The audit and feedback programme did not significantly affect the rates of unsatisfactory newborn screening blood samples or formula supplementation at discharge. No statistically significant effects were observed for the two internal control outcomes or the four external control indicators-in fact, two external control indicators (episiotomy and postdates induction) worsened relative to before implementation. An electronic audit and feedback programme implemented in maternal-newborn hospitals was associated with clinically relevant practice improvements at the provincial level in the majority of targeted indicators. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. Effects of prescription restrictive interventions on antibiotic procurement in primary care settings: a controlled interrupted time series study in China.

    Science.gov (United States)

    Tang, Yuqing; Liu, Chaojie; Zhang, Zinan; Zhang, Xinping

    2018-01-01

    The overuse of antibiotics has been identified as a major challenge in regard to the rational prescription of medicines in low and middle income countries. Extensive studies on the effectiveness of persuasive interventions, such as guidelines have been undertaken. There is a dearth of research pertaining to the effects of restrictive interventions. This study aimed to evaluate the impacts of prescription restrictions in relation to types and administration routes of antibiotics on antibiotic procurement in primary care settings in China. Data were drawn from the monthly procurement records of medicines for primary care institutions in Hubei province over a 31-month period from May 2011 to November 2013. We analyzed the monthly procurement volume and costs of antibiotics. Interrupted time series analyses with a difference-in-difference approach were performed to evaluate the effect of the restrictive intervention (started in August 2012) on antibiotic procurement in comparison with those for cardiovascular conditions. Sensitivity tests were performed by replacing outliers using a simple linear interpolation technique. Over the entire study period, antibiotics accounted for 33.65% of the total costs of medicines procured for primary care institutions: mostly non-restricted antibiotics (86.03%) and antibiotics administered through parenteral routes (79.59%). On average, 17.14 million defined daily doses (DDDs) of antibiotics were procured per month, with the majority (93.09%) for non-restricted antibiotics and over half (52.38%) for parenteral administered antibiotics. The restrictive intervention was associated with a decline in the secular trend of costs for non-restricted oral antibiotics (- 0.36 million Yuan per month, p = 0.029), and for parenteral administered restricted antibiotics (- 0.28 million Yuan per month, p = 0.019), as well as a decline in the secular trend of procurement volume for parenteral administered non-restricted antibiotics (- 0

  8. Effectiveness of employer financial incentives in reducing time to report worker injury: an interrupted time series study of two Australian workers' compensation jurisdictions.

    Science.gov (United States)

    Lane, Tyler J; Gray, Shannon; Hassani-Mahmooei, Behrooz; Collie, Alex

    2018-01-05

    Early intervention following occupational injury can improve health outcomes and reduce the duration and cost of workers' compensation claims. Financial early reporting incentives (ERIs) for employers may shorten the time between injury and access to compensation benefits and services. We examined ERI effect on time spent in the claim lodgement process in two Australian states: South Australia (SA), which introduced them in January 2009, and Tasmania (TAS), which introduced them in July 2010. Using administrative records of 1.47 million claims lodged between July 2006 and June 2012, we conducted an interrupted time series study of ERI impact on monthly median days in the claim lodgement process. Time periods included claim reporting, insurer decision, and total time. The 18-month gap in implementation between the states allowed for a multiple baseline design. In SA, we analysed periods within claim reporting: worker and employer reporting times (similar data were not available in TAS). To account for external threats to validity, we examined impact in reference to a comparator of other Australian workers' compensation jurisdictions. Total time in the process did not immediately change, though trend significantly decreased in both jurisdictions (SA: -0.36 days per month, 95% CI -0.63 to -0.09; TAS: 0.35, -0.50 to -0.20). Claim reporting time also decreased in both (SA: -1.6 days, -2.4 to -0.8; TAS: -5.4, -7.4 to -3.3). In TAS, there was a significant increase in insurer decision time (4.6, 3.9 to 5.4) and a similar but non-significant pattern in SA. In SA, worker reporting time significantly decreased (-4.7, -5.8 to -3.5), but employer reporting time did not (-0.3, -0.8 to 0.2). The results suggest that ERIs reduced claim lodgement time and, in the long-term, reduced total time in the claim lodgement process. However, only worker reporting time significantly decreased in SA, indicating that ERIs may not have shortened the process through the intended target of

  9. An Interrupted Time-Series Analysis to Assess Impact of Introduction of Co-Payment on Emergency Room Visits in Cyprus.

    Science.gov (United States)

    Petrou, Panagiotis

    2015-10-01

    A co-payment fee of EUR10 was introduced in Cyprus, in order to cope with overcrowding of emergency room services. The scope of this paper is the assessment of the short-term impact of this measure. We used an interrupted time-series autoregressive integrated moving average model, and we analyzed official data from Cyprus' largest emergency room facility for three years. Co-payment is associated with a 16% statistically significant reduction of emergency room visits. No impact was observed in categories of teenagers, children, infants, and people over 70 years old. Co-payment was proven to be effective in Cyprus' emergency room setting and is expected to lessen congestion in the emergency room. The price insensitivity of people aged over 70 years, teenagers, children and infants, merits additional research for the identification of the underlying reasons.

  10. Did the Great Recession increase suicides in the USA? Evidence from an interrupted time-series analysis.

    Science.gov (United States)

    Harper, Sam; Bruckner, Tim A

    2017-07-01

    Research suggests that the Great Recession of 2007-2009 led to nearly 5000 excess suicides in the United States. However, prior work has not accounted for seasonal patterning and unique suicide trends by age and gender. We calculated monthly suicide rates from 1999 to 2013 for men and women aged 15 and above. Suicide rates before the Great Recession were used to predict the rate during and after the Great Recession. Death rates for each age-gender group were modeled using Poisson regression with robust variance, accounting for seasonal and nonlinear suicide trajectories. There were 56,658 suicide deaths during the Great Recession. Age- and gender-specific suicide trends before the recession demonstrated clear seasonal and nonlinear trajectories. Our models predicted 57,140 expected suicide deaths, leading to 482 fewer observed than expected suicides (95% confidence interval -2079, 943). We found little evidence to suggest that the Great Recession interrupted existing trajectories of suicide rates. Suicide rates were already increasing before the Great Recession for middle-aged men and women. Future studies estimating the impact of recessions on suicide should account for the diverse and unique suicide trajectories of different social groups. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Reductions in cardiovascular, cerebrovascular, and respiratory mortality following the national irish smoking ban: interrupted time-series analysis.

    Directory of Open Access Journals (Sweden)

    Sericea Stallings-Smith

    Full Text Available BACKGROUND: Previous studies have shown decreases in cardiovascular mortality following the implementation of comprehensive smoking bans. It is not known whether cerebrovascular or respiratory mortality decreases post-ban. On March 29, 2004, the Republic of Ireland became the first country in the world to implement a national workplace smoking ban. The aim of this study was to assess the effect of this policy on all-cause and cause-specific, non-trauma mortality. METHODS: A time-series epidemiologic assessment was conducted, utilizing Poisson regression to examine weekly age and gender-standardized rates for 215,878 non-trauma deaths in the Irish population, ages ≥35 years. The study period was from January 1, 2000, to December 31, 2007, with a post-ban follow-up of 3.75 years. All models were adjusted for time trend, season, influenza, and smoking prevalence. RESULTS: Following ban implementation, an immediate 13% decrease in all-cause mortality (RR: 0.87; 95% CI: 0.76-0.99, a 26% reduction in ischemic heart disease (IHD (RR: 0.74; 95% CI: 0.63-0.88, a 32% reduction in stroke (RR: 0.68; 95% CI: 0.54-0.85, and a 38% reduction in chronic obstructive pulmonary disease (COPD (RR: 0.62; 95% CI: 0.46-0.83 mortality was observed. Post-ban reductions in IHD, stroke, and COPD mortalities were seen in ages ≥65 years, but not in ages 35-64 years. COPD mortality reductions were found only in females (RR: 0.47; 95% CI: 0.32-0.70. Post-ban annual trend reductions were not detected for any smoking-related causes of death. Unadjusted estimates indicate that 3,726 (95% CI: 2,305-4,629 smoking-related deaths were likely prevented post-ban. Mortality decreases were primarily due to reductions in passive smoking. CONCLUSIONS: The national Irish smoking ban was associated with immediate reductions in early mortality. Importantly, post-ban risk differences did not change with a longer follow-up period. This study corroborates previous evidence for cardiovascular

  12. An electronic trigger tool to optimise intravenous to oral antibiotic switch: a controlled, interrupted time series study

    NARCIS (Netherlands)

    Berrevoets, M.A.H.; Pot, J.; Houterman, A.E.; Dofferhoff, A.; Nabuurs-Franssen, M.H.; Fleuren, H.; Kullberg, B.J.; Schouten, J.A.; Sprong, T.

    2017-01-01

    BACKGROUND: Timely switch from intravenous (iv) antibiotics to oral therapy is a key component of antimicrobial stewardship programs in order to improve patient safety, promote early discharge and reduce costs. We have introduced a time-efficient and easily implementable intervention that relies on

  13. Interrupted time series analysis of children’s blood lead levels: A case study of lead hazard control program in Syracuse, New York

    Science.gov (United States)

    Shao, Liyang; Zhang, Lianjun; Zhen, Zhen

    2017-01-01

    Children’s blood lead concentrations have been closely monitored over the last two decades in the United States. The bio-monitoring surveillance data collected in local agencies reflected the local temporal trends of children’s blood lead levels (BLLs). However, the analysis and modeling of the long-term time series of BLLs have rarely been reported. We attempted to quantify the long-term trends of children’s BLLs in the city of Syracuse, New York and evaluate the impacts of local lead poisoning prevention programs and Lead Hazard Control Program on reducing the children’s BLLs. We applied interrupted time series analysis on the monthly time series of BLLs surveillance data and used ARMA (autoregressive and moving average) models to measure the average children’s blood lead level shift and detect the seasonal pattern change. Our results showed that there were three intervention stages over the past 20 years to reduce children’s BLLs in the city of Syracuse, NY. The average of children’s BLLs was significantly decreased after the interventions, declining from 8.77μg/dL to 3.94μg/dL during1992 to 2011. The seasonal variation diminished over the past decade, but more short term influences were in the variation. The lead hazard control treatment intervention proved effective in reducing the children’s blood lead levels in Syracuse, NY. Also, the reduction of the seasonal variation of children’s BLLs reflected the impacts of the local lead-based paint mitigation program. The replacement of window and door was the major cost of lead house abatement. However, soil lead was not considered a major source of lead hazard in our analysis. PMID:28182688

  14. Total and cause-specific mortality before and after the onset of the Greek economic crisis: an interrupted time-series analysis.

    Science.gov (United States)

    Laliotis, Ioannis; Ioannidis, John P A; Stavropoulou, Charitini

    2016-12-01

    Greece was one of the countries hit the hardest by the 2008 financial crisis in Europe. Yet, evidence on the effect of the crisis on total and cause-specific mortality remains unclear. We explored whether the economic crisis affected the trend of overall and cause-specific mortality rates. We used regional panel data from the Hellenic Statistical Authority to assess mortality trends by age, sex, region, and cause in Greece between January, 2001, and December, 2013. We used Eurostat data to calculate monthly age-standardised mortality rates per 100 000 inhabitants for each region. Data were divided into two subperiods: before the crisis (January, 2001, to August, 2008) and after the onset of the crisis (September, 2008, to December, 2013). We tested for changes in the slope of mortality by doing an interrupted time-series analysis. Overall mortality continued to decline after the onset of the financial crisis (-0·065, 95% CI -0·080 to -0·049), but at a slower pace than before the crisis (-0·13, -0·15 to -0·10; trend difference 0·062, 95% CI 0·041 to 0·083; pperiod after the onset of the crisis with extrapolated values based on the period before the crisis, we estimate that an extra 242 deaths per month occurred after the onset of the crisis. Mortality trends have been interrupted after the onset of compared with before the crisis, but changes vary by age, sex, and cause of death. The increase in deaths due to adverse events during medical treatment might reflect the effects of deterioration in quality of care during economic recessions. None. Copyright © 2016 The Author(s). Published by Elsevier Ltd. This is an Open Access article under the CC BY license. Published by Elsevier Ltd.. All rights reserved.

  15. The effect of reduced street lighting on road casualties and crime in England and Wales: controlled interrupted time series analysis.

    Science.gov (United States)

    Steinbach, Rebecca; Perkins, Chloe; Tompson, Lisa; Johnson, Shane; Armstrong, Ben; Green, Judith; Grundy, Chris; Wilkinson, Paul; Edwards, Phil

    2015-11-01

    Many local authorities in England and Wales have reduced street lighting at night to save money and reduce carbon emissions. There is no evidence to date on whether these reductions impact on public health. We quantified the effect of 4 street lighting adaptation strategies (switch off, part-night lighting, dimming and white light) on casualties and crime in England and Wales. Observational study based on analysis of geographically coded police data on road traffic collisions and crime in 62 local authorities. Conditional Poisson models were used to analyse longitudinal changes in the counts of night-time collisions occurring on affected roads during 2000-2013, and crime within census Middle Super Output Areas during 2010-2013. Effect estimates were adjusted for regional temporal trends in casualties and crime. There was no evidence that any street lighting adaptation strategy was associated with a change in collisions at night. There was significant statistical heterogeneity in the effects on crime estimated at police force level. Overall, there was no evidence for an association between the aggregate count of crime and switch off (RR 0.11; 95% CI 0.01 to 2.75) or part-night lighting (RR 0.96; 95% CI 0.86 to 1.06). There was weak evidence for a reduction in the aggregate count of crime and dimming (RR 0.84; 95% CI 0.70 to 1.02) and white light (RR 0.89; 95% CI 0.77 to 1.03). This study found little evidence of harmful effects of switch off, part-night lighting, dimming, or changes to white light/LEDs on road collisions or crime in England and Wales. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  16. Evaluation of the impact of support for nursing research on scientific productivity in seven Italian hospitals: A multiple interrupted time series study.

    Science.gov (United States)

    Chiari, Paolo; Forni, Cristiana; Zeneli, Anita; Gianesini, Gloria; Zanin, Roberta; Braglia, Luca; Cavuto, Silvio; Guberti, Monica

    2016-05-01

    Nursing research is not well-developed in Italy, and knowledge of the methodologies for conducting research is lacking. In several hospitals, including those in which this study was conducted, a research center has been established to support and educate nurses on how to conduct clinical research. In this observational study, we sought to assess whether establishing a support center for nursing research has resulted in an increase in scientific production in terms of the numbers of protocols approved (primary outcome), articles published and nurse authors involved in the publications (secondary outcomes). Multiple interrupted time series. Data from 2002 to 2012 were collected in seven hospitals. Research centers have been established at various times in only four of these hospitals. A statistically significant increase in the primary outcome (the number of protocols approved by the Research Ethics Committee in which the principal investigator was a nurse) was observed in two hospitals approximately 2years after establishing a research center. The number of nursing research articles published in scientific journals with an impact factor increased but was not statistically significant. Finally, the number of nurse authors increased significantly in two hospitals with support units. Definitive conclusions could not be reached for the other two experimental hospitals because notably few post-intervention data were available. In the control hospitals, the scientific production outcomes did not change. This study shows that establishing a support center for nursing research inside hospitals can facilitate the production of research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Acute effects of aircraft noise on cardiovascular admissions - an interrupted time-series analysis of a six-day closure of London Heathrow Airport caused by volcanic ash.

    Science.gov (United States)

    Pearson, Tim; Campbell, Michael J; Maheswaran, Ravi

    2016-08-01

    Acute noise exposure may acutely increase blood pressure but the hypothesis that acute exposure to aircraft noise may trigger cardiovascular events has not been investigated. This study took advantage of a six-day closure of a major airport in April 2010 caused by volcanic ash to examine if there was a decrease in emergency cardiovascular hospital admissions during or immediately after the closure period, using an interrupted daily time-series study design. The population living within the 55dB(A) noise contour was substantial at 0.7 million. The average daily admission count was 13.9 (SD 4.4). After adjustment for covariates, there was no evidence of a decreased risk of hospital admission from cardiovascular disease during the closure period (relative risk 0.97 (95% CI 0.75-1.26)). Using lags of 1-7 days gave similar results. Further studies are needed to investigate if transient aircraft noise exposure can trigger acute cardiovascular events. Copyright © 2016. Published by Elsevier Ltd.

  18. Incidence of hip and knee replacement in patients with rheumatoid arthritis following the introduction of biological DMARDs: an interrupted time-series analysis using nationwide Danish healthcare registers.

    Science.gov (United States)

    Cordtz, René Lindholm; Hawley, Samuel; Prieto-Alhambra, Daniel; Højgaard, Pil; Zobbe, Kristian; Overgaard, Søren; Odgaard, Anders; Kristensen, Lars Erik; Dreyer, Lene

    2018-05-01

    To study the impact of the introduction of biological disease-modifying anti-rheumatic drugs (bDMARDs) and associated rheumatoid arthritis (RA) management guidelines on the incidence of total hip (THR) and knee replacements (TKR) in Denmark. Nationwide register-based cohort and interrupted time-series analysis. Patients with incident RA between 1996 and 2011 were identified in the Danish National Patient Register. Patients with RA were matched on age, sex and municipality with up to 10 general population comparators (GPCs). Standardised 5-year incidence rates of THR and TKR per 1000 person-years were calculated for patients with RA and GPCs in 6-month periods. Levels and trends in the pre-bDMARD (1996-2001) were compared with the bDMARD era (2003-2016) using segmented linear regression interrupted by a 1-year lag period (2002). We identified 30 404 patients with incident RA and 297 916 GPCs. In 1996, the incidence rate of THR and TKR was 8.72 and 5.87, respectively, among patients with RA, and 2.89 and 0.42 in GPCs. From 1996 to 2016, the incidence rate of THR decreased among patients with RA, but increased among GPCs. Among patients with RA, the incidence rate of TKR increased from 1996 to 2001, but started to decrease from 2003 and throughout the bDMARD era. The incidence of TKR increased among GPCs from 1996 to 2016. We report that the incidence rate of THR and TKR was 3-fold and 14-fold higher, respectively among patients with RA compared with GPCs in 1996. In patients with RA, introduction of bDMARDs was associated with a decreasing incidence rate of TKR, whereas the incidence of THR had started to decrease before bDMARD introduction. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. Impact of prescription drug monitoring programs and pill mill laws on high-risk opioid prescribers: A comparative interrupted time series analysis.

    Science.gov (United States)

    Chang, Hsien-Yen; Lyapustina, Tatyana; Rutkow, Lainie; Daubresse, Matthew; Richey, Matt; Faul, Mark; Stuart, Elizabeth A; Alexander, G Caleb

    2016-08-01

    Prescription drug monitoring programs (PDMPs) and pill mill laws were implemented to reduce opioid-related injuries/deaths. We evaluated their effects on high-risk prescribers in Florida. We used IMS Health's LRx Lifelink database between July 2010 and September 2012 to identify opioid-prescribing prescribers in Florida (intervention state, N: 38,465) and Georgia (control state, N: 18,566). The pre-intervention, intervention, and post-intervention periods were: July 2010-June 2011, July 2011-September 2011, and October 2011-September 2012. High-risk prescribers were those in the top 5th percentile of opioid volume during four consecutive calendar quarters. We applied comparative interrupted time series models to evaluate policy effects on clinical practices and monthly prescribing measures for low-risk/high-risk prescribers. We identified 1526 (4.0%) high-risk prescribers in Florida, accounting for 67% of total opioid volume and 40% of total opioid prescriptions. Relative to their lower-risk counterparts, they wrote sixteen times more monthly opioid prescriptions (79 vs. 5, pprescription-filling patients receiving opioids (47% vs. 19%, pprescriptions (-536 patients/month, 95% confidence intervals [CI] -829 to -243; -847 prescriptions/month, CI -1498 to -197), morphine equivalent dose (-0.88mg/month, CI -1.13 to -0.62), and total opioid volume (-3.88kg/month, CI -5.14 to -2.62). Low-risk providers did not experience statistically significantly relative reductions, nor did policy implementation affect the status of being high- vs. low- risk prescribers. High-risk prescribers are disproportionately responsive to state policies. However, opioids-prescribing remains highly concentrated among high-risk providers. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  20. Impact of an infectious disease specialist on antifungal use: an interrupted time-series analysis in a tertiary hospital in Tokyo.

    Science.gov (United States)

    Morii, D; Ichinose, N; Yokozawa, T; Oda, T

    2018-01-08

    Antimicrobial stewardship programmes are considered essential for optimizing antimicrobial use in order to improve patient outcomes, reduce the number of adverse sequelae, prevent resistance, and ensure cost-effective therapy. To assess the efficacy and the limitations of antifungal antimicrobial stewardship programmes. A bundle to manage infectious diseases was implemented in our hospital in October 2010. Data regarding antimicrobial use density (AUD) from April 2006 to May 2016 were collected. Trends in AUD were assessed using an interrupted time-series model for three separate periods: the pre-bundle, the bundle implementation, and the long-term follow-up periods. The primary and secondary outcomes were AUD (defined daily dose (DDD) per 1000 patient-days) of intravenous antifungals and expenditure on antifungals per fiscal year, respectively. The AUD for all intravenous antifungals decreased from 26.1 in 2006 to 9.9 in 2015. Whereas the change in the trend during the pre-bundle period was not significant (slope: 0.062; 95% confidence interval (CI): -0.180 to 0.305), a significant decrease was observed in the bundle implementation period (slope: -0.535; 95% CI: -0.907 to -0.164). The trend slowed during the long-term follow-up period (slope: -0.040; 95% CI: -0.218 to 0.138). Total expenditure on antifungals decreased by 73%, from ¥52,354,411 in fiscal year 2006 to ¥14,073,099 in fiscal year 2015. The bundle significantly reduced the use of antifungals and decreased costs over time, but this effect was limited in that it had stabilized within three years. Copyright © 2018 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  1. Health Facility Utilisation Changes during the Introduction of Community Case Management of Malaria in South Western Uganda: An Interrupted Time Series Approach.

    Science.gov (United States)

    Lal, Sham; Ndyomugenyi, Richard; Alexander, Neal D; Lagarde, Mylene; Paintain, Lucy; Magnussen, Pascal; Chandramohan, Daniel; Clarke, Siân E

    2015-01-01

    Malaria endemic countries have scaled-up community health worker (CHW) interventions, to diagnose and treat malaria in communities with limited access to public health systems. The evaluations of these programmes have centred on CHW's compliance to guidelines, but the broader changes at public health centres including utilisation and diagnoses made, has received limited attention. This analysis was conducted during a CHW-intervention for malaria in Rukungiri District, Western Uganda. Outpatient department (OPD) visit data were collected for children under-5 attending three health centres one year before the CHW-intervention started (pre-intervention period) and for 20 months during the intervention (intervention-period). An interrupted time series analysis with segmented regression models was used to compare the trends in malaria, non-malaria and overall OPD visits during the pre-intervention and intervention-period. The introduction of a CHW-intervention suggested the frequency of diagnoses of diarrhoeal diseases, pneumonia and helminths increased, whilst the frequency of malaria diagnoses declined at health centres. In May 2010 when the intervention began, overall health centre utilisation decreased by 63% compared to the pre-intervention period and the health centres saw 32 fewer overall visits per month compared to the pre-intervention period (pMalaria visits also declined shortly after the intervention began and there were 27 fewer visits per month during the intervention-period compared with the pre-intervention period (pmalaria visits were sustained for the entire intervention-period. In contrast, there were no observable changes in trends of non-malarial visits between the pre-intervention and intervention-period. This analysis suggests introducing a CHW-intervention can reduce the number of child malaria visits and change the profile of cases presenting at health centres. The reduction in workload of health workers may allow them to spend more time with

  2. The impact of “Option B” on HIV transmission from mother to child in Rwanda: An interrupted time series analysis

    Science.gov (United States)

    Abimpaye, Monique; Iyer, Hari S.; Gupta, Neil; Remera, Eric; Mugwaneza, Placidie; Law, Michael R.

    2018-01-01

    Background Nearly a quarter of a million children have acquired HIV, prompting the implementation of new protocols—Option B and B+—for treating HIV+ pregnant women. While efficacy has been demonstrated in randomized trials, there is limited real-world evidence on the impact of these changes. Using longitudinal, routinely collected data we assessed the impact of the adoption of WHO Option B in Rwanda on mother to infant transmission. Methods We used interrupted time series analysis to evaluate the impact of Option B on mother-to-child HIV transmission in Rwanda. Our primary outcome was the proportion of HIV tests in infants with positive results at six weeks of age. We included data for 20 months before and 22 months after the 2010 policy change. Results Of the 15,830 HIV tests conducted during our study period, 392 tested positive. We found a significant decrease in both the level (-2.08 positive tests per 100 tests conducted, 95% CI: -2.71 to -1.45, p < 0.001) and trend (-0.11 positive tests per 100 tests conducted per month, 95% CI: -0.16 to -0.07, p < 0.001) of test positivity. This represents an estimated 297 fewer children born without HIV in the post-policy period or a 46% reduction in HIV transmission from mother to child. Conclusions The adoption of Option B in Rwanda contributed to an immediate decrease in the rate of HIV transmission from mother to child. This suggests other countries may benefit from adopting these WHO guidelines. PMID:29451925

  3. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis.

    Science.gov (United States)

    Hopewell, Sally; Ravaud, Philippe; Baron, Gabriel; Boutron, Isabelle

    2012-06-22

    To investigate the effect of the CONSORT for Abstracts guidelines, and different editorial policies used by five leading general medical journals to implement the guidelines, on the reporting quality of abstracts of randomised trials. Interrupted time series analysis. We randomly selected up to 60 primary reports of randomised trials per journal per year from five high impact, general medical journals in 2006-09, if indexed in PubMed with an electronic abstract. We excluded reports that did not include an electronic abstract, and any secondary trial publications or economic analyses. We classified journals in three categories: those not mentioning the guidelines in their instructions to authors (JAMA and New England Journal of Medicine), those referring to the guidelines in their instructions to authors but with no specific policy to implement them (BMJ), and those referring to the guidelines in their instructions to authors with an active policy to implement them (Annals of Internal Medicine and Lancet). Two authors extracted data independently using the CONSORT for Abstracts checklist. Mean number of CONSORT items reported in selected abstracts, among nine items reported in fewer than 50% of the abstracts published across the five journals in 2006. We assessed 955 reports of abstracts of randomised trials. Journals with an active policy to enforce the guidelines showed an immediate increase in the level of mean number of items reported (increase of 1.50 items; P=0.0037). At 23 months after publication of the guidelines, the mean number of items reported per abstract for the primary outcome was 5.41 of nine items, a 53% increase compared with the expected level estimated on the basis of pre-intervention trends. The change in level or trend did not increase in journals with no policy to enforce the guidelines (BMJ, JAMA, and New England Journal of Medicine). Active implementation of the CONSORT for Abstracts guidelines by journals can lead to improvements in the

  4. Retrospective interrupted time series examining hypertension and diabetes medicines usage following changes in patient cost sharing in the 'Farmácia Popular' programme in Brazil.

    Science.gov (United States)

    Emmerick, Isabel Cristina Martins; Campos, Monica Rodrigues; Luiza, Vera Lucia; Chaves, Luisa Arueira; Bertoldi, Andrea Dâmaso; Ross-Degnan, Dennis

    2017-11-03

    'Farmácia Popular' (FP) programme was launched in 2004, expanded in 2006 and changed the cost sharing for oral hypoglycaemic (OH) and antihypertensive (AH) medicines in 2009 and in 2011. This paper describes patterns of usage and continuity of coverage for OH and AH medicines following changes in patient cost sharing in the FP. Interrupted time series study using retrospective administrative data. Monthly programme participation (PP) and proportion of days covered (PDC) were the two outcome measures. The open cohort included all patients with two or more dispensings for a given study medicine in 2008-2012. The interventions were an increase in patient cost sharing in 2009 and zero patient cost sharing for key medicines in 2011. A total of 3.6 and 9.5 million patients receiving treatment for diabetes and hypertension, respectively, qualified for the study. Before the interventions, PP was growing by 7.3% per month; median PDC varied by medicine from 50% to 75%. After patient cost sharing increased in 2009, PP reduced by 56.5% and PDC decreased for most medicines (median 60.3%). After the 2011 free medicine programme, PP surged by 121 000 new dispensings per month and PDC increased for all covered medicines (80.7%). Cost sharing was found to be a barrier to continuity of treatment in Brazil's private sector FP programme. Making essential medicines free to patients appear to increase participation and continuity of treatment to clinically beneficial levels (PDC >80%). © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Effectiveness of facilitated introduction of a standard operating procedure into routine processes in the operating theatre: a controlled interrupted time series.

    Science.gov (United States)

    Morgan, Lauren; New, Steve; Robertson, Eleanor; Collins, Gary; Rivero-Arias, Oliver; Catchpole, Ken; Pickering, Sharon P; Hadi, Mohammed; Griffin, Damian; McCulloch, Peter

    2015-02-01

    Standard operating procedures (SOPs) should improve safety in the operating theatre, but controlled studies evaluating the effect of staff-led implementation are needed. In a controlled interrupted time series, we evaluated three team process measures (compliance with WHO surgical safety checklist, non-technical skills and technical performance) and three clinical outcome measures (length of hospital stay, complications and readmissions) before and after a 3-month staff-led development of SOPs. Process measures were evaluated by direct observation, using Oxford Non-Technical Skills II for non-technical skills and the 'glitch count' for technical performance. All staff in two orthopaedic operating theatres were trained in the principles of SOPs and then assisted to develop standardised procedures. Staff in a control operating theatre underwent the same observations but received no training. The change in difference between active and control groups was compared before and after the intervention using repeated measures analysis of variance. We observed 50 operations before and 55 after the intervention and analysed clinical data on 1022 and 861 operations, respectively. The staff chose to structure their efforts around revising the 'whiteboard' which documented and prompted tasks, rather than directly addressing specific task problems. Although staff preferred and sustained the new system, we found no significant differences in process or outcome measures before/after intervention in the active versus the control group. There was a secular trend towards worse outcomes in the postintervention period, seen in both active and control theatres. SOPs when developed and introduced by frontline staff do not necessarily improve operative processes or outcomes. The inherent tension in improvement work between giving staff ownership of improvement and maintaining control of direction needs to be managed, to ensure staff are engaged but invest energy in appropriate change

  6. Impact of the economic recession and subsequent austerity on suicide and self-harm in Ireland: An interrupted time series analysis.

    Science.gov (United States)

    Corcoran, Paul; Griffin, Eve; Arensman, Ella; Fitzgerald, Anthony P; Perry, Ivan J

    2015-06-01

    The recent economic recession has been associated with short-term increases in suicide in many countries. Data are lacking on the longer-term effect on suicide and on the impact on non-fatal suicidal behaviour. Using interrupted time series analyses, we have assessed the impact of economic recession and austerity in Ireland on national rates of suicide mortality and self-harm presentations to hospital in 2008-12. By the end of 2012, the male suicide rate was 57% higher [+8.7 per 100,000, 95% confidence interval (CI), 4.8 to 12.5] than if the pre-recession trend continued, whereas female suicide was almost unchanged (+0.3 per 100,000, 95% CI, -1.1 to 1.8). Male and female self-harm rates were 31% higher (+74.1 per 100,000, 95% CI, -6.3 to 154.6) and 22% higher (+63.2 per 100,000, 95% CI, 4.1 to 122.2), respectively. There were 476 more male (95% CI, 274 to 678) and 85 more female (95% CI, -9 to 180) suicide deaths and 5029 more male (95% CI, 626 to 9432) and 3833 more female (95% CI, 321 to 7345) self-harm presentations to hospital in 2008-12 than if pre-recession trends had continued. Men aged 25-64 years were affected in terms of suicide and self-harm with the greatest impact observed in 25-44 year-olds. The increase in self-harm by women was among 15-24 year-olds. Five years of economic recession and austerity in Ireland have had a significant negative impact on rates of suicide in men and on self-harm in both sexes. © The Author 2015; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  7. Impact of rapid diagnostic tests for the diagnosis and treatment of malaria at a peripheral health facility in Western Uganda: an interrupted time series analysis.

    Science.gov (United States)

    Boyce, Ross M; Muiru, Anthony; Reyes, Raquel; Ntaro, Moses; Mulogo, Edgar; Matte, Michael; Siedner, Mark J

    2015-05-15

    The World Health Organization recommends that all suspected malaria cases receive a parasitological diagnosis prior to treatment with artemisinin-based combination therapy. A recent meta-analysis of clinical trials evaluating RDTs for the management of patients with fever found substantial reductions in anti-malarial prescriptions when health workers adhered to treatment protocols based on test results. However few studies have reported on the impact of RDTs on health systems outside research settings. The study comprised a retrospective interrupted time series analysis, comparing rates of malaria diagnosis, treatment, and resource utilization before and after introduction of RDTs at a peripheral health facility in rural Western Uganda. The use of malaria diagnostic tests was graphically depicted throughout the study period and fit regression models to identify correlates of three outcomes of interest: (1) length of stay (2) the proportion of patients referred to a higher-level health facility, and (3) administration of antibiotics. Over the course of the study period, 14,357 individuals underwent diagnostic testing for malaria with either a RDT (9,807) or microscopy (4,550). The proportion of patients with parasite-based diagnoses more than tripled to 34% after the introduction of RDTs. RDTs largely replaced microscopy as the diagnostic method of choice. Compared to patients admitted during the pre-RDT period, patients admitted to the health centre with malaria in the post-RDT period had significantly reduced odds of being referred to another health centre (AOR=0.49, P=0.038), receiving antibiotics (AOR=0.42, Pintroduction of RDTs for the diagnosis of malaria at a rural health facility in Uganda. The results show a reduction in referrals and shorter mean inpatient LOS even as antibiotics were prescribed less frequently. This change greatly increased laboratory throughput and the resultant proportion of patients receiving a parasite-based diagnosis.

  8. Use of interrupted time-series method to evaluate the impact of cigarette excise tax increases in Pennsylvania, 2000-2009.

    Science.gov (United States)

    Ma, Zhen-qiang; Kuller, Lewis H; Fisher, Monica A; Ostroff, Stephen M

    2013-10-03

    Scientific evidence shows that cigarette price increases can significantly reduce smoking prevalence and smoking initiation among adolescents and young adults. However, data are lacking regarding the effectiveness of increasing Pennsylvania's cigarette tax to reduce smoking and/or adverse health effects of smoking. The objective of our study was to assess the impact of cigarette tax increases and resulting price increases on smoking prevalence, acute myocardial infarction (AMI) and asthma hospitalization rates, and sudden cardiac death (SCD) rates in Pennsylvania. We used segmented regression analyses of interrupted time series to evaluate the level and trend changes in Pennsylvania adults' current smoking prevalence, age-adjusted AMI and asthma hospitalization rates, age-specific asthma hospitalization rates, and age-adjusted SCD rates following 2 cigarette excise tax increases. After the first excise tax increase, no beneficial effects were noted on the outcomes of interest. The second tax increase was associated with significant declines in smoking prevalence for people aged 18 to 39, age-adjusted AMI hospitalization rates for men, age-adjusted asthma hospitalizations rates, and SCD rates among men. Overall smoking prevalence declined by 5.2% (P = .01), with a quarterly decrease of 1.4% (P = .01) for people aged 18 to 39 years. The age-adjusted AMI hospitalization rate for men showed a decline of 3.87/100,000 population (P = .04). The rate of age-adjusted asthma hospitalizations decreased by 10.05/100,000 population (P tobacco consumption should be cognizant of the psychological tipping points at which overall price affects smoking patterns.

  9. Improving Neuromuscular Monitoring and Reducing Residual Neuromuscular Blockade With E-Learning: Protocol for the Multicenter Interrupted Time Series INVERT Study.

    Science.gov (United States)

    Thomsen, Jakob Louis Demant; Mathiesen, Ole; Hägi-Pedersen, Daniel; Skovgaard, Lene Theil; Østergaard, Doris; Engbaek, Jens; Gätke, Mona Ring

    2017-10-06

    Muscle relaxants facilitate endotracheal intubation under general anesthesia and improve surgical conditions. Residual neuromuscular blockade occurs when the patient is still partially paralyzed when awakened after surgery. The condition is associated with subjective discomfort and an increased risk of respiratory complications. Use of an objective neuromuscular monitoring device may prevent residual block. Despite this, many anesthetists refrain from using the device. Efforts to increase the use of objective monitoring are time consuming and require the presence of expert personnel. A neuromuscular monitoring e-learning module might support consistent use of neuromuscular monitoring devices. The aim of the study is to assess the effect of a neuromuscular monitoring e-learning module on anesthesia staff's use of objective neuromuscular monitoring and the incidence of residual neuromuscular blockade in surgical patients at 6 Danish teaching hospitals. In this interrupted time series study, we are collecting data repeatedly, in consecutive 3-week periods, before and after the intervention, and we will analyze the effect using segmented regression analysis. Anesthesia departments in the Zealand Region of Denmark are included, and data from all patients receiving a muscle relaxant are collected from the anesthesia information management system MetaVision. We will assess the effect of the module on all levels of potential effect: staff's knowledge and skills, patient care practice, and patient outcomes. The primary outcome is use of neuromuscular monitoring in patients according to the type of muscle relaxant received. Secondary outcomes include last recorded train-of-four value, administration of reversal agents, and time to discharge from the postanesthesia care unit as well as a multiple-choice test to assess knowledge. The e-learning module was developed based on a needs assessment process, including focus group interviews, surveys, and expert opinions. The e

  10. The effects of financial incentives for case finding for depression in patients with diabetes and coronary heart disease: interrupted time series analysis.

    Science.gov (United States)

    McLintock, Kate; Russell, Amy M; Alderson, Sarah L; West, Robert; House, Allan; Westerman, Karen; Foy, Robbie

    2014-08-20

    To evaluate the effects of Quality and Outcomes Framework (QOF) incentivised case finding for depression on diagnosis and treatment in targeted and non-targeted long-term conditions. Interrupted time series analysis. General practices in Leeds, UK. 65 (58%) of 112 general practices shared data on 37,229 patients with diabetes and coronary heart disease targeted by case finding incentives, and 101,008 patients with four other long-term conditions not targeted (hypertension, epilepsy, chronic obstructive pulmonary disease and asthma). Incentivised case finding for depression using two standard screening questions. Clinical codes indicating new depression-related diagnoses and new prescriptions of antidepressants. We extracted routinely recorded data from February 2002 through April 2012. The number of new diagnoses and prescriptions for those on registers was modelled with a binomial regression, which provided the strength of associations between time periods and their rates. New diagnoses of depression increased from 21 to 94/100,000 per month in targeted patients between the periods 2002-2004 and 2007-2011 (OR 2.09; 1.92 to 2.27). The rate increased from 27 to 77/100,000 per month in non-targeted patients (OR 1.53; 1.46 to 1.62). The slopes in prescribing for both groups flattened to zero immediately after QOF was introduced but before incentivised case finding (p<0.01 for both). Antidepressant prescribing in targeted patients returned to the pre-QOF secular upward trend (Wald test for equivalence of slope, z=0.73, p=0.47); the slope was less steep for non-targeted patients (z=-4.14, p<0.01). Incentivised case finding increased new depression-related diagnoses. The establishment of QOF disrupted rising trends in new prescriptions of antidepressants, which resumed following the introduction of incentivised case finding. Prescribing trends are of concern given that they may include people with mild-to-moderate depression unlikely to respond to such treatment

  11. The impact of regional co-payment and national reimbursement criteria on statins use in Italy: an interrupted time-series analysis.

    Science.gov (United States)

    Damiani, Gianfranco; Federico, Bruno; Anselmi, Angela; Bianchi, Caterina Bianca Neve Aurora; Silvestrini, Giulia; Iodice, Lanfranco; Navarra, Pierluigi; Da Cas, Roberto; Raschetti, Roberto; Ricciardi, Walter

    2014-01-06

    Statins are among the most commonly prescribed drugs worldwide in the prevention of cardiovascular diseases and their effectiveness is largely acknowledged. The consumption of statins increased four-fold during the 2000-2010 decade in Italy and national and regional control policies were developed. Restrictions to reimbursement were fixed at the national level, whereas co-payment was introduced in some, but not all, regions. The aim of the present study is to assess the impact of such policies on the consumption of statins in Italy between 2001-2007 among outpatients. The statin use was measured in terms of defined daily doses per 1,000 inhabitants per day (DDD/1000 inh. day) from May 2001 to December 2007. The study was conducted in 17 out of 21 regions, nine of which had implemented a co-payment policy. Time trends in consumption before and after the introduction of co-payment policies and reimbursement criteria were examined using segmented regression analysis of interrupted time-series, adjusting for seasonal components. The consumption of statins increased by 22.9 DDD/1000 inh. day in May 2001 to 54.7 DDD/1000 inh. day in December 2007. On average, there was a 1.7% increase in statin use each month before the national guideline changed while the increase was about 0.5% afterwards. The revision of the reimbursement criteria was associated with a significant decrease in level (coefficient = -2.80, 95% CI -3.70 to -1.90 p-value co-payment was associated with a significant change in trend of consumption so that the overall use of the drug increased by 0.04 (95% CI 0.02 to 0.07, p-value < 0.001) DDD/1000 inh. day per month in the post-intervention period, but there was no evidence of a change in level of consumption (p-value = 0.163). Consumption of statins in Italy increased almost three-fold during the study period. The restriction to reimbursement Interventions was associated with an immediate drop and a decrease in trend of statin use, while the regional

  12. Long term effect of reduced pack sizes of paracetamol on poisoning deaths and liver transplant activity in England and Wales: interrupted time series analyses

    Science.gov (United States)

    Bergen, Helen; Simkin, Sue; Dodd, Sue; Pocock, Phil; Bernal, William; Gunnell, David; Kapur, Navneet

    2013-01-01

    Objective To assess the long term effect of United Kingdom legislation introduced in September 1998 to restrict pack sizes of paracetamol on deaths from paracetamol poisoning and liver unit activity. Design Interrupted time series analyses to assess mean quarterly changes from October 1998 to the end of 2009 relative to projected deaths without the legislation based on pre-legislation trends. Setting Mortality (1993-2009) and liver unit activity (1995-2009) in England and Wales, using information from the Office for National Statistics and NHS Blood and Transplant, respectively. Participants Residents of England and Wales. Main outcome measures Suicide, deaths of undetermined intent, and accidental poisoning deaths involving single drug ingestion of paracetamol and paracetamol compounds in people aged 10 years and over, and liver unit registrations and transplantations for paracetamol induced hepatotoxicity. Results Compared with the pre-legislation level, following the legislation there was an estimated average reduction of 17 (95% confidence interval −25 to −9) deaths per quarter in England and Wales involving paracetamol alone (with or without alcohol) that received suicide or undetermined verdicts. This decrease represented a 43% reduction or an estimated 765 fewer deaths over the 11¼ years after the legislation. A similar effect was found when accidental poisoning deaths were included, and when a conservative method of analysis was used. This decrease was largely unaltered after controlling for a non-significant reduction in deaths involving other methods of poisoning and also suicides by all methods. There was a 61% reduction in registrations for liver transplantation for paracetamol induced hepatotoxicity (−11 (−20 to −1) registrations per quarter). But no reduction was seen in actual transplantations (−3 (−12 to 6)), nor in registrations after a conservative method of analysis was used. Conclusions UK legislation to reduce pack sizes of

  13. Impact of clinical trial findings on Bell's palsy management in general practice in the UK 2001–2012: interrupted time series regression analysis

    Science.gov (United States)

    Morales, Daniel R; Donnan, Peter T; Daly, Fergus; Staa, Tjeerd Van; Sullivan, Frank M

    2013-01-01

    Objectives To measure the incidence of Bell's palsy and determine the impact of clinical trial findings on Bell's palsy management in the UK. Design Interrupted time series regression analysis and incidence measures. Setting General practices in the UK contributing to the Clinical Practice Research Datalink (CPRD). Participants Patients ≥16 years with a diagnosis of Bell's palsy between 2001 and 2012. Interventions (1) Publication of the 2004 Cochrane reviews of clinical trials on corticosteroids and antivirals for Bell's palsy, which made no clear recommendation on their use and (2) publication of the 2007 Scottish Bell's Palsy Study (SBPS), which made a clear recommendation that treatment with prednisolone alone improves chances for complete recovery. Main outcome measures Incidence of Bell's palsy per 100 000 person-years. Changes in the management of Bell's palsy with either prednisolone therapy, antiviral therapy, combination therapy (prednisolone with antiviral therapy) or untreated cases. Results During the 12-year period, 14 460 cases of Bell's palsy were identified with an overall incidence of 37.7/100 000 person-years. The 2004 Cochrane reviews were associated with immediate falls in prednisolone therapy (−6.3% (−11.0 to −1.6)), rising trends in combination therapy (1.1% per quarter (0.5 to 1.7)) and falling trends for untreated cases (−0.8% per quarter (−1.4 to −0.3)). SBPS was associated with immediate increases in prednisolone therapy (5.1% (0.9 to 9.3)) and rising trends in prednisolone therapy (0.7% per quarter (0.4 to 1.2)); falling trends in combination therapy (−1.7% per quarter (−2.2 to −1.3)); and rising trends for untreated cases (1.2% per quarter (0.8 to 1.6)). Despite improvements, 44% still remain untreated. Conclusions SBPS was clearly associated with change in management, but a significant proportion of patients failed to receive effective treatment, which cannot be fully explained. Clarity and uncertainty in

  14. Impact of clinical trial findings on Bell's palsy management in general practice in the UK 2001-2012: interrupted time series regression analysis.

    Science.gov (United States)

    Morales, Daniel R; Donnan, Peter T; Daly, Fergus; Staa, Tjeerd Van; Sullivan, Frank M

    2013-01-01

    To measure the incidence of Bell's palsy and determine the impact of clinical trial findings on Bell's palsy management in the UK. Interrupted time series regression analysis and incidence measures. General practices in the UK contributing to the Clinical Practice Research Datalink (CPRD). Patients ≥16 years with a diagnosis of Bell's palsy between 2001 and 2012. (1) Publication of the 2004 Cochrane reviews of clinical trials on corticosteroids and antivirals for Bell's palsy, which made no clear recommendation on their use and (2) publication of the 2007 Scottish Bell's Palsy Study (SBPS), which made a clear recommendation that treatment with prednisolone alone improves chances for complete recovery. Incidence of Bell's palsy per 100 000 person-years. Changes in the management of Bell's palsy with either prednisolone therapy, antiviral therapy, combination therapy (prednisolone with antiviral therapy) or untreated cases. During the 12-year period, 14 460 cases of Bell's palsy were identified with an overall incidence of 37.7/100 000 person-years. The 2004 Cochrane reviews were associated with immediate falls in prednisolone therapy (-6.3% (-11.0 to -1.6)), rising trends in combination therapy (1.1% per quarter (0.5 to 1.7)) and falling trends for untreated cases (-0.8% per quarter (-1.4 to -0.3)). SBPS was associated with immediate increases in prednisolone therapy (5.1% (0.9 to 9.3)) and rising trends in prednisolone therapy (0.7% per quarter (0.4 to 1.2)); falling trends in combination therapy (-1.7% per quarter (-2.2 to -1.3)); and rising trends for untreated cases (1.2% per quarter (0.8 to 1.6)). Despite improvements, 44% still remain untreated. SBPS was clearly associated with change in management, but a significant proportion of patients failed to receive effective treatment, which cannot be fully explained. Clarity and uncertainty in clinical trial recommendations may change clinical practice. However, better ways are needed to understand and circumvent

  15. A simplified prevention bundle with dual hand hygiene audit reduces early-onset ventilator-associated pneumonia in cardiovascular surgery units: An interrupted time-series analysis.

    Directory of Open Access Journals (Sweden)

    Kang-Cheng Su

    Full Text Available To investigate the effect of a simplified prevention bundle with alcohol-based, dual hand hygiene (HH audit on the incidence of early-onset ventilation-associated pneumonia (VAP.This 3-year, quasi-experimental study with interrupted time-series analysis was conducted in two cardiovascular surgery intensive care units in a medical center. Unaware external HH audit (eHH performed by non-unit-based observers was a routine task before and after bundle implementation. Based on the realistic ICU settings, we implemented a 3-component bundle, which included: a compulsory education program, a knowing internal HH audit (iHH performed by unit-based observers, and a standardized oral care (OC protocol with 0.1% chlorhexidine gluconate. The study periods comprised 4 phases: 12-month pre-implementation phase 1 (eHH+/education-/iHH-/OC-, 3-month run-in phase 2 (eHH+/education+/iHH+/OC+, 15-month implementation phase 3 (eHH+/education+/iHH+/OC+, and 6-month post-implementation phase 4 (eHH+/education-/iHH+/OC-.A total of 2553 ventilator-days were observed. VAP incidences (events/1000 ventilator days in phase 1-4 were 39.1, 40.5, 15.9, and 20.4, respectively. VAP was significantly reduced by 59% in phase 3 (vs. phase 1, incidence rate ratio [IRR] 0.41, P = 0.002, but rebounded in phase 4. Moreover, VAP incidence was inversely correlated to compliance of OC (r2 = 0.531, P = 0.001 and eHH (r2 = 0.878, P < 0.001, but not applied for iHH, despite iHH compliance was higher than eHH compliance during phase 2 to 4. Compared to eHH, iHH provided more efficient and faster improvements for standard HH practice. The minimal compliances required for significant VAP reduction were 85% and 75% for OC and eHH (both P < 0.05, IRR 0.28 and 0.42, respectively.This simplified prevention bundle effectively reduces early-onset VAP incidence. An unaware HH compliance correlates with VAP incidence. A knowing HH audit provides better improvement in HH practice. Accordingly, we suggest

  16. Youth Mental Health Services Utilization Rates After a Large-Scale Social Media Campaign: Population-Based Interrupted Time-Series Analysis.

    Science.gov (United States)

    Booth, Richard G; Allen, Britney N; Bray Jenkyn, Krista M; Li, Lihua; Shariff, Salimah Z

    2018-04-06

    Despite the uptake of mass media campaigns, their overall impact remains unclear. Since 2011, a Canadian telecommunications company has operated an annual, large-scale mental health advocacy campaign (Bell Let's Talk) focused on mental health awareness and stigma reduction. In February 2012, the campaign began to explicitly leverage the social media platform Twitter and incented participation from the public by promising donations of Can $0.05 for each interaction with a campaign-specific username (@Bell_LetsTalk). The intent of the study was to examine the impact of this 2012 campaign on youth outpatient mental health services in the province of Ontario, Canada. Monthly outpatient mental health visits (primary health care and psychiatric services) were obtained for the Ontario youth aged 10 to 24 years (approximately 5.66 million visits) from January 1, 2006 to December 31, 2015. Interrupted time series, autoregressive integrated moving average modeling was implemented to evaluate the impact of the campaign on rates of monthly outpatient mental health visits. A lagged intervention date of April 1, 2012 was selected to account for the delay required for a patient to schedule and attend a mental health-related physician visit. The inclusion of Twitter into the 2012 Bell Let's Talk campaign was temporally associated with an increase in outpatient mental health utilization for both males and females. Within primary health care environments, female adolescents aged 10 to 17 years experienced a monthly increase in the mental health visit rate from 10.2/1000 in April 2006 to 14.1/1000 in April 2015 (slope change of 0.094 following campaign, Pcampaign, Pcampaign (slope change of 0.005, P=.02; slope change of 0.003, P=.005, respectively). For young adults aged 18 to 24 years, females who used primary health care experienced the most significant increases in mental health visit rates from 26.5/1000 in April 2006 to 29.2/1000 in April 2015 (slope change of 0.17 following

  17. Evaluating the Impact of Florida's "Stand Your Ground" Self-defense Law on Homicide and Suicide by Firearm: An Interrupted Time Series Study.

    Science.gov (United States)

    Humphreys, David K; Gasparrini, Antonio; Wiebe, Douglas J

    2017-01-01

    In 2005, Florida amended its self-defense laws to provide legal immunity to individuals using lethal force in self-defense. The enactment of "stand your ground" laws in the United States has been controversial and their effect on rates of homicide and homicide by firearm is uncertain. To estimate the impact of Florida's stand your ground law on rates of homicide and homicide by firearm. Using an interrupted time series design, we analyzed monthly rates of homicide and homicide by firearm in Florida between 1999 and 2014. Data were collected from the Wide-ranging Online Data for Epidemiologic Research (WONDER) web portal at the Centers for Disease Control and Prevention. We used seasonally adjusted segmented Poisson regression models to assess whether the onset of the law was associated with changes in the underlying trends for homicide and homicide by firearm in Florida. We also assessed the association using comparison states without stand your ground laws (New York, New Jersey, Ohio, and Virginia) and control outcomes (all suicides and suicides by firearm in Florida). October 1, 2005, the effective date of the law, was used to define homicides before and after the change. Monthly rates of homicide, firearm-related homicide, suicide, and suicide by firearm in Florida and the 4 comparison states. Prior to the stand your ground law, the mean monthly homicide rate in Florida was 0.49 deaths per 100 000 (mean monthly count, 81.93), and the rate of homicide by firearm was 0.29 deaths per 100 000 (mean monthly count, 49.06). Both rates had an underlying trend of 0.1% decrease per month. After accounting for underlying trends, these results estimate that after the law took effect there was an abrupt and sustained increase in the monthly homicide rate of 24.4% (relative risk [RR], 1.24; 95%CI, 1.16-1.33) and in the rate of homicide by firearm of 31.6% (RR, 1.32; 95% CI, 1.21-1.44). No evidence of change was found in the analyses of comparison states for either

  18. Trends in the utilization of dental outpatient services affected by the expansion of health care benefits in South Korea to include scaling: a 6-year interrupted time-series study.

    Science.gov (United States)

    Park, Hee-Jung; Lee, Jun Hyup; Park, Sujin; Kim, Tae-Il

    2018-02-01

    This study utilized a strong quasi-experimental design to test the hypothesis that the implementation of a policy to expand dental care services resulted in an increase in the usage of dental outpatient services. A total of 45,650,000 subjects with diagnoses of gingivitis or advanced periodontitis who received dental scaling were selected and examined, utilizing National Health Insurance claims data from July 2010 through November 2015. We performed a segmented regression analysis of the interrupted time-series to analyze the time-series trend in dental costs before and after the policy implementation, and assessed immediate changes in dental costs. After the policy change was implemented, a statistically significant 18% increase occurred in the observed total dental cost per patient, after adjustment for age, sex, and residence area. In addition, the dental costs of outpatient gingivitis treatment increased immediately by almost 47%, compared with a 15% increase in treatment costs for advanced periodontitis outpatients. This policy effect appears to be sustainable. The introduction of the new policy positively impacted the immediate and long-term outpatient utilization of dental scaling treatment in South Korea. While the policy was intended to entice patients to prevent periodontal disease, thus benefiting the insurance system, our results showed that the policy also increased treatment accessibility for potential periodontal disease patients and may improve long-term periodontal health in the South Korean population.

  19. Facilitating needs based cancer care for people with a chronic disease: Evaluation of an intervention using a multi-centre interrupted time series design

    Directory of Open Access Journals (Sweden)

    Sibbritt David

    2010-01-01

    Full Text Available Abstract Background Palliative care should be provided according to the individual needs of the patient, caregiver and family, so that the type and level of care provided, as well as the setting in which it is delivered, are dependent on the complexity and severity of individual needs, rather than prognosis or diagnosis 1. This paper presents a study designed to assess the feasibility and efficacy of an intervention to assist in the allocation of palliative care resources according to need, within the context of a population of people with advanced cancer. Methods/design People with advanced cancer and their caregivers completed bi-monthly telephone interviews over a period of up to 18 months to assess unmet needs, anxiety and depression, quality of life, satisfaction with care and service utilisation. The intervention, introduced after at least two baseline phone interviews, involved a training medical, nursing and allied health professionals at each recruitment site on the use of the Palliative Care Needs Assessment Guidelines and the Needs Assessment Tool: Progressive Disease - Cancer (NAT: PD-C; b health professionals completing the NAT: PD-C with participating patients approximately monthly for the rest of the study period. Changes in outcomes will be compared pre-and post-intervention. Discussion The study will determine whether the routine, systematic and regular use of the Guidelines and NAT: PD-C in a range of clinical settings is a feasible and effective strategy for facilitating the timely provision of needs based care. Trials registration ISRCTN21699701

  20. Long-term effects of flooding on mortality in England and Wales, 1994-2005: controlled interrupted time-series analysis.

    Science.gov (United States)

    Milojevic, Ai; Armstrong, Ben; Kovats, Sari; Butler, Bridget; Hayes, Emma; Leonardi, Giovanni; Murray, Virginia; Wilkinson, Paul

    2011-02-02

    Limited evidence suggests that being flooded may increase mortality and morbidity among affected householders not just at the time of the flood but for months afterwards. The objective of this study is to explore the methods for quantifying such long-term health effects of flooding by analysis of routine mortality registrations in England and Wales. Mortality data, geo-referenced by postcode of residence, were linked to a national database of flood events for 1994 to 2005. The ratio of mortality in the post-flood year to that in the pre-flood year within flooded postcodes was compared with that in non-flooded boundary areas (within 5 km of a flood). Further analyses compared the observed number of flood-area deaths in the year after flooding with the number expected from analysis of mortality trends stratified by region, age-group, sex, deprivation group and urban-rural status. Among the 319 recorded floods, there were 771 deaths in the year before flooding and 693 deaths in the year after (post-/pre-flood ratio of 0.90, 95% CI 0.82, 1.00). This ratio did not vary substantially by age, sex, population density or deprivation. A similar post-flood 'deficit' of deaths was suggested by the analyses based on observed/expected deaths. The observed post-flood 'deficit' of deaths is counter-intuitive and difficult to interpret because of the possible influence of population displacement caused by flooding. The bias that might arise from such displacement remains unquantified but has important implications for future studies that use place of residence as a marker of exposure.

  1. Evaluation of the national Cleanyourhands campaign to reduce Staphylococcus aureus bacteraemia and Clostridium difficile infection in hospitals in England and Wales by improved hand hygiene: four year, prospective, ecological, interrupted time series study.

    Science.gov (United States)

    Stone, Sheldon Paul; Fuller, Christopher; Savage, Joan; Cookson, Barry; Hayward, Andrew; Cooper, Ben; Duckworth, Georgia; Michie, Susan; Murray, Miranda; Jeanes, Annette; Roberts, J; Teare, Louise; Charlett, Andre

    2012-05-03

    To evaluate the impact of the Cleanyourhands campaign on rates of hospital procurement of alcohol hand rub and soap, report trends in selected healthcare associated infections, and investigate the association between infections and procurement. Prospective, ecological, interrupted time series study from 1 July 2004 to 30 June 2008. 187 acute trusts in England and Wales. Installation of bedside alcohol hand rub, materials promoting hand hygiene and institutional engagement, regular hand hygiene audits, rolled out nationally from 1 December 2004. Quarterly (that is, every three months) rates for each trust of hospital procurement of alcohol hand rub and liquid soap; Staphylococcus aureus bacteraemia (meticillin resistant (MRSA) and meticillin sensitive (MSSA)) and Clostridium difficile infection for each trust. Associations between procurement and infection rates assessed by mixed effect Poisson regression model (which also accounted for effect of bed occupancy, hospital type, and timing of other national interventions targeting these infections). Combined procurement of soap and alcohol hand rub tripled from 21.8 to 59.8 mL per patient bed day; procurement rose in association with each phase of the campaign. Rates fell for MRSA bacteraemia (1.88 to 0.91 cases per 10,000 bed days) and C difficile infection (16.75 to 9.49 cases). MSSA bacteraemia rates did not fall. Increased procurement of soap was independently associated with reduced C difficile infection throughout the study (adjusted incidence rate ratio for 1 mL increase per patient bed day 0.993, 95% confidence interval 0.990 to 0.996; P hospital procurement of alcohol rub and soap, which the results suggest has an important role in reducing rates of some healthcare associated infections. National interventions for infection control undertaken in the context of a high profile political drive can reduce selected healthcare associated infections.

  2. Thinking aloud in the presence of interruptions and time constraints

    DEFF Research Database (Denmark)

    Hertzum, Morten; Holmegaard, Kristin Due

    2013-01-01

    Thinking aloud is widely used for usability evaluation and its reactivity is therefore important to the quality of evaluation results. This study investigates whether thinking aloud (i.e., verbalization at levels 1 and 2) affects the behaviour of users who perform tasks that involve interruptions...... and time constraints, two frequent elements of real-world activities. We find that the presence of auditory, visual, audiovisual, or no interruptions interacts with thinking aloud for task solution rate, task completion time, and participants’ fixation rate. Thinking-aloud participants also spend longer...... responding to interruptions than control participants. Conversely, the absence or presence of time constraints does not interact with thinking aloud, suggesting that time pressure is less likely to make thinking aloud reactive than previously assumed. Our results inform practitioners faced with the decision...

  3. An Interrupted Time Series Analysis to Determine the Effect of an Electronic Health Record-Based Intervention on Appropriate Screening for Type 2 Diabetes in Urban Primary Care Clinics in New York City.

    Science.gov (United States)

    Albu, Jeanine B; Sohler, Nancy; Li, Rui; Li, Xuan; Young, Edwin; Gregg, Edward W; Ross-Degnan, Dennis

    2017-08-01

    To determine the impact of a health system-wide primary care diabetes management system, which included targeted guidelines for type 2 diabetes (T2DM) and prediabetes (dysglycemia) screening, on detection of previously undiagnosed dysglycemia cases. Intervention included electronic health record (EHR)-based decision support and standardized providers and staff training for using the American Diabetes Association guidelines for dysglycemia screening. Using EHR data, we identified 40,456 adults without T2DM or recent screening with a face-to-face visit (March 2011-December 2013) in five urban clinics. Interrupted time series analyses examined the impact of the intervention on trends in three outcomes: 1 ) monthly proportion of eligible patients receiving dysglycemia testing, 2 ) two negative comparison conditions (dysglycemia testing among ineligible patients and cholesterol screening), and 3 ) yield of undiagnosed dysglycemia among those tested. Baseline monthly proportion of eligible patients receiving testing was 7.4-10.4%. After the intervention, screening doubled (mean increase + 11.0% [95% CI 9.0, 13.0], proportion range 18.6-25.3%). The proportion of ineligible patients tested also increased (+5.0% [95% CI 3.0, 8.0]) with no concurrent change in cholesterol testing (+0% [95% CI -0.02, 0.05]). About 59% of test results in eligible patients showed dysglycemia both before and after the intervention. Implementation of a policy for systematic dysglycemia screening including formal training and EHR templates in urban academic primary care clinics resulted in a doubling of appropriate testing and the number of patients who could be targeted for treatment to prevent or delay T2DM. © 2017 by the American Diabetes Association.

  4. Integrated HIV-Care Into Primary Health Care Clinics and the Influence on Diabetes and Hypertension Care: An Interrupted Time Series Analysis in Free State, South Africa Over 4 Years.

    Science.gov (United States)

    Rawat, Angeli; Uebel, Kerry; Moore, David; Yassi, Annalee

    2018-04-15

    Noncommunicable diseases (NCDs), specifically diabetes and hypertension, are rising in high HIV-burdened countries such as South Africa. How integrated HIV care into primary health care (PHC) influences NCD care is unknown. We aimed to understand whether differences existed in NCD care (pre- versus post-integration) and how changes may relate to HIV patient numbers. Public sector PHC clinics in Free State, South Africa. Using a quasiexperimental design, we analyzed monthly administrative data on 4 indicators for diabetes and hypertension (clinic and population levels) during 4 years as HIV integration was implemented in PHC. Data represented 131 PHC clinics with a catchment population of 1.5 million. We used interrupted time series analysis at ±18 and ±30 months from HIV integration in each clinic to identify changes in trends postintegration compared with those in preintegration. We used linear mixed-effect models to study relationships between HIV and NCD indicators. Patients receiving antiretroviral therapy in the 131 PHC clinics studied increased from 1614 (April 2009) to 57, 958 (April 2013). Trends in new diabetes patients on treatment remained unchanged. However, population-level new hypertensives on treatment decreased at ±30 months from integration by 6/100, 000 (SE = 3, P < 0.02) and was associated with the number of new patients with HIV on treatment at the clinics. Our findings suggest that during the implementation of integrated HIV care into PHC clinics, care for hypertensive patients could be compromised. Further research is needed to understand determinants of NCD care in South Africa and other high HIV-burdened settings to ensure patient-centered PHC.

  5. Short-term and sustained effects of a health system strengthening intervention to improve mortality trends for paediatric severe malnutrition in rural South African hospitals: An interrupted time series design

    Directory of Open Access Journals (Sweden)

    M Muzigaba

    2017-04-01

    Full Text Available Background. Case fatality rates for childhood severe acute malnutrition (SAM remain high in some resource-limited facilities in South Africa (SA, despite the widespread availability of the World Health Organization treatment guidelines. There is a need to develop reproducible interventions that reinforce the implementation of these guidelines and assess their effect and sustainability. Objectives. To assess the short-term and sustained effects of a health system strengthening intervention on mortality attributable to SAM in two hospitals located in the Eastern Cape Province of SA. Methods. This was a theory-driven evaluation conducted in two rural hospitals in SA over a 69-month period (2009 - 2014. In both facilities, a health system strengthening intervention was implemented within the first 32 months, and thereafter discontinued. Sixty-nine monthly data series were collected on: (i monthly total SAM case fatality rate (CFR; (ii monthly SAM CFR within 24 hours of admission; and (iii monthly SAM CFR among HIV-positive cases, to determine the intervention’s effect within the first 32 months and sustainability over the remaining 37 months. The data were analysed using Linden’s method for analysing interrupted time series data. Results. The study revealed that the intervention was associated with a statistically significant decrease of up to 0.4% in monthly total SAM CFR, a non-statistically significant decrease of up to 0.09% in monthly SAM CFR within 24 hours of admission and a non-statistically significant decrease of up to 0.11% in monthly SAM CFR among HIV-positive cases. The decrease in mortality trends for both outcomes was only slightly reversed upon the discontinuation of the intervention. No autocorrelation was detected in the regression models generated during data analyses. Conclusion. The study findings suggest that although the intervention was designed to be self-sustaining, this may not have been the case. A qualitative enquiry

  6. Change in non-alcoholic beverage sales following a 10-pence levy on sugar-sweetened beverages within a national chain of restaurants in the UK: interrupted time series analysis of a natural experiment.

    Science.gov (United States)

    Cornelsen, Laura; Mytton, Oliver T; Adams, Jean; Gasparrini, Antonio; Iskander, Dalia; Knai, Cecile; Petticrew, Mark; Scott, Courtney; Smith, Richard; Thompson, Claire; White, Martin; Cummins, Steven

    2017-11-01

    This study evaluates changes in sales of non-alcoholic beverages in Jamie's Italian, a national chain of commercial restaurants in the UK, following the introduction of a £0.10 per-beverage levy on sugar-sweetened beverages (SSBs) and supporting activity including beverage menu redesign, new products and establishment of a children's health fund from levy proceeds. We used an interrupted time series design to quantify changes in sales of non-alcoholic beverages 12 weeks and 6 months after implementation of the levy, using itemised electronic point of sale data. Main outcomes were number of SSBs and other non-alcoholic beverages sold per customer. Linear regression and multilevel random effects models, adjusting for seasonality and clustering, were used to investigate changes in SSB sales across all restaurants (n=37) and by tertiles of baseline restaurant SSB sales per customer. Compared with the prelevy period, the number of SSBs sold per customer declined by 11.0% (-17.3% to -4.3%) at 12 weeks and 9.3% (-15.2% to -3.2%) at 6 months. For non-levied beverages, sales per customer of children's fruit juice declined by 34.7% (-55.3% to -4.3%) at 12 weeks and 9.9% (-16.8% to -2.4%) at 6 months. At 6 months, sales per customer of fruit juice increased by 21.8% (14.0% to 30.2%) but sales of diet cola (-7.3%; -11.7% to -2.8%) and bottled waters (-6.5%; -11.0% to -1.7%) declined. Changes in sales were only observed in restaurants in the medium and high tertiles of baseline SSB sales per customer. Introduction of a £0.10 levy on SSBs alongside complementary activities is associated with declines in SSB sales per customer in the short and medium term, particularly in restaurants with higher baseline sales of SSBs. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  7. The Effect of Integration of Self-Management Web Platforms on Health Status in Chronic Obstructive Pulmonary Disease Management in Primary Care (e-Vita Study): Interrupted Time Series Design.

    Science.gov (United States)

    Talboom-Kamp, Esther Pwa; Verdijk, Noortje A; Kasteleyn, Marise J; Harmans, Lara M; Talboom, Irvin Jsh; Looijmans-van den Akker, Ingrid; van Geloven, Nan; Numans, Mattijs E; Chavannes, Niels H

    2017-08-16

    Worldwide nearly 3 million people die from chronic obstructive pulmonary disease (COPD) every year. Integrated disease management (IDM) improves quality of life for COPD patients and can reduce hospitalization. Self-management of COPD through eHealth is an effective method to improve IDM and clinical outcomes. The objective of this implementation study was to investigate the effect of 3 chronic obstructive pulmonary disease eHealth programs applied in primary care on health status. The e-Vita COPD study compares different levels of integration of Web-based self-management platforms in IDM in 3 primary care settings. Patient health status is examined using the Clinical COPD Questionnaire (CCQ). The parallel cohort design includes 3 levels of integration in IDM (groups 1, 2, 3) and randomization of 2 levels of personal assistance for patients (group A, high assistance, group B, low assistance). Interrupted time series (ITS) design was used to collect CCQ data at multiple time points before and after intervention, and multilevel linear regression modeling was used to analyze CCQ data. Of the 702 invited patients, 215 (30.6%) registered to a platform. Of these, 82 participated in group 1 (high integration IDM), 36 in group 1A (high assistance), and 46 in group 1B (low assistance); 96 participated in group 2 (medium integration IDM), 44 in group 2A (high assistance) and 52 in group 2B (low assistance); also, 37 participated in group 3 (no integration IDM). In the total group, no significant difference was found in change in CCQ trend (P=.334) before (-0.47% per month) and after the intervention (-0.084% per month). Also, no significant difference was found in CCQ changes before versus after the intervention between the groups with high versus low personal assistance. In all subgroups, there was no significant change in the CCQ trend before and after the intervention (group 1A, P=.237; 1B, P=.991; 2A, P=.120; 2B, P=.166; 3, P=.945). The e-Vita eHealth-supported COPD

  8. The time-course of recovery from interruption during reading: eye movement evidence for the role of interruption lag and spatial memory.

    Science.gov (United States)

    Cane, James E; Cauchard, Fabrice; Weger, Ulrich W

    2012-01-01

    Two experiments examined how interruptions impact reading and how interruption lags and the reader's spatial memory affect the recovery from such interruptions. Participants read paragraphs of text and were interrupted unpredictably by a spoken news story while their eye movements were monitored. Time made available for consolidation prior to responding to the interruption did not aid reading resumption. However, providing readers with a visual cue that indicated the interruption location did aid task resumption substantially in Experiment 2. Taken together, the findings show that the recovery from interruptions during reading draws on spatial memory resources and can be aided by processes that support spatial memory. Practical implications are discussed.

  9. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...... of speculators and hedgers, we find that speculators profit from time series momentum at the expense of hedgers....

  10. Analysing Stable Time Series

    National Research Council Canada - National Science Library

    Adler, Robert

    1997-01-01

    We describe how to take a stable, ARMA, time series through the various stages of model identification, parameter estimation, and diagnostic checking, and accompany the discussion with a goodly number...

  11. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  12. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  13. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  14. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  15. Time series analysis

    CERN Document Server

    Madsen, Henrik

    2007-01-01

    ""In this book the author gives a detailed account of estimation, identification methodologies for univariate and multivariate stationary time-series models. The interesting aspect of this introductory book is that it contains several real data sets and the author made an effort to explain and motivate the methodology with real data. … this introductory book will be interesting and useful not only to undergraduate students in the UK universities but also to statisticians who are keen to learn time-series techniques and keen to apply them. I have no hesitation in recommending the book.""-Journa

  16. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  17. Modelling bursty time series

    International Nuclear Information System (INIS)

    Vajna, Szabolcs; Kertész, János; Tóth, Bálint

    2013-01-01

    Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)

  18. Real time interrupt handling using FORTRAN IV plus under RSX-11M

    International Nuclear Information System (INIS)

    Schultz, D.E.

    1981-01-01

    A real-time data acquisition application for a linear accelerator is described. The important programming features of this application are use of connect to interrupt, a shared library, map to I/O page, and a shared data area. How you can provide rapid interrupt handling using these tools from FORTRAN IV PLUS is explained

  19. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  20. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  1. Evaluation of vaccination herd immunity effects for anogenital warts in a low coverage setting with human papillomavirus vaccine-an interrupted time series analysis from 2005 to 2010 using health insurance data.

    Science.gov (United States)

    Thöne, Kathrin; Horn, Johannes; Mikolajczyk, Rafael

    2017-08-14

    Shortly after the human papillomavirus (HPV) vaccine recommendation and hence the reimbursement of vaccination costs for the respective age groups in Germany in 2007, changes in the incidence of anogenital warts (AGWs) were observed, but it was not clear at what level the incidence would stabilize and to what extent herd immunity would be present. Given the relatively low HPV vaccination coverage in Germany, we aimed to assess potential vaccination herd immunity effects in the German setting. A retrospective open cohort study with data from more than nine million statutory health insurance members from 2005 to 2010 was conducted. AGW cases were identified using ICD-10-codes. The incidence of AGWs was estimated by age, sex, and calendar quarter. Age and sex specific incidence rate ratios were estimated comparing the years 2009-2010 (post-vaccination period) with 2005-2007 (pre-vaccination period). Incidence rate ratio of AGWs for the post-vaccination period compared to the pre-vaccination period showed a u-shaped decrease among the 14- to 24-year-old females and males which corresponds well with the reported HPV vaccination uptake in 2008. A maximum reduction of up to 60% was observed for the 16- to 20-year-old females and slightly less pronounced (up to 50%) for the 16- and 18-year-old males. Age groups outside of the range 14-24 years demonstrated no decrease. The decrease of incidence occurred in both sexes early after the vaccine recommendation and stabilized at lower levels in 2009-2010. A relative reduction of up to 50% among males of approximately similar age groups as that of females receiving the HPV vaccination suggests herd protection resulting from assortative mixing by age. The early decrease among males can be reduced over time due to partner change.

  2. Hospital admission interviews are time-consuming with several interruptions

    DEFF Research Database (Denmark)

    Ghazanfar, Misbah N; Honoré, Per Gustaf Hartvig; Nielsen, Trine R H

    2012-01-01

    The admission interview is an important procedure to reduce medication errors. Studies indicate that physicians do not spend much time on the interview and that the major obstacles are lack of time and heavy workload. The aim of this study was to measure the time physicians spend on admission...... interviews and to describe factors that affect time consumption....

  3. Forecasting Cryptocurrencies Financial Time Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely...

  4. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  5. Time series with tailored nonlinearities

    Science.gov (United States)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  6. Clustering of financial time series

    Science.gov (United States)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  7. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  8. PERFORMANCE COMPARISON OF USART COMMUNICATION BETWEEN REAL TIME OPERATING SYSTEM (RTOS AND NATIVE INTERRUPT

    Directory of Open Access Journals (Sweden)

    Novian Habibie

    2016-02-01

    Full Text Available Comunication between microcontrollers is one of the crucial point in embedded sytems. On the other hand, embedded system must be able to run many parallel task simultaneously. To handle this, we need a reliabe system that can do a multitasking without decreasing every task’s performance. The most widely used methods for multitasking in embedded systems are using Interrupt Service Routine (ISR or using Real Time Operating System (RTOS. This research compared perfomance of USART communication on system with RTOS to a system that use interrupt. Experiments run on two identical development board XMega A3BU-Xplained which used intenal sensor (light and temperature and used servo as external component. Perfomance comparison done by counting ping time (elapsing time to transmit data and get a reply as a mark that data has been received and compare it. This experiments divided into two scenarios: (1 system loaded with many tasks, (2 system loaded with few tasks. Result of the experiments show that communication will be faster if system only loaded with few tasks. System with RTOS has won from interrupt in case (1, but lose to interrupt in case (2.

  9. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  10. Benchmarking of energy time series

    Energy Technology Data Exchange (ETDEWEB)

    Williamson, M.A.

    1990-04-01

    Benchmarking consists of the adjustment of time series data from one source in order to achieve agreement with similar data from a second source. The data from the latter source are referred to as the benchmark(s), and often differ in that they are observed at a lower frequency, represent a higher level of temporal aggregation, and/or are considered to be of greater accuracy. This report provides an extensive survey of benchmarking procedures which have appeared in the statistical literature, and reviews specific benchmarking procedures currently used by the Energy Information Administration (EIA). The literature survey includes a technical summary of the major benchmarking methods and their statistical properties. Factors influencing the choice and application of particular techniques are described and the impact of benchmark accuracy is discussed. EIA applications and procedures are reviewed and evaluated for residential natural gas deliveries series and coal production series. It is found that the current method of adjusting the natural gas series is consistent with the behavior of the series and the methods used in obtaining the initial data. As a result, no change is recommended. For the coal production series, a staged approach based on a first differencing technique is recommended over the current procedure. A comparison of the adjustments produced by the two methods is made for the 1987 Indiana coal production series. 32 refs., 5 figs., 1 tab.

  11. A Time Series Forecasting Method

    Directory of Open Access Journals (Sweden)

    Wang Zhao-Yu

    2017-01-01

    Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.

  12. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  13. Discretization of time series data.

    Science.gov (United States)

    Dimitrova, Elena S; Licona, M Paola Vera; McGee, John; Laubenbacher, Reinhard

    2010-06-01

    An increasing number of algorithms for biochemical network inference from experimental data require discrete data as input. For example, dynamic Bayesian network methods and methods that use the framework of finite dynamical systems, such as Boolean networks, all take discrete input. Experimental data, however, are typically continuous and represented by computer floating point numbers. The translation from continuous to discrete data is crucial in preserving the variable dependencies and thus has a significant impact on the performance of the network inference algorithms. We compare the performance of two such algorithms that use discrete data using several different discretization algorithms. One of the inference methods uses a dynamic Bayesian network framework, the other-a time-and state-discrete dynamical system framework. The discretization algorithms are quantile, interval discretization, and a new algorithm introduced in this article, SSD. SSD is especially designed for short time series data and is capable of determining the optimal number of discretization states. The experiments show that both inference methods perform better with SSD than with the other methods. In addition, SSD is demonstrated to preserve the dynamic features of the time series, as well as to be robust to noise in the experimental data. A C++ implementation of SSD is available from the authors at http://polymath.vbi.vt.edu/discretization .

  14. Interruption Management in the Intensive Care Unit: Predicting Resumption Times and Assessing Distributed Support

    Science.gov (United States)

    Grundgeiger, Tobias; Sanderson, Penelope; MacDougall, Hamish G.; Venkatesh, Balasubramanian

    2010-01-01

    Interruptions are frequent in many work domains. Researchers in health care have started to study interruptions extensively, but their studies usually do not use a theoretically guided approach. Conversely, researchers conducting theoretically rich laboratory studies on interruptions have not usually investigated how effectively their findings…

  15. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  16. Interrupting Hate: Homophobia in Schools and What Literacy Can Do about It. Language & Literacy Series

    Science.gov (United States)

    Blackburn, Mollie V.

    2011-01-01

    This timely and important book focuses on the problems of heterosexism and homophobia in schools and explores how these forms of oppression impact LGBTQQ youth, as well as all young people. The author shows how concerned teachers can engage students in literacy practices both in and out of school to develop positive learning environments. The…

  17. Global Population Density Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Density Grid Time Series Estimates provide a back-cast time series of population density grids based on the year 2000 population grid from SEDAC's...

  18. Global Population Count Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Count Grid Time Series Estimates provide a back-cast time series of population grids based on the year 2000 population grid from SEDAC's Global...

  19. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  20. Acceptability of smartphone technology to interrupt sedentary time in adults with diabetes.

    Science.gov (United States)

    Pellegrini, Christine A; Hoffman, Sara A; Daly, Elyse R; Murillo, Manuel; Iakovlev, Gleb; Spring, Bonnie

    2015-09-01

    Breaking up sitting time with light- or moderate-intensity physical activity may help to alleviate some negative health effects of sedentary behavior, but few studies have examined ways to effectively intervene. This feasibility study examined the acceptability of a new technology (NEAT!) developed to interrupt prolonged bouts (≥20 min) of sedentary time among adults with type 2 diabetes. Eight of nine participants completed a 1-month intervention and agreed that NEAT! made them more conscious of sitting time. Most participants (87.5 %) expressed a desire to use NEAT! in the future. Sedentary time decreased by 8.1 ± 4.5 %, and light physical activity increased by 7.9 ± 5.5 % over the 1-month period. The results suggest that NEAT! is an acceptable technology to intervene on sedentary time among adults with type 2 diabetes. Future studies are needed to examine the use of the technology among larger samples and determine its effects on glucose and insulin levels.

  1. A review of subsequence time series clustering.

    Science.gov (United States)

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  2. Real-Time Predictions of Reservoir Size and Rebound Time during Antiretroviral Therapy Interruption Trials for HIV.

    Directory of Open Access Journals (Sweden)

    Alison L Hill

    2016-04-01

    Full Text Available Monitoring the efficacy of novel reservoir-reducing treatments for HIV is challenging. The limited ability to sample and quantify latent infection means that supervised antiretroviral therapy (ART interruption studies are generally required. Here we introduce a set of mathematical and statistical modeling tools to aid in the design and interpretation of ART-interruption trials. We show how the likely size of the remaining reservoir can be updated in real-time as patients continue off treatment, by combining the output of laboratory assays with insights from models of reservoir dynamics and rebound. We design an optimal schedule for viral load sampling during interruption, whereby the frequency of follow-up can be decreased as patients continue off ART without rebound. While this scheme can minimize costs when the chance of rebound between visits is low, we find that the reservoir will be almost completely reseeded before rebound is detected unless sampling occurs at least every two weeks and the most sensitive viral load assays are used. We use simulated data to predict the clinical trial size needed to estimate treatment effects in the face of highly variable patient outcomes and imperfect reservoir assays. Our findings suggest that large numbers of patients-between 40 and 150-will be necessary to reliably estimate the reservoir-reducing potential of a new therapy and to compare this across interventions. As an example, we apply these methods to the two "Boston patients", recipients of allogeneic hematopoietic stem cell transplants who experienced large reductions in latent infection and underwent ART-interruption. We argue that the timing of viral rebound was not particularly surprising given the information available before treatment cessation. Additionally, we show how other clinical data can be used to estimate the relative contribution that remaining HIV+ cells in the recipient versus newly infected cells from the donor made to the

  3. Interrupt Handlers in Java

    DEFF Research Database (Denmark)

    Korsholm, Stephan; Schoeberl, Martin; Ravn, Anders Peter

    2008-01-01

    An important part of implementing device drivers is to control the interrupt facilities of the hardware platform and to program interrupt handlers. Current methods for handling interrupts in Java use a server thread waiting for the VM to signal an interrupt occurrence. It means that the interrupt...... is handled at a later time, which has some disadvantages. We present constructs that allow interrupts to be handled directly and not at a later point decided by a scheduler. A desirable feature of our approach is that we do not require a native middleware layer but can handle interrupts entirely with Java...... code. We have implemented our approach using an interpreter and a Java processor, and give an example demonstrating its use....

  4. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...

  5. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  6. Trend Filtering Techniques for Time Series Analysis

    OpenAIRE

    López Arias, Daniel

    2016-01-01

    Time series can be found almost everywhere in our lives and because of this being capable of analysing them is an important task. Most of the time series we can think of are quite noisy, being this one of the main problems to extract information from them. In this work we use Trend Filtering techniques to try to remove this noise from a series and understand the underlying trend of the series, that gives us information about the behaviour of the series aside from the particular...

  7. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense......This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...

  8. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  9. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  10. Time-frequency analysis of econometric time series

    Science.gov (United States)

    Corinaldi, Sharif; Cohen, Leon

    2007-06-01

    We review the basic concepts of time-frequency analysis which are methods that indicate not only that which frequencies in a time series but also when they existed. A number of examples are given to illustrate the possible use of these methods to econometric series. The methods are applied to the Beveridge Wheat Price Series.

  11. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  12. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  13. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  14. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  15. Methods comparison by time series analysis

    International Nuclear Information System (INIS)

    Giovino, J.

    1986-01-01

    One role of the U.S. Environmental Protection Agency (EPA) is that of monitor for laboratories under contract to perform chemical analyses. In general this program involves periodic analyses and reporting of unknown radionuclides in water. This radiochemistry data for the years 1980-1984, has been summarized. It represents several radionuclides and various methods used by numerous laboratories. Any series of measurements taken at successive time points is a time series, and is thus candidate for time series analysis. The purpose of such an analysis is to see what changes take place over time in the event being observed, to see if the performance is better or worse than it was expected to be, and to predict future behavior. To illustrate the step-by-step process of a time series analysis, the radionuclide /sup 226/Ra was selected. The available data were generated by two methods; total radium alpha and /sup 222/Rn emanation. The results of analysis are presented

  16. Data Mining Smart Energy Time Series

    Directory of Open Access Journals (Sweden)

    Janina POPEANGA

    2015-07-01

    Full Text Available With the advent of smart metering technology the amount of energy data will increase significantly and utilities industry will have to face another big challenge - to find relationships within time-series data and even more - to analyze such huge numbers of time series to find useful patterns and trends with fast or even real-time response. This study makes a small review of the literature in the field, trying to demonstrate how essential is the application of data mining techniques in the time series to make the best use of this large quantity of data, despite all the difficulties. Also, the most important Time Series Data Mining techniques are presented, highlighting their applicability in the energy domain.

  17. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  18. Measuring multiscaling in financial time-series

    International Nuclear Information System (INIS)

    Buonocore, R.J.; Aste, T.; Di Matteo, T.

    2016-01-01

    We discuss the origin of multiscaling in financial time-series and investigate how to best quantify it. Our methodology consists in separating the different sources of measured multifractality by analyzing the multi/uni-scaling behavior of synthetic time-series with known properties. We use the results from the synthetic time-series to interpret the measure of multifractality of real log-returns time-series. The main finding is that the aggregation horizon of the returns can introduce a strong bias effect on the measure of multifractality. This effect can become especially important when returns distributions have power law tails with exponents in the range (2, 5). We discuss the right aggregation horizon to mitigate this bias.

  19. Detecting nonlinear structure in time series

    International Nuclear Information System (INIS)

    Theiler, J.

    1991-01-01

    We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of ''surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs

  20. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  1. Simulating multivariate time series using flocking

    OpenAIRE

    Schruben, Lee W.; Singham, Dashi I.

    2010-01-01

    Refereed Conference Paper Notions from agent based modeling (ABM) can be used to simulate multivariate time series. An example is given using the ABM concept of flocking, which models the behaviors of birds (called boids) in a flock. A multivariate time series is mapped into the coordinates of a bounded orthotope. This represents the flight path of a boid. Other boids are generated that flock around this data boid. The coordinates of these new boids are mapped back to simulate replicates o...

  2. Dimensionality reduction for time series data

    OpenAIRE

    Vidaurre, Diego; Rezek, Iead; Harrison, Samuel L.; Smith, Stephen S.; Woolrich, Mark

    2014-01-01

    Despite the fact that they do not consider the temporal nature of data, classic dimensionality reduction techniques, such as PCA, are widely applied to time series data. In this paper, we introduce a factor decomposition specific for time series that builds upon the Bayesian multivariate autoregressive model and hence evades the assumption that data points are mutually independent. The key is to find a low-rank estimation of the autoregressive matrices. As in the probabilistic version of othe...

  3. DROP: Dimensionality Reduction Optimization for Time Series

    OpenAIRE

    Suri, Sahaana; Bailis, Peter

    2017-01-01

    Dimensionality reduction is critical in analyzing increasingly high-volume, high-dimensional time series. In this paper, we revisit a now-classic study of time series dimensionality reduction operators and find that for a given quality constraint, Principal Component Analysis (PCA) uncovers representations that are over 2x smaller than those obtained via alternative techniques favored in the literature. However, as classically implemented via Singular Value Decomposition (SVD), PCA is incredi...

  4. Boosting Nonlinear Additive Autoregressive Time Series

    OpenAIRE

    Shafik, Nivien; Tutz, Gerhard

    2007-01-01

    Within the last years several methods for the analysis of nonlinear autoregressive time series have been proposed. As in linear autoregressive models main problems are model identification, estimation and prediction. A boosting method is proposed that performs model identification and estimation simultaneously within the framework of nonlinear autoregressive time series. The method allows to select influential terms from a large numbers of potential lags and exogenous variables. The influence...

  5. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  6. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  7. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang

    2016-01-01

    The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...... queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...... are either costly or require continuous maintenance. In this paper we propose an approach for approximate OLAP querying of time series that offers constant latency and is maintenance-free. To achieve this, we identify similarities between aggregation cuboids and propose algorithms that eliminate...

  8. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  9. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  10. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  11. Time series clustering in large data sets

    Directory of Open Access Journals (Sweden)

    Jiří Fejfar

    2011-01-01

    Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.

  12. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic......In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...

  13. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  14. TimeSeer: Scagnostics for high-dimensional time series.

    Science.gov (United States)

    Dang, Tuan Nhon; Anand, Anushka; Wilkinson, Leland

    2013-03-01

    We introduce a method (Scagnostic time series) and an application (TimeSeer) for organizing multivariate time series and for guiding interactive exploration through high-dimensional data. The method is based on nine characterizations of the 2D distributions of orthogonal pairwise projections on a set of points in multidimensional euclidean space. These characterizations include measures, such as, density, skewness, shape, outliers, and texture. Working directly with these Scagnostic measures, we can locate anomalous or interesting subseries for further analysis. Our application is designed to handle the types of doubly multivariate data series that are often found in security, financial, social, and other sectors.

  15. Complex dynamic in ecological time series

    Science.gov (United States)

    Peter Turchin; Andrew D. Taylor

    1992-01-01

    Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...

  16. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do not indi...

  17. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  18. Inferring interdependencies from short time series

    Indian Academy of Sciences (India)

    chance – a much weaker null hypothesis than when trying to ensure that the observed value of a test statis- .... for short time series and performs better than exist- ing methods. The details are discussed in the .... seen to perform well in a significant number of combi- nations, although without any discernible relation to the.

  19. Argos: An Optimized Time-Series Photometer

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... We designed a prime focus CCD photometer, Argos, optimized for high speed time-series measurements of blue variables (Nather & Mukadam 2004) for the 2.1 m telescope at McDonald Observatory. Lack of any intervening optics between the primary mirror and the CCD makes the instrument highly ...

  20. Markov Trends in Macroeconomic Time Series

    NARCIS (Netherlands)

    R. Paap (Richard)

    1997-01-01

    textabstractMany macroeconomic time series are characterised by long periods of positive growth, expansion periods, and short periods of negative growth, recessions. A popular model to describe this phenomenon is the Markov trend, which is a stochastic segmented trend where the slope depends on the

  1. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  2. Inferring interdependencies from short time series

    Indian Academy of Sciences (India)

    underlying structural difference in their overall economies, as well as their agricultural sectors. Keywords. Interdependence; correlation; inner composition alignment; time series ..... ables – sharing common properties within a climate zone – and socio-economic indicators, where informa- tion is aggregated only on a ...

  3. Recent Advances in Energy Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Francisco Martínez-Álvarez

    2017-06-01

    Full Text Available This editorial summarizes the performance of the special issue entitled Energy Time Series Forecasting, which was published in MDPI’s Energies journal. The special issue took place in 2016 and accepted a total of 21 papers from twelve different countries. Electrical, solar, or wind energy forecasting were the most analyzed topics, introducing brand new methods with very sound results.

  4. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  5. Hurst exponents for short time series

    Science.gov (United States)

    Qi, Jingchao; Yang, Huijie

    2011-12-01

    A concept called balanced estimator of diffusion entropy is proposed to detect quantitatively scalings in short time series. The effectiveness is verified by detecting successfully scaling properties for a large number of artificial fractional Brownian motions. Calculations show that this method can give reliable scalings for short time series with length ˜102. It is also used to detect scalings in the Shanghai Stock Index, five stock catalogs, and a total of 134 stocks collected from the Shanghai Stock Exchange Market. The scaling exponent for each catalog is significantly larger compared with that for the stocks included in the catalog. Selecting a window with size 650, the evolution of scaling for the Shanghai Stock Index is obtained by the window's sliding along the series. Global patterns in the evolutionary process are captured from the smoothed evolutionary curve. By comparing the patterns with the important event list in the history of the considered stock market, the evolution of scaling is matched with the stock index series. We can find that the important events fit very well with global transitions of the scaling behaviors.

  6. Inverse statistical approach in heartbeat time series

    International Nuclear Information System (INIS)

    Ebadi, H; Shirazi, A H; Mani, Ali R; Jafari, G R

    2011-01-01

    We present an investigation on heart cycle time series, using inverse statistical analysis, a concept borrowed from studying turbulence. Using this approach, we studied the distribution of the exit times needed to achieve a predefined level of heart rate alteration. Such analysis uncovers the most likely waiting time needed to reach a certain change in the rate of heart beat. This analysis showed a significant difference between the raw data and shuffled data, when the heart rate accelerates or decelerates to a rare event. We also report that inverse statistical analysis can distinguish between the electrocardiograms taken from healthy volunteers and patients with heart failure

  7. Researches on Nutritional Behaviour in Romanian Black and White Primiparous Cows. Interruptions Number and their Duration in the Ration Consumption Time

    Directory of Open Access Journals (Sweden)

    Silvia Erina

    2012-10-01

    Full Text Available The study was carried out on 9 Romanian Black and White primiparous cows. The aim of this study was todetermine some aspect of nutritional behaviour of the cows. During the experiments, the following behaviour aspectswere determined: interruption number and their duration in the feed consumption time. Results showed that theadministration order of forages had an influence on the interruptions number, which was 0.74 less for hay in fibroussucculentorder (O1. For silage, the interruption number was 0.42 higher in fibrous-succulent order (O1. Betweenportion 1 (P1 and portion 3 (P3, the significant difference (p<0.05 was for interruptions duration, duringconsumption silage, in favour portion P1. Distinct significant differences (p<0.01 was observed for the interruptionnumber during consumption silage (0.95 sec. higher in P1 than in P3, for interruption duration (5.96 sec. higher inP1 than in P3. Between P2 and P3, significant difference (p<0.05 was observed for interruptions number duringconsumption silage and for average interruptions duration during consumption beet in favour to portion P2.Regarding the number of feedings per portion, always the differences were higher in the second feeding F1 than inthe first feeding F2.

  8. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  9. COMPUTATION OF IMAGE SIMILARITY WITH TIME SERIES

    Directory of Open Access Journals (Sweden)

    V. Balamurugan

    2011-11-01

    Full Text Available Searching for similar sequence in large database is an important task in temporal data mining. Similarity search is concerned with efficiently locating subsequences or whole sequences in large archives of sequences. It is useful in typical data mining applications and it can be easily extended to image retrieval. In this work, time series similarity analysis that involves dimensionality reduction and clustering is adapted on digital images to find similarity between them. The dimensionality reduced time series is represented as clusters by the use of K-Means clustering and the similarity distance between two images is found by finding the distance between the signatures of their clusters. To quantify the extent of similarity between two sequences, Earth Mover’s Distance (EMD is used. From the experiments on different sets of images, it is found that this technique is well suited for measuring the subjective similarity between two images.

  10. Visibility graphlet approach to chaotic time series

    Energy Technology Data Exchange (ETDEWEB)

    Mutua, Stephen [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China); Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega (Kenya); Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn; Yang, Huijie, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2016-05-15

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  11. Markov Trends in Macroeconomic Time Series

    OpenAIRE

    Paap, Richard

    1997-01-01

    textabstractMany macroeconomic time series are characterised by long periods of positive growth, expansion periods, and short periods of negative growth, recessions. A popular model to describe this phenomenon is the Markov trend, which is a stochastic segmented trend where the slope depends on the value of an unobserved two-state first-order Markov process. The two slopes of the Markov trend describe the growth rates in the two phases of the business cycle. This thesis deals with a Bayesian ...

  12. General bulk service queueing system with N-policy, multiplevacations, setup time and server breakdown without interruption

    Science.gov (United States)

    Sasikala, S.; Indhira, K.; Chandrasekaran, V. M.

    2017-11-01

    In this paper, we have considered an MX / (a,b) / 1 queueing system with server breakdown without interruption, multiple vacations, setup times and N-policy. After a batch of service, if the size of the queue is ξ (customers in the queue. After a vacation, if the server finds at least N customers waiting for service, then the server needs a setup time to start the service. After a batch of service, if the amount of waiting customers in the queue is ξ (≥ a) then the server serves a batch of min(ξ,b) customers, where b ≥ a. We derived the probability generating function of queue length at arbitrary time epoch. Further, we obtained some important performance measures.

  13. Forecasting autoregressive time series under changing persistence

    DEFF Research Database (Denmark)

    Kruse, Robinson

    Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...... recommendations, no matter whether a change in persistence occurs or not. Seven different forecasting strategies based on a biasedcorrected estimator are compared by means of a large-scale Monte Carlo study. The results for decreasing and increasing persistence are highly asymmetric and new to the literature. Its...

  14. Effect of an evidence-based website on healthcare usage: an interrupted time-series study.

    NARCIS (Netherlands)

    Spoelman, W.A.; Bonten, T.N.; Waal, M.W.M. de; Drenthen, T.; Smeele, I.J.M.; Nielen, M.M.; Chavannes, N.

    2016-01-01

    Objectives: Healthcare costs and usage are rising. Evidence-based online health information may reduce healthcare usage, but the evidence is scarce. The objective of this study was to determine whether the release of a nationwide evidence-based health website was associated with a reduction in

  15. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  16. Timing calibration and spectral cleaning of LOFAR time series data

    OpenAIRE

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Hörandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    2016-01-01

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are stable over time, while being approximately uniform-random for a sum over many sources or for noise. Using only milliseconds-long datasets, the method finds the strongest interfering transmitters,...

  17. Interruptions disrupt reading comprehension.

    Science.gov (United States)

    Foroughi, Cyrus K; Werner, Nicole E; Barragán, Daniela; Boehm-Davis, Deborah A

    2015-06-01

    Previous research suggests that being interrupted while reading a text does not disrupt the later recognition or recall of information from that text. This research is used as support for Ericsson and Kintsch's (1995) long-term working memory (LT-WM) theory, which posits that disruptions while reading (e.g., interruptions) do not impair subsequent text comprehension. However, to fully comprehend a text, individuals may need to do more than recognize or recall information that has been presented in the text at a later time. Reading comprehension often requires individuals to connect and synthesize information across a text (e.g., successfully identifying complex topics such as themes and tones) and not just make a familiarity-based decision (i.e., recognition). The goal for this study was to determine whether interruptions while reading disrupt reading comprehension when the questions assessing comprehension require participants to connect and synthesize information across the passage. In Experiment 1, interruptions disrupted reading comprehension. In Experiment 2, interruptions disrupted reading comprehension but not recognition of information from the text. In Experiment 3, the addition of a 15-s time-out prior to the interruption successfully removed these negative effects. These data suggest that the time it takes to process the information needed to successfully comprehend text when reading is greater than that required for recognition. Any interference (e.g., an interruption) that occurs during the comprehension process may disrupt reading comprehension. This evidence supports the need for transient activation of information in working memory for successful text comprehension and does not support LT-WM theory. (c) 2015 APA, all rights reserved).

  18. Fractal fluctuations in cardiac time series

    Science.gov (United States)

    West, B. J.; Zhang, R.; Sanders, A. W.; Miniyar, S.; Zuckerman, J. H.; Levine, B. D.; Blomqvist, C. G. (Principal Investigator)

    1999-01-01

    Human heart rate, controlled by complex feedback mechanisms, is a vital index of systematic circulation. However, it has been shown that beat-to-beat values of heart rate fluctuate continually over a wide range of time scales. Herein we use the relative dispersion, the ratio of the standard deviation to the mean, to show, by systematically aggregating the data, that the correlation in the beat-to-beat cardiac time series is a modulated inverse power law. This scaling property indicates the existence of long-time memory in the underlying cardiac control process and supports the conclusion that heart rate variability is a temporal fractal. We argue that the cardiac control system has allometric properties that enable it to respond to a dynamical environment through scaling.

  19. Period Estimation in Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2011-09-01

    Detection of periodicity and period estimation in non-uniformly sampled time series data is frequently a goal in Astronomical data analysis. There are various problems faced: Firstly, data is sampled non-uniformly which makes it difficult to use simple Fourier transform for performing spectral analysis. Secondly, there are large gaps in data which makes it difficult to interpolate the signal for re-sampling. Finally, in data sets with smaller time periods the non-uniformity in sampling and noise in data pose even greater problems because of the lesser number of samples per period. In this talk we review existing methods and then we propose new approaches in determining periods. We first use correntropy (an alternative to autocorrelation) that encapsulates non-linear correlations using a spatio-temporal kernel to estimate accurately the time period of the data. The other uses periodic kernels in non-parametric Gaussian process. These new techniques are also used for identifying periodic signals.

  20. Inferring causality from noisy time series data

    DEFF Research Database (Denmark)

    Mønster, Dan; Fusaroli, Riccardo; Tylén, Kristian

    2016-01-01

    Convergent Cross-Mapping (CCM) has shown high potential to perform causal inference in the absence of models. We assess the strengths and weaknesses of the method by varying coupling strength and noise levels in coupled logistic maps. We find that CCM fails to infer accurate coupling strength...... and even causality direction in synchronized time-series and in the presence of intermediate coupling. We find that the presence of noise deterministically reduces the level of cross-mapping fidelity, while the convergence rate exhibits higher levels of robustness. Finally, we propose that controlled noise...

  1. Time Series Modeling for Structural Response Prediction

    Science.gov (United States)

    1988-11-14

    results for 2nd mode. 69 5. 3DOF simulated data. 71 6. Experimental data. 72 7. Simulated data. 75 8. MPEM estimates for MDOF data with closely spaced...vector Ssteering matrix of residual time series 2DOF Two-degree-of-freedom 2LS Two-stage Least Squares Method 3DOF Three-degree-of-freedom x SUMMARY A...70 Table 5: 3DOF Simulated Data (fd= 1 ,10 ,25 ; C=.01,.0l,.0l; Amp=1,l,l; 256 pts, f,=2000 Hz) Algorithm grv noise higher mode grv, 4th mode, bias 40

  2. Fourier analysis of time series an introduction

    CERN Document Server

    Bloomfield, Peter

    2000-01-01

    A new, revised edition of a yet unrivaled work on frequency domain analysis Long recognized for his unique focus on frequency domain methods for the analysis of time series data as well as for his applied, easy-to-understand approach, Peter Bloomfield brings his well-known 1976 work thoroughly up to date. With a minimum of mathematics and an engaging, highly rewarding style, Bloomfield provides in-depth discussions of harmonic regression, harmonic analysis, complex demodulation, and spectrum analysis. All methods are clearly illustrated using examples of specific data sets, while ample

  3. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  4. Anomaly on Superspace of Time Series Data

    Science.gov (United States)

    Capozziello, Salvatore; Pincak, Richard; Kanjamapornkul, Kabin

    2017-11-01

    We apply the G-theory and anomaly of ghost and antighost fields in the theory of supersymmetry to study a superspace over time series data for the detection of hidden general supply and demand equilibrium in the financial market. We provide proof of the existence of a general equilibrium point over 14 extradimensions of the new G-theory compared with the M-theory of the 11 dimensions model of Edward Witten. We found that the process of coupling between nonequilibrium and equilibrium spinor fields of expectation ghost fields in the superspace of time series data induces an infinitely long exact sequence of cohomology from a short exact sequence of moduli state space model. If we assume that the financial market is separated into two topological spaces of supply and demand as the D-brane and anti-D-brane model, then we can use a cohomology group to compute the stability of the market as a stable point of the general equilibrium of the interaction between D-branes of the market. We obtain the result that the general equilibrium will exist if and only if the 14th Batalin-Vilkovisky cohomology group with the negative dimensions underlying 14 major hidden factors influencing the market is zero.

  5. Muscle-invasive bladder cancer treated with external beam radiation: influence of total dose, overall treatment time, and treatment interruption on local control

    International Nuclear Information System (INIS)

    Moonen, L.; Voet, H. van der; Nijs, R. de; Horenblas, S.; Hart, A.A.M.; Bartelink, H.

    1998-01-01

    Purpose: To evaluate and eventually quantify a possible influence of tumor proliferation during the external radiation course on local control in muscle invasive bladder cancer. Methods and Materials: The influence of total dose, overall treatment time, and treatment interruption has retrospectively been analyzed in a series of 379 patients with nonmetastasized, muscle-invasive transitional cell carcinoma of the urinary bladder. All patients received external beam radiotherapy at the Netherlands Cancer Institute between 1977 and 1990. Total dose varied between 50 and 75 Gy with a mean of 60.5 Gy and a median of 60.4 Gy. Overall treatment time varied between 20 and 270 days with a mean of 49 days and a median of 41 days. Number of fractions varied between 17 and 36 with a mean of 27 and a median of 26. Two hundred and forty-four patients had a continuous radiation course, whereas 135 had an intended split course or an unintended treatment interruption. Median follow-up was 22 months for all patients and 82 months for the 30 patients still alive at last follow-up. A stepwise procedure using proportional hazard regression has been used to identify prognostic treatment factors with respect to local recurrence as sole first recurrence. Results: One hundred and thirty-six patients experienced a local recurrence and 120 of these occurred before regional or distant metastases. The actuarial local control rate was 40.3% at 5 years and 32.3% at 10 years. In a multivariate analysis total dose showed a significant association with local control (p 0.0039), however in a markedly nonlinear way. In fact only those patients treated with a dose below 57.5 Gy had a significant higher bladder relapse rate, whereas no difference in relapse rate was found among patients treated with doses above 57.5 Gy. This remained the case even after adjustment for overall treatment time and all significant tumor and patient characteristics. The Normalized Tumor Dose (NTD) (α/β = 10) and NTD (

  6. Time Series Based for Online Signature Verification

    Directory of Open Access Journals (Sweden)

    I Ketut Gede Darma Putra

    2013-11-01

    Full Text Available Signature verification system is to match the tested signature with a claimed signature. This paper proposes time series based for feature extraction method and dynamic time warping for match method. The system made by process of testing 900 signatures belong to 50 participants, 3 signatures for reference and 5 signatures from original user, simple imposters and trained imposters for signatures test. The final result system was tested with 50 participants with 3 references. This test obtained that system accuracy without imposters is 90,44897959% at threshold 44 with rejection errors (FNMR is 5,2% and acceptance errors (FMR is 4,35102%, when with imposters system accuracy is 80,1361% at threshold 27 with error rejection (FNMR is 15,6% and acceptance errors (average FMR is 4,263946%, with details as follows: acceptance errors is 0,391837%, acceptance errors simple imposters is 3,2% and acceptance errors trained imposters is 9,2%.

  7. Automated time series forecasting for biosurveillance.

    Science.gov (United States)

    Burkom, Howard S; Murphy, Sean Patrick; Shmueli, Galit

    2007-09-30

    For robust detection performance, traditional control chart monitoring for biosurveillance is based on input data free of trends, day-of-week effects, and other systematic behaviour. Time series forecasting methods may be used to remove this behaviour by subtracting forecasts from observations to form residuals for algorithmic input. We describe three forecast methods and compare their predictive accuracy on each of 16 authentic syndromic data streams. The methods are (1) a non-adaptive regression model using a long historical baseline, (2) an adaptive regression model with a shorter, sliding baseline, and (3) the Holt-Winters method for generalized exponential smoothing. Criteria for comparing the forecasts were the root-mean-square error, the median absolute per cent error (MedAPE), and the median absolute deviation. The median-based criteria showed best overall performance for the Holt-Winters method. The MedAPE measures over the 16 test series averaged 16.5, 11.6, and 9.7 for the non-adaptive regression, adaptive regression, and Holt-Winters methods, respectively. The non-adaptive regression forecasts were degraded by changes in the data behaviour in the fixed baseline period used to compute model coefficients. The mean-based criterion was less conclusive because of the effects of poor forecasts on a small number of calendar holidays. The Holt-Winters method was also most effective at removing serial autocorrelation, with most 1-day-lag autocorrelation coefficients below 0.15. The forecast methods were compared without tuning them to the behaviour of individual series. We achieved improved predictions with such tuning of the Holt-Winters method, but practical use of such improvements for routine surveillance will require reliable data classification methods.

  8. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  9. Palmprint Verification Using Time Series Method

    Directory of Open Access Journals (Sweden)

    A. A. Ketut Agung Cahyawan Wiranatha

    2013-11-01

    Full Text Available The use of biometrics as an automatic recognition system is growing rapidly in solving security problems, palmprint is one of biometric system which often used. This paper used two steps in center of mass moment method for region of interest (ROI segmentation and apply the time series method combined with block window method as feature representation. Normalized Euclidean Distance is used to measure the similarity degrees of two feature vectors of palmprint. System testing is done using 500 samples palms, with 4 samples as the reference image and the 6 samples as test images. Experiment results show that this system can achieve a high performance with success rate about 97.33% (FNMR=1.67%, FMR=1.00 %, T=0.036.

  10. Learning and Prediction of Relational Time Series

    Science.gov (United States)

    2013-03-01

    r S ub gr ap h Is om or ph is m (s ec ) Number of Constants in one situation Snort Dataset 1 & 2: Runtime over constant count Attention BFS 130...the scalability of the attention technique. 0 0.2 0.4 0.6 0.8 1 0 50 100 150 200 250 300 Ti m e pe r S ub gr ap h Is om or ph is m (s ec ) Number...φ, φ). Segment: A segment in the relational time-series r = p1p2…pn is comprised of the percept subsequence [ papa +1pa+2…pa+mpb) such that pa

  11. Fisher information framework for time series modeling

    Science.gov (United States)

    Venkatesan, R. C.; Plastino, A.

    2017-08-01

    A robust prediction model invoking the Takens embedding theorem, whose working hypothesis is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the working hypothesis satisfy a time independent Schrödinger-like equation in a vector setting. The inference of (i) the probability density function of the coefficients of the working hypothesis and (ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defines the working hypothesis, solely in terms of the observed data. Cases for prediction employing time series' obtained from the: (i) the Mackey-Glass delay-differential equation, (ii) one ECG signal from the MIT-Beth Israel Deaconess Hospital (MIT-BIH) cardiac arrhythmia database, and (iii) one ECG signal from the Creighton University ventricular tachyarrhythmia database. The ECG samples were obtained from the Physionet online repository. These examples demonstrate the efficiency of the prediction model. Numerical examples for exemplary cases are provided.

  12. Time series modeling for syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Mandl Kenneth D

    2003-01-01

    Full Text Available Abstract Background Emergency department (ED based syndromic surveillance systems identify abnormally high visit rates that may be an early signal of a bioterrorist attack. For example, an anthrax outbreak might first be detectable as an unusual increase in the number of patients reporting to the ED with respiratory symptoms. Reliably identifying these abnormal visit patterns requires a good understanding of the normal patterns of healthcare usage. Unfortunately, systematic methods for determining the expected number of (ED visits on a particular day have not yet been well established. We present here a generalized methodology for developing models of expected ED visit rates. Methods Using time-series methods, we developed robust models of ED utilization for the purpose of defining expected visit rates. The models were based on nearly a decade of historical data at a major metropolitan academic, tertiary care pediatric emergency department. The historical data were fit using trimmed-mean seasonal models, and additional models were fit with autoregressive integrated moving average (ARIMA residuals to account for recent trends in the data. The detection capabilities of the model were tested with simulated outbreaks. Results Models were built both for overall visits and for respiratory-related visits, classified according to the chief complaint recorded at the beginning of each visit. The mean absolute percentage error of the ARIMA models was 9.37% for overall visits and 27.54% for respiratory visits. A simple detection system based on the ARIMA model of overall visits was able to detect 7-day-long simulated outbreaks of 30 visits per day with 100% sensitivity and 97% specificity. Sensitivity decreased with outbreak size, dropping to 94% for outbreaks of 20 visits per day, and 57% for 10 visits per day, all while maintaining a 97% benchmark specificity. Conclusions Time series methods applied to historical ED utilization data are an important tool

  13. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  14. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  15. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  16. Emergency Physician Use of Cognitive Strategies to Manage Interruptions.

    Science.gov (United States)

    Ratwani, Raj M; Fong, Allan; Puthumana, Josh S; Hettinger, Aaron Z

    2017-11-01

    The purpose of this study is to examine whether emergency physicians use strategies to manage interruptions during clinical work. Interruption management strategies include immediately engaging the interruption by discontinuing the current task and starting the interruption, continuing the current task while engaging the interruption, rejecting the interruption, or delaying the interruption. An observational time and motion study was conducted in 3 different urban, academic emergency departments with 18 attending emergency physicians. Each physician was observed for 2 hours, and the number of interruptions, source of interruptions, type of task being interrupted, and use of interruption management strategies were documented. Participants were interrupted on average of 12.5 times per hour. The majority of interruptions were in person from other staff, including nurses, residents, and other attending physicians. When participants were interrupted, they were often working on their computer. Participants almost always immediately engaged the interruption task (75.4% of the time), followed by multitasking, in which the primary task was continued while the interrupting task was performed (22.2%). Physicians rejected or delayed interruptions less than 2% of the time. Our results suggest there is an opportunity to introduce emergency physicians to the use of interruption management strategies as a method of handling the frequent interruptions they are exposed to. Use of these strategies when high-risk primary tasks are performed may reduce the disruptiveness of some interruptions and improve patient safety. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  17. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  18. Approximate Entropies for Stochastic Time Series and EKG Time Series of Patients with Epilepsy and Pseudoseizures

    Science.gov (United States)

    Vyhnalek, Brian; Zurcher, Ulrich; O'Dwyer, Rebecca; Kaufman, Miron

    2009-10-01

    A wide range of heart rate irregularities have been reported in small studies of patients with temporal lobe epilepsy [TLE]. We hypothesize that patients with TLE display cardiac dysautonomia in either a subclinical or clinical manner. In a small study, we have retrospectively identified (2003-8) two groups of patients from the epilepsy monitoring unit [EMU] at the Cleveland Clinic. No patients were diagnosed with cardiovascular morbidities. The control group consisted of patients with confirmed pseudoseizures and the experimental group had confirmed right temporal lobe epilepsy through a seizure free outcome after temporal lobectomy. We quantified the heart rate variability using the approximate entropy [ApEn]. We found similar values of the ApEn in all three states of consciousness (awake, sleep, and proceeding seizure onset). In the TLE group, there is some evidence for greater variability in the awake than in either the sleep or proceeding seizure onset. Here we present results for mathematically-generated time series: the heart rate fluctuations ξ follow the γ statistics i.e., p(ξ)=γ-1(k) ξ^k exp(-ξ). This probability function has well-known properties and its Shannon entropy can be expressed in terms of the γ-function. The parameter k allows us to generate a family of heart rate time series with different statistics. The ApEn calculated for the generated time series for different values of k mimic the properties found for the TLE and pseudoseizure group. Our results suggest that the ApEn is an effective tool to probe differences in statistics of heart rate fluctuations.

  19. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  20. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  1. MODELLING OF ORDINAL TIME SERIES BY PROPORTIONAL ODDS MODEL

    Directory of Open Access Journals (Sweden)

    Serpil AKTAŞ ALTUNAY

    2013-06-01

    Full Text Available Categorical time series data with random time dependent covariates often arise when the variable categories are assigned as categorical. There are several other models that have been proposed in the literature for the analysis of categorical time series. For example, Markov chain models, integer autoregressive processes, discrete ARMA models can be utilized for modeling of categorical time series. In general, the choice of model depends on the measurement of study variables: nominal, ordinal and interval. However, regression theory is successful approach for categorical time series which is based on generalized linear models and partial likelihood inference. One of the models for ordinal time series in regression theory is proportional odds model. In this study, proportional odds model approach to ordinal categorical time series is investigated based on a real air pollution data set and the results are discussed.

  2. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  3. Randomized trial of time-limited interruptions of protease inhibitor-based antiretroviral therapy (ART vs. continuous therapy for HIV-1 infection.

    Directory of Open Access Journals (Sweden)

    Cynthia Firnhaber

    Full Text Available The clinical outcomes of short interruptions of PI-based ART regimens remains undefined.A 2-arm non-inferiority trial was conducted on 53 HIV-1 infected South African participants with viral load 450 cells/µl on stavudine (or zidovudine, lamivudine and lopinavir/ritonavir. Subjects were randomized to a sequential 2, 4 and 8-week ART interruptions or b continuous ART (cART. Primary analysis was based on the proportion of CD4 count >350 cells(c/ml over 72 weeks. Adherence, HIV-1 drug resistance, and CD4 count rise over time were analyzed as secondary endpoints.The proportions of CD4 counts >350 cells/µl were 82.12% for the intermittent arm and 93.73 for the cART arm; the difference of 11.95% was above the defined 10% threshold for non-inferiority (upper limit of 97.5% CI, 24.1%; 2-sided CI: -0.16, 23.1. No clinically significant differences in opportunistic infections, adverse events, adherence or viral resistance were noted; after randomization, long-term CD4 rise was observed only in the cART arm.We are unable to conclude that short PI-based ART interruptions are non-inferior to cART in retention of immune reconstitution; however, short interruptions did not lead to a greater rate of resistance mutations or adverse events than cART suggesting that this regimen may be more forgiving than NNRTIs if interruptions in therapy occur.ClinicalTrials.gov NCT00100646.

  4. Modelling the time at which overcrowding and feed interruption emerge on the swine premises under movement restrictions during a classical swine fever outbreak.

    Science.gov (United States)

    Weng, H Y; Yadav, S; Olynk Widmar, N J; Croney, C; Ash, M; Cooper, M

    2017-03-01

    A stochastic risk model was developed to estimate the time elapsed before overcrowding (TOC) or feed interruption (TFI) emerged on the swine premises under movement restrictions during a classical swine fever (CSF) outbreak in Indiana, USA. Nursery (19 to 65 days of age) and grow-to-finish (40 to 165 days of age) pork production operations were modelled separately. Overcrowding was defined as the total weight of pigs on premises exceeding 100% to 115% of the maximum capacity of the premises, which was computed as the total weight of the pigs at harvest/transition age. Algorithms were developed to estimate age-specific weight of the pigs on premises and to compare the daily total weight of the pigs with the threshold weight defining overcrowding to flag the time when the total weight exceeded the threshold (i.e. when overcrowding occurred). To estimate TFI, an algorithm was constructed to model a swine producer's decision to discontinue feed supply by incorporating the assumptions that a longer estimated epidemic duration, a longer time interval between the age of pigs at the onset of the outbreak and the harvest/transition age, or a longer progression of an ongoing outbreak would increase the probability of a producer's decision to discontinue the feed supply. Adverse animal welfare conditions were modelled to emerge shortly after an interruption of feed supply. Simulations were run with 100 000 iterations each for a 365-day period. Overcrowding occurred in all simulated iterations, and feed interruption occurred in 30% of the iterations. The median (5th and 95th percentiles) TOC was 24 days (10, 43) in nursery operations and 78 days (26, 134) in grow-to-finish operations. Most feed interruptions, if they emerged, occurred within 15 days of an outbreak. The median (5th and 95th percentiles) time at which either overcrowding or feed interruption emerged was 19 days (4, 42) in nursery and 57 days (4, 130) in grow-to-finish operations. The study findings suggest that

  5. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  6. Analysis of Nonstationary Time Series for Biological Rhythms Research.

    Science.gov (United States)

    Leise, Tanya L

    2017-06-01

    This article is part of a Journal of Biological Rhythms series exploring analysis and statistics topics relevant to researchers in biological rhythms and sleep research. The goal is to provide an overview of the most common issues that arise in the analysis and interpretation of data in these fields. In this article on time series analysis for biological rhythms, we describe some methods for assessing the rhythmic properties of time series, including tests of whether a time series is indeed rhythmic. Because biological rhythms can exhibit significant fluctuations in their period, phase, and amplitude, their analysis may require methods appropriate for nonstationary time series, such as wavelet transforms, which can measure how these rhythmic parameters change over time. We illustrate these methods using simulated and real time series.

  7. Track Irregularity Time Series Analysis and Trend Forecasting

    OpenAIRE

    Jia Chaolong; Xu Weixiang; Wang Futian; Wang Hanning

    2012-01-01

    The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1) is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changin...

  8. A Comparative Analysis of Short Time Series Processing Methods

    OpenAIRE

    Kiršners, A; Borisovs, A

    2012-01-01

    This article analyzes the traditional time series processing methods that are used to perform the task of short time series analysis in demand forecasting. The main aim of this paper is to scrutinize the ability of these methods to be used when analyzing short time series. The analyzed methods include exponential smoothing, exponential smoothing with the development trend and moving average method. The paper gives the description of the structure and main operating princi...

  9. Bag-of-Temporal-SIFT-Words for Time Series Classification

    OpenAIRE

    Bailly , Adeline; Malinowski , Simon; Tavenard , Romain; Guyet , Thomas; Chapel , Laetitia

    2015-01-01

    International audience; Time series classification is an application of particular interest with the increase of data to monitor. Classical techniques for time series classification rely on point-to-point distances. Recently, Bag-of-Words approaches have been used in this context. Words are quantized versions of simple features extracted from sliding windows. The SIFT framework has proved efficient for image classification. In this paper, we design a time series classification scheme that bui...

  10. TIME SERIES WORKSHOP” OBSERVATIONS DATA PROCESSING TOOL

    OpenAIRE

    Shapovalova, L. L

    2017-01-01

    The new tool for mathematical and visual processing of time series is reresented. The program ”Time Series WorkShop” (TSW) is specialized for processing visual observations of variable stars. An open structure of the allows to apply any old and new mathematical methods for searching any parameters of variability. The program also allows to visualize the time series and any  calculation results (periodograms, histograms, light curves and their smoothing curves) in a camera-ready form. The foll...

  11. Capturing Structure Implicitly from Time-Series having Limited Data

    OpenAIRE

    Emaasit, Daniel; Johnson, Matthew

    2018-01-01

    Scientific fields such as insider-threat detection and highway-safety planning often lack sufficient amounts of time-series data to estimate statistical models for the purpose of scientific discovery. Moreover, the available limited data are quite noisy. This presents a major challenge when estimating time-series models that are robust to overfitting and have well-calibrated uncertainty estimates. Most of the current literature in these fields involve visualizing the time-series for noticeabl...

  12. The sample autocorrelation function of non-linear time series

    NARCIS (Netherlands)

    Basrak, Bojan

    2000-01-01

    When studying a real-life time series, it is frequently reasonable to assume, possibly after a suitable transformation, that the data come from a stationary time series (Xt). This means that the finite-dimensional distributions of this sequence are invariant under shifts of time. Various stationary

  13. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  14. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  15. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  16. Interpretable Early Classification of Multivariate Time Series

    Science.gov (United States)

    Ghalwash, Mohamed F.

    2013-01-01

    Recent advances in technology have led to an explosion in data collection over time rather than in a single snapshot. For example, microarray technology allows us to measure gene expression levels in different conditions over time. Such temporal data grants the opportunity for data miners to develop algorithms to address domain-related problems,…

  17. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  18. Assessing Local Turbulence Strength from a Time Series

    Directory of Open Access Journals (Sweden)

    Mayer Humi

    2010-01-01

    Full Text Available We study the possible link between “local turbulence strength” in a flow which is represented by a finite time series and a “chaotic invariant”, namely, the leading Lyaponuv exponent that characterizes this series. To validate a conjecture about this link, we analyze several time series of measurements taken by a plane flying at constant height in the upper troposphere. For each of these time series we estimate the leading Lyaponuv exponent which we then correlate with the structure constants for the temperature. In addition, we introduce a quantitative technique to educe the scale contents of the flow and a methodology to validate its spectrum.

  19. Time Series Periodicity for Lava Compositions of Shield Volcanoes Above Hot Spots

    Science.gov (United States)

    Sharapov, V.; Zhmodik, A.

    2005-05-01

    Thoileitic sheets for the Siberian Platform trapps and basalts of Hawaii and Iceland have similar time series for lava profiles in the following respects: 1. periodicity 2. interruption of distribution functions both at the suit boundaries and within continuous discharge of active volcanoes 3. presence or absence of time trends 4. substantial difference in distribution function types for profiles resulted from fissure conduit located at 30-40 km distance from each other. Wavelet analysis shows the functions for tholeiitic shield volcanoes are similar for the time intervals form 20 to 20000 years. Spectral characteristics of petrogenous and trace components as well as isotope ratios show coherent and incoherent values for frequency spectra. The main tendencies for evolution of compositions in completed sequences are similar. The spatial zoning depends on structural and geodynamic conditions of a lava sheet formation and its size. The trapps of the Siberian Platform can be regarded as unique structures from this viewpoint. The most various are time and spatial series for the Hawaiian Island. Two types of time series evolution can be distinguished: `Icelandic' and `Hawaiian'. It is interesting to note, that for the Western Siberian Slab the former type is typical, and for the Siberian Platform - the latter. The reasons for this phenomenon are still not clear. This work was supported by the RFBR (Grants No. 04-05-64276, No. 04-05-64107, and No. 04-05-64332).

  20. Useful Pattern Mining on Time Series

    DEFF Research Database (Denmark)

    Goumatianos, Nikitas; Christou, Ioannis T; Lindgren, Peter

    2013-01-01

    % or higher increase (or, alternatively, decrease) in a chosen property of the stock (e.g. close-value) within a given time-window (e.g. 5 days). Initial results from a first prototype implementation of the architecture show that after training on a large set of stocks, the system is capable of finding...

  1. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations. Keywords. Cantor set; time series; earthquake; market crash. PACS Nos 05.00; 02.50.-r; 64.60; 89.65.Gh; 95.75.Wx. 1. Introduction. Capturing dynamical patterns of ...

  2. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.

  3. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  4. Time series analyses of mean monthly rainfall for drought ...

    African Journals Online (AJOL)

    This paper analyses the time series characteristics of rainfall data for Sokoto metropolis for 40 years with a view to understanding drought management. Data for this study was obtained from the Nigeria Metrological Agency (NIMET), Sokoto Airport; Sokoto. The data was subjected to time series tests (trend, cycle, seasonal ...

  5. Critical values for unit root tests in seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); B. Hobijn (Bart)

    1997-01-01

    textabstractIn this paper, we present tables with critical values for a variety of tests for seasonal and non-seasonal unit roots in seasonal time series. We consider (extensions of) the Hylleberg et al. and Osborn et al. test procedures. These extensions concern time series with increasing seasonal

  6. Time Series Econometrics for the 21st Century

    Science.gov (United States)

    Hansen, Bruce E.

    2017-01-01

    The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…

  7. Time series forecasting based on deep extreme learning machine

    NARCIS (Netherlands)

    Guo, Xuqi; Pang, Y.; Yan, Gaowei; Qiao, Tiezhu; Yang, Guang-Hong; Yang, Dan

    2017-01-01

    Multi-layer Artificial Neural Networks (ANN) has caught widespread attention as a new method for time series forecasting due to the ability of approximating any nonlinear function. In this paper, a new local time series prediction model is established with the nearest neighbor domain theory, in

  8. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  9. 461 TIME SERIES ANALYSES OF MEAN MONTHLY RAINFALL ...

    African Journals Online (AJOL)

    Osondu

    Abstract. This paper analyses the time series characteristics of rainfall data for Sokoto metropolis for 40 years with a view to understanding drought management. Data for this study was obtained from the. Nigeria Metrological Agency (NIMET), Sokoto Airport; Sokoto. The data was subjected to time series tests (trend, cycle ...

  10. Time series prediction of apple scab using meteorological ...

    African Journals Online (AJOL)

    A new prediction model for the early warning of apple scab is proposed in this study. The method is based on artificial intelligence and time series prediction. The infection period of apple scab was evaluated as the time series prediction model instead of summation of wetness duration. Also, the relations of different ...

  11. A Dynamic Fuzzy Cluster Algorithm for Time Series

    Directory of Open Access Journals (Sweden)

    Min Ji

    2013-01-01

    clustering time series by introducing the definition of key point and improving FCM algorithm. The proposed algorithm works by determining those time series whose class labels are vague and further partitions them into different clusters over time. The main advantage of this approach compared with other existing algorithms is that the property of some time series belonging to different clusters over time can be partially revealed. Results from simulation-based experiments on geographical data demonstrate the excellent performance and the desired results have been obtained. The proposed algorithm can be applied to solve other clustering problems in data mining.

  12. Frontiers in Time Series and Financial Econometrics : An overview

    NARCIS (Netherlands)

    S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)

    2015-01-01

    markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time

  13. Frontiers in Time Series and Financial Econometrics: An Overview

    NARCIS (Netherlands)

    S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)

    2015-01-01

    markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time

  14. Effectiveness of Multivariate Time Series Classification Using Shapelets

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available Typically, time series classifiers require signal pre-processing (filtering signals from noise and artifact removal, etc., enhancement of signal features (amplitude, frequency, spectrum, etc., classification of signal features in space using the classical techniques and classification algorithms of multivariate data. We consider a method of classifying time series, which does not require enhancement of the signal features. The method uses the shapelets of time series (time series shapelets i.e. small fragments of this series, which reflect properties of one of its classes most of all.Despite the significant number of publications on the theory and shapelet applications for classification of time series, the task to evaluate the effectiveness of this technique remains relevant. An objective of this publication is to study the effectiveness of a number of modifications of the original shapelet method as applied to the multivariate series classification that is a littlestudied problem. The paper presents the problem statement of multivariate time series classification using the shapelets and describes the shapelet–based basic method of binary classification, as well as various generalizations and proposed modification of the method. It also offers the software that implements a modified method and results of computational experiments confirming the effectiveness of the algorithmic and software solutions.The paper shows that the modified method and the software to use it allow us to reach the classification accuracy of about 85%, at best. The shapelet search time increases in proportion to input data dimension.

  15. Period Estimation in Astronomical Time Series Using Slotted Correntropy

    OpenAIRE

    Huijse, Pablo; Estévez, Pablo A.; Zegers, Pablo; Príncipe, José; Protopapas, Pavlos

    2011-01-01

    In this letter, we propose a method for period estimation in light curves from periodic variable stars using correntropy. Light curves are astronomical time series of stellar brightness over time, and are characterized as being noisy and unevenly sampled. We propose to use slotted time lags in order to estimate correntropy directly from irregularly sampled time series. A new information theoretic metric is proposed for discriminating among the peaks of the correntropy spectral density. The sl...

  16. Using SAR satellite data time series for regional glacier mapping

    Directory of Open Access Journals (Sweden)

    S. H. Winsvold

    2018-03-01

    Full Text Available With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8 and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.

  17. Using SAR satellite data time series for regional glacier mapping

    Science.gov (United States)

    Winsvold, Solveig H.; Kääb, Andreas; Nuth, Christopher; Andreassen, Liss M.; van Pelt, Ward J. J.; Schellenberger, Thomas

    2018-03-01

    With dense SAR satellite data time series it is possible to map surface and subsurface glacier properties that vary in time. On Sentinel-1A and RADARSAT-2 backscatter time series images over mainland Norway and Svalbard, we outline how to map glaciers using descriptive methods. We present five application scenarios. The first shows potential for tracking transient snow lines with SAR backscatter time series and correlates with both optical satellite images (Sentinel-2A and Landsat 8) and equilibrium line altitudes derived from in situ surface mass balance data. In the second application scenario, time series representation of glacier facies corresponding to SAR glacier zones shows potential for a more accurate delineation of the zones and how they change in time. The third application scenario investigates the firn evolution using dense SAR backscatter time series together with a coupled energy balance and multilayer firn model. We find strong correlation between backscatter signals with both the modeled firn air content and modeled wetness in the firn. In the fourth application scenario, we highlight how winter rain events can be detected in SAR time series, revealing important information about the area extent of internal accumulation. In the last application scenario, averaged summer SAR images were found to have potential in assisting the process of mapping glaciers outlines, especially in the presence of seasonal snow. Altogether we present examples of how to map glaciers and to further understand glaciological processes using the existing and future massive amount of multi-sensor time series data.

  18. Automated analysis of brachial ultrasound time series

    Science.gov (United States)

    Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan

    1998-07-01

    Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.

  19. Sensor-Generated Time Series Events: A Definition Language

    Directory of Open Access Journals (Sweden)

    Juan Pazos

    2012-08-01

    Full Text Available There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  20. Sensor-Generated Time Series Events: A Definition Language

    Science.gov (United States)

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  1. Fractal dimension of wind speed time series

    International Nuclear Information System (INIS)

    Chang, Tian-Pau; Ko, Hong-Hsi; Liu, Feng-Jiao; Chen, Pai-Hsun; Chang, Ying-Pin; Liang, Ying-Hsin; Jang, Horng-Yuan; Lin, Tsung-Chi; Chen, Yi-Hwa

    2012-01-01

    Highlights: ► Fractal dimension of wind speeds in Taiwan is studied considering climate factors. ► Relevant algorithms for the calculation of fractal dimension are presented graphically. ► Fractal dimension reveals negative correlation with mean wind speed. ► Fractal dimension is not lower even wind distribution is well described by Weibull pdf. - Abstract: The fluctuation of wind speed within a specific time period affects a lot the energy conversion rate of wind turbine. In this paper, the concept of fractal dimension in chaos theory is applied to investigate wind speed characterizations; numerical algorithms for the calculation of the fractal dimension are presented graphically. Wind data selected is observed at three wind farms experiencing different climatic conditions from 2006 to 2008 in Taiwan, where wind speed distribution can be properly classified to high wind season from October to March and low wind season from April to September. The variations of fractal dimensions among different wind farms are analyzed from the viewpoint of climatic conditions. The results show that the wind speeds studied are characterized by medium to high values of fractal dimension; the annual dimension values lie between 1.61 and 1.66. Because of monsoon factor, the fluctuation of wind speed during high wind months is not as significant as that during low wind months; the value of fractal dimension reveals negative correlation with that of mean wind speed, irrespective of wind farm considered. For a location where the wind distribution is well described by Weibull function, its fractal dimension is not necessarily lower. These findings are useful to wind analysis.

  2. Current interruption transients calculation

    CERN Document Server

    Peelo, David F

    2014-01-01

    Provides an original, detailed and practical description of current interruption transients, origins, and the circuits involved, and how they can be calculated Current Interruption Transients Calculationis a comprehensive resource for the understanding, calculation and analysis of the transient recovery voltages (TRVs) and related re-ignition or re-striking transients associated with fault current interruption and the switching of inductive and capacitive load currents in circuits. This book provides an original, detailed and practical description of current interruption transients, origins,

  3. DEM time series of an agricultural watershed

    Science.gov (United States)

    Pineux, Nathalie; Lisein, Jonathan; Swerts, Gilles; Degré, Aurore

    2014-05-01

    In agricultural landscape soil surface evolves notably due to erosion and deposition phenomenon. Even if most of the field data come from plot scale studies, the watershed scale seems to be more appropriate to understand them. Currently, small unmanned aircraft systems and images treatments are improving. In this way, 3D models are built from multiple covering shots. When techniques for large areas would be to expensive for a watershed level study or techniques for small areas would be too time consumer, the unmanned aerial system seems to be a promising solution to quantify the erosion and deposition patterns. The increasing technical improvements in this growth field allow us to obtain a really good quality of data and a very high spatial resolution with a high Z accuracy. In the center of Belgium, we equipped an agricultural watershed of 124 ha. For three years (2011-2013), we have been monitoring weather (including rainfall erosivity using a spectropluviograph), discharge at three different locations, sediment in runoff water, and watershed microtopography through unmanned airborne imagery (Gatewing X100). We also collected all available historical data to try to capture the "long-term" changes in watershed morphology during the last decades: old topography maps, soil historical descriptions, etc. An erosion model (LANDSOIL) is also used to assess the evolution of the relief. Short-term evolution of the surface are now observed through flights done at 200m height. The pictures are taken with a side overlap equal to 80%. To precisely georeference the DEM produced, ground control points are placed on the study site and surveyed using a Leica GPS1200 (accuracy of 1cm for x and y coordinates and 1.5cm for the z coordinate). Flights are done each year in December to have an as bare as possible ground surface. Specific treatments are developed to counteract vegetation effect because it is know as key sources of error in the DEM produced by small unmanned aircraft

  4. The impact of family policy and career interruptions on women's perceptions of negative occupational consequences of full-time home care

    DEFF Research Database (Denmark)

    Ejrnæs, Anders

    2011-01-01

    This article examines the role of family policy in shaping mothers' choice between work and care and the perceived occupational consequences of that choice. A central question concerns how parental/maternal leave and childcare policies affect the occupational consequences for mothers who spend time...... on full-time caring. Using comparative data from the second round of the 2004/05 European Social Survey, the analysis shows that the duration of career interruption due to care-giving and different care policies influence mothers' subjective feelings about caring for children having negative consequences...

  5. Database for Hydrological Time Series of Inland Waters (DAHITI)

    Science.gov (United States)

    Schwatke, Christian; Dettmering, Denise

    2016-04-01

    Satellite altimetry was designed for ocean applications. However, since some years, satellite altimetry is also used over inland water to estimate water level time series of lakes, rivers and wetlands. The resulting water level time series can help to understand the water cycle of system earth and makes altimetry to a very useful instrument for hydrological applications. In this poster, we introduce the "Database for Hydrological Time Series of Inland Waters" (DAHITI). Currently, the database contains about 350 water level time series of lakes, reservoirs, rivers, and wetlands which are freely available after a short registration process via http://dahiti.dgfi.tum.de. In this poster, we introduce the product of DAHITI and the functionality of the DAHITI web service. Furthermore, selected examples of inland water targets are presented in detail. DAHITI provides time series of water level heights of inland water bodies and their formal errors . These time series are available within the period of 1992-2015 and have varying temporal resolutions depending on the data coverage of the investigated water body. The accuracies of the water level time series depend mainly on the extent of the investigated water body and the quality of the altimeter measurements. Hereby, an external validation with in-situ data reveals RMS differences between 5 cm and 40 cm for lakes and 10 cm and 140 cm for rivers, respectively.

  6. vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series(. )t ... showed that vector bilinear autoregressive (BIVAR) models provide better estimates than the long embraced linear models. ... order moving average (MA) polynomials on backward shift operator B ...

  7. Conditional time series forecasting with convolutional neural networks

    NARCIS (Netherlands)

    A. Borovykh (Anastasia); S.M. Bohte (Sander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractForecasting financial time series using past observations has been a significant topic of interest. While temporal relationships in the data exist, they are difficult to analyze and predict accurately due to the non-linear trends and noise present in the series. We propose to learn these

  8. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  9. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis.

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-03-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches.

  10. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  11. The Relevance of Coordinate Time Series Analysis In Permanent GPS Networks: The European Example

    Science.gov (United States)

    Kenyeres, A.

    An evolving research tendency in Earth sciences is the use and exploitation of multi- purpose, permanently operating GNSS stations. They may serve the everyday practice (navigation, geodesy), specific research activities (e.g. geokinematics) and near-future applications (e.g. meteorology) as well. The long-term maintenance of station and product consistency is vital to guarantee the quality in all cases. This paper concen- trates on the geokinematic application of the GPS, where station velocities are derived from the long-term observation series of the permanent stations. This application re- quires the most sophisticated and careful analysis of station coordinate time series in order to derive mm-accuracy velocities. However the long term consistency of the times series is corrupted by events caused by technical interruptions (hardware config- uration changes), environmental effects (emerging noise and multipath sources) and human interactions (changes in network configuration and processing scheme). The general effects of all these phenomena are seen in the time series as jumps and outlier periods and they decrease the accuracy and reliability of the estimated velocities. The careful analysis, identification, bookkeeping and elimination of the nuisance factors should be a basic research activity in order to recover the capabilities of the observing system. Within the EUREF Permanent Network (EPN) an effort has been started in 2000, in order to perform a review-analysis of the EPN weekly combined solutions. The coordinate time series of all stations are examined, the jumps and outliers are eliminated and a cleaned multi-year combination will be computed. This retrospective work will be completed in 2002, which will be followed by a continuous tracking of the station performance. In this paper the experiences and the actual status of this work are presented.

  12. Characterizing interdependencies of multiple time series theory and applications

    CERN Document Server

    Hosoya, Yuzo; Takimoto, Taro; Kinoshita, Ryo

    2017-01-01

    This book introduces academic researchers and professionals to the basic concepts and methods for characterizing interdependencies of multiple time series in the frequency domain. Detecting causal directions between a pair of time series and the extent of their effects, as well as testing the non existence of a feedback relation between them, have constituted major focal points in multiple time series analysis since Granger introduced the celebrated definition of causality in view of prediction improvement. Causality analysis has since been widely applied in many disciplines. Although most analyses are conducted from the perspective of the time domain, a frequency domain method introduced in this book sheds new light on another aspect that disentangles the interdependencies between multiple time series in terms of long-term or short-term effects, quantitatively characterizing them. The frequency domain method includes the Granger noncausality test as a special case. Chapters 2 and 3 of the book introduce an i...

  13. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  14. Detecting structural breaks in time series via genetic algorithms

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2016-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behaviour of the time series substantially changes. Typically, no solid background knowledge...... of the time series under consideration is available. Therefore, a black-box optimization approach is our method of choice for detecting structural breaks. We describe a genetic algorithm framework which easily adapts to a large number of statistical settings. To evaluate the usefulness of different crossover...... operator alone. Moreover, we present a specific fitness function which exploits the sparse structure of the break points and which can be evaluated particularly efficiently. The experiments on artificial and real-world time series show that the resulting algorithm detects break points with high precision...

  15. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  16. Financial Time-series Analysis: a Brief Overview

    Science.gov (United States)

    Chakraborti, A.; Patriarca, M.; Santhanam, M. S.

    Prices of commodities or assets produce what is called time-series. Different kinds of financial time-series have been recorded and studied for decades. Nowadays, all transactions on a financial market are recorded, leading to a huge amount of data available, either for free in the Internet or commercially. Financial time-series analysis is of great interest to practitioners as well as to theoreticians, for making inferences and predictions. Furthermore, the stochastic uncertainties inherent in financial time-series and the theory needed to deal with them make the subject especially interesting not only to economists, but also to statisticians and physicists [1]. While it would be a formidable task to make an exhaustive review on the topic, with this review we try to give a flavor of some of its aspects.

  17. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  18. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  19. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  20. Unsupervised land cover change detection: meaningful sequential time series analysis

    CSIR Research Space (South Africa)

    Salmon, BP

    2011-06-01

    Full Text Available An automated land cover change detection method is proposed that uses coarse spatial resolution hyper-temporal earth observation satellite time series data. The study compared three different unsupervised clustering approaches that operate on short...

  1. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  2. Signal Processing for Time-Series Functions on a Graph

    Science.gov (United States)

    2018-02-01

    ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time-Series Functions on a Graph by Humberto Muñoz-Barona, Jean Vettel, and...ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time-Series Functions on a Graph by Humberto Muñoz-Barona Southern University...addison.w.bohannon.civ@mail.mil>. Previous research introduced signal processing on graphs, an approach to generalize signal processing tools such

  3. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  4. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  5. Extracting Chaos Control Parameters from Time Series Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Santos, R B B [Centro Universitario da FEI, Avenida Humberto de Alencar Castelo Branco 3972, 09850-901, Sao Bernardo do Campo, SP (Brazil); Graves, J C, E-mail: rsantos@fei.edu.br [Instituto Tecnologico de Aeronautica, Praca Marechal Eduardo Gomes 50, 12228-900, Sao Jose dos Campos, SP (Brazil)

    2011-03-01

    We present a simple method to analyze time series, and estimate the parameters needed to control chaos in dynamical systems. Application of the method to a system described by the logistic map is also shown. Analyzing only two 100-point time series, we achieved results within 2% of the analytical ones. With these estimates, we show that OGY control method successfully stabilized a period-1 unstable periodic orbit embedded in the chaotic attractor.

  6. An innovation approach to non-Gaussian time series analysis

    OpenAIRE

    Ozaki, Tohru; Iino, Mitsunori

    2001-01-01

    The paper shows that the use of both types of random noise, white noise and Poisson noise, can be justified when using an innovations approach. The historical background for this is sketched, and then several methods of whitening dependent time series are outlined, including a mixture of Gaussian white noise and a compound Poisson process: this appears as a natural extension of the Gaussian white noise model for the prediction errors of a non-Gaussian time series. A stati...

  7. SEASONAL AUTOREGRESSIVE INTEGRATED MOVING AVERAGE MODEL FOR PRECIPITATION TIME SERIES

    OpenAIRE

    Yan Wang; Meng Gao; Xinghua Chang; Xiyong Hou

    2012-01-01

    Predicting the trend of precipitation is a difficult task in meteorology and environmental sciences. Statistical approaches from time series analysis provide an alternative way for precipitation prediction. The ARIMA model incorporating seasonal characteristics, which is referred to as seasonal ARIMA model was presented. The time series data is the monthly precipitation data in Yantai, China and the period is from 1961 to 2011. The model was denoted as SARIMA (1, 0, 1) (0, 1, 1)12 in this stu...

  8. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  9. Combined Forecasts from Linear and Nonlinear Time Series Models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  10. Combined forecasts from linear and nonlinear time series models

    NARCIS (Netherlands)

    N. Terui (Nobuhiko); H.K. van Dijk (Herman)

    1999-01-01

    textabstractCombined forecasts from a linear and a nonlinear model are investigated for time series with possibly nonlinear characteristics. The forecasts are combined by a constant coefficient regression method as well as a time varying method. The time varying method allows for a locally

  11. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  12. Period Estimation in Astronomical Time Series Using Slotted Correntropy

    Science.gov (United States)

    Huijse, Pablo; Estevez, Pablo A.; Zegers, Pablo; Principe, José C.; Protopapas, Pavlos

    2011-06-01

    In this letter, we propose a method for period estimation in light curves from periodic variable stars using correntropy. Light curves are astronomical time series of stellar brightness over time, and are characterized as being noisy and unevenly sampled. We propose to use slotted time lags in order to estimate correntropy directly from irregularly sampled time series. A new information theoretic metric is proposed for discriminating among the peaks of the correntropy spectral density. The slotted correntropy method outperformed slotted correlation, string length, VarTools (Lomb-Scargle periodogram and Analysis of Variance), and SigSpec applications on a set of light curves drawn from the MACHO survey.

  13. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  14. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  15. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  16. Evaluation of Scaling Invariance Embedded in Short Time Series

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length . Calculations with specified Hurst exponent values of show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias () and sharp confidential interval (standard deviation ). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records. PMID:25549356

  17. State-level gonorrhea rates and expedited partner therapy laws: insights from time series analyses.

    Science.gov (United States)

    Owusu-Edusei, K; Cramer, R; Chesson, H W; Gift, T L; Leichliter, J S

    2017-06-01

    In this study, we examined state-level monthly gonorrhea morbidity and assessed the potential impact of existing expedited partner therapy (EPT) laws in relation to the time that the laws were enacted. Longitudinal study. We obtained state-level monthly gonorrhea morbidity (number of cases/100,000 for males, females and total) from the national surveillance data. We used visual examination (of morbidity trends) and an autoregressive time series model in a panel format with intervention (interrupted time series) analysis to assess the impact of state EPT laws based on the months in which the laws were enacted. For over 84% of the states with EPT laws, the monthly morbidity trends did not show any noticeable decreases on or after the laws were enacted. Although we found statistically significant decreases in gonorrhea morbidity within four of the states with EPT laws (Alaska, Illinois, Minnesota, and Vermont), there were no significant decreases when the decreases in the four states were compared contemporaneously with the decreases in states that do not have the laws. We found no impact (decrease in gonorrhea morbidity) attributable exclusively to the EPT law(s). However, these results do not imply that the EPT laws themselves were not effective (or failed to reduce gonorrhea morbidity), because the effectiveness of the EPT law is dependent on necessary intermediate events/outcomes, including sexually transmitted infection service providers' awareness and practice, as well as acceptance by patients and their partners. Published by Elsevier Ltd.

  18. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  19. Drunk driving detection based on classification of multivariate time series.

    Science.gov (United States)

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  20. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  1. Biogeochemistry from Gliders at the Hawaii Ocean Times-Series

    Science.gov (United States)

    Nicholson, D. P.; Barone, B.; Karl, D. M.

    2016-02-01

    At the Hawaii Ocean Time-series (HOT) autonomous, underwater gliders equipped with biogeochemical sensors observe the oceans for months at a time, sampling spatiotemporal scales missed by the ship-based programs. Over the last decade, glider data augmented by a foundation of time-series observations have shed light on biogeochemical dynamics occuring spatially at meso- and submesoscales and temporally on scales from diel to annual. We present insights gained from the synergy between glider observations, time-series measurements and remote sensing in the subtropical North Pacific. We focus on diel variability observed in dissolved oxygen and bio-optics and approaches to autonomously quantify net community production and gross primary production (GPP) as developed during the 2012 Hawaii Ocean Experiment - DYnamics of Light And Nutrients (HOE-DYLAN). Glider-based GPP measurements were extended to explore the relationship between GPP and mesoscale context over multiple years of Seaglider deployments.

  2. Compounding approach for univariate time series with nonstationary variances

    Science.gov (United States)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  3. The Application of Bayesian Spectral Analysis in Photometric Time Series

    Directory of Open Access Journals (Sweden)

    saeideh latif

    2017-11-01

    Full Text Available The present paper introduces the Bayesian spectral analysis as a powerful and efficient method for spectral analysis of photometric time series. For this purpose, Bayesian spectral analysis has programmed in Matlab software for XZ Dra photometric time series which is non-uniform with large gaps and the power spectrum of this analysis has compared with the power spectrum which obtained from the Period04 software, which designed for statistical analysis of astronomical time series and used of artificial data for unify the time series. Although in the power spectrum of this software, the main spectral peak which represent the main frequency of XZ Dra variable star oscillations in the f = 2.09864 (day -1 is well known but false spectral peaks are also seen. Also, in this software it’s not clear how to generate the synthetic data. These false peaks have been removed in the power spectrum which obtained from the Bayesian analysis; also this spectral peak which is around the desired frequency has a shorter width and is more accurate. It should be noted that in Bayesian spectral analysis, it’s not require to unify the time series for obtaining a desired power spectrum. Moreover, the researcher also becomes aware of the exact calculation process.

  4. Recurrent Neural Network Applications for Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  5. Using empirical mode decomposition to correlate paleoclimatic time-series

    Directory of Open Access Journals (Sweden)

    J. Solé

    2007-01-01

    Full Text Available Determination of the timing and duration of paleoclimatic events is a challenging task. Classical techniques for time-series analysis rely too strongly on having a constant sampling rate, which poorly adapts to the uneven time recording of paleoclimatic variables; new, more flexible methods issued from Non-Linear Physics are hence required. In this paper, we have used Huang's Empirical Mode Decomposition (EMD for the analysis of paleoclimatic series. We have studied three different time series of temperature proxies, characterizing oscillation patterns by using EMD. To measure the degree of temporal correlation of two variables, we have developed a method that relates couples of modes from different series by calculating the instantaneous phase differences among the associated modes. We observed that when two modes exhibited a constant phase difference, their frequencies were nearly equal to that of Milankovich cycles. Our results show that EMD is a good methodology not only for synchronization of different records but also for determination of the different local frequencies in each time series. Some of the obtained modes may be interpreted as the result of global forcing mechanisms.

  6. Analysis and generation of groundwater concentration time series

    Science.gov (United States)

    Crăciun, Maria; Vamoş, Călin; Suciu, Nicolae

    2018-01-01

    Concentration time series are provided by simulated concentrations of a nonreactive solute transported in groundwater, integrated over the transverse direction of a two-dimensional computational domain and recorded at the plume center of mass. The analysis of a statistical ensemble of time series reveals subtle features that are not captured by the first two moments which characterize the approximate Gaussian distribution of the two-dimensional concentration fields. The concentration time series exhibit a complex preasymptotic behavior driven by a nonstationary trend and correlated fluctuations with time-variable amplitude. Time series with almost the same statistics are generated by successively adding to a time-dependent trend a sum of linear regression terms, accounting for correlations between fluctuations around the trend and their increments in time, and terms of an amplitude modulated autoregressive noise of order one with time-varying parameter. The algorithm generalizes mixing models used in probability density function approaches. The well-known interaction by exchange with the mean mixing model is a special case consisting of a linear regression with constant coefficients.

  7. Continuous baseflow separation from time series of daily and ...

    African Journals Online (AJOL)

    Continuous baseflow separation procedures have been frequently used to differentiate total flows into the high-frequency, lowamplitude 'baseflow' component and the low-frequency, high-amplitude 'flood' flows. In the past, such procedures have normally been applied to streamflow time-series data with time steps of 1 day ...

  8. Efficient use of correlation entropy for analysing time series data

    Indian Academy of Sciences (India)

    specific data sets. The technique uses the scalar time series to reconstruct the dy- namics in an embedding space of dimension M using delay coordinates scanned at a suitable time delay τ. But a major difficulty in implementing this procedure is that, the scaling region in the correlation sum for the computation of D2 and K2 ...

  9. Sparse time series chain graphical models for reconstructing genetic networks

    NARCIS (Netherlands)

    Abegaz, Fentaw; Wit, Ernst

    We propose a sparse high-dimensional time series chain graphical model for reconstructing genetic networks from gene expression data parametrized by a precision matrix and autoregressive coefficient matrix. We consider the time steps as blocks or chains. The proposed approach explores patterns of

  10. Time Series Prediction based on Hybrid Neural Networks

    Directory of Open Access Journals (Sweden)

    S. A. Yarushev

    2016-01-01

    Full Text Available In this paper, we suggest to use hybrid approach to time series forecasting problem. In first part of paper, we create a literature review of time series forecasting methods based on hybrid neural networks and neuro-fuzzy approaches. Hybrid neural networks especially effective for specific types of applications such as forecasting or classification problem, in contrast to traditional monolithic neural networks. These classes of problems include problems with different characteristics in different modules. The main part of paper create a detailed overview of hybrid networks benefits, its architectures and performance under traditional neural networks. Hybrid neural networks models for time series forecasting are discussed in the paper. Experiments with modular neural networks are given.

  11. Appropriate Algorithms for Nonlinear Time Series Analysis in Psychology

    Science.gov (United States)

    Scheier, Christian; Tschacher, Wolfgang

    Chaos theory has a strong appeal for psychology because it allows for the investigation of the dynamics and nonlinearity of psychological systems. Consequently, chaos-theoretic concepts and methods have recently gained increasing attention among psychologists and positive claims for chaos have been published in nearly every field of psychology. Less attention, however, has been paid to the appropriateness of chaos-theoretic algorithms for psychological time series. An appropriate algorithm can deal with short, noisy data sets and yields `objective' results. In the present paper it is argued that most of the classical nonlinear techniques don't satisfy these constraints and thus are not appropriate for psychological data. A methodological approach is introduced that is based on nonlinear forecasting and the method of surrogate data. In artificial data sets and empirical time series we can show that this methodology reliably assesses nonlinearity and chaos in time series even if they are short and contaminated by noise.

  12. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  13. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  14. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  15. Semi-autonomous remote sensing time series generation tool

    Science.gov (United States)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  16. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  17. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  18. Easily adaptable complexity measure for finite time series.

    Science.gov (United States)

    Ke, Da-Guan; Tong, Qin-Ye

    2008-06-01

    We present a complexity measure for any finite time series. This measure has invariance under any monotonic transformation of the time series, has a degree of robustness against noise, and has the adaptability of satisfying almost all the widely accepted but conflicting criteria for complexity measurements. Surprisingly, the measure is developed from Kolmogorov complexity, which is traditionally believed to represent only randomness and to satisfy one criterion to the exclusion of the others. For familiar iterative systems, our treatment may imply a heuristic approach to transforming symbolic dynamics into permutation dynamics and vice versa.

  19. Handbook of Time Series Analysis Recent Theoretical Developments and Applications

    CERN Document Server

    Schelter, Björn; Timmer, Jens

    2006-01-01

    This handbook provides an up-to-date survey of current research topics and applications of time series analysis methods written by leading experts in their fields. It covers recent developments in univariate as well as bivariate and multivariate time series analysis techniques ranging from physics' to life sciences' applications. Each chapter comprises both methodological aspects and applications to real world complex systems, such as the human brain or Earth's climate. Covering an exceptionally broad spectrum of topics, beginners, experts and practitioners who seek to understand the latest de

  20. Complex network approach for recurrence analysis of time series

    Energy Technology Data Exchange (ETDEWEB)

    Marwan, Norbert, E-mail: marwan@pik-potsdam.d [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donges, Jonathan F. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany); Zou Yong [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany); Donner, Reik V. [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Institute for Transport and Economics, Dresden University of Technology, Andreas-Schubert-Str. 23, 01062 Dresden (Germany)] [Graduate School of Science, Osaka Prefecture University, 1-1 Gakuencho, Naka-ku, Sakai 599-8531 (Japan); Kurths, Juergen [Potsdam Institute for Climate Impact Research, PO Box 601203, 14412 Potsdam (Germany)] [Department of Physics, Humboldt University Berlin, Newtonstr. 15, 12489 Berlin (Germany)

    2009-11-09

    We propose a novel approach for analysing time series using complex network theory. We identify the recurrence matrix (calculated from time series) with the adjacency matrix of a complex network and apply measures for the characterisation of complex networks to this recurrence matrix. By using the logistic map, we illustrate the potential of these complex network measures for the detection of dynamical transitions. Finally, we apply the proposed approach to a marine palaeo-climate record and identify the subtle changes to the climate regime.

  1. Testing for intracycle determinism in pseudoperiodic time series.

    Science.gov (United States)

    Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  2. Deep Learning in Multiple Multistep Time Series Prediction

    OpenAIRE

    Zang, Chuanyun

    2017-01-01

    The project aims to research on combining deep learning specifically Long-Short Memory (LSTM) and basic statistics in multiple multistep time series prediction. LSTM can dive into all the pages and learn the general trends of variation in a large scope, while the well selected medians for each page can keep the special seasonality of different pages so that the future trend will not fluctuate too much from the reality. A recent Kaggle competition on 145K Web Traffic Time Series Forecasting [1...

  3. Time series analysis of onchocerciasis data from Mexico: a trend towards elimination.

    Science.gov (United States)

    Lara-Ramírez, Edgar E; Rodríguez-Pérez, Mario A; Pérez-Rodríguez, Miguel A; Adeleke, Monsuru A; Orozco-Algarra, María E; Arrendondo-Jiménez, Juan I; Guo, Xianwu

    2013-01-01

    In Latin America, there are 13 geographically isolated endemic foci distributed among Mexico, Guatemala, Colombia, Venezuela, Brazil and Ecuador. The communities of the three endemic foci found within Mexico have been receiving ivermectin treatment since 1989. In this study, we predicted the trend of occurrence of cases in Mexico by applying time series analysis to monthly onchocerciasis data reported by the Mexican Secretariat of Health between 1988 and 2011 using the software R. A total of 15,584 cases were reported in Mexico from 1988 to 2011. The data of onchocerciasis cases are mainly from the main endemic foci of Chiapas and Oaxaca. The last case in Oaxaca was reported in 1998, but new cases were reported in the Chiapas foci up to 2011. Time series analysis performed for the foci in Mexico showed a decreasing trend of the disease over time. The best-fitted models with the smallest Akaike Information Criterion (AIC) were Auto-Regressive Integrated Moving Average (ARIMA) models, which were used to predict the tendency of onchocerciasis cases for two years ahead. According to the ARIMA models predictions, the cases in very low number (below 1) are expected for the disease between 2012 and 2013 in Chiapas, the last endemic region in Mexico. The endemic regions of Mexico evolved from high onchocerciasis-endemic states to the interruption of transmission due to the strategies followed by the MSH, based on treatment with ivermectin. The extremely low level of expected cases as predicted by ARIMA models for the next two years suggest that the onchocerciasis is being eliminated in Mexico. To our knowledge, it is the first study utilizing time series for predicting case dynamics of onchocerciasis, which could be used as a benchmark during monitoring and post-treatment surveillance.

  4. A methodology to filter time series: application to minute-by-minute electric load series

    Directory of Open Access Journals (Sweden)

    Mayte Suarez-Farinas

    2004-12-01

    Full Text Available In this article a methodology for filtering a time series is presented, with application to high frequency series such as the minute-by-minute electric load series. The goal of this approach is to detect and substitute the irregularities of the time series that can produce distortions on the modelling stage. Outlier values are detected through a dynamic linear model and the Bayes factor tool; missing values are then interpolated with a Smoothing Cubic Spline. The performance of the proposed approach is illustrated using real data and evaluated through a series of tests where the irregularities have been simulated.Neste artigo apresenta-se uma metodologia para a filtragem de séries temporais, com aplicação em séries de alta freqüência. Esta metodologia tem como objetivo detectar e substituir as irregularidades da série temporal que podem comprometer a etapa de modelagem. São detalhados o modelo linear dinâmico utilizado para detectar os valores outliers e o emprego do Fator de Bayes. Na interpolação de valores faltantes utiliza-se o Spline Cúbico Suavizado. O desempenho da metodologia proposta é avaliado a través de vários testes onde as irregularidade foram simuladas.

  5. Effect of incentive payments on chronic disease management and health services use in British Columbia, Canada: Interrupted time series analysis.

    Science.gov (United States)

    Lavergne, M Ruth; Law, Michael R; Peterson, Sandra; Garrison, Scott; Hurley, Jeremiah; Cheng, Lucy; McGrail, Kimberlyn

    2018-02-01

    We studied the effects of incentive payments to primary care physicians for the care of patients with diabetes, hypertension, and Chronic Obstructive Pulmonary Disease (COPD) in British Columbia, Canada. We used linked administrative health data to examine monthly primary care visits, continuity of care, laboratory testing, pharmaceutical dispensing, hospitalizations, and total h ealth care spending. We examined periods two years before and two years after each incentive was introduced, and used segmented regression to assess whether there were changes in level or trend of outcome measures across all eligible patients following incentive introduction, relative to pre-intervention periods. We observed no increases in primary care visits or continuity of care after incentives were introduced. Rates of ACR testing and antihypertensive dispensing increased among patients with hypertension, but none of the other modest increases in laboratory testing or prescriptions dispensed reached statistical significance. Rates of hospitalizations for stroke and heart failure among patients with hypertension fell relative to pre-intervention patterns, while hospitalizations for COPD increased. Total hospitalizations and hospitalizations via the emergency department did not change. Health care spending increased for patients with hypertension. This large-scale incentive scheme for primary care physicians showed some positive effects for patients with hypertension, but we observe no similar changes in patient management, reductions in hospitalizations, or changes in spending for patients with diabetes and COPD. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  6. Impact of STROBE Statement Publication on Quality of Observational Study Reporting: Interrupted Time Series versus Before-After Analysis

    NARCIS (Netherlands)

    S. Bastuji-Garin (Sylvie); E. Sbidian (Emilie); C. Gaudy-Marqueste (Caroline); E. Ferrat (Emilie); J.C. Roujeau (Jean-Claude); M.A. Richard (Marie-Aleth); F. Canoui-Poitrine (Florence); J.N. Bouwes Bavinck (Jan Nico); P.J. Coenraads (Pieter-Jan); T.L. Diepgen; P. Elsner (Peter); I. Garcia-Doval (Ignacio); J.J. Grob; S. Langan (Sinead); L. Naldi; T.E.C. Nijsten (Tamar); J. Schmitt (Julien); Å. Svensson (Åke); H. Williams

    2013-01-01

    textabstractBackground:In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement

  7. Effect of nocturnal sound reduction on the incidence of delirium in intensive care unit patients: An interrupted time series analysis

    NARCIS (Netherlands)

    van de Pol, Ineke; van Iterson, Mat; Maaskant, Jolanda

    2017-01-01

    Delirium in critically-ill patients is a common multifactorial disorder that is associated with various negative outcomes. It is assumed that sleep disturbances can result in an increased risk of delirium. This study hypothesized that implementing a protocol that reduces overall nocturnal sound

  8. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Current interruption by density depression

    International Nuclear Information System (INIS)

    Wagner, J.S.; Tajima, T.; Akasofu, S.I.

    1985-04-01

    Using a one-dimensional electrostatic particle code, we examine processes associated with current interruption in a collisionless plasma when a density depression is present along the current channel. Current interruption due to double layers was suggested by Alfven and Carlqvist (1967) as a cause of solar flares. At a local density depression, plasma instabilities caused by an electron current flow are accentuated, leading to current disruption. Our simulation study encompasses a wide range of the parameters in such a way that under appropriate conditions, both the Alfven and Carlqvist (1967) regime and the Smith and Priest (1972) regime take place. In the latter regime the density depression decays into a stationary structure (''ion-acoustic layer'') which spawns a series of ion-acoustic ''solitons'' and ion phase space holes travelling upstream. A large inductance of the current circuit tends to enhance the plasma instabilities

  10. Stochastic generation of hourly wind speed time series

    International Nuclear Information System (INIS)

    Shamshad, A.; Wan Mohd Ali Wan Hussin; Bawadi, M.A.; Mohd Sanusi, S.A.

    2006-01-01

    In the present study hourly wind speed data of Kuala Terengganu in Peninsular Malaysia are simulated by using transition matrix approach of Markovian process. The wind speed time series is divided into various states based on certain criteria. The next wind speed states are selected based on the previous states. The cumulative probability transition matrix has been formed in which each row ends with 1. Using the uniform random numbers between 0 and 1, a series of future states is generated. These states have been converted to the corresponding wind speed values using another uniform random number generator. The accuracy of the model has been determined by comparing the statistical characteristics such as average, standard deviation, root mean square error, probability density function and autocorrelation function of the generated data to those of the original data. The generated wind speed time series data is capable to preserve the wind speed characteristics of the observed data

  11. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  12. Tempered fractional time series model for turbulence in geophysical flows

    Science.gov (United States)

    Meerschaert, Mark M.; Sabzikar, Farzad; Phanikumar, Mantha S.; Zeleke, Aklilu

    2014-09-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model.

  13. Tempered fractional time series model for turbulence in geophysical flows

    International Nuclear Information System (INIS)

    Meerschaert, Mark M; Sabzikar, Farzad; Phanikumar, Mantha S; Zeleke, Aklilu

    2014-01-01

    We propose a new time series model for velocity data in turbulent flows. The new model employs tempered fractional calculus to extend the classical 5/3 spectral model of Kolmogorov. Application to wind speed and water velocity in a large lake are presented, to demonstrate the practical utility of the model. (paper)

  14. On the Application of Information in Time Series Analysis

    Czech Academy of Sciences Publication Activity Database

    Klán, Petr; Wilkie, J.; Ankenbrand, T.

    1998-01-01

    Roč. 8, č. 1 (1998), s. 39-49 ISSN 1210-0552 Grant - others:Fonds National Suisse de la Recherche Scientifique (XE) CP93:9630 Keywords : time series analysis * measurement and application of information Subject RIV: BA - General Mathematics

  15. A Non-standard Empirical Likelihood for Time Series

    DEFF Research Database (Denmark)

    Nordman, Daniel J.; Bunzel, Helle; Lahiri, Soumendra N.

    -standard asymptotics and requires a significantly different development compared to standard BEL. We establish the large-sample distribution of log-ratio statistics from the new BEL method for calibrating confidence regions for mean or smooth function parameters of time series. This limit law is not the usual chi...

  16. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...

  17. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    203–210. Two-fractal overlap time series: Earthquakes and market crashes. BIKAS K CHAKRABARTI1,2,∗, ARNAB CHATTERJEE1,3 and. PRATIP BHATTACHARYYA1,4. 1Theoretical Condensed Matter Physics Division and Centre for Applied Mathematics and. Computational Science, Saha Institute of Nuclear Physics, ...

  18. Time Series Data Visualization in World Wide Telescope

    Science.gov (United States)

    Fay, J.

    WorldWide Telescope provides a rich set of timer series visualization for both archival and real time data. WWT consists of both interactive desktop tools for interactive immersive visualization and HTML5 web based controls that can be utilized in customized web pages. WWT supports a range of display options including full dome, power walls, stereo and virtual reality headsets.

  19. Seasonal time series data imputation: Comparison between feed ...

    African Journals Online (AJOL)

    Specifically we examine how recursive and direct estimates from forward and backward learning Artificial Neural Networks (ANN) compares with seasonal ARIMA estimates and interpolation estimates of Additive outliers in seasonal ARIMA models. A comparison statistics is also proposed. Keywords: Time Series; Artificial ...

  20. Detecting cognizable trends of gene expression in a time series ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Genetics; Volume 95; Issue 3. Detecting cognizable trends of gene expression in a time series RNA-sequencing experiment: a bootstrap approach. SHATAKSHEE CHATTERJEE PARTHA P. MAJUMDER PRIYANKA PANDEY. RESEARCH ARTICLE Volume 95 Issue 3 September 2016 pp 587- ...

  1. Buys – Ballot Estimates for time series decomposition | Iwueze ...

    African Journals Online (AJOL)

    An estimation procedure based on the Buys – Ballot (1847) table for time series decomposition is given in this paper. We give two alternative methods called the Chain Base Estimation and Fixed Base Estimation methods. Simulated examples are used to illustrate the methods, while comparing them with the least squares ...

  2. Time series analysis in astronomy: Limits and potentialities

    DEFF Research Database (Denmark)

    Vio, R.; Kristensen, N.R.; Madsen, Henrik

    2005-01-01

    In this paper we consider the problem of the limits concerning the physical information that can be extracted from the analysis of one or more time series ( light curves) typical of astrophysical objects. On the basis of theoretical considerations and numerical simulations, we show that with no a...

  3. Forecasting with nonlinear time series model: A Monte-Carlo ...

    African Journals Online (AJOL)

    In this paper, we propose a new method of forecasting with nonlinear time series model using Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-Carlo method of forecasting using a ...

  4. Outlier detection algorithms for least squares time series regression

    DEFF Research Database (Denmark)

    Johansen, Søren; Nielsen, Bent

    We review recent asymptotic results on some robust methods for multiple regression. The regressors include stationary and non-stationary time series as well as polynomial terms. The methods include the Huber-skip M-estimator, 1-step Huber-skip M-estimators, in particular the Impulse Indicator Sat...

  5. a model for nonlinear innovation in time series

    African Journals Online (AJOL)

    DJFLEX

    heteroscedastic errors are common in financial and econometric time series. The conditional variance may be specified as nonlinear autoregressive conditional heteroscedasticity ...... applied econometrics, 8, 31 – 49. Rao, C. R., 1973. Linear statistical inference and its applications, 2nd edition. New york: John Wiley.

  6. Time series analysis in chaotic diode resonator circuit

    Energy Technology Data Exchange (ETDEWEB)

    Hanias, M.P. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)] e-mail: mhanias@teihal.gr; Giannaris, G. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Spyridakis, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece); Rigas, A. [TEI of Chalkis, GR 34400, Evia, Chalkis (Greece)

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension {nu} and m {sub min}, respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated.

  7. Time series analysis in chaotic diode resonator circuit

    International Nuclear Information System (INIS)

    Hanias, M.P.; Giannaris, G.; Spyridakis, A.; Rigas, A.

    2006-01-01

    A diode resonator chaotic circuit is presented. Multisim is used to simulate the circuit and show the presence of chaos. Time series analysis performed by the method proposed by Grasberger and Procaccia. The correlation and minimum embedding dimension ν and m min , respectively, were calculated. Also the corresponding Kolmogorov entropy was calculated

  8. Financial Intermediation and the Nigerian Economy: A Time Series ...

    African Journals Online (AJOL)

    This paper examines the level of development of financial intermediation and how it impacts on economic growth of Nigeria. Using a time series data covering a period of 40 years (1970 –2009) and employing the econometric tool of Ordinary Least Squares (OLS) and cointegration analysis based on Engle Granger ...

  9. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  10. Long-memory time series theory and methods

    CERN Document Server

    Palma, Wilfredo

    2007-01-01

    Wilfredo Palma, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.

  11. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    ... by 0.002579KG/Month. Finally, this work adds to the growing body of literature on data-driven production and inventory management by utilizing historical data in the development of useful forecasting mathematical model. Keywords: production model, inventory management, multivariate time series, production forecast ...

  13. A Hybrid Joint Moment Ratio Test for Financial Time Series

    NARCIS (Netherlands)

    P.A. Groenendijk (Patrick); A. Lucas (André); C.G. de Vries (Casper)

    1998-01-01

    textabstractWe advocate the use of absolute moment ratio statistics in conjunction with standard variance ratio statistics in order to disentangle linear dependence, non-linear dependence, and leptokurtosis in financial time series. Both statistics are computed for multiple return horizons

  14. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  15. Detection of "noisy" chaos in a time series

    DEFF Research Database (Denmark)

    Chon, K H; Kanters, J K; Cohen, R J

    1997-01-01

    Time series from biological system often displays fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". The output from most biological systems is probably the result of both the...

  16. Multivariate Time Series Analysis for Optimum Production Forecast ...

    African Journals Online (AJOL)

    FIRST LADY

    on data-driven production and inventory management by utilizing historical data in the development of useful forecasting mathematical model. Keywords: production model, inventory management, multivariate time series, production forecast. Introduction. A large assortment of forecasting techniques has been developed ...

  17. Growth And Export Expansion In Mauritius - A Time Series Analysis ...

    African Journals Online (AJOL)

    This paper analyses the empirical relationship between economic growth and export expansion in Mauritius as observed through time series data. Using Granger Causality tests, the short-run analysis results revealed that there is significant reciprocal causality between real export earnings (total, textiles and manufacturing) ...

  18. Tests for nonlinearity in short stationary time series

    International Nuclear Information System (INIS)

    Chang, T.; Sauer, T.; Schiff, S.J.

    1995-01-01

    To compare direct tests for detecting determinism in chaotic time series, data from Henon, Lorenz, and Mackey--Glass equations were contaminated with various levels of additive colored noise. These data were analyzed with a variety of recently developed tests for determinism, and the results compared

  19. forecasting with nonlinear time series model: a monte-carlo ...

    African Journals Online (AJOL)

    PUBLICATIONS1

    with nonlinear time series model by comparing the RMSE with the traditional bootstrap and. Monte-Carlo method of forecasting. We use the logistic smooth transition autoregressive. (LSTAR) model as a case study. We first consider a linear model called the AR. (p) model of order p which satisfies the follow- ing linear ...

  20. Time Series Factor Analysis with an Application to Measuring Money

    NARCIS (Netherlands)

    Gilbert, Paul D.; Meijer, Erik

    2005-01-01

    Time series factor analysis (TSFA) and its associated statistical theory is developed. Unlike dynamic factor analysis (DFA), TSFA obviates the need for explicitly modeling the process dynamics of the underlying phenomena. It also differs from standard factor analysis (FA) in important respects: the

  1. Seasonal time series forecasting: a comparative study of arima and ...

    African Journals Online (AJOL)

    This paper addresses the concerns of Faraway and Chatfield (1998) who questioned the forecasting ability of Artificial Neural Networks (ANN). In particular the paper compares the performance of Artificial Neural Networks (ANN) and ARIMA models in forecasting of seasonal (monthly) Time series. Using the Airline data ...

  2. Time series prediction with simple recurrent neural networks ...

    African Journals Online (AJOL)

    Simple recurrent neural networks are widely used in time series prediction. Most researchers and application developers often choose arbitrarily between Elman or Jordan simple recurrent neural networks for their applications. A hybrid of the two called Elman-Jordan (or Multi-recurrent) neural network is also being used.

  3. Multivariate time series modeling of selected childhood diseases in ...

    African Journals Online (AJOL)

    This paper is focused on modeling the five most prevalent childhood diseases in Akwa Ibom State using a multivariate approach to time series. An aggregate of 78,839 reported cases of malaria, upper respiratory tract infection (URTI), Pneumonia, anaemia and tetanus were extracted from five randomly selected hospitals in ...

  4. Multiple imputation for time series data with Amelia package.

    Science.gov (United States)

    Zhang, Zhongheng

    2016-02-01

    Time series data are common in medical researches. Many laboratory variables or study endpoints could be measured repeatedly over time. Multiple imputation (MI) without considering time trend of a variable may cause it to be unreliable. The article illustrates how to perform MI by using Amelia package in a clinical scenario. Amelia package is powerful in that it allows for MI for time series data. External information on the variable of interest can also be incorporated by using prior or bound argument. Such information may be based on previous published observations, academic consensus, and personal experience. Diagnostics of imputation model can be performed by examining the distributions of imputed and observed values, or by using over-imputation technique.

  5. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  6. Displaying time series, spatial, and space-time data with R

    CERN Document Server

    Perpinan Lamigueiro, Oscar

    2014-01-01

    Code and Methods for Creating High-Quality Data GraphicsA data graphic is not only a static image, but it also tells a story about the data. It activates cognitive processes that are able to detect patterns and discover information not readily available with the raw data. This is particularly true for time series, spatial, and space-time datasets.Focusing on the exploration of data with visual methods, Displaying Time Series, Spatial, and Space-Time Data with R presents methods and R code for producing high-quality graphics of time series, spatial, and space-time data. Practical examples using

  7. Normalization methods in time series of platelet function assays

    Science.gov (United States)

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  8. An entropic approach to the analysis of time series

    Science.gov (United States)

    Scafetta, Nicola

    Statistical analysis of time series. With compelling arguments we show that the Diffusion Entropy Analysis (DEA) is the only method of the literature of the Science of Complexity that correctly determines the scaling hidden within a time series reflecting a Complex Process. The time series is thought of as a source of fluctuations, and the DEA is based on the Shannon entropy of the diffusion process generated by these fluctuations. All traditional methods of scaling analysis, instead, are based on the variance of this diffusion process. The variance methods detect the real scaling only if the Gaussian assumption holds true. We call H the scaling exponent detected by the variance methods and delta the real scaling exponent. If the time series is characterized by Fractional Brownian Motion, we have H = delta and the scaling can be safely determined, in this case, by using the variance methods. If, on the contrary, the time series is characterized, for example, by Levy statistics, H ≠ delta and the variance methods cannot be used to detect the true scaling. Levy walk yields the relation delta = 1/(3 - 2H). In the case of Levy flights, the variance diverges and the exponent H cannot be determined, whereas the scaling delta exists and can be established by using the DEA. Therefore, only the joint use of two different scaling analysis methods, the variance scaling analysis and the DEA, can assess the real nature, Gauss or Levy or something else, of a time series. Moreover, the DEA determines the information content, under the form of Shannon entropy, or of any other convenient entropic indicator, at each time step of the process that, given a sufficiently large number of data, is expected to become diffusion with scaling. This makes it possible to study the regime of transition from dynamics to thermodynamics, non-stationary regimes, and the saturation regime as well. First of all, the efficiency of the DEA is proved with theoretical arguments and with numerical work

  9. On the plurality of times: disunified time and the A-series | Nefdt ...

    African Journals Online (AJOL)

    Then, I attempt to show that disunified time is a problem for a semantics based on the A-series since A-truthmakers are hard to come by in a universe of temporally disconnected time-series. Finally, I provide a novel argument showing that presentists should be particularly fearful of such a universe. South African Journal of ...

  10. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    In this paper, we consider the problem of continuous dissemination of time series data, such as sensor measurements, to a large number of subscribers. These subscribers fall into multiple subscription levels, where each subscription level is specified by the bandwidth constraint of a subscriber......, which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time...... to optimize the average accuracies of the data received by all subscribers within the dissemination network. Finally, we have conducted extensive experiments to study the performance of the algorithms....

  11. Optimization of recurrent neural networks for time series modeling

    DEFF Research Database (Denmark)

    Pedersen, Morten With

    1997-01-01

    The present thesis is about optimization of recurrent neural networks applied to time series modeling. In particular is considered fully recurrent networks working from only a single external input, one layer of nonlinear hidden units and a li near output unit applied to prediction of discrete time...... series. The overall objective s are to improve training by application of second-order methods and to improve generalization ability by architecture optimization accomplished by pruning. The major topics covered in the thesis are: 1. The problem of training recurrent networks is analyzed from a numerical...... of solution obtained as well as computation time required. 3. A theoretical definition of the generalization error for recurrent networks is provided. This definition justifies a commonly adopted approach for estimating generalization ability. 4. The viability of pruning recurrent networks by the Optimal...

  12. Estimating density dependence from time series of population age structure.

    Science.gov (United States)

    Lande, Russell; Engen, Steinar; Saether, Bernt-Erik; Coulson, Tim

    2006-07-01

    Population fluctuations are caused by demographic and environmental stochasticity, time lags due to life history, and density dependence. We model a general life history allowing density dependence within and among age or stage classes in a population undergoing small or moderate fluctuations around a stable equilibrium. We develop a method for estimating the overall strength of density dependence measured by the rate of return toward equilibrium, and we also consider a simplified population description and forecasting using the density-dependent reproductive value. This generality comes at the cost of requiring a time series of the population age or stage structure instead of a univariate time series of adult or total population size. The method is illustrated by analyzing the dynamics of a fully censused population of red deer (Cervus elaphus) based on annual fluctuations of age structure through 21 years.

  13. Reconstruction of ensembles of coupled time-delay systems from time series.

    Science.gov (United States)

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  14. Basic interrupt and command structures and applications

    International Nuclear Information System (INIS)

    Davies, R.C.

    1974-01-01

    Interrupt and command structures of a real-time system are described through specific examples. References to applications of a real-time system and programing development references are supplied. (auth)

  15. Perception of acoustically presented time series with varied intervals.

    Science.gov (United States)

    Wackermann, Jiří; Pacer, Jakob; Wittmann, Marc

    2014-03-01

    Data from three experiments on serial perception of temporal intervals in the supra-second domain are reported. Sequences of short acoustic signals ("pips") separated by periods of silence were presented to the observers. Two types of time series, geometric or alternating, were used, where the modulus 1+δ of the inter-pip series and the base duration Tb (range from 1.1 to 6s) were varied as independent parameters. The observers had to judge whether the series were accelerating, decelerating, or uniform (3 paradigm), or to distinguish regular from irregular sequences (2 paradigm). "Intervals of subjective uniformity" (isus) were obtained by fitting Gaussian psychometric functions to individual subjects' responses. Progression towards longer base durations (Tb=4.4 or 6s) shifts the isus towards negative δs, i.e., accelerating series. This finding is compatible with the phenomenon of "subjective shortening" of past temporal intervals, which is naturally accounted for by the lossy integration model of internal time representation. The opposite effect observed for short durations (Tb=1.1 or 1.5s) remains unexplained by the lossy integration model, and presents a challenge for further research. © 2013 Elsevier B.V. All rights reserved.

  16. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  17. FTSPlot: fast time series visualization for large datasets.

    Directory of Open Access Journals (Sweden)

    Michael Riss

    Full Text Available The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N; the visualization itself can be done with a complexity of O(1 and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64 bytes, on the x86_64 architecture currently up to 2(48 bytes are supported, and benchmarks have been conducted with 2(40 bytes/1 TiB or 1.3 x 10(11 double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.

  18. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  19. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  20. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  1. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Å; Futiger, Sally A

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...

  2. Efficient Processing of Multiple DTW Queries in Time Series Databases

    DEFF Research Database (Denmark)

    Kremer, Hardy; Günnemann, Stephan; Ivanescu, Anca-Maria

    2011-01-01

    Dynamic Time Warping (DTW) is a widely used distance measure for time series that has been successfully used in science and many other application domains. As DTW is computationally expensive, there is a strong need for efficient query processing algorithms. Such algorithms exist for single queries....... In many of today’s applications, however, large numbers of queries arise at any given time. Existing DTW techniques do not process multiple DTW queries simultaneously, a serious limitation which slows down overall processing. In this paper, we propose an efficient processing approach for multiple DTW...

  3. Interruptions and multitasking in nursing care.

    Science.gov (United States)

    Kalisch, Beatrice J; Aebersold, Michelle

    2010-03-01

    The environment surrounding registered nurses (RNs) has been described as fast-paced and unpredictable, and nurses' cognitive load as exceptionally heavy. Studies of interruptions and multitasking in health care are limited, and most have focused on physicians. The extent and type of interruptions and multitasking of nurses, as well as patient errors, were studied using a natural-setting observational field design. The study was conducted in seven patient care units in two Midwestern hospitals--an academic medical center and a community-based teaching hospital. A total of 35 nurses were observed for four-hour periods of time by experienced clinical nurses, who underwent training until they reached an interrater reliability of 0.90. In the 36 RN observations (total, 136 hours) 3,441 events were captured. There were a total of 1,354 interruptions, 46 hours of multitasking, and 200 errors. Nurses were interrupted 10 times per hour, or 1 interruption per 6 minutes. However, RNs in one of the hospitals had significantly more interruptions--1 interruption every 4 1/2 minutes in Hospital 1 (versus 1 every 13.3 minutes in Hospital 2). Nurses were observed to be multitasking 34% of the time (range, 23%- 41%). Overall, the error rate was 1.5 per hour (1.02 per hour in Hospital 1 and 1.89 per hour in Hospital 2). Although there was no significant relationship between interruptions, multitasking, and patient errors, the results of this study show that nurses' work environment is complex and error prone. RNs observed in both hospitals and on all patient care units experienced a high level of discontinuity in the execution of their work. Although nurses manage interruptions and multitasking well, the potential for errors is present, and strategies to decrease interruptions are needed.

  4. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...... unconditional skewness. We consider modelling the unconditional mean and variance using models that respond nonlinearly or asymmetrically to shocks. We investigate the implications of these models on the third-moment structure of the marginal distribution as well as conditions under which the unconditional...... distribution exhibits skewness and nonzero third-order autocovariance structure. In this respect, an asymmetric or nonlinear specification of the conditional mean is found to be of greater importance than the properties of the conditional variance. Several examples are discussed and, whenever possible...

  5. Time Series Analysis, Modeling and Applications A Computational Intelligence Perspective

    CERN Document Server

    Chen, Shyi-Ming

    2013-01-01

    Temporal and spatiotemporal data form an inherent fabric of the society as we are faced with streams of data coming from numerous sensors, data feeds, recordings associated with numerous areas of application embracing physical and human-generated phenomena (environmental data, financial markets, Internet activities, etc.). A quest for a thorough analysis, interpretation, modeling and prediction of time series comes with an ongoing challenge for developing models that are both accurate and user-friendly (interpretable). The volume is aimed to exploit the conceptual and algorithmic framework of Computational Intelligence (CI) to form a cohesive and comprehensive environment for building models of time series. The contributions covered in the volume are fully reflective of the wealth of the CI technologies by bringing together ideas, algorithms, and numeric studies, which convincingly demonstrate their relevance, maturity and visible usefulness. It reflects upon the truly remarkable diversity of methodological a...

  6. Model and Variable Selection Procedures for Semiparametric Time Series Regression

    Directory of Open Access Journals (Sweden)

    Risa Kato

    2009-01-01

    Full Text Available Semiparametric regression models are very useful for time series analysis. They facilitate the detection of features resulting from external interventions. The complexity of semiparametric models poses new challenges for issues of nonparametric and parametric inference and model selection that frequently arise from time series data analysis. In this paper, we propose penalized least squares estimators which can simultaneously select significant variables and estimate unknown parameters. An innovative class of variable selection procedure is proposed to select significant variables and basis functions in a semiparametric model. The asymptotic normality of the resulting estimators is established. Information criteria for model selection are also proposed. We illustrate the effectiveness of the proposed procedures with numerical simulations.

  7. A Generalization of Some Classical Time Series Tools

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg; Madsen, Henrik

    2001-01-01

    or linearity. The generalizations do not prescribe a particular smoothing technique. In fact, when the smoother is replaced by a linear regression the generalizations reduce to close approximations of SACF and SPACF. For this reason a smooth transition from the linear to the non-linear case can be obtained......In classical time series analysis the sample autocorrelation function (SACF) and the sample partial autocorrelation function (SPACF) has gained wide application for structural identification of linear time series models. We suggest generalizations, founded on smoothing techniques, applicable...... by varying the bandwidth of a local linear smoother. By adjusting the flexibility of the smoother the power of the tests for independence and linearity against specific alternatives can be adjusted. The generalizations allow for graphical presentations, very similar to those used for SACF and SPACF...

  8. Deviations from uniform power law scaling in nonstationary time series

    Science.gov (United States)

    Viswanathan, G. M.; Peng, C. K.; Stanley, H. E.; Goldberger, A. L.

    1997-01-01

    A classic problem in physics is the analysis of highly nonstationary time series that typically exhibit long-range correlations. Here we test the hypothesis that the scaling properties of the dynamics of healthy physiological systems are more stable than those of pathological systems by studying beat-to-beat fluctuations in the human heart rate. We develop techniques based on the Fano factor and Allan factor functions, as well as on detrended fluctuation analysis, for quantifying deviations from uniform power-law scaling in nonstationary time series. By analyzing extremely long data sets of up to N = 10(5) beats for 11 healthy subjects, we find that the fluctuations in the heart rate scale approximately uniformly over several temporal orders of magnitude. By contrast, we find that in data sets of comparable length for 14 subjects with heart disease, the fluctuations grow erratically, indicating a loss of scaling stability.

  9. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  10. Time series analysis of nuclear instrumentation in EBR-II

    Energy Technology Data Exchange (ETDEWEB)

    Imel, G.R.

    1996-05-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel`s response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals.

  11. Models for Pooled Time-Series Cross-Section Data

    Directory of Open Access Journals (Sweden)

    Lawrence E Raffalovich

    2015-07-01

    Full Text Available Several models are available for the analysis of pooled time-series cross-section (TSCS data, defined as “repeated observations on fixed units” (Beck and Katz 1995. In this paper, we run the following models: (1 a completely pooled model, (2 fixed effects models, and (3 multi-level/hierarchical linear models. To illustrate these models, we use a Generalized Least Squares (GLS estimator with cross-section weights and panel-corrected standard errors (with EViews 8 on the cross-national homicide trends data of forty countries from 1950 to 2005, which we source from published research (Messner et al. 2011. We describe and discuss the similarities and differences between the models, and what information each can contribute to help answer substantive research questions. We conclude with a discussion of how the models we present may help to mitigate validity threats inherent in pooled time-series cross-section data analysis.

  12. Time series analysis of nuclear instrumentation in EBR-II

    International Nuclear Information System (INIS)

    Imel, G.R.

    1996-01-01

    Results of a time series analysis of the scaler count data from the 3 wide range nuclear detectors in the Experimental Breeder Reactor-II are presented. One of the channels was replaced, and it was desired to determine if there was any statistically significant change (ie, improvement) in the channel's response after the replacement. Data were collected from all 3 channels for 16-day periods before and after detector replacement. Time series analysis and statistical tests showed that there was no significant change after the detector replacement. Also, there were no statistically significant differences among the 3 channels, either before or after the replacement. Finally, it was determined that errors in the reactivity change inferred from subcritical count monitoring during fuel handling would be on the other of 20-30 cents for single count intervals

  13. Modeling Philippine Stock Exchange Composite Index Using Time Series Analysis

    Science.gov (United States)

    Gayo, W. S.; Urrutia, J. D.; Temple, J. M. F.; Sandoval, J. R. D.; Sanglay, J. E. A.

    2015-06-01

    This study was conducted to develop a time series model of the Philippine Stock Exchange Composite Index and its volatility using the finite mixture of ARIMA model with conditional variance equations such as ARCH, GARCH, EG ARCH, TARCH and PARCH models. Also, the study aimed to find out the reason behind the behaviorof PSEi, that is, which of the economic variables - Consumer Price Index, crude oil price, foreign exchange rate, gold price, interest rate, money supply, price-earnings ratio, Producers’ Price Index and terms of trade - can be used in projecting future values of PSEi and this was examined using Granger Causality Test. The findings showed that the best time series model for Philippine Stock Exchange Composite index is ARIMA(1,1,5) - ARCH(1). Also, Consumer Price Index, crude oil price and foreign exchange rate are factors concluded to Granger cause Philippine Stock Exchange Composite Index.

  14. Time series analysis methods and applications for flight data

    CERN Document Server

    Zhang, Jianye

    2017-01-01

    This book focuses on different facets of flight data analysis, including the basic goals, methods, and implementation techniques. As mass flight data possesses the typical characteristics of time series, the time series analysis methods and their application for flight data have been illustrated from several aspects, such as data filtering, data extension, feature optimization, similarity search, trend monitoring, fault diagnosis, and parameter prediction, etc. An intelligent information-processing platform for flight data has been established to assist in aircraft condition monitoring, training evaluation and scientific maintenance. The book will serve as a reference resource for people working in aviation management and maintenance, as well as researchers and engineers in the fields of data analysis and data mining.

  15. Time series prediction by feedforward neural networks - is it difficult?

    CERN Document Server

    Rosen-Zvi, M; Kinzel, W

    2003-01-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma sup 2 (gamma >> 1). The generalization error is found to decrease as epsilon sub g propor to exp(-alpha/gamma sup 2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  16. Time series prediction by feedforward neural networks - is it difficult?

    Science.gov (United States)

    Rosen-Zvi, Michal; Kanter, Ido; Kinzel, Wolfgang

    2003-04-01

    The difficulties that a neural network faces when trying to learn from a quasi-periodic time series are studied analytically using a teacher-student scenario where the random input is divided into two macroscopic regions with different variances, 1 and 1/gamma2 (gamma gg 1). The generalization error is found to decrease as epsilong propto exp(-alpha/gamma2), where alpha is the number of examples per input dimension. In contradiction to this very slow vanishing generalization error, the next output prediction is found to be almost free of mistakes. This picture is consistent with learning quasi-periodic time series produced by feedforward neural networks, which is dominated by enhanced components of the Fourier spectrum of the input. Simulation results are in good agreement with the analytical results.

  17. A Comparative Study of Portmanteau Tests for Univariate Time Series Models

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2006-07-01

    Full Text Available Time series model diagnostic checking is the most important stage of time series model building. In this paper the comparison among several suggested diagnostic tests has been made using the simulation time series data.

  18. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  19. Deriving dynamic marketing effectiveness from econometric time series models

    OpenAIRE

    Horváth, C.; Franses, Ph.H.B.F.

    2003-01-01

    textabstractTo understand the relevance of marketing efforts, it has become standard practice to estimate the long-run and short-run effects of the marketing-mix, using, say, weekly scanner data. A common vehicle for this purpose is an econometric time series model. Issues that are addressed in the literature are unit roots, cointegration, structural breaks and impulse response functions. In this paper we summarize the most important concepts by reviewing all possible empirical cases that can...

  20. Identification of neutral biochemical network models from time series data

    Directory of Open Access Journals (Sweden)

    Maia Marco

    2009-05-01

    Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  1. Identification of neutral biochemical network models from time series data.

    Science.gov (United States)

    Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S

    2009-05-05

    The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  2. Financial Time Series Prediction Using Elman Recurrent Random Neural Networks

    Directory of Open Access Journals (Sweden)

    Jie Wang

    2016-01-01

    (ERNN, the empirical results show that the proposed neural network displays the best performance among these neural networks in financial time series forecasting. Further, the empirical research is performed in testing the predictive effects of SSE, TWSE, KOSPI, and Nikkei225 with the established model, and the corresponding statistical comparisons of the above market indices are also exhibited. The experimental results show that this approach gives good performance in predicting the values from the stock market indices.

  3. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Seglearn: A Python Package for Learning Sequences and Time Series

    OpenAIRE

    Burns, David M.; Whyne, Cari M.

    2018-01-01

    Seglearn is an open-source python package for machine learning time series or sequences using a sliding window segmentation approach. The implementation provides a flexible pipeline for tackling classification, regression, and forecasting problems with multivariate sequence and contextual data. This package is compatible with scikit-learn and is listed under scikit-learn Related Projects. The package depends on numpy, scipy, and scikit-learn. Seglearn is distributed under the BSD 3-Clause Lic...

  5. Analyses of GIMMS NDVI Time Series in Kogi State, Nigeria

    Science.gov (United States)

    Palka, Jessica; Wessollek, Christine; Karrasch, Pierre

    2017-10-01

    The value of remote sensing data is particularly evident where an areal monitoring is needed to provide information on the earth's surface development. The use of temporal high resolution time series data allows for detecting short-term changes. In Kogi State in Nigeria different vegetation types can be found. As the major population in this region is living in rural communities with crop farming the existing vegetation is slowly being altered. The expansion of agricultural land causes loss of natural vegetation, especially in the regions close to the rivers which are suitable for crop production. With regard to these facts, two questions can be dealt with covering different aspects of the development of vegetation in the Kogi state, the determination and evaluation of the general development of the vegetation in the study area (trend estimation) and analyses on a short-term behavior of vegetation conditions, which can provide information about seasonal effects in vegetation development. For this purpose, the GIMMS-NDVI data set, provided by the NOAA, provides information on the normalized difference vegetation index (NDVI) in a geometric resolution of approx. 8 km. The temporal resolution of 15 days allows the already described analyses. For the presented analysis data for the period 1981-2012 (31 years) were used. The implemented workflow mainly applies methods of time series analysis. The results show that in addition to the classical seasonal development, artefacts of different vegetation periods (several NDVI maxima) can be found in the data. The trend component of the time series shows a consistently positive development in the entire study area considering the full investigation period of 31 years. However, the results also show that this development has not been continuous and a simple linear modeling of the NDVI increase is only possible to a limited extent. For this reason, the trend modeling was extended by procedures for detecting structural breaks in

  6. The complexity of carbon flux time series in Europe

    Science.gov (United States)

    Lange, Holger; Sippel, Sebastian

    2014-05-01

    Observed geophysical time series usually exhibit pronounced variability, part of which is process-related and deterministic ("signal"), another part is due to random fluctuations ("noise"). To discern these two sources for fluctuations is notoriously difficult using conventional analysis methods, unless sophisticated model assumptions are made. Here, we present an almost parameter-free innovative approach with the potential to draw a distinction between deterministic processes and structured noise, based on ordinal pattern statistics. The method determines one measure for the information content of time series (Shannon entropy) and two complexity measures, one based on global properties of the order pattern distribution (Jensen-Shannon complexity) and one based on local (derivative) properties (Fisher information or complexity). Each time series gets classified via its location in an entropy-complexity plane; using this representation, the method draws a qualitative distinction between different types of natural processes. As a case study, we investigate Gross Primary Productivity (GPP) and respiration which are key variables in terrestrial ecosystems quantifying carbon allocation and biomass growth of vegetation. Changes in GPP and ecosystem respiration can be induced by land use change, environmental disasters or extreme events, and changing climate. Numerous attempts to quantify these variables on larger spatial scales exist. Here, we investigate gridded time series at monthly resolution for the European continent either based on upscaled measurements ("observations") or modelled with two different process-based terrestrial ecosystem models ("simulations"). The complexity analysis is either visualized as maps of Europe showing "hotspots" of complexity for GPP and respiration, or used to provide a detailed observations-simulations and model-model comparison. Values found for information and complexity will be compared to known artificial reference processes

  7. Statistical Inference Methods for Sparse Biological Time Series Data

    Directory of Open Access Journals (Sweden)

    Voit Eberhard O

    2011-04-01

    Full Text Available Abstract Background Comparing metabolic profiles under different biological perturbations has become a powerful approach to investigating the functioning of cells. The profiles can be taken as single snapshots of a system, but more information is gained if they are measured longitudinally over time. The results are short time series consisting of relatively sparse data that cannot be analyzed effectively with standard time series techniques, such as autocorrelation and frequency domain methods. In this work, we study longitudinal time series profiles of glucose consumption in the yeast Saccharomyces cerevisiae under different temperatures and preconditioning regimens, which we obtained with methods of in vivo nuclear magnetic resonance (NMR spectroscopy. For the statistical analysis we first fit several nonlinear mixed effect regression models to the longitudinal profiles and then used an ANOVA likelihood ratio method in order to test for significant differences between the profiles. Results The proposed methods are capable of distinguishing metabolic time trends resulting from different treatments and associate significance levels to these differences. Among several nonlinear mixed-effects regression models tested, a three-parameter logistic function represents the data with highest accuracy. ANOVA and likelihood ratio tests suggest that there are significant differences between the glucose consumption rate profiles for cells that had been--or had not been--preconditioned by heat during growth. Furthermore, pair-wise t-tests reveal significant differences in the longitudinal profiles for glucose consumption rates between optimal conditions and heat stress, optimal and recovery conditions, and heat stress and recovery conditions (p-values Conclusion We have developed a nonlinear mixed effects model that is appropriate for the analysis of sparse metabolic and physiological time profiles. The model permits sound statistical inference procedures

  8. Reconstruction of network topology using status-time-series data

    Science.gov (United States)

    Pandey, Pradumn Kumar; Badarla, Venkataramana

    2018-01-01

    Uncovering the heterogeneous connection pattern of a networked system from the available status-time-series (STS) data of a dynamical process on the network is of great interest in network science and known as a reverse engineering problem. Dynamical processes on a network are affected by the structure of the network. The dependency between the diffusion dynamics and structure of the network can be utilized to retrieve the connection pattern from the diffusion data. Information of the network structure can help to devise the control of dynamics on the network. In this paper, we consider the problem of network reconstruction from the available status-time-series (STS) data using matrix analysis. The proposed method of network reconstruction from the STS data is tested successfully under susceptible-infected-susceptible (SIS) diffusion dynamics on real-world and computer-generated benchmark networks. High accuracy and efficiency of the proposed reconstruction procedure from the status-time-series data define the novelty of the method. Our proposed method outperforms compressed sensing theory (CST) based method of network reconstruction using STS data. Further, the same procedure of network reconstruction is applied to the weighted networks. The ordering of the edges in the weighted networks is identified with high accuracy.

  9. Time series analysis for psychological research: examining and forecasting change

    Science.gov (United States)

    Jebb, Andrew T.; Tay, Louis; Wang, Wei; Huang, Qiming

    2015-01-01

    Psychological research has increasingly recognized the importance of integrating temporal dynamics into its theories, and innovations in longitudinal designs and analyses have allowed such theories to be formalized and tested. However, psychological researchers may be relatively unequipped to analyze such data, given its many characteristics and the general complexities involved in longitudinal modeling. The current paper introduces time series analysis to psychological research, an analytic domain that has been essential for understanding and predicting the behavior of variables across many diverse fields. First, the characteristics of time series data are discussed. Second, different time series modeling techniques are surveyed that can address various topics of interest to psychological researchers, including describing the pattern of change in a variable, modeling seasonal effects, assessing the immediate and long-term impact of a salient event, and forecasting future values. To illustrate these methods, an illustrative example based on online job search behavior is used throughout the paper, and a software tutorial in R for these analyses is provided in the Supplementary Materials. PMID:26106341

  10. Genetic programming and serial processing for time series classification.

    Science.gov (United States)

    Alfaro-Cid, Eva; Sharman, Ken; Esparcia-Alcázar, Anna I

    2014-01-01

    This work describes an approach devised by the authors for time series classification. In our approach genetic programming is used in combination with a serial processing of data, where the last output is the result of the classification. The use of genetic programming for classification, although still a field where more research in needed, is not new. However, the application of genetic programming to classification tasks is normally done by considering the input data as a feature vector. That is, to the best of our knowledge, there are not examples in the genetic programming literature of approaches where the time series data are processed serially and the last output is considered as the classification result. The serial processing approach presented here fills a gap in the existing literature. This approach was tested in three different problems. Two of them are real world problems whose data were gathered for online or conference competitions. As there are published results of these two problems this gives us the chance to compare the performance of our approach against top performing methods. The serial processing of data in combination with genetic programming obtained competitive results in both competitions, showing its potential for solving time series classification problems. The main advantage of our serial processing approach is that it can easily handle very large datasets.

  11. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  12. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  13. Learning restricted Boolean network model by time-series data.

    Science.gov (United States)

    Ouyang, Hongjia; Fang, Jie; Shen, Liangzhong; Dougherty, Edward R; Liu, Wenbin

    2014-01-01

    Restricted Boolean networks are simplified Boolean networks that are required for either negative or positive regulations between genes. Higa et al. (BMC Proc 5:S5, 2011) proposed a three-rule algorithm to infer a restricted Boolean network from time-series data. However, the algorithm suffers from a major drawback, namely, it is very sensitive to noise. In this paper, we systematically analyze the regulatory relationships between genes based on the state switch of the target gene and propose an algorithm with which restricted Boolean networks may be inferred from time-series data. We compare the proposed algorithm with the three-rule algorithm and the best-fit algorithm based on both synthetic networks and a well-studied budding yeast cell cycle network. The performance of the algorithms is evaluated by three distance metrics: the normalized-edge Hamming distance [Formula: see text], the normalized Hamming distance of state transition [Formula: see text], and the steady-state distribution distance μ (ssd). Results show that the proposed algorithm outperforms the others according to both [Formula: see text] and [Formula: see text], whereas its performance according to μ (ssd) is intermediate between best-fit and the three-rule algorithms. Thus, our new algorithm is more appropriate for inferring interactions between genes from time-series data.

  14. Cross-sample entropy of foreign exchange time series

    Science.gov (United States)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  15. Coastline detection with time series of SAR images

    Science.gov (United States)

    Ao, Dongyang; Dumitru, Octavian; Schwarz, Gottfried; Datcu, Mihai

    2017-10-01

    For maritime remote sensing, coastline detection is a vital task. With continuous coastline detection results from satellite image time series, the actual shoreline, the sea level, and environmental parameters can be observed to support coastal management and disaster warning. Established coastline detection methods are often based on SAR images and wellknown image processing approaches. These methods involve a lot of complicated data processing, which is a big challenge for remote sensing time series. Additionally, a number of SAR satellites operating with polarimetric capabilities have been launched in recent years, and many investigations of target characteristics in radar polarization have been performed. In this paper, a fast and efficient coastline detection method is proposed which comprises three steps. First, we calculate a modified correlation coefficient of two SAR images of different polarization. This coefficient differs from the traditional computation where normalization is needed. Through this modified approach, the separation between sea and land becomes more prominent. Second, we set a histogram-based threshold to distinguish between sea and land within the given image. The histogram is derived from the statistical distribution of the polarized SAR image pixel amplitudes. Third, we extract continuous coastlines using a Canny image edge detector that is rather immune to speckle noise. Finally, the individual coastlines derived from time series of .SAR images can be checked for changes.

  16. Earthquake forecasting studies using radon time series data in Taiwan

    Science.gov (United States)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  17. Forecasting long memory time series under a break in persistence

    DEFF Research Database (Denmark)

    Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson

    of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines......We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...

  18. Extracting the relevant delays in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    selection, and more precisely stepwise forward selection. The method is compared to other forward selection schemes, as well as to a nonparametric tests aimed at estimating the embedding dimension of time series. The final application extends these results to the efficient estimation of FIR filters on some......In this contribution, we suggest a convenient way to use generalisation error to extract the relevant delays from a time-varying process, i.e. the delays that lead to the best prediction performance. We design a generalisation-based algorithm that takes its inspiration from traditional variable...

  19. Accelerating molecular dynamics simulations by linear prediction of time series

    Science.gov (United States)

    Brutovsky, B.; Mülders, T.; Kneller, G. R.

    2003-04-01

    We present a molecular dynamics simulation scheme which allows to speed up molecular dynamics simulations by linear prediction of force time series. The explicit calculation of nonbonding forces is periodically replaced by linear prediction from past values. Applying our method to liquid oxygen consisting of flexible molecules we obtained real speedups between 5.4 and 6.5, compared to conventional molecular dynamics simulations. Here only the bond-stretching forces were calculated at each time step. We demonstrate that essential dynamical quantities, such as the mean-square displacement and the velocity autocorrelation function, are preserved.

  20. Satellite Image Time Series Decomposition Based on EEMD

    Directory of Open Access Journals (Sweden)

    Yun-long Kong

    2015-11-01

    Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.

  1. Time-series animation techniques for visualizing urban growth

    Science.gov (United States)

    Acevedo, W.; Masuoka, P.

    1997-01-01

    Time-series animation is a visually intuitive way to display urban growth. Animations of landuse change for the Baltimore-Washington region were generated by showing a series of images one after the other in sequential order. Before creating an animation, various issues which will affect the appearance of the animation should be considered, including the number of original data frames to use, the optimal animation display speed, the number of intermediate frames to create between the known frames, and the output media on which the animations will be displayed. To create new frames between the known years of data, the change in each theme (i.e. urban development, water bodies, transportation routes) must be characterized and an algorithm developed to create the in-between frames. Example time-series animations were created using a temporal GIS database of the Baltimore-Washington area. Creating the animations involved generating raster images of the urban development, water bodies, and principal transportation routes; overlaying the raster images on a background image; and importing the frames to a movie file. Three-dimensional perspective animations were created by draping each image over digital elevation data prior to importing the frames to a movie file. ?? 1997 Elsevier Science Ltd.

  2. Connectionist Architectures for Time Series Prediction of Dynamical Systems

    Science.gov (United States)

    Weigend, Andreas Sebastian

    We investigate the effectiveness of connectionist networks for predicting the future continuation of temporal sequences. The problem of overfitting, particularly serious for short records of noisy data, is addressed by the method of weight-elimination: a term penalizing network complexity is added to the usual cost function in back-propagation. We describe the dynamics of the procedure and clarify the meaning of the parameters involved. From a Bayesian perspective, the complexity term can be usefully interpreted as an assumption about prior distribution of the weights. We analyze three time series. On the benchmark sunspot series, the networks outperform traditional statistical approaches. We show that the network performance does not deteriorate when there are more input units than needed. In the second example, the notoriously noisy foreign exchange rates series, we pick one weekday and one currency (DM vs. US). Given exchange rate information up to and including a Monday, the task is to predict the rate for the following Tuesday. Weight-elimination manages to extract a significant part of the dynamics and makes the solution interpretable. In the third example, the networks predict the resource utilization of a chaotic computational ecosystem for hundreds of steps forward in time.

  3. Seasonality of Tuberculosis in Delhi, India: A Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Varun Kumar

    2014-01-01

    Full Text Available Background. It is highly cost effective to detect a seasonal trend in tuberculosis in order to optimize disease control and intervention. Although seasonal variation of tuberculosis has been reported from different parts of the world, no definite and consistent pattern has been observed. Therefore, the study was designed to find the seasonal variation of tuberculosis in Delhi, India. Methods. Retrospective record based study was undertaken in a Directly Observed Treatment Short course (DOTS centre located in the south district of Delhi. Six-year data from January 2007 to December 2012 was analyzed. Expert modeler of SPSS ver. 21 software was used to fit the best suitable model for the time series data. Results. Autocorrelation function (ACF and partial autocorrelation function (PACF at lag 12 show significant peak suggesting seasonal component of the TB series. Seasonal adjusted factor (SAF showed peak seasonal variation from March to May. Univariate model by expert modeler in the SPSS showed that Winter’s multiplicative model could best predict the time series data with 69.8% variability. The forecast shows declining trend with seasonality. Conclusion. A seasonal pattern and declining trend with variable amplitudes of fluctuation were observed in the incidence of tuberculosis.

  4. Seasonality of tuberculosis in delhi, India: a time series analysis.

    Science.gov (United States)

    Kumar, Varun; Singh, Abhay; Adhikary, Mrinmoy; Daral, Shailaja; Khokhar, Anita; Singh, Saudan

    2014-01-01

    Background. It is highly cost effective to detect a seasonal trend in tuberculosis in order to optimize disease control and intervention. Although seasonal variation of tuberculosis has been reported from different parts of the world, no definite and consistent pattern has been observed. Therefore, the study was designed to find the seasonal variation of tuberculosis in Delhi, India. Methods. Retrospective record based study was undertaken in a Directly Observed Treatment Short course (DOTS) centre located in the south district of Delhi. Six-year data from January 2007 to December 2012 was analyzed. Expert modeler of SPSS ver. 21 software was used to fit the best suitable model for the time series data. Results. Autocorrelation function (ACF) and partial autocorrelation function (PACF) at lag 12 show significant peak suggesting seasonal component of the TB series. Seasonal adjusted factor (SAF) showed peak seasonal variation from March to May. Univariate model by expert modeler in the SPSS showed that Winter's multiplicative model could best predict the time series data with 69.8% variability. The forecast shows declining trend with seasonality. Conclusion. A seasonal pattern and declining trend with variable amplitudes of fluctuation were observed in the incidence of tuberculosis.

  5. Linear and nonlinear dynamic systems in financial time series prediction

    Directory of Open Access Journals (Sweden)

    Salim Lahmiri

    2012-10-01

    Full Text Available Autoregressive moving average (ARMA process and dynamic neural networks namely the nonlinear autoregressive moving average with exogenous inputs (NARX are compared by evaluating their ability to predict financial time series; for instance the S&P500 returns. Two classes of ARMA are considered. The first one is the standard ARMA model which is a linear static system. The second one uses Kalman filter (KF to estimate and predict ARMA coefficients. This model is a linear dynamic system. The forecasting ability of each system is evaluated by means of mean absolute error (MAE and mean absolute deviation (MAD statistics. Simulation results indicate that the ARMA-KF system performs better than the standard ARMA alone. Thus, introducing dynamics into the ARMA process improves the forecasting accuracy. In addition, the ARMA-KF outperformed the NARX. This result may suggest that the linear component found in the S&P500 return series is more dominant than the nonlinear part. In sum, we conclude that introducing dynamics into the ARMA process provides an effective system for S&P500 time series prediction.

  6. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-01-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it-an improved and generalized version of Bayesian Blocks [Scargle 1998]-that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piece- wise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by [Arias-Castro, Donoho and Huo 2003]. In the spirit of Reproducible Research [Donoho et al. (2008)] all of the code and data necessary to reproduce all of the figures in this paper are included as auxiliary material.

  7. Assessing Coupling Dynamics from an Ensemble of Time Series

    Directory of Open Access Journals (Sweden)

    Germán Gómez-Herrero

    2015-04-01

    Full Text Available Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts, which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.

  8. Studies in Astronomical Time Series Analysis. VI. Bayesian Block Representations

    Science.gov (United States)

    Scargle, Jeffrey D.; Norris, Jay P.; Jackson, Brad; Chiang, James

    2013-02-01

    This paper addresses the problem of detecting and characterizing local variability in time series and other forms of sequential data. The goal is to identify and characterize statistically significant variations, at the same time suppressing the inevitable corrupting observational errors. We present a simple nonparametric modeling technique and an algorithm implementing it—an improved and generalized version of Bayesian Blocks—that finds the optimal segmentation of the data in the observation interval. The structure of the algorithm allows it to be used in either a real-time trigger mode, or a retrospective mode. Maximum likelihood or marginal posterior functions to measure model fitness are presented for events, binned counts, and measurements at arbitrary times with known error distributions. Problems addressed include those connected with data gaps, variable exposure, extension to piecewise linear and piecewise exponential representations, multivariate time series data, analysis of variance, data on the circle, other data modes, and dispersed data. Simulations provide evidence that the detection efficiency for weak signals is close to a theoretical asymptotic limit derived by Arias-Castro et al. In the spirit of Reproducible Research all of the code and data necessary to reproduce all of the figures in this paper are included as supplementary material.

  9. Time series analysis of the behavior of brazilian natural rubber

    Directory of Open Access Journals (Sweden)

    Antônio Donizette de Oliveira

    2009-03-01

    Full Text Available The natural rubber is a non-wood product obtained of the coagulation of some lattices of forest species, being Hevea brasiliensis the main one. Native from the Amazon Region, this species was already known by the Indians before the discovery of America. The natural rubber became a product globally valued due to its multiple applications in the economy, being its almost perfect substitute the synthetic rubber derived from the petroleum. Similarly to what happens with other countless products the forecast of future prices of the natural rubber has been object of many studies. The use of models of forecast of univariate timeseries stands out as the more accurate and useful to reduce the uncertainty in the economic decision making process. This studyanalyzed the historical series of prices of the Brazilian natural rubber (R$/kg, in the Jan/99 - Jun/2006 period, in order tocharacterize the rubber price behavior in the domestic market; estimated a model for the time series of monthly natural rubberprices; and foresaw the domestic prices of the natural rubber, in the Jul/2006 - Jun/2007 period, based on the estimated models.The studied models were the ones belonging to the ARIMA family. The main results were: the domestic market of the natural rubberis expanding due to the growth of the world economy; among the adjusted models, the ARIMA (1,1,1 model provided the bestadjustment of the time series of prices of the natural rubber (R$/kg; the prognosis accomplished for the series supplied statistically adequate fittings.

  10. Prewhitening of hydroclimatic time series? Implications for inferred change and variability across time scales

    Science.gov (United States)

    Razavi, Saman; Vogel, Richard

    2018-02-01

    Prewhitening, the process of eliminating or reducing short-term stochastic persistence to enable detection of deterministic change, has been extensively applied to time series analysis of a range of geophysical variables. Despite the controversy around its utility, methodologies for prewhitening time series continue to be a critical feature of a variety of analyses including: trend detection of hydroclimatic variables and reconstruction of climate and/or hydrology through proxy records such as tree rings. With a focus on the latter, this paper presents a generalized approach to exploring the impact of a wide range of stochastic structures of short- and long-term persistence on the variability of hydroclimatic time series. Through this approach, we examine the impact of prewhitening on the inferred variability of time series across time scales. We document how a focus on prewhitened, residual time series can be misleading, as it can drastically distort (or remove) the structure of variability across time scales. Through examples with actual data, we show how such loss of information in prewhitened time series of tree rings (so-called "residual chronologies") can lead to the underestimation of extreme conditions in climate and hydrology, particularly droughts, reconstructed for centuries preceding the historical period.

  11. Sequential Monte Carlo for inference of latent ARMA time-series with innovations correlated in time

    Science.gov (United States)

    Urteaga, Iñigo; Bugallo, Mónica F.; Djurić, Petar M.

    2017-12-01

    We consider the problem of sequential inference of latent time-series with innovations correlated in time and observed via nonlinear functions. We accommodate time-varying phenomena with diverse properties by means of a flexible mathematical representation of the data. We characterize statistically such time-series by a Bayesian analysis of their densities. The density that describes the transition of the state from time t to the next time instant t+1 is used for implementation of novel sequential Monte Carlo (SMC) methods. We present a set of SMC methods for inference of latent ARMA time-series with innovations correlated in time for different assumptions in knowledge of parameters. The methods operate in a unified and consistent manner for data with diverse memory properties. We show the validity of the proposed approach by comprehensive simulations of the challenging stochastic volatility model.

  12. Cluster analysis of activity-time series in motor learning

    DEFF Research Database (Denmark)

    Balslev, Daniela; Nielsen, Finn Årup; Frutiger, Sally A.

    2002-01-01

    Neuroimaging studies of learning focus on brain areas where the activity changes as a function of time. To circumvent the difficult problem of model selection, we used a data-driven analytic tool, cluster analysis, which extracts representative temporal and spatial patterns from the voxel......-time series. The optimal number of clusters was chosen using a cross-validated likelihood method, which highlights the clustering pattern that generalizes best over the subjects. Data were acquired with PET at different time points during practice of a visuomotor task. The results from cluster analysis show...... practice-related activity in a fronto-parieto-cerebellar network, in agreement with previous studies of motor learning. These voxels were separated from a group of voxels showing an unspecific time-effect and another group of voxels, whose activation was an artifact from smoothing. Hum. Brain Mapping 15...

  13. Weighted statistical parameters for irregularly sampled time series

    Science.gov (United States)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  14. Time-Series Analysis of the Impact of Prescription Drug Monitoring Programs on Heroin Treatment Admissions.

    Science.gov (United States)

    Branham, Douglas Keith

    2018-03-21

    Prescription drug abuse has become a major issue in the United States in recent years. Prescription drug monitoring programs (PDMPs) are designed to help health care providers to prevent such abuses. There may be unintended effects of these programs. Specifically, PDMPs may move prescription opioid users to begin use of heroin. This article aims to evaluate the impact of PDMPs on heroin abuse across several different states through use of treatment admissions records obtained from the Treatment Episode Data Set. Operational dates and other characteristics of state PDMPs were obtained from the Prescription Drug Monitoring Program Training and Technical Assistance Center. Data for the dependent variable were collected from the Treatment Episodes Data Set from 1992 to 2012. Interrupted time-series analyses using autoregressive integrated moving average modeling were used to estimate the effect of presence of an operational PDMP on the number of admissions reporting heroin as their primary drug being used. The relationship between heroin admissions and prescription opioid admissions was significant for the average data (β = 0.41, p = 0.0017) and the 5-year data (β = 0.5, p = 0.036), both showing positive associations between heroin and prescription drug admissions in states in the post PDMP implementation period. Conclusions/Importance: The study found a positive relationship that between heroin and prescription opioid admissions post PDMP implementation. Future research should attempt to identify what this relationship means and how this information can be used to improve opioid policy.

  15. Mapping Brazilian savanna vegetation gradients with Landsat time series

    Science.gov (United States)

    Schwieder, Marcel; Leitão, Pedro J.; da Cunha Bustamante, Mercedes Maria; Ferreira, Laerte Guimarães; Rabe, Andreas; Hostert, Patrick

    2016-10-01

    Global change has tremendous impacts on savanna systems around the world. Processes related to climate change or agricultural expansion threaten the ecosystem's state, function and the services it provides. A prominent example is the Brazilian Cerrado that has an extent of around 2 million km2 and features high biodiversity with many endemic species. It is characterized by landscape patterns from open grasslands to dense forests, defining a heterogeneous gradient in vegetation structure throughout the biome. While it is undisputed that the Cerrado provides a multitude of valuable ecosystem services, it is exposed to changes, e.g. through large scale land conversions or climatic changes. Monitoring of the Cerrado is thus urgently needed to assess the state of the system as well as to analyze and further understand ecosystem responses and adaptations to ongoing changes. Therefore we explored the potential of dense Landsat time series to derive phenological information for mapping vegetation gradients in the Cerrado. Frequent data gaps, e.g. due to cloud contamination, impose a serious challenge for such time series analyses. We synthetically filled data gaps based on Radial Basis Function convolution filters to derive continuous pixel-wise temporal profiles capable of representing Land Surface Phenology (LSP). Derived phenological parameters revealed differences in the seasonal cycle between the main Cerrado physiognomies and could thus be used to calibrate a Support Vector Classification model to map their spatial distribution. Our results show that it is possible to map the main spatial patterns of the observed physiognomies based on their phenological differences, whereat inaccuracies occurred especially between similar classes and data-scarce areas. The outcome emphasizes the need for remote sensing based time series analyses at fine scales. Mapping heterogeneous ecosystems such as savannas requires spatial detail, as well as the ability to derive important

  16. GPS time series at Campi Flegrei caldera (2000-2013

    Directory of Open Access Journals (Sweden)

    Prospero De Martino

    2014-05-01

    Full Text Available The Campi Flegrei caldera is an active volcanic system associated to a high volcanic risk, and represents a well known and peculiar example of ground deformations (bradyseism, characterized by intense uplift periods, followed by subsidence phases with some episodic superimposed mini-uplifts. Ground deformation is an important volcanic precursor, and, its continuous monitoring, is one of the main tool for short time forecast of eruptive activity. This paper provides an overview of the continuous GPS monitoring of the Campi Flegrei caldera from January 2000 to July 2013, including network operations, data recording and processing, and data products. In this period the GPS time series allowed continuous and accurate tracking of ground deformation of the area. Seven main uplift episodes were detected, and during each uplift period, the recurrent horizontal displacement pattern, radial from the “caldera center”, suggests no significant change in deformation source geometry and location occurs. The complete archive of GPS time series at Campi Flegrei area is reported in the Supplementary materials. These data can be usefull for the scientific community in improving the research on Campi Flegrei caldera dynamic and hazard assessment.

  17. Exploratory joint and separate tracking of geographically related time series

    Science.gov (United States)

    Balasingam, Balakumar; Willett, Peter; Levchuk, Georgiy; Freeman, Jared

    2012-05-01

    Target tracking techniques have usually been applied to physical systems via radar, sonar or imaging modalities. But the same techniques - filtering, association, classification, track management - can be applied to nontraditional data such as one might find in other fields such as economics, business and national defense. In this paper we explore a particular data set. The measurements are time series collected at various sites; but other than that little is known about it. We shall refer to as the data as representing the Megawatt hour (MWH) output of various power plants located in Afghanistan. We pose such questions as: 1. Which power plants seem to have a common model? 2. Do any power plants change their models with time? 3. Can power plant behavior be predicted, and if so, how far to the future? 4. Are some of the power plants stochastically linked? That is, do we observed a lack of power demand at one power plant as implying a surfeit of demand elsewhere? The observations seem well modeled as hidden Markov. This HMM modeling is compared to other approaches; and tests are continued to other (albeit self-generated) data sets with similar characteristics. Keywords: Time-series analysis, hidden Markov models, statistical similarity, clustering weighted

  18. Centrality measures in temporal networks with time series analysis

    Science.gov (United States)

    Huang, Qiangjuan; Zhao, Chengli; Zhang, Xue; Wang, Xiaojie; Yi, Dongyun

    2017-05-01

    The study of identifying important nodes in networks has a wide application in different fields. However, the current researches are mostly based on static or aggregated networks. Recently, the increasing attention to networks with time-varying structure promotes the study of node centrality in temporal networks. In this paper, we define a supra-evolution matrix to depict the temporal network structure. With using of the time series analysis, the relationships between different time layers can be learned automatically. Based on the special form of the supra-evolution matrix, the eigenvector centrality calculating problem is turned into the calculation of eigenvectors of several low-dimensional matrices through iteration, which effectively reduces the computational complexity. Experiments are carried out on two real-world temporal networks, Enron email communication network and DBLP co-authorship network, the results of which show that our method is more efficient at discovering the important nodes than the common aggregating method.

  19. Detecting and characterising ramp events in wind power time series

    International Nuclear Information System (INIS)

    Gallego, Cristóbal; Cuerva, Álvaro; Costa, Alexandre

    2014-01-01

    In order to implement accurate models for wind power ramp forecasting, ramps need to be previously characterised. This issue has been typically addressed by performing binary ramp/non-ramp classifications based on ad-hoc assessed thresholds. However, recent works question this approach. This paper presents the ramp function, an innovative wavelet- based tool which detects and characterises ramp events in wind power time series. The underlying idea is to assess a continuous index related to the ramp intensity at each time step, which is obtained by considering large power output gradients evaluated under different time scales (up to typical ramp durations). The ramp function overcomes some of the drawbacks shown by the aforementioned binary classification and permits forecasters to easily reveal specific features of the ramp behaviour observed at a wind farm. As an example, the daily profile of the ramp-up and ramp-down intensities are obtained for the case of a wind farm located in Spain

  20. Assemblage time series reveal biodiversity change but not systematic loss.

    Science.gov (United States)

    Dornelas, Maria; Gotelli, Nicholas J; McGill, Brian; Shimadzu, Hideyasu; Moyes, Faye; Sievers, Caya; Magurran, Anne E

    2014-04-18

    The extent to which biodiversity change in local assemblages contributes to global biodiversity loss is poorly understood. We analyzed 100 time series from biomes across Earth to ask how diversity within assemblages is changing through time. We quantified patterns of temporal α diversity, measured as change in local diversity, and temporal β diversity, measured as change in community composition. Contrary to our expectations, we did not detect systematic loss of α diversity. However, community composition changed systematically through time, in excess of predictions from null models. Heterogeneous rates of environmental change, species range shifts associated with climate change, and biotic homogenization may explain the different patterns of temporal α and β diversity. Monitoring and understanding change in species composition should be a conservation priority.

  1. Chaotic time series analysis in economics: Balance and perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Faggini, Marisa, E-mail: mfaggini@unisa.it [Dipartimento di Scienze Economiche e Statistiche, Università di Salerno, Fisciano 84084 (Italy)

    2014-12-15

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area.

  2. ALBEDO PATTERN RECOGNITION AND TIME-SERIES ANALYSES IN MALAYSIA

    Directory of Open Access Journals (Sweden)

    S. A. Salleh

    2012-07-01

    Full Text Available Pattern recognition and time-series analyses will enable one to evaluate and generate predictions of specific phenomena. The albedo pattern and time-series analyses are very much useful especially in relation to climate condition monitoring. This study is conducted to seek for Malaysia albedo pattern changes. The pattern recognition and changes will be useful for variety of environmental and climate monitoring researches such as carbon budgeting and aerosol mapping. The 10 years (2000–2009 MODIS satellite images were used for the analyses and interpretation. These images were being processed using ERDAS Imagine remote sensing software, ArcGIS 9.3, the 6S code for atmospherical calibration and several MODIS tools (MRT, HDF2GIS, Albedo tools. There are several methods for time-series analyses were explored, this paper demonstrates trends and seasonal time-series analyses using converted HDF format MODIS MCD43A3 albedo land product. The results revealed significance changes of albedo percentages over the past 10 years and the pattern with regards to Malaysia's nebulosity index (NI and aerosol optical depth (AOD. There is noticeable trend can be identified with regards to its maximum and minimum value of the albedo. The rise and fall of the line graph show a similar trend with regards to its daily observation. The different can be identified in term of the value or percentage of rises and falls of albedo. Thus, it can be concludes that the temporal behavior of land surface albedo in Malaysia have a uniform behaviours and effects with regards to the local monsoons. However, although the average albedo shows linear trend with nebulosity index, the pattern changes of albedo with respects to the nebulosity index indicates that there are external factors that implicates the albedo values, as the sky conditions and its diffusion plotted does not have uniform trend over the years, especially when the trend of 5 years interval is examined, 2000 shows high

  3. Ensemble Deep Learning for Biomedical Time Series Classification.

    Science.gov (United States)

    Jin, Lin-Peng; Dong, Jun

    2016-01-01

    Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost .

  4. Ensemble Deep Learning for Biomedical Time Series Classification

    Directory of Open Access Journals (Sweden)

    Lin-peng Jin

    2016-01-01

    Full Text Available Ensemble learning has been proved to improve the generalization ability effectively in both theory and practice. In this paper, we briefly outline the current status of research on it first. Then, a new deep neural network-based ensemble method that integrates filtering views, local views, distorted views, explicit training, implicit training, subview prediction, and Simple Average is proposed for biomedical time series classification. Finally, we validate its effectiveness on the Chinese Cardiovascular Disease Database containing a large number of electrocardiogram recordings. The experimental results show that the proposed method has certain advantages compared to some well-known ensemble methods, such as Bagging and AdaBoost.

  5. SaaS Platform for Time Series Data Handling

    Directory of Open Access Journals (Sweden)

    Oplachko Ekaterina

    2018-01-01

    Full Text Available The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a “Software as a Service” model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  6. SaaS Platform for Time Series Data Handling

    Science.gov (United States)

    Oplachko, Ekaterina; Rykunov, Stanislav; Ustinin, Mikhail

    2018-02-01

    The paper is devoted to the description of MathBrain, a cloud-based resource, which works as a "Software as a Service" model. It is designed to maximize the efficiency of the current technology and to provide a tool for time series data handling. The resource provides access to the following analysis methods: direct and inverse Fourier transforms, Principal component analysis and Independent component analysis decompositions, quantitative analysis, magnetoencephalography inverse problem solution in a single dipole model based on multichannel spectral data.

  7. Chaotic time series analysis in economics: Balance and perspectives

    International Nuclear Information System (INIS)

    Faggini, Marisa

    2014-01-01

    The aim of the paper is not to review the large body of work concerning nonlinear time series analysis in economics, about which much has been written, but rather to focus on the new techniques developed to detect chaotic behaviours in economic data. More specifically, our attention will be devoted to reviewing some of these techniques and their application to economic and financial data in order to understand why chaos theory, after a period of growing interest, appears now not to be such an interesting and promising research area

  8. Quality Quandaries- Time Series Model Selection and Parsimony

    DEFF Research Database (Denmark)

    Bisgaard, Søren; Kulahci, Murat

    2009-01-01

    Some of the issues involved in selecting adequate models for time series data are discussed using an example concerning the number of users of an Internet server. The process of selecting an appropriate model is subjective and requires experience and judgment. The authors believe an important...... consideration in model selection should be parameter parsimony. They favor the use of parsimonious mixed ARMA models, noting that research has shown that a model building strategy that considers only autoregressive representations will lead to non-parsimonious models and to loss of forecasting accuracy....

  9. Estimation of dynamic flux profiles from metabolic time series data

    Directory of Open Access Journals (Sweden)

    Chou I-Chun

    2012-07-01

    Full Text Available Abstract Background Advances in modern high-throughput techniques of molecular biology have enabled top-down approaches for the estimation of parameter values in metabolic systems, based on time series data. Special among them is the recent method of dynamic flux estimation (DFE, which uses such data not only for parameter estimation but also for the identification of functional forms of the processes governing a metabolic system. DFE furthermore provides diagnostic tools for the evaluation of model validity and of the quality of a model fit beyond residual errors. Unfortunately, DFE works only when the data are more or less complete and the system contains as many independent fluxes as metabolites. These drawbacks may be ameliorated with other types of estimation and information. However, such supplementations incur their own limitations. In particular, assumptions must be made regarding the functional forms of some processes and detailed kinetic information must be available, in addition to the time series data. Results The authors propose here a systematic approach that supplements DFE and overcomes some of its shortcomings. Like DFE, the approach is model-free and requires only minimal assumptions. If sufficient time series data are available, the approach allows the determination of a subset of fluxes that enables the subsequent applicability of DFE to the rest of the flux system. The authors demonstrate the procedure with three artificial pathway systems exhibiting distinct characteristics and with actual data of the trehalose pathway in Saccharomyces cerevisiae. Conclusions The results demonstrate that the proposed method successfully complements DFE under various situations and without a priori assumptions regarding the model representation. The proposed method also permits an examination of whether at all, to what degree, or within what range the available time series data can be validly represented in a particular functional format of

  10. Nonlinear analysis and prediction of time series in multiphase reactors

    CERN Document Server

    Liu, Mingyan

    2014-01-01

    This book reports on important nonlinear aspects or deterministic chaos issues in the systems of multi-phase reactors. The reactors treated in the book include gas-liquid bubble columns, gas-liquid-solid fluidized beds and gas-liquid-solid magnetized fluidized beds. The authors take pressure fluctuations in the bubble columns  as time series for nonlinear analysis, modeling and forecasting. They present qualitative and quantitative non-linear analysis tools which include attractor phase plane plot, correlation dimension, Kolmogorov entropy and largest Lyapunov exponent calculations and local non-linear short-term prediction.

  11. Real coded genetic algorithm for fuzzy time series prediction

    Science.gov (United States)

    Jain, Shilpa; Bisht, Dinesh C. S.; Singh, Phool; Mathpal, Prakash C.

    2017-10-01

    Genetic Algorithm (GA) forms a subset of evolutionary computing, rapidly growing area of Artificial Intelligence (A.I.). Some variants of GA are binary GA, real GA, messy GA, micro GA, saw tooth GA, differential evolution GA. This research article presents a real coded GA for predicting enrollments of University of Alabama. Data of Alabama University is a fuzzy time series. Here, fuzzy logic is used to predict enrollments of Alabama University and genetic algorithm optimizes fuzzy intervals. Results are compared to other eminent author works and found satisfactory, and states that real coded GA are fast and accurate.

  12. Feasibility of real-time calculation of correlation integral derived statistics applied to EEG time series

    NARCIS (Netherlands)

    Broek, P.L.C. van den; Egmond, J. van; Rijn, C.M. van; Takens, F.; Coenen, A.M.L.; Booij, L.H.D.J.

    2005-01-01

    This study assessed the feasibility of online calculation of the correlation integral (C(r)) aiming to apply C(r)-derived statistics. For real-time application it is important to reduce calculation time. It is shown how our method works for EEG time series. Methods: To achieve online calculation of

  13. Feasibility of real-time calculation of correlation integral derived statistics applied to EGG time series

    NARCIS (Netherlands)

    van den Broek, PLC; van Egmond, J; van Rijn, CM; Takens, F; Coenen, AML; Booij, LHDJ

    2005-01-01

    Background: This study assessed the feasibility of online calculation of the correlation integral (C(r)) aiming to apply C(r)derived statistics. For real-time application it is important to reduce calculation time. It is shown how our method works for EEG time series. Methods: To achieve online

  14. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee

    2005-11-01

        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  15. Visualizing trends and clusters in ranked time-series data

    Science.gov (United States)

    Gousie, Michael B.; Grady, John; Branagan, Melissa

    2013-12-01

    There are many systems that provide visualizations for time-oriented data. Of those, few provide the means of finding patterns in time-series data in which rankings are also important. Fewer still have the fine granularity necessary to visually follow individual data points through time. We propose the Ranking Timeline, a novel visualization method for modestly-sized multivariate data sets that include the top ten rankings over time. The system includes two main visualization components: a ranking over time and a cluster analysis. The ranking visualization, loosely based on line plots, allows the user to track individual data points so as to facilitate comparisons within a given time frame. Glyphs represent additional attributes within the framework of the overall system. The user has control over many aspects of the visualization, including viewing a subset of the data and/or focusing on a desired time frame. The cluster analysis tool shows the relative importance of individual items in conjunction with a visualization showing the connection(s) to other, similar items, while maintaining the aforementioned glyphs and user interaction. The user controls the clustering according to a similarity threshold. The system has been implemented as a Web application, and has been tested with data showing the top ten actors/actresses from 1929-2010. The experiments have revealed patterns in the data heretofore not explored.

  16. High-resolution (noble) gas time series for aquatic research

    Science.gov (United States)

    Popp, A. L.; Brennwald, M. S.; Weber, U.; Kipfer, R.

    2017-12-01

    We developed a portable mass spectrometer (miniRUEDI) for on-site quantification of gas concentrations (He, Ar, Kr, N2, O2, CO2, CH4, etc.) in terrestrial gases [1,2]. Using the gas-equilibrium membrane-inlet technique (GE-MIMS), the miniRUEDI for the first time also allows accurate on-site and long-term dissolved-gas analysis in water bodies. The miniRUEDI is designed for operation in the field and at remote locations, using battery power and ambient air as a calibration gas. In contrast to conventional sampling and subsequent lab analysis, the miniRUEDI provides real-time and continuous time series of gas concentrations with a time resolution of a few seconds.Such high-resolution time series and immediate data availability open up new opportunities for research in highly dynamic and heterogeneous environmental systems. In addition the combined analysis of inert and reactive gas species provides direct information on the linkages of physical and biogoechemical processes, such as the air/water gas exchange, excess air formation, O2 turnover, or N2 production by denitrification [1,3,4].We present the miniRUEDI instrument and discuss its use for environmental research based on recent applications of tracking gas dynamics related to rapid and short-term processes in aquatic systems. [1] Brennwald, M.S., Schmidt, M., Oser, J., and Kipfer, R. (2016). Environmental Science and Technology, 50(24):13455-13463, doi: 10.1021/acs.est.6b03669[2] Gasometrix GmbH, gasometrix.com[3] Mächler, L., Peter, S., Brennwald, M.S., and Kipfer, R. (2013). Excess air formation as a mechanism for delivering oxygen to groundwater. Water Resources Research, doi:10.1002/wrcr.20547[4] Mächler, L., Brennwald, M.S., and Kipfer, R. (2013). Argon Concentration Time-Series As a Tool to Study Gas Dynamics in the Hyporheic Zone. Environmental Science and Technology, doi: 10.1021/es305309b

  17. Land surface phenology from SPOT VEGETATION time series

    Directory of Open Access Journals (Sweden)

    A. Verger

    2016-12-01

    Full Text Available Land surface phenology from time series of satellite data are expected to contribute to improve the representation of vegetation phenology in earth system models. We characterized the baseline phenology of the vegetation at the global scale from GEOCLIM-LAI, a global climatology of leaf area index (LAI derived from 1-km SPOT VEGETATION time series for 1999-2010. The calibration with ground measurements showed that the start and end of season were best identified using respectively 30% and 40% threshold of LAI amplitude values. The satellite-derived phenology was spatially consistent with the global distributions of climatic drivers and biome land cover. The accuracy of the derived phenological metrics, evaluated using available ground observations for birch forests in Europe, cherry in Asia and lilac shrubs in North America showed an overall root mean square error lower than 19 days for the start, end and length of season, and good agreement between the latitudinal gradients of VEGETATION LAI phenology and ground data.

  18. Multiscale Symbolic Phase Transfer Entropy in Financial Time Series Classification

    Science.gov (United States)

    Zhang, Ningning; Lin, Aijing; Shang, Pengjian

    We address the challenge of classifying financial time series via a newly proposed multiscale symbolic phase transfer entropy (MSPTE). Using MSPTE method, we succeed to quantify the strength and direction of information flow between financial systems and classify financial time series, which are the stock indices from Europe, America and China during the period from 2006 to 2016 and the stocks of banking, aviation industry and pharmacy during the period from 2007 to 2016, simultaneously. The MSPTE analysis shows that the value of symbolic phase transfer entropy (SPTE) among stocks decreases with the increasing scale factor. It is demonstrated that MSPTE method can well divide stocks into groups by areas and industries. In addition, it can be concluded that the MSPTE analysis quantify the similarity among the stock markets. The symbolic phase transfer entropy (SPTE) between the two stocks from the same area is far less than the SPTE between stocks from different areas. The results also indicate that four stocks from America and Europe have relatively high degree of similarity and the stocks of banking and pharmaceutical industry have higher similarity for CA. It is worth mentioning that the pharmaceutical industry has weaker particular market mechanism than banking and aviation industry.

  19. Unsupervised Classification During Time-Series Model Building.

    Science.gov (United States)

    Gates, Kathleen M; Lane, Stephanie T; Varangis, E; Giovanello, K; Guiskewicz, K

    2017-01-01

    Researchers who collect multivariate time-series data across individuals must decide whether to model the dynamic processes at the individual level or at the group level. A recent innovation, group iterative multiple model estimation (GIMME), offers one solution to this dichotomy by identifying group-level time-series models in a data-driven manner while also reliably recovering individual-level patterns of dynamic effects. GIMME is unique in that it does not assume homogeneity in processes across individuals in terms of the patterns or weights of temporal effects. However, it can be difficult to make inferences from the nuances in varied individual-level patterns. The present article introduces an algorithm that arrives at subgroups of individuals that have similar dynamic models. Importantly, the researcher does not need to decide the number of subgroups. The final models contain reliable group-, subgroup-, and individual-level patterns that enable generalizable inferences, subgroups of individuals with shared model features, and individual-level patterns and estimates. We show that integrating community detection into the GIMME algorithm improves upon current standards in two important ways: (1) providing reliable classification and (2) increasing the reliability in the recovery of individual-level effects. We demonstrate this method on functional MRI from a sample of former American football players.

  20. Intermittency and multifractional Brownian character of geomagnetic time series

    Directory of Open Access Journals (Sweden)

    G. Consolini

    2013-07-01

    Full Text Available The Earth's magnetosphere exhibits a complex behavior in response to the solar wind conditions. This behavior, which is described in terms of mutifractional Brownian motions, could be the consequence of the occurrence of dynamical phase transitions. On the other hand, it has been shown that the dynamics of the geomagnetic signals is also characterized by intermittency at the smallest temporal scales. Here, we focus on the existence of a possible relationship in the geomagnetic time series between the multifractional Brownian motion character and the occurrence of intermittency. In detail, we investigate the multifractional nature of two long time series of the horizontal intensity of the Earth's magnetic field as measured at L'Aquila Geomagnetic Observatory during two years (2001 and 2008, which correspond to different conditions of solar activity. We propose a possible double origin of the intermittent character of the small-scale magnetic field fluctuations, which is related to both the multifractional nature of the geomagnetic field and the intermittent character of the disturbance level. Our results suggest a more complex nature of the geomagnetic response to solar wind changes than previously thought.

  1. Intermittency and multifractional Brownian character of geomagnetic time series

    Science.gov (United States)

    Consolini, G.; De Marco, R.; De Michelis, P.

    2013-07-01

    The Earth's magnetosphere exhibits a complex behavior in response to the solar wind conditions. This behavior, which is described in terms of mutifractional Brownian motions, could be the consequence of the occurrence of dynamical phase transitions. On the other hand, it has been shown that the dynamics of the geomagnetic signals is also characterized by intermittency at the smallest temporal scales. Here, we focus on the existence of a possible relationship in the geomagnetic time series between the multifractional Brownian motion character and the occurrence of intermittency. In detail, we investigate the multifractional nature of two long time series of the horizontal intensity of the Earth's magnetic field as measured at L'Aquila Geomagnetic Observatory during two years (2001 and 2008), which correspond to different conditions of solar activity. We propose a possible double origin of the intermittent character of the small-scale magnetic field fluctuations, which is related to both the multifractional nature of the geomagnetic field and the intermittent character of the disturbance level. Our results suggest a more complex nature of the geomagnetic response to solar wind changes than previously thought.

  2. Time series clustering analysis of health-promoting behavior

    Science.gov (United States)

    Yang, Chi-Ta; Hung, Yu-Shiang; Deng, Guang-Feng

    2013-10-01

    Health promotion must be emphasized to achieve the World Health Organization goal of health for all. Since the global population is aging rapidly, ComCare elder health-promoting service was developed by the Taiwan Institute for Information Industry in 2011. Based on the Pender health promotion model, ComCare service offers five categories of health-promoting functions to address the everyday needs of seniors: nutrition management, social support, exercise management, health responsibility, stress management. To assess the overall ComCare service and to improve understanding of the health-promoting behavior of elders, this study analyzed health-promoting behavioral data automatically collected by the ComCare monitoring system. In the 30638 session records collected for 249 elders from January, 2012 to March, 2013, behavior patterns were identified by fuzzy c-mean time series clustering algorithm combined with autocorrelation-based representation schemes. The analysis showed that time series data for elder health-promoting behavior can be classified into four different clusters. Each type reveals different health-promoting needs, frequencies, function numbers and behaviors. The data analysis result can assist policymakers, health-care providers, and experts in medicine, public health, nursing and psychology and has been provided to Taiwan National Health Insurance Administration to assess the elder health-promoting behavior.

  3. Blind source separation problem in GPS time series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2016-04-01

    A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition

  4. Dependency Structures in Differentially Coded Cardiovascular Time Series

    Directory of Open Access Journals (Sweden)

    Tatjana Tasic

    2017-01-01

    Full Text Available Objectives. This paper analyses temporal dependency in the time series recorded from aging rats, the healthy ones and those with early developed hypertension. The aim is to explore effects of age and hypertension on mutual sample relationship along the time axis. Methods. A copula method is applied to raw and to differentially coded signals. The latter ones were additionally binary encoded for a joint conditional entropy application. The signals were recorded from freely moving male Wistar rats and from spontaneous hypertensive rats, aged 3 months and 12 months. Results. The highest level of comonotonic behavior of pulse interval with respect to systolic blood pressure is observed at time lags τ=0, 3, and 4, while a strong counter-monotonic behavior occurs at time lags τ=1 and 2. Conclusion. Dynamic range of aging rats is considerably reduced in hypertensive groups. Conditional entropy of systolic blood pressure signal, compared to unconditional, shows an increased level of discrepancy, except for a time lag 1, where the equality is preserved in spite of the memory of differential coder. The antiparallel streams play an important role at single beat time lag.

  5. Fundamental State Space Time Series Models for JEPX Electricity Prices

    Science.gov (United States)

    Ofuji, Kenta; Kanemoto, Shigeru

    Time series models are popular in attempts to model and forecast price dynamics in various markets. In this paper, we have formulated two state space models and tested them for its applicability to power price modeling and forecasting using JEPX (Japan Electric Power eXchange) data. The state space models generally have a high degree of flexibility with its time-dependent state transition matrix and system equation configurations. Based on empirical data analysis and past literatures, we used calculation assumptions to a) extract stochastic trend component to capture non-stationarity, and b) detect structural changes underlying in the market. The stepwise calculation algorithm followed that of Kalman Filter. We then evaluated the two models' forecasting capabilities, in comparison with ordinary AR (autoregressive) and ARCH (autoregressive conditional heteroskedasticity) models. By choosing proper explanatory variables, the latter state space model yielded as good a forecasting capability as that of the AR and the ARCH models for a short forecasting horizon.

  6. Estimation of Hurst Exponent for the Financial Time Series

    Science.gov (United States)

    Kumar, J.; Manchanda, P.

    2009-07-01

    Till recently statistical methods and Fourier analysis were employed to study fluctuations in stock markets in general and Indian stock market in particular. However current trend is to apply the concepts of wavelet methodology and Hurst exponent, see for example the work of Manchanda, J. Kumar and Siddiqi, Journal of the Frankline Institute 144 (2007), 613-636 and paper of Cajueiro and B. M. Tabak. Cajueiro and Tabak, Physica A, 2003, have checked the efficiency of emerging markets by computing Hurst component over a time window of 4 years of data. Our goal in the present paper is to understand the dynamics of the Indian stock market. We look for the persistency in the stock market through Hurst exponent and fractal dimension of time series data of BSE 100 and NIFTY 50.

  7. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  8. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    Science.gov (United States)

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  9. Time series power flow analysis for distribution connected PV generation.

    Energy Technology Data Exchange (ETDEWEB)

    Broderick, Robert Joseph; Quiroz, Jimmy Edward; Ellis, Abraham; Reno, Matthew J.; Smith, Jeff; Dugan, Roger

    2013-01-01

    Distributed photovoltaic (PV) projects must go through an interconnection study process before connecting to the distribution grid. These studies are intended to identify the likely impacts and mitigation alternatives. In the majority of the cases, system impacts can be ruled out or mitigation can be identified without an involved study, through a screening process or a simple supplemental review study. For some proposed projects, expensive and time-consuming interconnection studies are required. The challenges to performing the studies are twofold. First, every study scenario is potentially unique, as the studies are often highly specific to the amount of PV generation capacity that varies greatly from feeder to feeder and is often unevenly distributed along the same feeder. This can cause location-specific impacts and mitigations. The second challenge is the inherent variability in PV power output which can interact with feeder operation in complex ways, by affecting the operation of voltage regulation and protection devices. The typical simulation tools and methods in use today for distribution system planning are often not adequate to accurately assess these potential impacts. This report demonstrates how quasi-static time series (QSTS) simulation and high time-resolution data can be used to assess the potential impacts in a more comprehensive manner. The QSTS simulations are applied to a set of sample feeders with high PV deployment to illustrate the usefulness of the approach. The report describes methods that can help determine how PV affects distribution system operations. The simulation results are focused on enhancing the understanding of the underlying technical issues. The examples also highlight the steps needed to perform QSTS simulation and describe the data needed to drive the simulations. The goal of this report is to make the methodology of time series power flow analysis readily accessible to utilities and others responsible for evaluating

  10. Seismic assessment of a site using the time series method

    International Nuclear Information System (INIS)

    Krutzik, N.J.; Rotaru, I.; Bobei, M.; Mingiuc, C.; Serban, V.; Androne, M.

    1997-01-01

    To increase the safety of a NPP located on a seismic site, the seismic acceleration level to which the NPP should be qualified must be as representative as possible for that site, with a conservative degree of safety but not too exaggerated. The consideration of the seismic events affecting the site as independent events and the use of statistic methods to define some safety levels with very low annual occurrence probability (10 -4 ) may lead to some exaggerations of the seismic safety level. The use of some very high value for the seismic acceleration imposed by the seismic safety levels required by the hazard analysis may lead to very costly technical solutions that can make the plant operation more difficult and increase maintenance costs. The considerations of seismic events as a time series with dependence among the events produced, may lead to a more representative assessment of a NPP site seismic activity and consequently to a prognosis on the seismic level values to which the NPP would be ensured throughout its life-span. That prognosis should consider the actual seismic activity (including small earthquakes in real time) of the focuses that affect the plant site. The paper proposes the applications of Autoregressive Time Series to issue a prognosis on the seismic activity of a focus and presents the analysis on Vrancea focus that affects NPP Cernavoda site, by this method. The paper also presents the manner to analyse the focus activity as per the new approach and it assesses the maximum seismic acceleration that may affect NPP Cernavoda throughout its life-span (∼ 30 years). Development and applications of new mathematical analysis method, both for long - and short - time intervals, may lead to important contributions in the process of foretelling the seismic events in the future. (authors)

  11. Miscarriage: A Dream Interrupted

    Science.gov (United States)

    Trepal, Heather C.; Semivan, Suzanne Gibson; Caley-Bruce, Mary

    2005-01-01

    Pregnancy is a developmental task that requires women to become accustomed to inherent and sometimes profound biological, somatic, and psychological changes. When pregnancy is interrupted by miscarriage, it may become a pivotal crisis point in the development of a woman's maternal identity as well as an issue in family development. This manuscript…

  12. On randomly interrupted diffusion

    International Nuclear Information System (INIS)

    Luczka, J.

    1993-01-01

    Processes driven by randomly interrupted Gaussian white noise are considered. An evolution equation for single-event probability distributions in presented. Stationary states are considered as a solution of a second-order ordinary differential equation with two imposed conditions. A linear model is analyzed and its stationary distributions are explicitly given. (author). 10 refs

  13. Wrapper Feature Extraction for Time Series Classification Using Singular Value Decomposition

    OpenAIRE

    Hui, Zhang; Tu, Bao Ho; Kawasaki, Saori

    2005-01-01

    Time series classification is an important aspect of time series mining. Recently, time series classification has attracted increasing interests in various domains. However, the high dimensionality property of time series makes time series classification a difficult problem. The so-called curse of dimensionality not only slows down the process of classification but also decreases the classification quality. Many dimensionality reduction techniques have been proposed to circumvent the curse of...

  14. The Outlier Interval Detection Algorithms on Astronautical Time Series Data

    Directory of Open Access Journals (Sweden)

    Wei Hu

    2013-01-01

    Full Text Available The Outlier Interval Detection is a crucial technique to analyze spacecraft fault, locate exception, and implement intelligent fault diagnosis system. The paper proposes two OID algorithms on astronautical Time Series Data, that is, variance based OID (VOID and FFT and k nearest Neighbour based OID (FKOID. The VOID algorithm divides TSD into many intervals and measures each interval’s outlier score according to its variance. This algorithm can detect the outlier intervals with great fluctuation in the time domain. It is a simple and fast algorithm with less time complexity, but it ignores the frequency information. The FKOID algorithm extracts the frequency information of each interval by means of Fast Fourier Transform, so as to calculate the distances between frequency features, and adopts the KNN method to measure the outlier score according to the sum of distances between the interval’s frequency vector and the K nearest frequency vectors. It detects the outlier intervals in a refined way at an appropriate expense of the time and is valid to detect the outlier intervals in both frequency and time domains.

  15. United States Forest Disturbance Trends Observed Using Landsat Time Series

    Science.gov (United States)

    Masek, Jeffrey G.; Goward, Samuel N.; Kennedy, Robert E.; Cohen, Warren B.; Moisen, Gretchen G.; Schleeweis, Karen; Huang, Chengquan

    2013-01-01

    Disturbance events strongly affect the composition, structure, and function of forest ecosystems; however, existing U.S. land management inventories were not designed to monitor disturbance. To begin addressing this gap, the North American Forest Dynamics (NAFD) project has examined a geographic sample of 50 Landsat satellite image time series to assess trends in forest disturbance across the conterminous United States for 1985-2005. The geographic sample design used a probability-based scheme to encompass major forest types and maximize geographic dispersion. For each sample location disturbance was identified in the Landsat series using the Vegetation Change Tracker (VCT) algorithm. The NAFD analysis indicates that, on average, 2.77 Mha/yr of forests were disturbed annually, representing 1.09%/yr of US forestland. These satellite-based national disturbance rates estimates tend to be lower than those derived from land management inventories, reflecting both methodological and definitional differences. In particular the VCT approach used with a biennial time step has limited sensitivity to low-intensity disturbances. Unlike prior satellite studies, our biennial forest disturbance rates vary by nearly a factor of two between high and low years. High western US disturbance rates were associated with active fire years and insect activity, while variability in the east is more strongly related to harvest rates in managed forests. We note that generating a geographic sample based on representing forest type and variability may be problematic since the spatial pattern of disturbance does not necessarily correlate with forest type. We also find that the prevalence of diffuse, non-stand clearing disturbance in US forests makes the application of a biennial geographic sample problematic. Future satellite-based studies of disturbance at regional and national scales should focus on wall-to-wall analyses with annual time step for improved accuracy.

  16. A Time Series Evaluation of the FAST National Stroke Awareness Campaign in England

    Science.gov (United States)

    Flynn, Darren; Ford, Gary A.; Rodgers, Helen; Price, Christopher; Steen, Nick; Thomson, Richard G.

    2014-01-01

    Objective In February 2009, the Department of Health in England launched the Face, Arm, Speech, and Time (FAST) mass media campaign, to raise public awareness of stroke symptoms and the need for an emergency response. We aimed to evaluate the impact of three consecutive phases of FAST using population-level measures of behaviour in England. Methods Interrupted time series (May 2007 to February 2011) assessed the impact of the campaign on: access to a national stroke charity's information resources (Stroke Association [SA]); emergency hospital admissions with a primary diagnosis of stroke (Hospital Episode Statistics for England); and thrombolysis activity from centres in England contributing data to the Safe Implementation of Thrombolysis in Stroke UK database. Results Before the campaign, emergency admissions (and patients admitted via accident and emergency [A&E]) and thrombolysis activity was increasing significantly over time, whereas emergency admissions via general practitioners (GPs) were decreasing significantly. SA webpage views, calls to their helpline and information materials dispatched increased significantly after phase one. Website hits/views, and information materials dispatched decreased after phase one; these outcomes increased significantly during phases two and three. After phase one there were significant increases in overall emergency admissions (505, 95% CI = 75 to 935) and patients admitted via A&E (451, 95% CI = 26 to 875). Significantly fewer monthly emergency admissions via GPs were reported after phase three (−19, 95% CI = −29 to −9). Thrombolysis activity per month significantly increased after phases one (3, 95% CI = 1 to 6), and three (3, 95% CI = 1 to 4). Conclusions Phase one had a statistically significant impact on information seeking behaviour and emergency admissions, with additional impact that may be attributable to subsequent phases on information seeking behaviour, emergency admissions via GPs, and

  17. A time series evaluation of the FAST National Stroke Awareness Campaign in England.

    Directory of Open Access Journals (Sweden)

    Darren Flynn

    Full Text Available In February 2009, the Department of Health in England launched the Face, Arm, Speech, and Time (FAST mass media campaign, to raise public awareness of stroke symptoms and the need for an emergency response. We aimed to evaluate the impact of three consecutive phases of FAST using population-level measures of behaviour in England.Interrupted time series (May 2007 to February 2011 assessed the impact of the campaign on: access to a national stroke charity's information resources (Stroke Association [SA]; emergency hospital admissions with a primary diagnosis of stroke (Hospital Episode Statistics for England; and thrombolysis activity from centres in England contributing data to the Safe Implementation of Thrombolysis in Stroke UK database.Before the campaign, emergency admissions (and patients admitted via accident and emergency [A&E] and thrombolysis activity was increasing significantly over time, whereas emergency admissions via general practitioners (GPs were decreasing significantly. SA webpage views, calls to their helpline and information materials dispatched increased significantly after phase one. Website hits/views, and information materials dispatched decreased after phase one; these outcomes increased significantly during phases two and three. After phase one there were significant increases in overall emergency admissions (505, 95% CI = 75 to 935 and patients admitted via A&E (451, 95% CI = 26 to 875. Significantly fewer monthly emergency admissions via GPs were reported after phase three (-19, 95% CI =  -29 to -9. Thrombolysis activity per month significantly increased after phases one (3, 95% CI = 1 to 6, and three (3, 95% CI = 1 to 4.Phase one had a statistically significant impact on information seeking behaviour and emergency admissions, with additional impact that may be attributable to subsequent phases on information seeking behaviour, emergency admissions via GPs, and thrombolysis activity. Future campaigns should be

  18. Assimilation of LAI time-series in crop production models

    Science.gov (United States)

    Kooistra, Lammert; Rijk, Bert; Nannes, Louis

    2014-05-01

    Agriculture is worldwide a large consumer of freshwater, nutrients and land. Spatial explicit agricultural management activities (e.g., fertilization, irrigation) could significantly improve efficiency in resource use. In previous studies and operational applications, remote sensing has shown to be a powerful method for spatio-temporal monitoring of actual crop status. As a next step, yield forecasting by assimilating remote sensing based plant variables in crop production models would improve agricultural decision support both at the farm and field level. In this study we investigated the potential of remote sensing based Leaf Area Index (LAI) time-series assimilated in the crop production model LINTUL to improve yield forecasting at field level. The effect of assimilation method and amount of assimilated observations was evaluated. The LINTUL-3 crop production model was calibrated and validated for a potato crop on two experimental fields in the south of the Netherlands. A range of data sources (e.g., in-situ soil moisture and weather sensors, destructive crop measurements) was used for calibration of the model for the experimental field in 2010. LAI from cropscan field radiometer measurements and actual LAI measured with the LAI-2000 instrument were used as input for the LAI time-series. The LAI time-series were assimilated in the LINTUL model and validated for a second experimental field on which potatoes were grown in 2011. Yield in 2011 was simulated with an R2 of 0.82 when compared with field measured yield. Furthermore, we analysed the potential of assimilation of LAI into the LINTUL-3 model through the 'updating' assimilation technique. The deviation between measured and simulated yield decreased from 9371 kg/ha to 8729 kg/ha when assimilating weekly LAI measurements in the LINTUL model over the season of 2011. LINTUL-3 furthermore shows the main growth reducing factors, which are useful for farm decision support. The combination of crop models and sensor

  19. Time series Analysis of Integrateds Building System Variables

    Science.gov (United States)

    Georgiev, Tz.; Jonkov, T.; Yonchev, E.

    2010-10-01

    This article deals with time series analysis of indoor and outdoor variables of the integrated building system. The kernel of these systems is heating, ventilation and air conditioning (HVAC) problems. Important outdoor and indoor variables are: air temperature, global and diffuse radiations, wind speed and direction, temperature, relative humidity, mean radiant temperature, and so on. The aim of this article is TO select the structure and investigation of a linear auto—regressive (AR) and auto—regressive with external inputs (ARX) models. The investigation of obtained models is based on real—live data. All researches are derived in MATLAB environment. The further research will focus on synthesis of robust energy saving control algorithms.

  20. Earthquake magnitude time series: scaling behavior of visibility networks

    Science.gov (United States)

    Aguilar-San Juan, B.; Guzmán-Vargas, L.

    2013-11-01

    We present a statistical analysis of earthquake magnitude sequences in terms of the visibility graph method. Magnitude time series from Italy, Southern California, and Mexico are transformed into networks and some organizational graph properties are discussed. Connectivities are characterized by a scale-free distribution with a noticeable effect for large scales due to either the presence or the lack of large events. Also, a scaling behavior is observed between different node measures like betweenness centrality, clustering coefficient, nearest neighbor connectivity, and earthquake magnitude. Moreover, parameters which quantify the difference between forward and backward links, are proposed to evaluate the asymmetry of visibility attachment mechanism. Our results show an alternating average behavior of these parameters as earthquake magnitude changes. Finally, we evaluate the effects of reducing temporal and spatial windows of observation upon visibility network properties for main-shocks.

  1. Quantifying the Dynamical Complexity of Chaotic Time Series

    Science.gov (United States)

    Politi, Antonio

    2017-04-01

    A powerful approach is proposed for the characterization of chaotic signals. It is based on the combined use of two classes of indicators: (i) the probability of suitable symbolic sequences (obtained from the ordinal patterns of the corresponding time series); (ii) the width of the corresponding cylinder sets. This way, much information can be extracted and used to quantify the complexity of a given signal. As an example of the potentiality of the method, I introduce a modified permutation entropy which allows for quantitative estimates of the Kolmogorov-Sinai entropy in hyperchaotic models, where other methods would be unpractical. As a by-product, estimates of the fractal dimension of the underlying attractors are possible as well.

  2. Optimal estimation of recurrence structures from time series

    Science.gov (United States)

    beim Graben, Peter; Sellers, Kristin K.; Fröhlich, Flavio; Hutt, Axel

    2016-05-01

    Recurrent temporal dynamics is a phenomenon observed frequently in high-dimensional complex systems and its detection is a challenging task. Recurrence quantification analysis utilizing recurrence plots may extract such dynamics, however it still encounters an unsolved pertinent problem: the optimal selection of distance thresholds for estimating the recurrence structure of dynamical systems. The present work proposes a stochastic Markov model for the recurrent dynamics that allows for the analytical derivation of a criterion for the optimal distance threshold. The goodness of fit is assessed by a utility function which assumes a local maximum for that threshold reflecting the optimal estimate of the system's recurrence structure. We validate our approach by means of the nonlinear Lorenz system and its linearized stochastic surrogates. The final application to neurophysiological time series obtained from anesthetized animals illustrates the method and reveals novel dynamic features of the underlying system. We propose the number of optimal recurrence domains as a statistic for classifying an animals' state of consciousness.

  3. A Multivariate Time Series Method for Monte Carlo Reactor Analysis

    International Nuclear Information System (INIS)

    Taro Ueki

    2008-01-01

    A robust multivariate time series method has been established for the Monte Carlo calculation of neutron multiplication problems. The method is termed Coarse Mesh Projection Method (CMPM) and can be implemented using the coarse statistical bins for acquisition of nuclear fission source data. A novel aspect of CMPM is the combination of the general technical principle of projection pursuit in the signal processing discipline and the neutron multiplication eigenvalue problem in the nuclear engineering discipline. CMPM enables reactor physicists to accurately evaluate major eigenvalue separations of nuclear reactors with continuous energy Monte Carlo calculation. CMPM was incorporated in the MCNP Monte Carlo particle transport code of Los Alamos National Laboratory. The great advantage of CMPM over the traditional Fission Matrix method is demonstrated for the three space-dimensional modeling of the initial core of a pressurized water reactor

  4. Multivariate time series with linear state space structure

    CERN Document Server

    Gómez, Víctor

    2016-01-01

    This book presents a comprehensive study of multivariate time series with linear state space structure. The emphasis is put on both the clarity of the theoretical concepts and on efficient algorithms for implementing the theory. In particular, it investigates the relationship between VARMA and state space models, including canonical forms. It also highlights the relationship between Wiener-Kolmogorov and Kalman filtering both with an infinite and a finite sample. The strength of the book also lies in the numerous algorithms included for state space models that take advantage of the recursive nature of the models. Many of these algorithms can be made robust, fast, reliable and efficient. The book is accompanied by a MATLAB package called SSMMATLAB and a webpage presenting implemented algorithms with many examples and case studies. Though it lays a solid theoretical foundation, the book also focuses on practical application, and includes exercises in each chapter. It is intended for researchers and students wor...

  5. Indirect inference with time series observed with error

    DEFF Research Database (Denmark)

    Rossi, Eduardo; Santucci de Magistris, Paolo

    estimation. We propose to solve this inconsistency by jointly estimating the nuisance and the structural parameters. Under standard assumptions, this estimator is consistent and asymptotically normal. A condition for the identification of ARMA plus noise is obtained. The proposed methodology is used......We analyze the properties of the indirect inference estimator when the observed series are contaminated by measurement error. We show that the indirect inference estimates are asymptotically biased when the nuisance parameters of the measurement error distribution are neglected in the indirect...... to estimate the parameters of continuous-time stochastic volatility models with auxiliary specifications based on realized volatility measures. Monte Carlo simulations shows the bias reduction of the indirect estimates obtained when the microstructure noise is explicitly modeled. Finally, an empirical...

  6. Single-Index Additive Vector Autoregressive Time Series Models

    KAUST Repository

    LI, YEHUA

    2009-09-01

    We study a new class of nonlinear autoregressive models for vector time series, where the current vector depends on single-indexes defined on the past lags and the effects of different lags have an additive form. A sufficient condition is provided for stationarity of such models. We also study estimation of the proposed model using P-splines, hypothesis testing, asymptotics, selection of the order of the autoregression and of the smoothing parameters and nonlinear forecasting. We perform simulation experiments to evaluate our model in various settings. We illustrate our methodology on a climate data set and show that our model provides more accurate yearly forecasts of the El Niño phenomenon, the unusual warming of water in the Pacific Ocean. © 2009 Board of the Foundation of the Scandinavian Journal of Statistics.

  7. SPITZER IRAC PHOTOMETRY FOR TIME SERIES IN CROWDED FIELDS

    Energy Technology Data Exchange (ETDEWEB)

    Novati, S. Calchi; Beichman, C. [NASA Exoplanet Science Institute, MS 100-22, California Institute of Technology, Pasadena, CA 91125 (United States); Gould, A.; Fausnaugh, M.; Gaudi, B. S.; Pogge, R. W.; Wibking, B.; Zhu, W.; Poleski, R. [Department of Astronomy, Ohio State University, 140 W. 18th Ave., Columbus, OH 43210 (United States); Yee, J. C. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States); Bryden, G.; Henderson, C. B.; Shvartzvald, Y. [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Drive, Pasadena, CA 91109 (United States); Carey, S. [Spitzer, Science Center, MS 220-6, California Institute of Technology, Pasadena, CA (United States); Udalski, A.; Pawlak, M.; Szymański, M. K.; Skowron, J.; Mróz, P.; Kozłowski, S. [Warsaw University Observatory, Al. Ujazdowskie 4, 00-478 Warszawa (Poland); Collaboration: Spitzer team; OGLE group; and others

    2015-12-01

    We develop a new photometry algorithm that is optimized for the Infrared Array Camera (IRAC) Spitzer time series in crowded fields and that is particularly adapted to faint or heavily blended targets. We apply this to the 170 targets from the 2015 Spitzer microlensing campaign and present the results of three variants of this algorithm in an online catalog. We present detailed accounts of the application of this algorithm to two difficult cases, one very faint and the other very crowded. Several of Spitzer's instrumental characteristics that drive the specific features of this algorithm are shared by Kepler and WFIRST, implying that these features may prove to be a useful starting point for algorithms designed for microlensing campaigns by these other missions.

  8. Predicting the Market Potential Using Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Halmet Bradosti

    2015-12-01

    Full Text Available The aim of this analysis is to forecast a mini-market sales volume for the period of twelve months starting August 2015 to August 2016. The study is based on the monthly sales in Iraqi Dinar for a private local mini-market for the month of April 2014 to July 2015. As revealed on the graph and of course if the stagnant economic condition continues, the trend of future sales is down-warding. Based on time series analysis, the business may continue to operate and generate small revenues until August 2016. However, due to low sales volume, low profit margin and operating expenses, the revenues may not be adequate enough to produce positive net income and the business may not be able to operate afterward. The principal question rose from this is the forecasting sales in the region will be difficult where the business cycle so dynamic and revolutionary due to systematic risks and unforeseeable future.

  9. Efficient Bayesian inference for natural time series using ARFIMA processes

    Science.gov (United States)

    Graves, Timothy; Gramacy, Robert; Franzke, Christian; Watkins, Nicholas

    2016-04-01

    Many geophysical quantities, such as atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long memory (LM). LM implies that these quantities experience non-trivial temporal memory, which potentially not only enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LM. We present a modern and systematic approach to the inference of LM. We use the flexible autoregressive fractional integrated moving average (ARFIMA) model, which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LM, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g., short-memory effects) can be integrated over in order to focus on long-memory parameters and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data and the central England temperature (CET) time series, with favorable comparison to the standard estimators [1]. In addition we show how the method can be used to perform joint inference of the stability exponent and the memory parameter when ARFIMA is extended to allow for alpha-stable innovations. Such models can be used to study systems where heavy tails and long range memory coexist. [1] Graves et al, Nonlin. Processes Geophys., 22, 679-700, 2015; doi:10.5194/npg-22-679-2015.

  10. Seasonal dynamics of bacterial meningitis: a time-series analysis.

    Science.gov (United States)

    Paireau, Juliette; Chen, Angelica; Broutin, Helene; Grenfell, Bryan; Basta, Nicole E

    2016-06-01

    Bacterial meningitis, which is caused mainly by Neisseria meningitidis, Haemophilus influenzae, and Streptococcus pneumoniae, inflicts a substantial burden of disease worldwide. Yet, the temporal dynamics of this disease are poorly characterised and many questions remain about the ecology of the disease. We aimed to comprehensively assess seasonal trends in bacterial meningitis on a global scale. We developed the first bacterial meningitis global database by compiling monthly incidence data as reported by country-level surveillance systems. Using country-level wavelet analysis, we identified whether a 12 month periodic component (annual seasonality) was detected in time-series that had at least 5 years of data with at least 40 cases reported per year. We estimated the mean timing of disease activity by computing the centre of gravity of the distribution of cases and investigated whether synchrony exists between the three pathogens responsible for most cases of bacterial meningitis. We used country-level data from 66 countries, including from 47 countries outside the meningitis belt in sub-Saharan Africa. A persistent seasonality was detected in 49 (96%) of the 51 time-series from 38 countries eligible for inclusion in the wavelet analyses. The mean timing of disease activity had a latitudinal trend, with bacterial meningitis seasons peaking during the winter months in countries in both the northern and southern hemispheres. The three pathogens shared similar seasonality, but time-shifts differed slightly by country. Our findings provide key insight into the seasonal dynamics of bacterial meningitis and add to knowledge about the global epidemiology of meningitis and the host, environment, and pathogen characteristics driving these patterns. Comprehensive understanding of global seasonal trends in meningitis could be used to design more effective prevention and control strategies. Princeton University Health Grand Challenge, US National Institutes of Health (NIH

  11. Time series inversion of spectra from ground-based radiometers

    Directory of Open Access Journals (Sweden)

    O. M. Christensen

    2013-07-01

    Full Text Available Retrieving time series of atmospheric constituents from ground-based spectrometers often requires different temporal averaging depending on the altitude region in focus. This can lead to several datasets existing for one instrument, which complicates validation and comparisons between instruments. This paper puts forth a possible solution by incorporating the temporal domain into the maximum a posteriori (MAP retrieval algorithm. The state vector is increased to include measurements spanning a time period, and the temporal correlations between the true atmospheric states are explicitly specified in the a priori uncertainty matrix. This allows the MAP method to effectively select the best temporal smoothing for each altitude, removing the need for several datasets to cover different altitudes. The method is compared to traditional averaging of spectra using a simulated retrieval of water vapour in the mesosphere. The simulations show that the method offers a significant advantage compared to the traditional method, extending the sensitivity an additional 10 km upwards without reducing the temporal resolution at lower altitudes. The method is also tested on the Onsala Space Observatory (OSO water vapour microwave radiometer confirming the advantages found in the simulation. Additionally, it is shown how the method can interpolate data in time and provide diagnostic values to evaluate the interpolated data.

  12. Time to virological failure, treatment change and interruption for individuals treated within 12 months of HIV seroconversion and in chronic infection

    NARCIS (Netherlands)

    Zugna, Daniela; Geskus, Ronald B.; de Stavola, Bianca; Rosinska, Magdalena; Bartmeyer, Barbara; Boufassa, Faroudy; Chaix, Marie-Laure; Babiker, Abdel; Porter, Kholoud; del Amo, Julia; Meyer, Lawrence; Bucher, Heiner C.; Chêne, Geneviève; Hamouda, Osamah; Pillay, Deenan; Prins, Maria; Rosinska, Magda; Sabin, Caroline; Touloumi, Giota; Olson, Ashley; Coughlin, Kate; Walker, Sarah; de Luca, Andrea; Fisher, Martin; Muga, Robert; Zangerle, Robert; Kelleher, Tony; Cooper, David; Grey, Pat; Finlayson, Robert; Bloch, Mark; Ramacciotti, Tim; Gelgor, Linda; Smith, Don; Gill, John; Lutsar, Irja; Dabis, Francois; Thiebaut, Rodolphe; Masquelier, Bernard; Costagliola, Dominique; Guiguet, Marguerite; Vanhems, Philippe; Ghosn, Jade; Kücherer, Claudia; Paparizos, V.; Gargalianos-Kakolyris, P.; Lazanas, M.; Pantazis, Nikos; Katsarou, Olga; Rezza, Giovanni; Dorrucci, Maria; D'Arminio Monforte, Antonella; van der Helm, Jannie; Schuitemaker, Hanneke; Sannes, Mette; Brubakk, Oddbjorn; Kran, Anne-Marte Bakken; Tor, Jordi; Garcia de Olalla, Patricia; Cayla, Joan; Moreno, Santiago; Monge, Susana; del Romero, Jorge; Pérez-Hoyos, Santiago; Rickenbach, Martin; Francioli, Patrick; Malyuta, Ruslan; Murphy, Gary; Johnson, Anne; Phillips, Andrew; Morrison, Charles; Salata, Robert; Mugerwa, Roy; Chipato, Tsungai; Amornkul, Pauli N.; Gilmour, Jill; Kamali, Anatoli; Karita, Etienne; Giaquinto, Carlo; Gibb, Di; Grarup, Jesper; Kirk, Ole; Ledergerber, Bruno; Panteleev, Alex; Thorne, Claire; Welch, Stephen; Aboulker, Jean-Pierre; Albert, Jan; Asandi, Silvia; de Wit, Stéphane; de Wolf, Frank; Gatell, José; Karpov, Igor; Lundgren, Jens; Møller, Claus; Rakhmanova, Aza; Rockstroh, Jürgen; Anne, Alain Volny; Dedes, Nikos; Fenton, Kevin; Pizzuti, David; Vitoria, Marco; Ellefson, Michelle; Faggion, Silvia; Fradette, Lorraine; Frost, Richard; Schwimmer, Christine; Scott, Martin

    2012-01-01

    Background: Estimates of treatment failure, change and interruption are lacking for individuals treated in early HIV infection. Methods: Using CASCADE data, we compared the effect of treatment in early infection (within 12 months of seroconversion) with that seen in chronic infection on risk of

  13. Predicting Time Series Outputs and Time-to-Failure for an Aircraft Controller Using Bayesian Modeling

    Science.gov (United States)

    He, Yuning

    2015-01-01

    Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.

  14. Spectral Time Series of the Cas A Supernova

    Science.gov (United States)

    Rest, Armin

    2016-10-01

    We propose to obtain time-resolved spectroscopy of the outburst of the enigmatic historical supernova Cas A using STIS spectroscopy of light scattered by a narrow filament of interstellar dust. Our group has identified recent, high-surface brightness filaments that are likely to provide high signal-to-noise reproduction of the evolving spectrum of the Cas A outburst using verified, published techniques developed by us.The timescales to see any appreciable evolution in individual astrophysical objects are typically many orders of magnitudes larger than a human life. As a result, astronomers study large numbers of objects at different stages of their evolution to connect how a single object should change with time. Cas A can provide us with the ability, to look back in time to the point of explosion by observing its light echoes - SN light scattered off of dust in the Milky Way, which causes a time delay in reaching us. In obtaining spectra of light echoes, we have been able to determine the maximum-light characteristics of the SN. Our goal here is to obtain a single STIS spectrum of a bright Cas A LE, which will provide us a time series of spectra and a spatially resolved light curve of the Cas A SN. With these data, we will measure the properties of the cooling envelope after the shock breakout of the SN to estimate the radius of the progenitor star. We will then be able to connect the progenitor star to the explosion to the SN to the SNR.

  15. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    Studying the climate dynamics of past interglacials (IGs) helps to better assess the anthropogenically influenced dynamics of the current IG, the Holocene. We select the IG portions from the EPICA Dome C ice core archive, which covers the past 800 ka, to apply methods of statistical time series analysis (Mudelsee 2010). The analysed variables are deuterium/H (indicating temperature) (Jouzel et al. 2007), greenhouse gases (Siegenthaler et al. 2005, Loulergue et al. 2008, L¨ü thi et al. 2008) and a model-co-derived climate radiative forcing (Köhler et al. 2010). We select additionally high-resolution sea-surface-temperature records from the marine sedimentary archive. The first statistical method, persistence time estimation (Mudelsee 2002) lets us infer the 'climate memory' property of IGs. Second, linear regression informs about long-term climate trends during IGs. Third, ramp function regression (Mudelsee 2000) is adapted to look on abrupt climate changes during IGs. We compare the Holocene with previous IGs in terms of these mathematical approaches, interprete results in a climate context, assess uncertainties and the requirements to data from old IGs for yielding results of 'acceptable' accuracy. This work receives financial support from the Deutsche Forschungsgemeinschaft (Project ClimSens within the DFG Research Priority Program INTERDYNAMIK) and the European Commission (Marie Curie Initial Training Network LINC, No. 289447, within the 7th Framework Programme). References Jouzel J, Masson-Delmotte V, Cattani O, Dreyfus G, Falourd S, Hoffmann G, Minster B, Nouet J, Barnola JM, Chappellaz J, Fischer H, Gallet JC, Johnsen S, Leuenberger M, Loulergue L, Luethi D, Oerter H, Parrenin F, Raisbeck G, Raynaud D, Schilt A, Schwander J, Selmo E, Souchez R, Spahni R, Stauffer B, Steffensen JP, Stenni B, Stocker TF, Tison JL, Werner M, Wolff EW (2007) Orbital and millennial Antarctic climate variability over the past 800,000 years. Science 317:793. Köhler P, Bintanja R

  16. Some problems in inference from time series of geophysical processes

    Science.gov (United States)

    Koutsoyiannis, Demetris

    2010-05-01

    Due to the complexity of geophysical processes, their modelling and the conducting of typical tasks, such as estimation, prediction and hypothesis testing, heavily rely on available data series and their statistical processing. The classical statistical approaches, which are often used in geophysical modelling, are based upon several simplifying assumptions, which are invalidated in natural processes. Central among these is the (usually tacit) time independence assumption which is regarded to simplify modelling and statistical testing at no substantial cost for the validity of results. Moreover, the perception of the general behaviour of the natural processes and the implied uncertainty is heavily affected by the classical statistical paradigm that is in common use. However, the study of natural behaviours reveals the dominance of change at a multitude of time scales, which in statistical terms is translated in strong time dependence, decaying very slowly with lag time. In its simplest form, this dependence, and equivalently the multi-scale change, can be described by a Hurst-Kolmogorov process using a single parameter additional to those of the marginal distribution. Remarkably, the Hurst-Kolmogorov stochastic dynamics results in much higher uncertainty in comparison to either nonstationary descriptions, or to typical stationary descriptions with independent random processes and common Markov-type processes. In addition, as far as typical statistical estimation is concerned, the Hurst-Kolmogorov dynamics implies dramatically higher intervals in the estimation of location statistical parameters (e.g., mean) and highly negative bias in the estimation of dispersion parameters (e.g., standard deviation), not to mention the bias and uncertainty in higher order moments. Surprisingly, all these differences are commonly unaccounted for in most studies of geophysical processes, which may result in inappropriate modelling, wrong inferences and false claims about the

  17. Urban Area Monitoring using MODIS Time Series Data

    Science.gov (United States)

    Devadiga, S.; Sarkar, S.; Mauoka, E.

    2015-12-01

    Growing urban sprawl and its impact on global climate due to urban heat island effects has been an active area of research over the recent years. This is especially significant in light of rapid urbanization that is happening in some of the first developing nations across the globe. But so far study of urban area growth has been largely restricted to local and regional scales, using high to medium resolution satellite observations, taken at distinct time periods. In this presentation we propose a new approach to detect and monitor urban area expansion using long time series of MODIS data. This work characterizes data points using a vector of several annual metrics computed from the MODIS 8-day and 16-day composite L3 data products, at 250M resolution and over several years and then uses a vector angle mapping classifier to detect and segment the urban area. The classifier is trained using a set of training points obtained from a reference vector point and polygon pre-filtered using the MODIS VI product. This work gains additional significance, given that, despite unprecedented urban growth since 2000, the area covered by the urban class in the MODIS Global Land Cover (MCD12Q1, MCDLCHKM and MCDLC1KM) product hasn't changed since the launch of Terra and Aqua. The proposed approach was applied to delineate the urban area around several cities in Asia known to have maximum growth in the last 15 years. Results were verified using high resolution Landsat data.

  18. Impact of Sensor Degradation on the MODIS NDVI Time Series

    Science.gov (United States)

    Wang, Dongdong; Morton, Douglas Christopher; Masek, Jeffrey; Wu, Aisheng; Nagol, Jyoteshwar; Xiong, Xiaoxiong; Levy, Robert; Vermote, Eric; Wolfe, Robert

    2012-01-01

    Time series of satellite data provide unparalleled information on the response of vegetation to climate variability. Detecting subtle changes in vegetation over time requires consistent satellite-based measurements. Here, the impact of sensor degradation on trend detection was evaluated using Collection 5 data from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensors on the Terra and Aqua platforms. For Terra MODIS, the impact of blue band (Band 3, 470 nm) degradation on simulated surface reflectance was most pronounced at near-nadir view angles, leading to a 0.001-0.004 yr-1 decline in Normalized Difference Vegetation Index (NDVI) under a range of simulated aerosol conditions and surface types. Observed trends in MODIS NDVI over North America were consistentwith simulated results,with nearly a threefold difference in negative NDVI trends derived from Terra (17.4%) and Aqua (6.7%) MODIS sensors during 2002-2010. Planned adjustments to Terra MODIS calibration for Collection 6 data reprocessing will largely eliminate this negative bias in detection of NDVI trends.

  19. Patch-Based Forest Change Detection from Landsat Time Series

    Directory of Open Access Journals (Sweden)

    M. Joseph Hughes

    2017-05-01

    Full Text Available In the species-rich and structurally complex forests of the Eastern United States, disturbance events are often partial and therefore difficult to detect using remote sensing methods. Here we present a set of new algorithms, collectively called Vegetation Regeneration and Disturbance Estimates through Time (VeRDET, which employ a novel patch-based approach to detect periods of vegetation disturbance, stability, and growth from the historical Landsat image records. VeRDET generates a yearly clear-sky composite from satellite imagery, calculates a spectral vegetation index for each pixel in that composite, spatially segments the vegetation index image into patches, temporally divides the time series into differently sloped segments, and then labels those segments as disturbed, stable, or regenerating. Segmentation at both the spatial and temporal steps are performed using total variation regularization, an algorithm originally designed for signal denoising. This study explores VeRDET’s effectiveness in detecting forest change using four vegetation indices and two parameters controlling the spatial and temporal scales of segmentation within a calibration region. We then evaluate algorithm effectiveness within a 386,000 km2 area in the Eastern United States where VeRDET has overall error of 23% and omission error across disturbances ranging from 22% to 78% depending on agent.

  20. Noninvertibility and resonance in discrete-time neural networks for time-series processing

    Science.gov (United States)

    Gicquel, N.; Anderson, J. S.; Kevrekidis, I. G.

    1998-01-01

    We present a computer-assisted study emphasizing certain elements of the dynamics of artificial neural networks (ANNs) used for discrete time-series processing and nonlinear system identification. The structure of the network gives rise to the possibility of multiple inverses of a phase point backward in time; this is not possible for the continuous-time system from which the time series are obtained. Using a two-dimensional illustrative model in an oscillatory regime, we study here the interaction of attractors predicted by the discrete-time ANN model (invariant circles and periodic points locked on them) with critical curves. These curves constitute a generalization of critical points for maps of the interval (in the sense of Julia-Fatou); their interaction with the model-predicted attractors plays a crucial role in the organization of the bifurcation structure and ultimately in determining the dynamic behavior predicted by the neural network.

  1. Mackenzie River Delta morphological change based on Landsat time series

    Science.gov (United States)

    Vesakoski, Jenni-Mari; Alho, Petteri; Gustafsson, David; Arheimer, Berit; Isberg, Kristina

    2015-04-01

    Arctic rivers are sensitive and yet quite unexplored river systems to which the climate change will impact on. Research has not focused in detail on the fluvial geomorphology of the Arctic rivers mainly due to the remoteness and wideness of the watersheds, problems with data availability and difficult accessibility. Nowadays wide collaborative spatial databases in hydrology as well as extensive remote sensing datasets over the Arctic are available and they enable improved investigation of the Arctic watersheds. Thereby, it is also important to develop and improve methods that enable detecting the fluvio-morphological processes based on the available data. Furthermore, it is essential to reconstruct and improve the understanding of the past fluvial processes in order to better understand prevailing and future fluvial processes. In this study we sum up the fluvial geomorphological change in the Mackenzie River Delta during the last ~30 years. The Mackenzie River Delta (~13 000 km2) is situated in the North Western Territories, Canada where the Mackenzie River enters to the Beaufort Sea, Arctic Ocean near the city of Inuvik. Mackenzie River Delta is lake-rich, productive ecosystem and ecologically sensitive environment. Research objective is achieved through two sub-objectives: 1) Interpretation of the deltaic river channel planform change by applying Landsat time series. 2) Definition of the variables that have impacted the most on detected changes by applying statistics and long hydrological time series derived from Arctic-HYPE model (HYdrologic Predictions for Environment) developed by Swedish Meteorological and Hydrological Institute. According to our satellite interpretation, field observations and statistical analyses, notable spatio-temporal changes have occurred in the morphology of the river channel and delta during the past 30 years. For example, the channels have been developing in braiding and sinuosity. In addition, various linkages between the studied

  2. Factor models in high-dimensional time series : A time-domain approach

    NARCIS (Netherlands)

    Hallin, M.; Lippi, M.

    2013-01-01

    High-dimensional time series may well be the most common type of dataset in the so-called “big data” revolution, and have entered current practice in many areas, including meteorology, genomics, chemometrics, connectomics, complex physics simulations, biological and environmental research, finance

  3. Time-variant power spectral analysis of heart-rate time series by ...

    Indian Academy of Sciences (India)

    Frequency domain representation of a short-term heart-rate time series (HRTS) signal is a popular method for evaluating the cardiovascular control system. The spectral parameters, viz. percentage power in low frequency band (%PLF), percentage power in high frequency band (%PHF), power ratio of low frequency to high ...

  4. Academic Workload and Working Time: Retrospective Perceptions versus Time-Series Data

    Science.gov (United States)

    Kyvik, Svein

    2013-01-01

    The purpose of this article is to examine the validity of perceptions by academic staff about their past and present workload and working hours. Retrospective assessments are compared with time-series data. The data are drawn from four mail surveys among academic staff in Norwegian universities undertaken in the period 1982-2008. The findings show…

  5. Time-variant power spectral analysis of heart-rate time series by ...

    Indian Academy of Sciences (India)

    From this observation we conclude that during acute myocardial infarction, the anterior wall MI has stimulated sympathetic activity, while the acute inferior wall MI has stimulated parasympathetic activity. Results obtained from ARMA-based analysis of heart-rate time series signals are capable of complementing the clinical ...

  6. Estimation of vegetation cover resilience from satellite time series

    Directory of Open Access Journals (Sweden)

    T. Simoniello

    2008-07-01

    Full Text Available Resilience is a fundamental concept for understanding vegetation as a dynamic component of the climate system. It expresses the ability of ecosystems to tolerate disturbances and to recover their initial state. Recovery times are basic parameters of the vegetation's response to forcing and, therefore, are essential for describing realistic vegetation within dynamical models. Healthy vegetation tends to rapidly recover from shock and to persist in growth and expansion. On the contrary, climatic and anthropic stress can reduce resilience thus favouring persistent decrease in vegetation activity.

    In order to characterize resilience, we analyzed the time series 1982–2003 of 8 km GIMMS AVHRR-NDVI maps of the Italian territory. Persistence probability of negative and positive trends was estimated according to the vegetation cover class, altitude, and climate. Generally, mean recovery times from negative trends were shorter than those estimated for positive trends, as expected for vegetation of healthy status. Some signatures of inefficient resilience were found in high-level mountainous areas and in the Mediterranean sub-tropical ones. This analysis was refined by aggregating pixels according to phenology. This multitemporal clustering synthesized information on vegetation cover, climate, and orography rather well. The consequent persistence estimations confirmed and detailed hints obtained from the previous analyses. Under the same climatic regime, different vegetation resilience levels were found. In particular, within the Mediterranean sub-tropical climate, clustering was able to identify features with different persistence levels in areas that are liable to different levels of anthropic pressure. Moreover, it was capable of enhancing reduced vegetation resilience also in the southern areas under Warm Temperate sub-continental climate. The general consistency of the obtained results showed that, with the help of suited analysis

  7. Representative locations from time series of soil water content using time stability and wavelet analysis.

    Science.gov (United States)

    Rivera, Diego; Lillo, Mario; Granda, Stalin

    2014-12-01

    The concept of time stability has been widely used in the design and assessment of monitoring networks of soil moisture, as well as in hydrological studies, because it is as a technique that allows identifying of particular locations having the property of representing mean values of soil moisture in the field. In this work, we assess the effect of time stability calculations as new information is added and how time stability calculations are affected at shorter periods, subsampled from the original time series, containing different amounts of precipitation. In doing so, we defined two experiments to explore the time stability behavior. The first experiment sequentially adds new data to the previous time series to investigate the long-term influence of new data in the results. The second experiment applies a windowing approach, taking sequential subsamples from the entire time series to investigate the influence of short-term changes associated with the precipitation in each window. Our results from an operating network (seven monitoring points equipped with four sensors each in a 2-ha blueberry field) show that as information is added to the time series, there are changes in the location of the most stable point (MSP), and that taking the moving 21-day windows, it is clear that most of the variability of soil water content changes is associated with both the amount and intensity of rainfall. The changes of the MSP over each window depend on the amount of water entering the soil and the previous state of the soil water content. For our case study, the upper strata are proxies for hourly to daily changes in soil water content, while the deeper strata are proxies for medium-range stored water. Thus, different locations and depths are representative of processes at different time scales. This situation must be taken into account when water management depends on soil water content values from fixed locations.

  8. Long-term Hydrologic Time Series in Maine

    Science.gov (United States)

    Huntington, T. G.; Dudley, R. W.; Hodgkins, G. A.

    2002-05-01

    Long-term hydrologic data are valuable for improving our understanding of how water resources are likely to respond to changes in climate. The hydrologic regimes of rivers and lakes integrate climatological, geophysical, and biological processes that are difficult to model. Hydrologic variables record a synthesis of these complex interactions in metrics that are relatively easy to measure, compare among regions, and relate to measured climatic and land use variables. Here we present representative case studies using datasets including lake and river ice-out dates, seasonal center-of-volume date (SCVD, date on which half of the snow-melt dominated discharge volume has occurred during the period 1-Jan. and 31-May has occurred), water temperature, snow water equivalent, total annual discharge, and river ice thickness. These datasets were collected mainly by the U.S. Geological Survey (USGS). The snow data were collected by Maine Geological Survey, USGS, and private companies. The lake ice-out data were collected by various citizen observers and utility companies. Sea surface temperature measurements at Boothbay Harbor, Maine, are recorded by the Maine Department of Marine Resources. Because the calculation of ice thickness was peripheral to making these river flow measurements, the existence of these ice thickness data are fortuitous and provides a valuable data set that can be used in hydroclimatological investigations for detection of environmental change. Time-series analysis of lake and river ice-out dates, SCVD, and water temperature show a consistent hydrologic response indicating earlier spring warming in recent decades. The dates for Damariscotta Lake and the Piscataquis River ice-out have advanced significantly over their respective periods of record. Our analyses show that a majority of the lakes and rivers in Maine having long-term records (>100 years for lakes, and >50 years for rivers) show significant advances. The date of the SCVD, which is associated

  9. Instantaneous parameter estimation in cardiovascular time series by harmonic and time-frequency analysis.

    Science.gov (United States)

    Monti, Alessandro; Médigue, Claire; Mangin, Laurence

    2002-12-01

    Time-frequency distributions, such as smoothed pseudo Wigner-Ville distribution (SPWVD), complex demodulation (CDM), and provide useful time-varying spectral parameter estimators. However, each of these methods has limitations that a joint utilization could largely reduce, due to their interesting complementary features. The aim of this paper is to validate the joint SPWVD-CDM method on synthetic and real cardiovascular time series with normal and reduced variability such as in autonomic blockade or autonomic deficiency. We propose two indexes related to the noise present in the signal and to the dispersion of the power spectrum in order to validate instantaneous parameter estimation. In the low-frequency band, the interpretation of the instantaneous frequency and phase of cardiovascular time-series should be discarded in many real-life situations. Conversely, in the high frequency band, under paced breathing, the reliability of the instantaneous parameters is demonstrated even in conditions of reduced cardiovascular variability.

  10. Change classification in SAR time series: a functional approach

    Science.gov (United States)

    Boldt, Markus; Thiele, Antje; Schulz, Karsten; Hinz, Stefan

    2017-10-01

    Change detection represents a broad field of research in SAR remote sensing, consisting of many different approaches. Besides the simple recognition of change areas, the analysis of type, category or class of the change areas is at least as important for creating a comprehensive result. Conventional strategies for change classification are based on supervised or unsupervised landuse / landcover classifications. The main drawback of such approaches is that the quality of the classification result directly depends on the selection of training and reference data. Additionally, supervised processing methods require an experienced operator who capably selects the training samples. This training step is not necessary when using unsupervised strategies, but nevertheless meaningful reference data must be available for identifying the resulting classes. Consequently, an experienced operator is indispensable. In this study, an innovative concept for the classification of changes in SAR time series data is proposed. Regarding the drawbacks of traditional strategies given above, it copes without using any training data. Moreover, the method can be applied by an operator, who does not have detailed knowledge about the available scenery yet. This knowledge is provided by the algorithm. The final step of the procedure, which main aspect is given by the iterative optimization of an initial class scheme with respect to the categorized change objects, is represented by the classification of these objects to the finally resulting classes. This assignment step is subject of this paper.

  11. Distinguishing deterministic and noise components in ELM time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N

    2004-01-01

    Full text: One of the main problems in the preliminary data analysis is distinguishing the deterministic and noise components in the experimental signals. For example, in plasma physics the question arises analyzing edge localized modes (ELMs): is observed ELM behavior governed by a complicate deterministic chaos or just by random processes. We have developed methodology based on financial engineering principles, which allows us to distinguish deterministic and noise components. We extended the linear auto regression method (AR) by including the non-linearity (NAR method). As a starting point we have chosen the nonlinearity in the polynomial form, however, the NAR method can be extended to any other type of non-linear functions. The best polynomial model describing the experimental ELM time series was selected using Bayesian Information Criterion (BIC). With this method we have analyzed type I ELM behavior in a subset of ASDEX Upgrade shots. Obtained results indicate that a linear AR model can describe the ELM behavior. In turn, it means that type I ELM behavior is of a relaxation or random type

  12. Enhancing time-series detection algorithms for automated biosurveillance.

    Science.gov (United States)

    Tokars, Jerome I; Burkom, Howard; Xing, Jian; English, Roseanne; Bloom, Steven; Cox, Kenneth; Pavlin, Julie A

    2009-04-01

    BioSense is a US national system that uses data from health information systems for automated disease surveillance. We studied 4 time-series algorithm modifications designed to improve sensitivity for detecting artificially added data. To test these modified algorithms, we used reports of daily syndrome visits from 308 Department of Defense (DoD) facilities and 340 hospital emergency departments (EDs). At a constant alert rate of 1%, sensitivity was improved for both datasets by using a minimum standard deviation (SD) of 1.0, a 14-28 day baseline duration for calculating mean and SD, and an adjustment for total clinic visits as a surrogate denominator. Stratifying baseline days into weekdays versus weekends to account for day-of-week effects increased sensitivity for the DoD data but not for the ED data. These enhanced methods may increase sensitivity without increasing the alert rate and may improve the ability to detect outbreaks by using automated surveillance system data.

  13. Industrial electricity demand for Turkey: A structural time series analysis

    International Nuclear Information System (INIS)

    Dilaver, Zafer; Hunt, Lester C.

    2011-01-01

    This research investigates the relationship between Turkish industrial electricity consumption, industrial value added and electricity prices in order to forecast future Turkish industrial electricity demand. To achieve this, an industrial electricity demand function for Turkey is estimated by applying the structural time series technique to annual data over the period 1960 to 2008. In addition to identifying the size and significance of the price and industrial value added (output) elasticities, this technique also uncovers the electricity Underlying Energy Demand Trend (UEDT) for the Turkish industrial sector and is, as far as is known, the first attempt to do this. The results suggest that output and real electricity prices and a UEDT all have an important role to play in driving Turkish industrial electricity demand. Consequently, they should all be incorporated when modelling Turkish industrial electricity demand and the estimated UEDT should arguably be considered in future energy policy decisions concerning the Turkish electricity industry. The output and price elasticities are estimated to be 0.15 and - 0.16 respectively, with an increasing (but at a decreasing rate) UEDT and based on the estimated equation, and different forecast assumptions, it is predicted that Turkish industrial electricity demand will be somewhere between 97 and 148 TWh by 2020. -- Research Highlights: → Estimated output and price elasticities of 0.15 and -0.16 respectively. → Estimated upward sloping UEDT (i.e. energy using) but at a decreasing rate. → Predicted Turkish industrial electricity demand between 97 and 148 TWh in 2020.

  14. Imputation of missing data in time series for air pollutants

    Science.gov (United States)

    Junger, W. L.; Ponce de Leon, A.

    2015-02-01

    Missing data are major concerns in epidemiological studies of the health effects of environmental air pollutants. This article presents an imputation-based method that is suitable for multivariate time series data, which uses the EM algorithm under the assumption of normal distribution. Different approaches are considered for filtering the temporal component. A simulation study was performed to assess validity and performance of proposed method in comparison with some frequently used methods. Simulations showed that when the amount of missing data was as low as 5%, the complete data analysis yielded satisfactory results regardless of the generating mechanism of the missing data, whereas the validity began to degenerate when the proportion of missing values exceeded 10%. The proposed imputation method exhibited good accuracy and precision in different settings with respect to the patterns of missing observations. Most of the imputations obtained valid results, even under missing not at random. The methods proposed in this study are implemented as a package called mtsdi for the statistical software system R.

  15. Chaos Time Series Prediction Based on Membrane Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Meng Li

    2015-01-01

    Full Text Available This paper puts forward a prediction model based on membrane computing optimization algorithm for chaos time series; the model optimizes simultaneously the parameters of phase space reconstruction (τ,m and least squares support vector machine (LS-SVM (γ,σ by using membrane computing optimization algorithm. It is an important basis for spectrum management to predict accurately the change trend of parameters in the electromagnetic environment, which can help decision makers to adopt an optimal action. Then, the model presented in this paper is used to forecast band occupancy rate of frequency modulation (FM broadcasting band and interphone band. To show the applicability and superiority of the proposed model, this paper will compare the forecast model presented in it with conventional similar models. The experimental results show that whether single-step prediction or multistep prediction, the proposed model performs best based on three error measures, namely, normalized mean square error (NMSE, root mean square error (RMSE, and mean absolute percentage error (MAPE.

  16. Nonlinear time series analysis of normal and pathological human walking

    Science.gov (United States)

    Dingwell, Jonathan B.; Cusumano, Joseph P.

    2000-12-01

    Characterizing locomotor dynamics is essential for understanding the neuromuscular control of locomotion. In particular, quantifying dynamic stability during walking is important for assessing people who have a greater risk of falling. However, traditional biomechanical methods of defining stability have not quantified the resistance of the neuromuscular system to perturbations, suggesting that more precise definitions are required. For the present study, average maximum finite-time Lyapunov exponents were estimated to quantify the local dynamic stability of human walking kinematics. Local scaling exponents, defined as the local slopes of the correlation sum curves, were also calculated to quantify the local scaling structure of each embedded time series. Comparisons were made between overground and motorized treadmill walking in young healthy subjects and between diabetic neuropathic (NP) patients and healthy controls (CO) during overground walking. A modification of the method of surrogate data was developed to examine the stochastic nature of the fluctuations overlying the nominally periodic patterns in these data sets. Results demonstrated that having subjects walk on a motorized treadmill artificially stabilized their natural locomotor kinematics by small but statistically significant amounts. Furthermore, a paradox previously present in the biomechanical literature that resulted from mistakenly equating variability with dynamic stability was resolved. By slowing their self-selected walking speeds, NP patients adopted more locally stable gait patterns, even though they simultaneously exhibited greater kinematic variability than CO subjects. Additionally, the loss of peripheral sensation in NP patients was associated with statistically significant differences in the local scaling structure of their walking kinematics at those length scales where it was anticipated that sensory feedback would play the greatest role. Lastly, stride-to-stride fluctuations in the

  17. On the C++ Object Programming for Time Series, in the Linux framework

    OpenAIRE

    Mateescu, George Daniel

    2013-01-01

    We study the implementation of time series trough C++ classes, using the fundamentals of C++ programming language, in the Linux framework. Such an implementation may be useful in time series modelling.

  18. A time-series approach to dynamical systems from classical and quantum worlds

    Energy Technology Data Exchange (ETDEWEB)

    Fossion, Ruben [Instituto Nacional de Geriatría, Periférico Sur No. 2767, Col. San Jerónimo Lídice, Del. Magdalena Contreras, 10200 México D.F., Mexico and Centro de Ciencias de la Complejidad (C3), Universidad Nacional Autó (Mexico)

    2014-01-08

    This contribution discusses some recent applications of time-series analysis in Random Matrix Theory (RMT), and applications of RMT in the statistial analysis of eigenspectra of correlation matrices of multivariate time series.

  19. Predicting forest structure across space and time using lidar and Landsat time series (Invited)

    Science.gov (United States)

    Cohen, W. B.; Pflugmacher, D.; Yang, Z.

    2013-12-01

    Lidar is unprecedented in its ability to provide detailed characterizations of forest structure. However, use of lidar is currently limited to relatively small areas associated with specific projects. Moreover, lidar data are even more severely limited historically, which inhibits retrospective analyses of structure change. Landsat data is commonly dismissed when considering a need to map forest structure due to its lack of sensitivity to structural variability. But with the opening of the archive by USGS, Landsat data can now be used in creative ways that take advantage of dense time series to describe historic disturbance and recovery. Because the condition and state of a forest at any given location is largely a function of its disturbance history, this provides an opportunity to use Landsat time series to inform statistical models that predict current forest structure. Additionally, because Landsat time series go back to 1972, it becomes possible to extend those models back in time to derive structure trajectories for retrospective analyses. We will present the results from one or two studies in the Pacific Northwest, USA that use disturbance history metrics derived from Landsat time series to demonstrate the new power of Landsat to predict forest structure (e.g., aboveground live biomass, height). The primary metrics used relate to the magnitude of the greatest disturbance, pre- and post- disturbance spectral trends, and current spectral properties. This is accomplished using a limited field dataset to translate a lidar coverage into the structure measures of interest, and then sampling the lidar data to build a robust statistical relationship between lidar-derived structure and disturbance history. We examined the effect of number of years of history on prediction strength and found that R2 increases and RMSE decreases for a period of ~20 years. This means we can predict forest structure as far back as 1992, using the 20 years of history information contained

  20. Quirky patterns in time-series of estimates of recruitment could be artefacts

    DEFF Research Database (Denmark)

    Dickey-Collas, M.; Hinzen, N.T.; Nash, R.D.M.

    2015-01-01

    employed, and the associated modelling assumptions, can have an important influence on the characteristics of each time-series. We explore this idea by investigating recruitment time-series with three different recruitment parameterizations: a stock–recruitment model, a random-walk time-series model...