WorldWideScience

Sample records for pre-pleistocene time series

  1. Modeling 100,000-year climate fluctuations in pre-Pleistocene time series

    Science.gov (United States)

    Crowley, Thomas J.; Kim, Kwang-Yul; Mengel, John G.; Short, David A.

    1992-01-01

    A number of pre-Pleistocene climate records exhibit significant fluctuations at the 100,000-year (100-ky) eccentricity period, before the time of such fluctuations in global ice volume. The origin of these fluctuations has been obscure. Results reported here from a modeling study suggest that such a response can occur over low-altitude land areas involved in monsoon fluctuations. The twice yearly passage of the sun across the equator and the seasonal timing of perihelion interact to increase both 100-ky and 400-ky power in the modeled temperature field. The magnitude of the temperature response is sufficiently large to leave an imprint on the geologic record, and simulated fluctuations resemble those found in records of Triassic lake levels.

  2. Stable isotope time series and dentin increments elucidate Pleistocene proboscidean paleobiology

    Science.gov (United States)

    Fisher, Daniel; Rountrey, Adam; Smith, Kathlyn; Fox, David

    2010-05-01

    Investigations of stable isotope composition of mineralized tissues have added greatly to our knowledge of past climates and dietary behaviors of organisms, even when they are implemented through 'bulk sampling', in which a single assay yields a single, time-averaged value. Likewise, the practice of 'sclerochronology', which documents periodic structural increments comprising a growth record for accretionary tissues, offers insights into rates of growth and age data at a scale of temporal resolution permitted by the nature of structural increments. We combine both of these approaches to analyze dental tissues of late Pleistocene proboscideans. Tusk dentin typically preserves a record of accretionary growth consisting of histologically distinct increments on daily, approximately weekly, and yearly time scales. Working on polished transverse or longitudinal sections, we mill out a succession of temporally controlled dentin samples bounded by clear structural increments with a known position in the sequence of tusk growth. We further subject each sample (or an aliquot thereof) to multiple compositional analyses - most frequently to assess δ18O and δ13C of hydroxyapatite carbonate, and δ13C and δ15N of collagen. This yields, for each animal and each series of years investigated, a set of parallel compositional time series with a temporal resolution of 1-2 months (or finer if we need additional precision). Patterns in variation of thickness of periodic sub-annual increments yield insight into intra-annual and inter-annual variation of tusk growth rate. This is informative even by itself, but it is still more valuable when coupled with compositional time series. Further, the controls on different stable isotope systems are sufficiently different that the data ensemble yields 'much more than the sum of its parts.' By assessing how compositions and growth rates covary, we monitor with greater confidence changes in local climate, diet, behavior, and health status. We

  3. U-series component dating for late pleistocene basalt Longgang, Jilin province

    International Nuclear Information System (INIS)

    Yu Fusheng; Yuan Wanming; Han Song

    2003-01-01

    Longgang volcanic swarm belongs to one of volcanic areas which have been active since modern times. In view of multiple eruptions during histories, it is very important to determine age of every eruption for evaluating volcanic hazards. The alkaline basalt samples taken from Dayizishan and diaoshuihu are analyzed by U-series component method, after magnetic separation. The ages of the two samples are (71 ± 9) ka, (106 ± 13) ka before presence, respectively. These data indicate that there exist intensively eruptive activities during late Pleistocene

  4. Ancient divergence time estimates in Eutropis rugifera support the existence of Pleistocene barriers on the exposed Sunda Shelf

    Directory of Open Access Journals (Sweden)

    Benjamin R. Karin

    2017-10-01

    Full Text Available Episodic sea level changes that repeatedly exposed and inundated the Sunda Shelf characterize the Pleistocene. Available evidence points to a more xeric central Sunda Shelf during periods of low sea levels, and despite the broad land connections that persisted during this time, some organisms are assumed to have faced barriers to dispersal between land-masses on the Sunda Shelf. Eutropis rugifera is a secretive, forest adapted scincid lizard that ranges across the Sunda Shelf. In this study, we sequenced one mitochondrial (ND2 and four nuclear (BRCA1, BRCA2, RAG1, and MC1R markers and generated a time-calibrated phylogeny in BEAST to test whether divergence times between Sundaic populations of E. rugifera occurred during Pleistocene sea-level changes, or if they predate the Pleistocene. We find that E. rugifera shows pre-Pleistocene divergences between populations on different Sundaic land-masses. The earliest divergence within E. rugifera separates the Philippine samples from the Sundaic samples approximately 16 Ma; the Philippine populations thus cannot be considered conspecific with Sundaic congeners. Sundaic populations diverged approximately 6 Ma, and populations within Borneo from Sabah and Sarawak separated approximately 4.5 Ma in the early Pliocene, followed by further cladogenesis in Sarawak through the Pleistocene. Divergence of peninsular Malaysian populations from the Mentawai Archipelago occurred approximately 5 Ma. Separation among island populations from the Mentawai Archipelago likely dates to the Pliocene/Pleistocene boundary approximately 3.5 Ma, and our samples from peninsular Malaysia appear to coalesce in the middle Pleistocene, about 1 Ma. Coupled with the monophyly of these populations, these divergence times suggest that despite consistent land-connections between these regions throughout the Pleistocene E. rugifera still faced barriers to dispersal, which may be a result of environmental shifts that accompanied the

  5. Direct U-series analysis of the Lezetxiki humerus reveals a Middle Pleistocene age for human remains in the Basque Country (northern Iberia).

    Science.gov (United States)

    de-la-Rúa, Concepción; Altuna, Jesús; Hervella, Monserrat; Kinsley, Leslie; Grün, Rainer

    2016-04-01

    In 1964, a human humerus was found in a sedimentary deposit in Lezetxiki Cave (Basque Country, northern Iberia). The first studies on the stratigraphy, associated mammal faunal remains and lithic implements placed the deposits containing the humerus into the Riss glacial stage. Direct chronometric evidence has so far been missing, and the previous chronostratigraphic framework and faunal dating gave inconsistent results. Here we report laser ablation U-series analyses on the humerus yielding a minimum age of 164 ± 9 ka, corresponding to MIS 6. This is the only direct dating analysis of the Lezetxiki humerus and confirms a Middle Pleistocene age for this hominin fossil. Morphometric analyses suggest that the Lezetxiki humerus has close affinities to other Middle Pleistocene archaic hominins, such as those from La Sima de los Huesos at Atapuerca. This emphasizes the significance of the Lezetxiki fossil within the populations that predate the Neanderthals in south-western Europe. It is thus an important key fossil for the understanding of human evolution in Europe during the Middle Pleistocene, a time period when a great morphological diversity is observed but whose phylogenetic meaning is not yet fully understood. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Direct dating of Pleistocene stegodon from Timor Island, East Nusa Tenggara

    Directory of Open Access Journals (Sweden)

    Julien Louys

    2016-03-01

    Full Text Available Stegodons are a commonly recovered extinct proboscidean (elephants and allies from the Pleistocene record of Southeast Asian oceanic islands. Estimates on when stegodons arrived on individual islands and the timings of their extinctions are poorly constrained due to few reported direct geochronological analyses of their remains. Here we report on uranium-series dating of a stegodon tusk recovered from the Ainaro Gravels of Timor. The six dates obtained indicate the local presence of stegodons in Timor at or before 130 ka, significantly pre-dating the earliest evidence of humans on the island. On the basis of current data, we find no evidence for significant environmental changes or the presence of modern humans in the region during that time. Thus, we do not consider either of these factors to have contributed significantly to their extinction. In the absence of these, we propose that their extinction was possibly the result of long-term demographic and genetic declines associated with an isolated island population.

  7. The challenge of dating Early Pleistocene fossil teeth by the combined uranium series-electron spin resonance method: the Venta Micena palaeontological site (Orce, Spain)

    International Nuclear Information System (INIS)

    Duval, M.; Falgueres, Ch.; Bahain, J.J.; Shao, Q.; Grun, R.; Aubert, M.; Hellstrom, J.; Dolo, J.M.; Agusti, J.; Martinez-Navarro, B.; Palmqvist, P.; Toro-Moyano, I.

    2011-01-01

    The palaeontological site of Venta Micena (Orce, Andalusia, Spain) lies in the eastern sector of the Guadix-Baza basin, one of the best documented areas in Europe for Plio-Pleistocene bio-stratigraphy. The combination of bio-chronological and palaeo-magnetic results, combined with the radiometric data obtained for Atapuerca Sima del Elefante, indicated that the Venta Micena stratum was formed between the Jaramillo and Olduvai palaeo-magnetic events, most likely between 1.22 and 1.77 Ma. Five fossil teeth from two outcrops (sites A and B) were selected to assess the potential of combined uranium series-electron spin resonance (US-ESR) dating of Early Pleistocene sites. Although the US-ESR results of the first outcrop showed a large scatter between the three teeth, the mean age of 1.37 ± 0.24 Ma can be considered a reasonable age estimate for Venta Micena. The mean ESR age of 0.62 ± 0.03 Ma obtained for site B seems to be a severe underestimation when compared with the independent age control. This underestimation is attributed to a relative recent U-mobilization event that led to some U-leaching. The results show that any ESR age calculations of old samples are extremely sensitive to variations in the measured 230 Th/ 234 U ratios in dental tissues. Although the results demonstrate that ESR can in principle be applied to Early Pleistocene sites, they also reveal the complexity of dating such old teeth. It is necessary to continue research in several directions, such as study of the behaviour of ESR signals in old teeth and understanding recent U-mobilization processes, to improve the reliability of the combined US-ESR dating method applied to Early Pleistocene times, a period for which the number of available numerical dating techniques is very limited. (authors)

  8. A Pleistocene coastal alluvial fan complex produced by Middle Pleistocene glacio-fluvial processes

    Science.gov (United States)

    Adamson, Kathryn; Woodward, Jamie; Hughes, Philip; Giglio, Federico; Del Bianco, Fabrizio

    2014-05-01

    A coarse-grained alluvial fan sequence at Lipci, Kotor Bay, in western Montenegro, provides a sedimentary record of meltwater streams draining from the Orjen Massif (1,894 m a.s.l.) to the coastal zone. At Lipci sedimentary evidence and U-series ages have been used alongside offshore bathymetric imagery and seismic profiles to establish the size of the fan and constrain the nature and timing of its formation. Establishing the depositional history of such coastal fans is important for our understanding of cold stage sediment flux from glaciated uplands to the offshore zone, and for exploring the impact of sea level change on fan reworking. There is evidence of at least four phases of Pleistocene glaciation on the Orjen massif, which have been U-series dated and correlated to MIS 12, MIS 6, MIS 5d-2 and the Younger Dryas. A series of meltwater channels delivered large volumes of coarse- and fine-grained limestone sediment from the glaciated uplands into the Bay of Kotor. At the southern margin of the Orjen massif, a series of large (>700 m long) alluvial fans has developed. Some of these extend offshore for up to 600 m. Lipci fan lies downstream of end moraines in the valley immediately above, which were formed by an extensive outlet glacier of the Orjen ice cap during MIS 12. The terrestrial deposits are part of the fan apex (50 m a.s.l.) that lies at the foot of a steep bedrock channel, but the majority of the fan is now more than 25 m below sea level. The terrestrial fan sediments are strongly cemented by multiple generations of calcite precipitates: the oldest U-series ages are infinite indicating that the fan is >350 ka in age. These ages are in agreement with alluvial sedimentary evidence and U-series ages from other fluvial units on Mount Orjen. The terrestrial portion of the Lipci fan surface contains several channels. These are well preserved due to cementation with calcium carbonate. Submarine imagery indicates that the now submerged portion of the fan also

  9. Effectiveness of Multivariate Time Series Classification Using Shapelets

    Directory of Open Access Journals (Sweden)

    A. P. Karpenko

    2015-01-01

    Full Text Available Typically, time series classifiers require signal pre-processing (filtering signals from noise and artifact removal, etc., enhancement of signal features (amplitude, frequency, spectrum, etc., classification of signal features in space using the classical techniques and classification algorithms of multivariate data. We consider a method of classifying time series, which does not require enhancement of the signal features. The method uses the shapelets of time series (time series shapelets i.e. small fragments of this series, which reflect properties of one of its classes most of all.Despite the significant number of publications on the theory and shapelet applications for classification of time series, the task to evaluate the effectiveness of this technique remains relevant. An objective of this publication is to study the effectiveness of a number of modifications of the original shapelet method as applied to the multivariate series classification that is a littlestudied problem. The paper presents the problem statement of multivariate time series classification using the shapelets and describes the shapelet–based basic method of binary classification, as well as various generalizations and proposed modification of the method. It also offers the software that implements a modified method and results of computational experiments confirming the effectiveness of the algorithmic and software solutions.The paper shows that the modified method and the software to use it allow us to reach the classification accuracy of about 85%, at best. The shapelet search time increases in proportion to input data dimension.

  10. Timing of Pleistocene glaciations in the High Atlas, Morocco: New 10Be and 36Cl exposure ages

    Science.gov (United States)

    Hughes, Philip D.; Fink, David; Rodés, Ángel; Fenton, Cassandra R.; Fujioka, Toshiyuki

    2018-01-01

    This paper presents data from 42 new samples yielding Late Pleistocene cosmogenic 10Be and 36Cl exposure ages of moraine boulders across a series of glaciated valleys in the Toubkal Massif (4167 m a.s.l.), High Atlas, Morocco. This represents the first comprehensive Pleistocene glacial chronology in North Africa and one of the largest datasets from the Mediterranean region. The timing of these glacier advances has major implications for understanding the influence of Atlantic depressions on moisture supply to North Africa and the Mediterranean basin during the Pleistocene. The oldest and lowest moraines which span elevations from ∼1900 to 2400 m a.s.l. indicate that the maximum glacier advance occurred from MIS 5 to 3 with a combined mean 10Be and 36Cl age of 50.2 ± 19.5 ka (1 SD; n = 12, 7 outliers). The next moraine units up-valley at higher elevations (∼2200-2600 m a.s.l.) yielded exposure ages close to the global Last Glacial Maximum (LGM) with a combined mean 10Be and 36Cl age of 22.0 ± 4.9 ka (1 SD; n = 9, 7 outliers). The youngest exposure ages are from moraines that were emplaced during the Younger Dryas with a combined mean 10Be and 36Cl age of 12.3 ± 0.9 ka (1 SD; n = 7, no outliers) and are found in cirques at the highest elevations ranging from ∼2900 to 3300 m a.s.l. From moraines predating the Younger Dryas, a large number of young outliers are spread evenly between 6 and 13 ka suggesting a continuing process of exhumation or repositioning of boulders during the early to mid-Holocene. This attests to active seismic processes and possibly intense erosion during this period.

  11. Analysis of time series and size of equivalent sample

    International Nuclear Information System (INIS)

    Bernal, Nestor; Molina, Alicia; Pabon, Daniel; Martinez, Jorge

    2004-01-01

    In a meteorological context, a first approach to the modeling of time series is to use models of autoregressive type. This allows one to take into account the meteorological persistence or temporal behavior, thereby identifying the memory of the analyzed process. This article seeks to pre-sent the concept of the size of an equivalent sample, which helps to identify in the data series sub periods with a similar structure. Moreover, in this article we examine the alternative of adjusting the variance of the series, keeping in mind its temporal structure, as well as an adjustment to the covariance of two time series. This article presents two examples, the first one corresponding to seven simulated series with autoregressive structure of first order, and the second corresponding to seven meteorological series of anomalies of the air temperature at the surface in two Colombian regions

  12. multivariate time series modeling of selected childhood diseases

    African Journals Online (AJOL)

    2016-06-17

    Jun 17, 2016 ... KEYWORDS: Multivariate Approach, Pre-whitening, Vector Time Series, .... Alternatively, the process may be written in mean adjusted form as .... The AIC criterion asymptotically over estimates the order with positive probability, whereas the BIC and HQC criteria ... has the same asymptotic distribution as Ǫ.

  13. Palaeogeographical time series maps of the Pleistocene and Holocene lowland landscapes in and around the North Sea.

    NARCIS (Netherlands)

    Cohen, K.M.

    2017-01-01

    Palaeogeographical time series maps provide powerful text figures that narrate the chronological story of landscape change in prehistoric and historic times in a quite accessible way. They inspire and communicate the broad idea of developments quickly, no matter how the maps are produced

  14. SensL B-Series and C-Series silicon photomultipliers for time-of-flight positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    O' Neill, K., E-mail: koneill@sensl.com; Jackson, C., E-mail: cjackson@sensl.com

    2015-07-01

    Silicon photomultipliers from SensL are designed for high performance, uniformity and low cost. They demonstrate peak photon detection efficiency of 41% at 420 nm, which is matched to the output spectrum of cerium doped lutetium orthosilicate. Coincidence resolving time of less than 220 ps is demonstrated. New process improvements have lead to the development of C-Series SiPM which reduces the dark noise by over an order of magnitude. In this paper we will show characterization test results which include photon detection efficiency, dark count rate, crosstalk probability, afterpulse probability and coincidence resolving time comparing B-Series to the newest pre-production C-Series. Additionally we will discuss the effect of silicon photomultiplier microcell size on coincidence resolving time allowing the optimal microcell size choice to be made for time of flight positron emission tomography systems.

  15. Complete mitochondrial genome and phylogeny of Pleistocene mammoth Mammuthus primigenius.

    Directory of Open Access Journals (Sweden)

    Evgeny I Rogaev

    2006-03-01

    Full Text Available Phylogenetic relationships between the extinct woolly mammoth (Mammuthus primigenius, and the Asian (Elephas maximus and African savanna (Loxodonta africana elephants remain unresolved. Here, we report the sequence of the complete mitochondrial genome (16,842 base pairs of a woolly mammoth extracted from permafrost-preserved remains from the Pleistocene epoch--the oldest mitochondrial genome sequence determined to date. We demonstrate that well-preserved mitochondrial genome fragments, as long as approximately 1,600-1700 base pairs, can be retrieved from pre-Holocene remains of an extinct species. Phylogenetic reconstruction of the Elephantinae clade suggests that M. primigenius and E. maximus are sister species that diverged soon after their common ancestor split from the L. africana lineage. Low nucleotide diversity found between independently determined mitochondrial genomic sequences of woolly mammoths separated geographically and in time suggests that north-eastern Siberia was occupied by a relatively homogeneous population of M. primigenius throughout the late Pleistocene.

  16. Nitrogen isotopes suggest a change in nitrogen dynamics between the Late Pleistocene and modern time in Yukon, Canada

    Science.gov (United States)

    Longstaffe, Fred J.; Zazula, Grant

    2018-01-01

    A magnificent repository of Late Pleistocene terrestrial megafauna fossils is contained in ice-rich loess deposits of Alaska and Yukon, collectively eastern Beringia. The stable carbon (δ13C) and nitrogen (δ15N) isotope compositions of bone collagen from these fossils are routinely used to determine paleodiet and reconstruct the paleoecosystem. This approach requires consideration of changes in C- and N-isotope dynamics over time and their effects on the terrestrial vegetation isotopic baseline. To test for such changes between the Late Pleistocene and modern time, we compared δ13C and δ15N for vegetation and bone collagen and structural carbonate of some modern, Yukon, arctic ground squirrels with vegetation and bones from Late Pleistocene fossil arctic ground squirrel nests preserved in Yukon loess deposits. The isotopic discrimination between arctic ground squirrel bone collagen and their diet was measured using modern samples, as were isotopic changes during plant decomposition; Over-wintering decomposition of typical vegetation following senescence resulted in a minor change (~0–1 ‰) in δ13C of modern Yukon grasses. A major change (~2–10 ‰) in δ15N was measured for decomposing Yukon grasses thinly covered by loess. As expected, the collagen-diet C-isotope discrimination measured for modern samples confirms that modern vegetation δ13C is a suitable proxy for the Late Pleistocene vegetation in Yukon Territory, after correction for the Suess effect. The N-isotope composition of vegetation from the fossil arctic ground squirrel nests, however, is determined to be ~2.8 ‰ higher than modern grasslands in the region, after correction for decomposition effects. This result suggests a change in N dynamics in this region between the Late Pleistocene and modern time. PMID:29447202

  17. Pleistocene Palaeoart of the Americas

    Directory of Open Access Journals (Sweden)

    Robert G. Bednarik

    2014-04-01

    Full Text Available In contrast to the great time depth of Pleistocene rock art and mobiliary ‘art’ in the four other continents, the available evidence from the Americas is very limited, and restricted at best to the last part of the final Pleistocene. A review of what has so far become available is hampered by a considerable burden of literature presenting material contended to be of the Ice Age, even of the Mesozoic in some cases, that needs to be sifted through to find a minute number of credible claims. Even the timing of the first colonization of the Americas remains unresolved, and the lack of clear-cut substantiation of palaeoart finds predating about 12,000 years bp is conspicuous. There are vague hints of earlier human presence, rendering it likely that archaeology has failed to define its manifestations adequately, and Pleistocene palaeoart remains almost unexplored at this stage.

  18. New ESR/U-series data for the early Middle Pleistocene site of Isernia la Pineta, Italy

    International Nuclear Information System (INIS)

    Shao Qingfeng; Bahain, Jean-Jacques; Falgueres, Christophe; Peretto, Carlo; Arzarello, Marta; Minelli, Antonella; Thun Hohenstein, Ursula; Dolo, Jean-Michel; Garcia, Tristan; Frank, Norbert; Douville, Eric

    2011-01-01

    Located in Southern Italy, the Early Palaeolithic site of Isernia la Pineta has provided numerous palaeontological remains and artefacts in well-defined fluvio-lacustrine sequence. The normal magnetization of the main archaeological layer t3a and 39 Ar/ 40 Ar date of 610 ± 10 (2σ) ka, obtained from the immediately overlaying geological level, put the Isernia assemblage in the first part of the Middle Pleistocene. Previous ESR/U-series analyses of Isernia fossil teeth have displayed both recent U-uptake and severe underestimation of the ESR/U-series dates in comparison with the 39 Ar/ 40 Ar age. In order to identify the cause of this age underestimation, new analyses were realized in the present study on four bovid teeth directly recovered from the archaeological surface t3a. The ESR/U-series dates obtained were once again strongly underestimated, with an error weighted mean age of 435 ± 24 (1σ) ka. These too young dates could be associated to a change of the environmental γ-dose rate during the geological history of the Isernia site, related to the revealed recent U-uptake into the palaeontological remains of the archaeological level. If we consider that this dose rate change was coeval with a wet interglacial period and taking the 39 Ar/ 40 Ar age as geochronological reference, simulations with two dose rate steps indicate that this change could be correlated with marine isotopic stage 7 (MIS 7).

  19. New ESR/U-series data for the early Middle Pleistocene site of Isernia la Pineta, Italy

    Energy Technology Data Exchange (ETDEWEB)

    Shao Qingfeng, E-mail: shao@mnhn.fr [Departement de Prehistoire du Museum National d' Histoire Naturelle, UMR 7194 CNRS, 1 rue Rene Panhard, F-75013 Paris (France); Bahain, Jean-Jacques; Falgueres, Christophe [Departement de Prehistoire du Museum National d' Histoire Naturelle, UMR 7194 CNRS, 1 rue Rene Panhard, F-75013 Paris (France); Peretto, Carlo; Arzarello, Marta; Minelli, Antonella; Thun Hohenstein, Ursula [Dipartimento di Biologia ed Evoluzione, Universita di Ferrara, C.so Ercole I d' Este 32, I-44121 Ferrara (Italy); Dolo, Jean-Michel; Garcia, Tristan [CEA, LIST, Laboratoire National Henri Becquerel, F-91191 Gif-sur-Yvette cedex (France); Frank, Norbert; Douville, Eric [Laboratoire des Sciences du Climat et de l' Environnement, LSCE/IPSL, UMR 8212 CNRS-CEA-UVSQ, Domaine du CNRS, F-91198 Gif/Yvette cedex (France)

    2011-09-15

    Located in Southern Italy, the Early Palaeolithic site of Isernia la Pineta has provided numerous palaeontological remains and artefacts in well-defined fluvio-lacustrine sequence. The normal magnetization of the main archaeological layer t3a and {sup 39}Ar/{sup 40}Ar date of 610 {+-} 10 (2{sigma}) ka, obtained from the immediately overlaying geological level, put the Isernia assemblage in the first part of the Middle Pleistocene. Previous ESR/U-series analyses of Isernia fossil teeth have displayed both recent U-uptake and severe underestimation of the ESR/U-series dates in comparison with the {sup 39}Ar/{sup 40}Ar age. In order to identify the cause of this age underestimation, new analyses were realized in the present study on four bovid teeth directly recovered from the archaeological surface t3a. The ESR/U-series dates obtained were once again strongly underestimated, with an error weighted mean age of 435 {+-} 24 (1{sigma}) ka. These too young dates could be associated to a change of the environmental {gamma}-dose rate during the geological history of the Isernia site, related to the revealed recent U-uptake into the palaeontological remains of the archaeological level. If we consider that this dose rate change was coeval with a wet interglacial period and taking the {sup 39}Ar/{sup 40}Ar age as geochronological reference, simulations with two dose rate steps indicate that this change could be correlated with marine isotopic stage 7 (MIS 7).

  20. Fabrication and testing of W7-X pre-series target elements

    International Nuclear Information System (INIS)

    Boscary, J; Boeswirth, B; Greuner, H; Grigull, P; Missirlian, M; Plankensteiner, A; Schedler, B; Friedrich, T; Schlosser, J; Streibl, B; Traxler, H

    2007-01-01

    The assembly of the highly-loaded target plates of the WENDELSTEIN 7-X (W7-X) divertor requires the fabrication of 890 target elements (TEs). The plasma facing material is made of CFC NB31 flat tiles bonded to a CuCrZr copper alloy water-cooled heat sink. The elements are designed to remove a stationary heat flux and power up to 10 MW m -2 and 100 kW, respectively. Before launching the serial fabrication, pre-series activities aimed at qualifying the design, the manufacturing route and the non-destructive examinations (NDEs). High heat flux (HHF) tests performed on full-scale pre-series TEs resulted in an improvement of the design of the bond between tiles and heat sink to reduce the stresses during operation. The consequence is the fabrication of additional pre-series TEs to be tested in the HHF facility GLADIS. NDEs of this bond based on thermography methods are developed to define the acceptance criteria suitable for serial fabrication

  1. Frozen in Time? Microbial strategies for survival and carbon metabolism over geologic time in a Pleistocene permafrost chronosequence

    Science.gov (United States)

    Mackelprang, R.; Douglas, T. A.; Waldrop, M. P.

    2014-12-01

    Permafrost soils have received tremendous interest due to their importance as a global carbon store with the potential to be thawed over the coming centuries. Instead of being 'frozen in time,' permafrost contains active microbes. Most metagenomic studies have focused on Holocene aged permafrost. Here, we target Pleistocene aged ice and carbon rich permafrost (Yedoma), which can differ in carbon content and stage of decay. Our aim was to understand how microbes in the permafrost transform organic matter over geologic time and to identify physiological and biochemical adaptations that enable long-term survival. We used next-generation sequencing to characterize microbial communities along a permafrost age gradient. Samples were collected from the Cold Regions Research and Engineering Laboratory (CRREL) Permafrost Tunnel near Fox, AK, which penetrates a hillside providing access to permafrost ranging in age from 12 to 40 kyr. DNA was extracted directly from unthawed samples. 16S rRNA amplicon (16S) and shotgun metagenome sequencing revealed significant age-driven differences. First, microbial diversity declines with permafrost age, likely due to long-term exposure to environmental stresses and a reduction in metabolic resources. Second, we observed taxonomic differences among ages, with an increasing abundance of Firmicutes (endospore-formers) in older samples, suggesting that dormancy is a common survival strategy in older permafrost. Ordination of 16S and metagenome data revealed age-based clustering. Genes differing significantly between age categories included those involved in lipopolysaccharide assembly, cold-response, and carbon processing. These data point to the physiological adaptations to long-term frozen conditions and to the metabolic processes utilized in ancient permafrost. In fact, a gene common in older samples is involved in cadaverine production, which could potentially explain the putrefied smell of Pleistocene aged permafrost. Coupled with soil

  2. The Timeseries Toolbox - A Web Application to Enable Accessible, Reproducible Time Series Analysis

    Science.gov (United States)

    Veatch, W.; Friedman, D.; Baker, B.; Mueller, C.

    2017-12-01

    The vast majority of data analyzed by climate researchers are repeated observations of physical process or time series data. This data lends itself of a common set of statistical techniques and models designed to determine trends and variability (e.g., seasonality) of these repeated observations. Often, these same techniques and models can be applied to a wide variety of different time series data. The Timeseries Toolbox is a web application designed to standardize and streamline these common approaches to time series analysis and modeling with particular attention to hydrologic time series used in climate preparedness and resilience planning and design by the U. S. Army Corps of Engineers. The application performs much of the pre-processing of time series data necessary for more complex techniques (e.g. interpolation, aggregation). With this tool, users can upload any dataset that conforms to a standard template and immediately begin applying these techniques to analyze their time series data.

  3. OceanXtremes: Scalable Anomaly Detection in Oceanographic Time-Series

    Science.gov (United States)

    Wilson, B. D.; Armstrong, E. M.; Chin, T. M.; Gill, K. M.; Greguska, F. R., III; Huang, T.; Jacob, J. C.; Quach, N.

    2016-12-01

    The oceanographic community must meet the challenge to rapidly identify features and anomalies in complex and voluminous observations to further science and improve decision support. Given this data-intensive reality, we are developing an anomaly detection system, called OceanXtremes, powered by an intelligent, elastic Cloud-based analytic service backend that enables execution of domain-specific, multi-scale anomaly and feature detection algorithms across the entire archive of 15 to 30-year ocean science datasets.Our parallel analytics engine is extending the NEXUS system and exploits multiple open-source technologies: Apache Cassandra as a distributed spatial "tile" cache, Apache Spark for in-memory parallel computation, and Apache Solr for spatial search and storing pre-computed tile statistics and other metadata. OceanXtremes provides these key capabilities: Parallel generation (Spark on a compute cluster) of 15 to 30-year Ocean Climatologies (e.g. sea surface temperature or SST) in hours or overnight, using simple pixel averages or customizable Gaussian-weighted "smoothing" over latitude, longitude, and time; Parallel pre-computation, tiling, and caching of anomaly fields (daily variables minus a chosen climatology) with pre-computed tile statistics; Parallel detection (over the time-series of tiles) of anomalies or phenomena by regional area-averages exceeding a specified threshold (e.g. high SST in El Nino or SST "blob" regions), or more complex, custom data mining algorithms; Shared discovery and exploration of ocean phenomena and anomalies (facet search using Solr), along with unexpected correlations between key measured variables; Scalable execution for all capabilities on a hybrid Cloud, using our on-premise OpenStack Cloud cluster or at Amazon. The key idea is that the parallel data-mining operations will be run "near" the ocean data archives (a local "network" hop) so that we can efficiently access the thousands of files making up a three decade time-series

  4. GPS Position Time Series @ JPL

    Science.gov (United States)

    Owen, Susan; Moore, Angelyn; Kedar, Sharon; Liu, Zhen; Webb, Frank; Heflin, Mike; Desai, Shailen

    2013-01-01

    Different flavors of GPS time series analysis at JPL - Use same GPS Precise Point Positioning Analysis raw time series - Variations in time series analysis/post-processing driven by different users. center dot JPL Global Time Series/Velocities - researchers studying reference frame, combining with VLBI/SLR/DORIS center dot JPL/SOPAC Combined Time Series/Velocities - crustal deformation for tectonic, volcanic, ground water studies center dot ARIA Time Series/Coseismic Data Products - Hazard monitoring and response focused center dot ARIA data system designed to integrate GPS and InSAR - GPS tropospheric delay used for correcting InSAR - Caltech's GIANT time series analysis uses GPS to correct orbital errors in InSAR - Zhen Liu's talking tomorrow on InSAR Time Series analysis

  5. Attempt at ESR dating of tooth enamel of French middle pleistocene sites

    International Nuclear Information System (INIS)

    Bahain, J.J.; Sarcia, M.N.; Falgueres, C.; Yokoyama, Y.

    1993-01-01

    Tooth enamel samples from four important French middle Pleistocene sites are analyzed by the ESR method. ESR ages were calculated using uranium uptake mathematical models and compared with U-series results. (author)

  6. Late Pleistocene dune activity in the central Great Plains, USA

    Science.gov (United States)

    Mason, J.A.; Swinehart, J.B.; Hanson, P.R.; Loope, D.B.; Goble, R.J.; Miao, X.; Schmeisser, R.L.

    2011-01-01

    Stabilized dunes of the central Great Plains, especially the megabarchans and large barchanoid ridges of the Nebraska Sand Hills, provide dramatic evidence of late Quaternary environmental change. Episodic Holocene dune activity in this region is now well-documented, but Late Pleistocene dune mobility has remained poorly documented, despite early interpretations of the Sand Hills dunes as Pleistocene relicts. New optically stimulated luminescence (OSL) ages from drill cores and outcrops provide evidence of Late Pleistocene dune activity at sites distributed across the central Great Plains. In addition, Late Pleistocene eolian sands deposited at 20-25 ka are interbedded with loess south of the Sand Hills. Several of the large dunes sampled in the Sand Hills clearly contain a substantial core of Late Pleistocene sand; thus, they had developed by the Late Pleistocene and were fully mobile at that time, although substantial sand deposition and extensive longitudinal dune construction occurred during the Holocene. Many of the Late Pleistocene OSL ages fall between 17 and 14 ka, but it is likely that these ages represent only the later part of a longer period of dune construction and migration. At several sites, significant Late Pleistocene or Holocene large-dune migration also probably occurred after the time represented by the Pleistocene OSL ages. Sedimentary structures in Late Pleistocene eolian sand and the forms of large dunes potentially constructed in the Late Pleistocene both indicate sand transport dominated by northerly to westerly winds, consistent with Late Pleistocene loess transport directions. Numerical modeling of the climate of the Last Glacial Maximum has often yielded mean monthly surface winds southwest of the Laurentide Ice Sheet that are consistent with this geologic evidence, despite strengthened anticyclonic circulation over the ice sheet. Mobility of large dunes during the Late Pleistocene on the central Great Plains may have been the result of

  7. Time series analysis time series analysis methods and applications

    CERN Document Server

    Rao, Tata Subba; Rao, C R

    2012-01-01

    The field of statistics not only affects all areas of scientific activity, but also many other matters such as public policy. It is branching rapidly into so many different subjects that a series of handbooks is the only way of comprehensively presenting the various aspects of statistical methodology, applications, and recent developments. The Handbook of Statistics is a series of self-contained reference books. Each volume is devoted to a particular topic in statistics, with Volume 30 dealing with time series. The series is addressed to the entire community of statisticians and scientists in various disciplines who use statistical methodology in their work. At the same time, special emphasis is placed on applications-oriented techniques, with the applied statistician in mind as the primary audience. Comprehensively presents the various aspects of statistical methodology Discusses a wide variety of diverse applications and recent developments Contributors are internationally renowened experts in their respect...

  8. Highly comparative time-series analysis: the empirical structure of time series and their methods.

    Science.gov (United States)

    Fulcher, Ben D; Little, Max A; Jones, Nick S

    2013-06-06

    The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.

  9. Introduction to Time Series Modeling

    CERN Document Server

    Kitagawa, Genshiro

    2010-01-01

    In time series modeling, the behavior of a certain phenomenon is expressed in relation to the past values of itself and other covariates. Since many important phenomena in statistical analysis are actually time series and the identification of conditional distribution of the phenomenon is an essential part of the statistical modeling, it is very important and useful to learn fundamental methods of time series modeling. Illustrating how to build models for time series using basic methods, "Introduction to Time Series Modeling" covers numerous time series models and the various tools f

  10. Pliocene warmth, polar amplification, and stepped Pleistocene cooling recorded in NE Arctic Russia.

    Science.gov (United States)

    Brigham-Grette, Julie; Melles, Martin; Minyuk, Pavel; Andreev, Andrei; Tarasov, Pavel; DeConto, Robert; Koenig, Sebastian; Nowaczyk, Norbert; Wennrich, Volker; Rosén, Peter; Haltia, Eeva; Cook, Tim; Gebhardt, Catalina; Meyer-Jacob, Carsten; Snyder, Jeff; Herzschuh, Ulrike

    2013-06-21

    Understanding the evolution of Arctic polar climate from the protracted warmth of the middle Pliocene into the earliest glacial cycles in the Northern Hemisphere has been hindered by the lack of continuous, highly resolved Arctic time series. Evidence from Lake El'gygytgyn, in northeast (NE) Arctic Russia, shows that 3.6 to 3.4 million years ago, summer temperatures were ~8°C warmer than today, when the partial pressure of CO2 was ~400 parts per million. Multiproxy evidence suggests extreme warmth and polar amplification during the middle Pliocene, sudden stepped cooling events during the Pliocene-Pleistocene transition, and warmer than present Arctic summers until ~2.2 million years ago, after the onset of Northern Hemispheric glaciation. Our data are consistent with sea-level records and other proxies indicating that Arctic cooling was insufficient to support large-scale ice sheets until the early Pleistocene.

  11. Quantifying the astronomical contribution to Pleistocene climate change: A non-linear, statistical approach

    Science.gov (United States)

    Crucifix, Michel; Wilkinson, Richard; Carson, Jake; Preston, Simon; Alemeida, Carlos; Rougier, Jonathan

    2013-04-01

    The existence of an action of astronomical forcing on the Pleistocene climate is almost undisputed. However, quantifying this action is not straightforward. In particular, the phenomenon of deglaciation is generally interpreted as a manifestation of instability, which is typical of non-linear systems. As a consequence, explaining the Pleistocene climate record as the addition of an astronomical contribution and noise-as often done using harmonic analysis tools-is potentially deceptive. Rather, we advocate a methodology in which non-linear stochastic dynamical systems are calibrated on the Pleistocene climate record. The exercise, though, requires careful statistical reasoning and state-of-the-art techniques. In fact, the problem has been judged to be mathematically 'intractable and unsolved' and some pragmatism is justified. In order to illustrate the methodology we consider one dynamical system that potentially captures four dynamical features of the Pleistocene climate : the existence of a saddle-node bifurcation in at least one of its slow components, a time-scale separation between a slow and a fast component, the action of astronomical forcing, and the existence a stochastic contribution to the system dynamics. This model is obviously not the only possible representation of Pleistocene dynamics, but it encapsulates well enough both our theoretical and empirical knowledge into a very simple form to constitute a valid starting point. The purpose of this poster is to outline the practical challenges in calibrating such a model on paleoclimate observations. Just as in time series analysis, there is no one single and universal test or criteria that would demonstrate the validity of an approach. Several methods exist to calibrate the model and judgement develops by the confrontation of the results of the different methods. In particular, we consider here the Kalman filter variants, the Particle Monte-Carlo Markov Chain, and two other variants of Sequential Monte

  12. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor; Valenzuela, Olga

    2017-01-01

    This volume of selected and peer-reviewed contributions on the latest developments in time series analysis and forecasting updates the reader on topics such as analysis of irregularly sampled time series, multi-scale analysis of univariate and multivariate time series, linear and non-linear time series models, advanced time series forecasting methods, applications in time series analysis and forecasting, advanced methods and online learning in time series and high-dimensional and complex/big data time series. The contributions were originally presented at the International Work-Conference on Time Series, ITISE 2016, held in Granada, Spain, June 27-29, 2016. The series of ITISE conferences provides a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting.  It focuses on interdisciplinary and multidisciplinary rese arch encompassing the disciplines of comput...

  13. Shaped by uneven Pleistocene climate

    DEFF Research Database (Denmark)

    Li, Xinlei; Dong, Feng; Lei, Fumin

    2016-01-01

    had different impacts on different populations: clade N expanded after the last glacial maximum (LGM), whereas milder Pleistocene climate of east Asia allowed clade SE a longer expansion time (since MIS 5); clade SW expanded over a similarly long time as clade SE, which is untypical for European...

  14. Pliocene –Pleistocene geomorphological evolution of the Adriatic side of Central Italy

    Directory of Open Access Journals (Sweden)

    Gentili Bernardino

    2017-02-01

    Full Text Available This work is a significant contribution to knowledge of the Quaternary and pre-Quaternary morphogenesis of a wide sector of central Italy, from the Apennine chain to the Adriatic Sea. The goal is achieved through a careful analysis and interpretation of stratigraphic and tectonic data relating to marine and continental sediments and, mostly, through the study of relict limbs of ancient landscapes (erosional surfaces shaped by prevailing planation processes. The most important scientific datum is the definition of the time span in which the modelling of the oldest morphological element (the “summit relict surface” occurred: it started during Messinian in the westernmost portion and after a significant phase during middle-late Pliocene, ended in the early Pleistocene. During the middle and late Pleistocene, the rapid tectonic uplift of the area and the climate fluctuations favoured the deepening of the hydrographic network and the genesis of three orders of fluvial terraces, thus completing the fundamental features of the landscape. The subsequent Holocene evolution reshaped the minor elements, but not the basic ones.

  15. From Networks to Time Series

    Science.gov (United States)

    Shimada, Yutaka; Ikeguchi, Tohru; Shigehara, Takaomi

    2012-10-01

    In this Letter, we propose a framework to transform a complex network to a time series. The transformation from complex networks to time series is realized by the classical multidimensional scaling. Applying the transformation method to a model proposed by Watts and Strogatz [Nature (London) 393, 440 (1998)], we show that ring lattices are transformed to periodic time series, small-world networks to noisy periodic time series, and random networks to random time series. We also show that these relationships are analytically held by using the circulant-matrix theory and the perturbation theory of linear operators. The results are generalized to several high-dimensional lattices.

  16. Duality between Time Series and Networks

    Science.gov (United States)

    Campanharo, Andriana S. L. O.; Sirer, M. Irmak; Malmgren, R. Dean; Ramos, Fernando M.; Amaral, Luís A. Nunes.

    2011-01-01

    Studying the interaction between a system's components and the temporal evolution of the system are two common ways to uncover and characterize its internal workings. Recently, several maps from a time series to a network have been proposed with the intent of using network metrics to characterize time series. Although these maps demonstrate that different time series result in networks with distinct topological properties, it remains unclear how these topological properties relate to the original time series. Here, we propose a map from a time series to a network with an approximate inverse operation, making it possible to use network statistics to characterize time series and time series statistics to characterize networks. As a proof of concept, we generate an ensemble of time series ranging from periodic to random and confirm that application of the proposed map retains much of the information encoded in the original time series (or networks) after application of the map (or its inverse). Our results suggest that network analysis can be used to distinguish different dynamic regimes in time series and, perhaps more importantly, time series analysis can provide a powerful set of tools that augment the traditional network analysis toolkit to quantify networks in new and useful ways. PMID:21858093

  17. On the limits of using combined U-series/ESR method to date fossil teeth from two Early Pleistocene archaeological sites of the Orce area (Guadix-Baza basin, Spain)

    International Nuclear Information System (INIS)

    Duval, Mathieu; Falgueres, Christophe; Bahain, Jean-Jacques; Shao, Qingfeng; Grun, Rainer; Aubert, Maxime; Dolo, Jean-Michel; Agusti, Jordi; Martinez-Navarro, Bienvenido; Palmqvist, Paul; Toro-Moyano, Isidro

    2012-01-01

    The combined U-series/electron spin resonance (ESR) dating method was applied to nine teeth from two Early Pleistocene archaeological sites located in the Orce area (Guadix-Baza Basin, Southern Spain): Fuente Nueva-3 (FN-3) and Barranco Leon (BL). The combination of bio-stratigraphy and magneto-stratigraphy places both sites between the Olduvai and Jaramillo sub-chrons (1.78-1.07 Ma). Our results highlight the difficulty of dating such old sites and point out the limits of the combined U-series/ ESR dating method based on the US model. We identified several sources of uncertainties that may lead to inaccurate age estimates. Seven samples could not be dated because the dental tissues had ( 230 Th/ 234 U) activity ratios higher than equilibrium, indicating that uranium had probably leached from these tissues. It was however possible to calculate numerical estimates for two of the teeth, both from FN-3. One yielded a Middle Pleistocene age that seems to be strongly underestimated; the other provided an age of 1.19±0.21 Ma, in agreement with data obtained from independent methods. The latter result gives encouragement that there are samples that can be used for routine dating of old sites. (authors)

  18. Long time series

    DEFF Research Database (Denmark)

    Hisdal, H.; Holmqvist, E.; Hyvärinen, V.

    Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the......Awareness that emission of greenhouse gases will raise the global temperature and change the climate has led to studies trying to identify such changes in long-term climate and hydrologic time series. This report, written by the...

  19. A Course in Time Series Analysis

    CERN Document Server

    Peña, Daniel; Tsay, Ruey S

    2011-01-01

    New statistical methods and future directions of research in time series A Course in Time Series Analysis demonstrates how to build time series models for univariate and multivariate time series data. It brings together material previously available only in the professional literature and presents a unified view of the most advanced procedures available for time series model building. The authors begin with basic concepts in univariate time series, providing an up-to-date presentation of ARIMA models, including the Kalman filter, outlier analysis, automatic methods for building ARIMA models, a

  20. Effects of Pleistocene sea-level fluctuations on mangrove population dynamics: a lesson from Sonneratia alba.

    Science.gov (United States)

    Yang, Yuchen; Li, Jianfang; Yang, Shuhuan; Li, Xinnian; Fang, Lu; Zhong, Cairong; Duke, Norman C; Zhou, Renchao; Shi, Suhua

    2017-01-18

    A large-scale systematical investigation of the influence of Pleistocene climate oscillation on mangrove population dynamics could enrich our knowledge about the evolutionary history during times of historical climate change, which in turn may provide important information for their conservation. In this study, phylogeography of a mangrove tree Sonneratia alba was studied by sequencing three chloroplast fragments and seven nuclear genes. A low level of genetic diversity at the population level was detected across its range, especially at the range margins, which was mainly attributed to the steep sea-level drop and associated climate fluctuations during the Pleistocene glacial periods. Extremely small effective population size (Ne) was inferred in populations from both eastern and western Malay Peninsula (44 and 396, respectively), mirroring the fragility of mangrove plants and their paucity of robustness against future climate perturbations and human activity. Two major genetic lineages of high divergence were identified in the two mangrove biodiversity centres: the Indo-Malesia and Australasia regions. The estimated splitting time between these two lineages was 3.153 million year ago (MYA), suggesting a role for pre-Pleistocene events in shaping the major diversity patterns of mangrove species. Within the Indo-Malesia region, a subdivision was implicated between the South China Sea (SCS) and the remaining area with a divergence time of 1.874 MYA, corresponding to glacial vicariance when the emerged Sunda Shelf halted genetic exchange between the western and eastern coasts of the Malay Peninsula during Pleistocene sea-level drops. Notably, genetic admixture was observed in populations at the boundary regions, especially in the two populations near the Malacca Strait, indicating secondary contact between divergent lineages during interglacial periods. These interregional genetic exchanges provided ample opportunity for the re-use of standing genetic variation

  1. Kolmogorov Space in Time Series Data

    OpenAIRE

    Kanjamapornkul, K.; Pinčák, R.

    2016-01-01

    We provide the proof that the space of time series data is a Kolmogorov space with $T_{0}$-separation axiom using the loop space of time series data. In our approach we define a cyclic coordinate of intrinsic time scale of time series data after empirical mode decomposition. A spinor field of time series data comes from the rotation of data around price and time axis by defining a new extradimension to time series data. We show that there exist hidden eight dimensions in Kolmogorov space for ...

  2. On the Use of Running Trends as Summary Statistics for Univariate Time Series and Time Series Association

    OpenAIRE

    Trottini, Mario; Vigo, Isabel; Belda, Santiago

    2015-01-01

    Given a time series, running trends analysis (RTA) involves evaluating least squares trends over overlapping time windows of L consecutive time points, with overlap by all but one observation. This produces a new series called the “running trends series,” which is used as summary statistics of the original series for further analysis. In recent years, RTA has been widely used in climate applied research as summary statistics for time series and time series association. There is no doubt that ...

  3. Multiple Indicator Stationary Time Series Models.

    Science.gov (United States)

    Sivo, Stephen A.

    2001-01-01

    Discusses the propriety and practical advantages of specifying multivariate time series models in the context of structural equation modeling for time series and longitudinal panel data. For time series data, the multiple indicator model specification improves on classical time series analysis. For panel data, the multiple indicator model…

  4. Detection of Outliers and Imputing of Missing Values for Water Quality UV-VIS Absorbance Time Series

    Directory of Open Access Journals (Sweden)

    Leonardo Plazas-Nossa

    2017-01-01

    Full Text Available Context: The UV-Vis absorbance collection using online optical captors for water quality detection may yield outliers and/or missing values. Therefore, data pre-processing is a necessary pre-requisite to monitoring data processing. Thus, the aim of this study is to propose a method that detects and removes outliers as well as fills gaps in time series. Method: Outliers are detected using Winsorising procedure and the application of the Discrete Fourier Transform (DFT and the Inverse of Fast Fourier Transform (IFFT to complete the time series. Together, these tools were used to analyse a case study comprising three sites in Colombia ((i Bogotá D.C. Salitre-WWTP (Waste Water Treatment Plant, influent; (ii Bogotá D.C. Gibraltar Pumping Station (GPS; and, (iii Itagüí, San Fernando-WWTP, influent (Medellín metropolitan area analysed via UV-Vis (Ultraviolet and Visible spectra. Results: Outlier detection with the proposed method obtained promising results when window parameter values are small and self-similar, despite that the three time series exhibited different sizes and behaviours. The DFT allowed to process different length gaps having missing values. To assess the validity of the proposed method, continuous subsets (a section of the absorbance time series without outlier or missing values were removed from the original time series obtaining an average 12% error rate in the three testing time series. Conclusions: The application of the DFT and the IFFT, using the 10% most important harmonics of useful values, can be useful for its later use in different applications, specifically for time series of water quality and quantity in urban sewer systems. One potential application would be the analysis of dry weather interesting to rain events, a feat achieved by detecting values that correspond to unusual behaviour in a time series. Additionally, the result hints at the potential of the method in correcting other hydrologic time series.

  5. Forecasting long memory time series under a break in persistence

    DEFF Research Database (Denmark)

    Heinen, Florian; Sibbertsen, Philipp; Kruse, Robinson

    We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength...... of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines...

  6. Time Series Momentum

    DEFF Research Database (Denmark)

    Moskowitz, Tobias J.; Ooi, Yao Hua; Heje Pedersen, Lasse

    2012-01-01

    We document significant “time series momentum” in equity index, currency, commodity, and bond futures for each of the 58 liquid instruments we consider. We find persistence in returns for one to 12 months that partially reverses over longer horizons, consistent with sentiment theories of initial...... under-reaction and delayed over-reaction. A diversified portfolio of time series momentum strategies across all asset classes delivers substantial abnormal returns with little exposure to standard asset pricing factors and performs best during extreme markets. Examining the trading activities...

  7. International Work-Conference on Time Series

    CERN Document Server

    Pomares, Héctor

    2016-01-01

    This volume presents selected peer-reviewed contributions from The International Work-Conference on Time Series, ITISE 2015, held in Granada, Spain, July 1-3, 2015. It discusses topics in time series analysis and forecasting, advanced methods and online learning in time series, high-dimensional and complex/big data time series as well as forecasting in real problems. The International Work-Conferences on Time Series (ITISE) provide a forum for scientists, engineers, educators and students to discuss the latest ideas and implementations in the foundations, theory, models and applications in the field of time series analysis and forecasting. It focuses on interdisciplinary and multidisciplinary research encompassing the disciplines of computer science, mathematics, statistics and econometrics.

  8. Spectral Estimation of UV-Vis Absorbance Time Series for Water Quality Monitoring

    Directory of Open Access Journals (Sweden)

    Leonardo Plazas-Nossa

    2017-05-01

    Full Text Available Context: Signals recorded as multivariate time series by UV-Vis absorbance captors installed in urban sewer systems, can be non-stationary, yielding complications in the analysis of water quality monitoring. This work proposes to perform spectral estimation using the Box-Cox transformation and differentiation in order to obtain stationary multivariate time series in a wide sense. Additionally, Principal Component Analysis (PCA is applied to reduce their dimensionality. Method: Three different UV-Vis absorbance time series for different Colombian locations were studied: (i El-Salitre Wastewater Treatment Plant (WWTP in Bogotá; (ii Gibraltar Pumping Station (GPS in Bogotá; and (iii San-Fernando WWTP in Itagüí. Each UV-Vis absorbance time series had equal sample number (5705. The esti-mation of the spectral power density is obtained using the average of modified periodograms with rectangular window and an overlap of 50%, with the 20 most important harmonics from the Discrete Fourier Transform (DFT and Inverse Fast Fourier Transform (IFFT. Results: Absorbance time series dimensionality reduction using PCA, resulted in 6, 8 and 7 principal components for each study site respectively, altogether explaining more than 97% of their variability. Values of differences below 30% for the UV range were obtained for the three study sites, while for the visible range the maximum differences obtained were: (i 35% for El-Salitre WWTP; (ii 61% for GPS; and (iii 75% for San-Fernando WWTP. Conclusions: The Box-Cox transformation and the differentiation process applied to the UV-Vis absorbance time series for the study sites (El-Salitre, GPS and San-Fernando, allowed to reduce variance and to eliminate ten-dency of the time series. A pre-processing of UV-Vis absorbance time series is recommended to detect and remove outliers and then apply the proposed process for spectral estimation. Language: Spanish.

  9. Stochastic models for time series

    CERN Document Server

    Doukhan, Paul

    2018-01-01

    This book presents essential tools for modelling non-linear time series. The first part of the book describes the main standard tools of probability and statistics that directly apply to the time series context to obtain a wide range of modelling possibilities. Functional estimation and bootstrap are discussed, and stationarity is reviewed. The second part describes a number of tools from Gaussian chaos and proposes a tour of linear time series models. It goes on to address nonlinearity from polynomial or chaotic models for which explicit expansions are available, then turns to Markov and non-Markov linear models and discusses Bernoulli shifts time series models. Finally, the volume focuses on the limit theory, starting with the ergodic theorem, which is seen as the first step for statistics of time series. It defines the distributional range to obtain generic tools for limit theory under long or short-range dependences (LRD/SRD) and explains examples of LRD behaviours. More general techniques (central limit ...

  10. On the Paleobiogeography of Pleistocenic Italian Mammals / Osservazioni sulla paleobiogeografia dei mammiferi del Pleistocene italiano

    Directory of Open Access Journals (Sweden)

    Lucia Caloi

    1986-06-01

    Full Text Available Abstract In this paper the main Italian Pleistocene mammalofaunas are examined and a chronological sequence of the main deposits is proposed. Centers of spreading, times of first appearence in Italy and ranges through the peninsula of the more representative species are indicated, as far as possible. The insular faunas and the different degrees of endemism they show, are particularly discussed. Riassunto Vengono esaminate sinteticamente le principali faune a mammiferi del Pleistocene d'Italia e viene proposta una successione cronologica per i principali giacimenti. Per le specie più rappresentative vengono indicati, per quanto possibile, le aree di provenienza, il momento della comparsa e la loro diffusione nella penisola. Particolare attenzione viene posta alle forme insulari ed al loro carattere endemico.

  11. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  12. Automated Bayesian model development for frequency detection in biological time series

    Directory of Open Access Journals (Sweden)

    Oldroyd Giles ED

    2011-06-01

    Full Text Available Abstract Background A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. Results In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Conclusions Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and

  13. Automated Bayesian model development for frequency detection in biological time series.

    Science.gov (United States)

    Granqvist, Emma; Oldroyd, Giles E D; Morris, Richard J

    2011-06-24

    A first step in building a mathematical model of a biological system is often the analysis of the temporal behaviour of key quantities. Mathematical relationships between the time and frequency domain, such as Fourier Transforms and wavelets, are commonly used to extract information about the underlying signal from a given time series. This one-to-one mapping from time points to frequencies inherently assumes that both domains contain the complete knowledge of the system. However, for truncated, noisy time series with background trends this unique mapping breaks down and the question reduces to an inference problem of identifying the most probable frequencies. In this paper we build on the method of Bayesian Spectrum Analysis and demonstrate its advantages over conventional methods by applying it to a number of test cases, including two types of biological time series. Firstly, oscillations of calcium in plant root cells in response to microbial symbionts are non-stationary and noisy, posing challenges to data analysis. Secondly, circadian rhythms in gene expression measured over only two cycles highlights the problem of time series with limited length. The results show that the Bayesian frequency detection approach can provide useful results in specific areas where Fourier analysis can be uninformative or misleading. We demonstrate further benefits of the Bayesian approach for time series analysis, such as direct comparison of different hypotheses, inherent estimation of noise levels and parameter precision, and a flexible framework for modelling the data without pre-processing. Modelling in systems biology often builds on the study of time-dependent phenomena. Fourier Transforms are a convenient tool for analysing the frequency domain of time series. However, there are well-known limitations of this method, such as the introduction of spurious frequencies when handling short and noisy time series, and the requirement for uniformly sampled data. Biological time

  14. Time Series with Long Memory

    OpenAIRE

    西埜, 晴久

    2004-01-01

    The paper investigates an application of long-memory processes to economic time series. We show properties of long-memory processes, which are motivated to model a long-memory phenomenon in economic time series. An FARIMA model is described as an example of long-memory model in statistical terms. The paper explains basic limit theorems and estimation methods for long-memory processes in order to apply long-memory models to economic time series.

  15. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  16. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  17. Late Pleistocene fishes of the Tennessee River Basin: an analysis of a late Pleistocene freshwater fish fauna from Bell Cave (site ACb-2) in Colbert County, Alabama, USA.

    Science.gov (United States)

    Jacquemin, Stephen J; Ebersole, Jun A; Dickinson, William C; Ciampaglio, Charles N

    2016-01-01

    The Tennessee River Basin is considered one of the most important regions for freshwater biodiversity anywhere on the globe. The Tennessee River Basin currently includes populations of at least half of the described contemporary diversity of extant North American freshwater fishes, crayfish, mussel, and gastropod species. However, comparatively little is known about the biodiversity of this basin from the Pleistocene Epoch, particularly the late Pleistocene (∼10,000 to 30,000 years B.P.) leading to modern Holocene fish diversity patterns. The objective of this study was to describe the fish assemblages of the Tennessee River Basin from the late Pleistocene using a series of faunas from locales throughout the basin documented from published literature, unpublished reports, and an undocumented fauna from Bell Cave (site ACb-2, Colbert County, AL). Herein we discuss 41 unequivocal taxa from 10 late Pleistocene localities within the basin and include a systematic discussion of 11 families, 19 genera, and 24 identifiable species (28 unequivocal taxa) specific to the Bell Cave locality. Among the described fauna are several extirpated (e.g., Northern Pike Esox lucius, Northern Madtom Noturus stigmosus) and a single extinct (Harelip Sucker Moxostoma lacerum) taxa that suggest a combination of late Pleistocene displacement events coupled with more recent changes in habitat that have resulted in modern basin diversity patterns. The Bell Cave locality represents one of the most intact Pleistocene freshwater fish deposits anywhere in North America. Significant preservational, taphonomic, sampling, and identification biases preclude the identification of additional taxa. Overall, this study provides a detailed look into paleo-river ecology, as well as freshwater fish diversity and distribution leading up to the contemporary biodiversity patterns of the Tennessee River Basin and Mississippi River Basin as a whole.

  18. Late Pleistocene fishes of the Tennessee River Basin: an analysis of a late Pleistocene freshwater fish fauna from Bell Cave (site ACb-2 in Colbert County, Alabama, USA

    Directory of Open Access Journals (Sweden)

    Stephen J. Jacquemin

    2016-02-01

    Full Text Available The Tennessee River Basin is considered one of the most important regions for freshwater biodiversity anywhere on the globe. The Tennessee River Basin currently includes populations of at least half of the described contemporary diversity of extant North American freshwater fishes, crayfish, mussel, and gastropod species. However, comparatively little is known about the biodiversity of this basin from the Pleistocene Epoch, particularly the late Pleistocene (∼10,000 to 30,000 years B.P. leading to modern Holocene fish diversity patterns. The objective of this study was to describe the fish assemblages of the Tennessee River Basin from the late Pleistocene using a series of faunas from locales throughout the basin documented from published literature, unpublished reports, and an undocumented fauna from Bell Cave (site ACb-2, Colbert County, AL. Herein we discuss 41 unequivocal taxa from 10 late Pleistocene localities within the basin and include a systematic discussion of 11 families, 19 genera, and 24 identifiable species (28 unequivocal taxa specific to the Bell Cave locality. Among the described fauna are several extirpated (e.g., Northern Pike Esox lucius, Northern Madtom Noturus stigmosus and a single extinct (Harelip Sucker Moxostoma lacerum taxa that suggest a combination of late Pleistocene displacement events coupled with more recent changes in habitat that have resulted in modern basin diversity patterns. The Bell Cave locality represents one of the most intact Pleistocene freshwater fish deposits anywhere in North America. Significant preservational, taphonomic, sampling, and identification biases preclude the identification of additional taxa. Overall, this study provides a detailed look into paleo-river ecology, as well as freshwater fish diversity and distribution leading up to the contemporary biodiversity patterns of the Tennessee River Basin and Mississippi River Basin as a whole.

  19. Using Landsat Spectral Indices in Time-Series to Assess Wildfire Disturbance and Recovery

    Directory of Open Access Journals (Sweden)

    Samuel Hislop

    2018-03-01

    Full Text Available Satellite earth observation is being increasingly used to monitor forests across the world. Freely available Landsat data stretching back four decades, coupled with advances in computer processing capabilities, has enabled new time-series techniques for analyzing forest change. Typically, these methods track individual pixel values over time, through the use of various spectral indices. This study examines the utility of eight spectral indices for characterizing fire disturbance and recovery in sclerophyll forests, in order to determine their relative merits in the context of Landsat time-series. Although existing research into Landsat indices is comprehensive, this study presents a new approach, by comparing the distributions of pre and post-fire pixels using Glass’s delta, for evaluating indices without the need of detailed field information. Our results show that in the sclerophyll forests of southeast Australia, common indices, such as the Normalized Difference Vegetation Index (NDVI and the Normalized Burn Ratio (NBR, both accurately capture wildfire disturbance in a pixel-based time-series approach, especially if images from soon after the disturbance are available. However, for tracking forest regrowth and recovery, indices, such as NDVI, which typically capture chlorophyll concentration or canopy ‘greenness’, are not as reliable, with values returning to pre-fire levels in 3–5 years. In comparison, indices that are more sensitive to forest moisture and structure, such as NBR, indicate much longer (8–10 years recovery timeframes. This finding is consistent with studies that were conducted in other forest types. We also demonstrate that additional information regarding forest condition, particularly in relation to recovery, can be extracted from less well known indices, such as NBR2, as well as textural indices incorporating spatial variance. With Landsat time-series gaining in popularity in recent years, it is critical to

  20. Continuity versus discontinuity of the human settlement of Europe between the late Early Pleistocene and the early Middle Pleistocene. The mandibular evidence

    Science.gov (United States)

    Bermúdez de Castro, José María; Martinón-Torres, María; Rosell, Jordi; Blasco, Ruth; Arsuaga, Juan Luís; Carbonell, Eudald

    2016-12-01

    One of the most interesting aspects of the settlement of Europe is the possible continuity or discontinuity of the populations living in this continent during the Early and Middle Pleistocene. In this paper we present an analysis of the mandibular fossil record from four important Pleistocene European sites, Gran Dolina-TD6-2 (Sierra de Atapuerca), Mauer, Arago, and Atapuerca-Sima de los Huesos. We focus this study in the recognition of key derived mandibular features that may be useful to assess the relationship among the populations represented at these sites. In order to make an approach to the ecological scenario, we also present a short review and discussion of the archaeological and paleoenvironmental evidences at that time. Our results suggest that probably there was a demographic discontinuity between the late Early Pleistocene populations (MIS 21-MIS 19), and those dated to the MIS 15. Hybridization between residents and new settlers cannot be discarded. However, some features of the Gran Dolina-TD6 hominins point to some relationship between the population represented in this site (probably dated to the MIS 21) and the European Middle Pleistocene and early Late Pleistocene populations. A hypothetical scenario is presented in order to understand this apparent contradiction with the model of discontinuity.

  1. Network structure of multivariate time series.

    Science.gov (United States)

    Lacasa, Lucas; Nicosia, Vincenzo; Latora, Vito

    2015-10-21

    Our understanding of a variety of phenomena in physics, biology and economics crucially depends on the analysis of multivariate time series. While a wide range tools and techniques for time series analysis already exist, the increasing availability of massive data structures calls for new approaches for multidimensional signal processing. We present here a non-parametric method to analyse multivariate time series, based on the mapping of a multidimensional time series into a multilayer network, which allows to extract information on a high dimensional dynamical system through the analysis of the structure of the associated multiplex network. The method is simple to implement, general, scalable, does not require ad hoc phase space partitioning, and is thus suitable for the analysis of large, heterogeneous and non-stationary time series. We show that simple structural descriptors of the associated multiplex networks allow to extract and quantify nontrivial properties of coupled chaotic maps, including the transition between different dynamical phases and the onset of various types of synchronization. As a concrete example we then study financial time series, showing that a multiplex network analysis can efficiently discriminate crises from periods of financial stability, where standard methods based on time-series symbolization often fail.

  2. High heat flux tests of the WENDELSTEIN 7-X pre-series target elements

    International Nuclear Information System (INIS)

    Greuner, H.; Boeswirth, B.; Boscary, J.; Plankensteiner, A.; Schedler, B.

    2007-01-01

    The high heat flux (HHF) testing of WENDELSTEIN 7-X pre-series target elements is an indispensable step in the qualification of the manufacturing process. A set of 20 full scale pre-series elements was manufactured by PLANSEE SE to validate the materials and manufacturing technologies prior to the start of the series production. The HHF tests were performed in the ion beam test facility GLADIS. All actively water-cooled elements were tested for about 100 cycles at 10 MW/m 2 (10-15 s pulse duration). Several elements were loaded with even higher cycle numbers (up to 1000) and heat loads up to 24 MW/m 2 . Hot spots were, observed at the edges of several tiles during the HHF tests indicating local bonding problems of the CFC. The thermo-mechanical behaviour under HHF loading has been evaluated and compared to the FEM predictions. The measured temperatures and strains confirm the chosen FEM approach. This allows a component optimisation to achieve a successful series production of the W7-X divertor target elements

  3. What marketing scholars should know about time series analysis : time series applications in marketing

    NARCIS (Netherlands)

    Horváth, Csilla; Kornelis, Marcel; Leeflang, Peter S.H.

    2002-01-01

    In this review, we give a comprehensive summary of time series techniques in marketing, and discuss a variety of time series analysis (TSA) techniques and models. We classify them in the sets (i) univariate TSA, (ii) multivariate TSA, and (iii) multiple TSA. We provide relevant marketing

  4. Data mining in time series databases

    CERN Document Server

    Kandel, Abraham; Bunke, Horst

    2004-01-01

    Adding the time dimension to real-world databases produces Time SeriesDatabases (TSDB) and introduces new aspects and difficulties to datamining and knowledge discovery. This book covers the state-of-the-artmethodology for mining time series databases. The novel data miningmethods presented in the book include techniques for efficientsegmentation, indexing, and classification of noisy and dynamic timeseries. A graph-based method for anomaly detection in time series isdescribed and the book also studies the implications of a novel andpotentially useful representation of time series as strings. Theproblem of detecting changes in data mining models that are inducedfrom temporal databases is additionally discussed.

  5. 3D Architecture and evolution of the Po Plain-Northern Adriatic Foreland basin during Plio-Pleistocene time

    Science.gov (United States)

    Amadori, Chiara; Toscani, Giovanni; Ghielmi, Manlio; Maesano, Francesco Emanuele; D'Ambrogi, Chiara; Lombardi, Stefano; Milanesi, Riccardo; Panara, Yuri; Di Giulio, Andrea

    2017-04-01

    The Pliocene-Pleistocene tectonic and sedimentary evolution of the eastern Po Plain and northern Adriatic Foreland Basin (PPAF) (extended ca. 35,000 km2) was the consequence of severe Northern Apennine compressional activity and climate-driven eustatic changes. According with the 2D seismic interpretation, facies analysis and sequence stratigraphy approach by Ghielmi et al. (2013 and references therein), these tectono-eustatic phases generated six basin-scale unconformities referred as Base Pliocene (PL1), Intra-Zanclean (PL2), Intra-Piacenzian (PL3), Gelasian (PL4), Base Calabrian (PS1) and Late Calabrian (PS2). We present a basin-wide detailed 3D model of the PPAF region, derived from the interpretation of these unconformities in a dense network of seismic lines (ca. 6,000 km) correlated with more than 200 well stratigraphies (courtesy of ENI E&P). The initial 3D time-model has been time-to-depth converted using the 3D velocity model created with Vel-IO 3D, a tool for 3D depth conversions and then validated and integrated with depth domain dataset from bibliography and well log. Resultant isobath and isopach maps are produced to inspect step-by-step the basin paleogeographic evolution; it occurred through alternating stages of simple and fragmented foredeeps. Changes in the basin geometry through time, from the inner sector located in the Emilia-Romagna Apennines to the outermost region (Veneto and northern Adriatic Sea), were marked by repeated phases of outward migration of two large deep depocenters located in front of Emilia arcs on the west, and in front of Ferrara-Romagna thrusts on the east. During late Pliocene-early Pleistocene, the inner side of the Emilia-Romagna arcs evolved into an elongated deep thrust-top basin due to a strong foredeep fragmentation then, an overall tectono-stratigraphic analysis shows also a decreasing trend of tectonic intensity of the Northern Apennine since Pleistocene until present.

  6. Models for dependent time series

    CERN Document Server

    Tunnicliffe Wilson, Granville; Haywood, John

    2015-01-01

    Models for Dependent Time Series addresses the issues that arise and the methodology that can be applied when the dependence between time series is described and modeled. Whether you work in the economic, physical, or life sciences, the book shows you how to draw meaningful, applicable, and statistically valid conclusions from multivariate (or vector) time series data.The first four chapters discuss the two main pillars of the subject that have been developed over the last 60 years: vector autoregressive modeling and multivariate spectral analysis. These chapters provide the foundational mater

  7. Visual time series analysis

    DEFF Research Database (Denmark)

    Fischer, Paul; Hilbert, Astrid

    2012-01-01

    We introduce a platform which supplies an easy-to-handle, interactive, extendable, and fast analysis tool for time series analysis. In contrast to other software suits like Maple, Matlab, or R, which use a command-line-like interface and where the user has to memorize/look-up the appropriate...... commands, our application is select-and-click-driven. It allows to derive many different sequences of deviations for a given time series and to visualize them in different ways in order to judge their expressive power and to reuse the procedure found. For many transformations or model-ts, the user may...... choose between manual and automated parameter selection. The user can dene new transformations and add them to the system. The application contains efficient implementations of advanced and recent techniques for time series analysis including techniques related to extreme value analysis and filtering...

  8. Time-Transgressive Nature of the Magnetic Susceptibility Record across the Chinese Loess Plateau at the Pleistocene/Holocene Transition

    Science.gov (United States)

    Dong, Yajie; Wu, Naiqin; Li, Fengjiang; Huang, Linpei; Wen, Wenwen

    2015-01-01

    The loess stratigraphic boundary at the Pleistocene/Holocene transition defined by the magnetic susceptibility (MS) has previously been assumed to be synchronous with the Marine Isotope Stage (MIS) 2/1 boundary, and approximately time-synchronous at different sections across the Chinese Loess Plateau (CLP). However, although this assumption has been used as a basis for proxy-age model of Chinese loess deposits, it has rarely been tested by using absolute dating methods. In this study, we applied a single-aliquot regenerative-dose (SAR) protocol to the 45–63 μm quartz grain-size fraction to derive luminescence ages for the last glacial and Holocene sections of three loess sections on a transect from southeast to northwest across the CLP. Based on the 33 closely spaced optically stimulated luminescence (OSL) samples from the three sections, OSL chronologies were established using a polynomial curve fit at each section. Based on the OSL chronology, the timing of the Pleistocene/Holocene boundary, as defined by rapid changes in MS values, is dated at ~10.5 ka, 8.5 ka and 7.5 ka in the Yaoxian section, Jingchuan and Huanxian sections respectively. These results are clearly inconsistent with the MIS 2/1 boundary age of 12.05 ka, and therefore we conclude that the automatic correlation of the Pleistocene/Holocene transition, as inferred from the MS record, with the MIS 2/1 boundary is incorrect. The results clearly demonstrate that the marked changes in MS along the southeast to northwest transect are time-transgressive among the different sites, with the timing of significant paleosol development as indicated by the MS record being delayed by 3–4 ka in the northwest compared to the southeast. Our results suggest that this asynchronous paleosol development during the last deglacial was caused by the delayed arrival of the summer monsoon in the northwest CLP compared to the southeast. PMID:26186443

  9. Time-Transgressive Nature of the Magnetic Susceptibility Record across the Chinese Loess Plateau at the Pleistocene/Holocene Transition.

    Directory of Open Access Journals (Sweden)

    Yajie Dong

    Full Text Available The loess stratigraphic boundary at the Pleistocene/Holocene transition defined by the magnetic susceptibility (MS has previously been assumed to be synchronous with the Marine Isotope Stage (MIS 2/1 boundary, and approximately time-synchronous at different sections across the Chinese Loess Plateau (CLP. However, although this assumption has been used as a basis for proxy-age model of Chinese loess deposits, it has rarely been tested by using absolute dating methods. In this study, we applied a single-aliquot regenerative-dose (SAR protocol to the 45-63 μm quartz grain-size fraction to derive luminescence ages for the last glacial and Holocene sections of three loess sections on a transect from southeast to northwest across the CLP. Based on the 33 closely spaced optically stimulated luminescence (OSL samples from the three sections, OSL chronologies were established using a polynomial curve fit at each section. Based on the OSL chronology, the timing of the Pleistocene/Holocene boundary, as defined by rapid changes in MS values, is dated at ~10.5 ka, 8.5 ka and 7.5 ka in the Yaoxian section, Jingchuan and Huanxian sections respectively. These results are clearly inconsistent with the MIS 2/1 boundary age of 12.05 ka, and therefore we conclude that the automatic correlation of the Pleistocene/Holocene transition, as inferred from the MS record, with the MIS 2/1 boundary is incorrect. The results clearly demonstrate that the marked changes in MS along the southeast to northwest transect are time-transgressive among the different sites, with the timing of significant paleosol development as indicated by the MS record being delayed by 3-4 ka in the northwest compared to the southeast. Our results suggest that this asynchronous paleosol development during the last deglacial was caused by the delayed arrival of the summer monsoon in the northwest CLP compared to the southeast.

  10. Is the modern koala ( Phascolarctos cinereus) a derived dwarf of a Pleistocene giant? Implications for testing megafauna extinction hypotheses

    Science.gov (United States)

    Price, Gilbert J.

    2008-12-01

    The modern Australian koala ( Phascolarctos cinereus) is commonly regarded as a dwarf descendent of a Late Pleistocene giant koala ( Ph. stirtoni). The implication of that hypothesis is that the giant koala survived the Late Pleistocene megafaunal extinction "event", albeit as a smaller body-sized form. It is important to be able to constrain rates of Late Pleistocene faunal turnover, an aspect reliant on having accurate taxonomic information of extinct species. The koala dwarfing hypothesis is tested here by using a temporally-constrained biogeographical record of fossil koalas, and a morphological character analysis. The contemporary occurrence of both taxa in pre-Late Pleistocene deposits and significant differences in dental morphologies between those forms suggests that the modern koala is not a derived dwarf of the Pleistocene giant koala. Thus, the giant-form was among a number of other giant mammals, lizards and birds that suffered extinction sometime during the Late Pleistocene. The potential phenomenon of dwarfing of other Late Pleistocene and Recent faunas, such as grey kangaroos, is commonly used as a test for or against various megafaunal extinction hypotheses. However, the results of this study also demonstrate that the dwarfing hypothesis has not been adequately tested for a suite of other taxa. Thus, until the dwarfing hypothesis can be more fully tested, a clear understanding of the fate of Late Pleistocene faunas that apparently survived the extinction "event", and the origins of many extant forms will remain elusive.

  11. A Review of Subsequence Time Series Clustering

    Directory of Open Access Journals (Sweden)

    Seyedjamal Zolhavarieh

    2014-01-01

    Full Text Available Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  12. A review of subsequence time series clustering.

    Science.gov (United States)

    Zolhavarieh, Seyedjamal; Aghabozorgi, Saeed; Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies.

  13. A Review of Subsequence Time Series Clustering

    Science.gov (United States)

    Teh, Ying Wah

    2014-01-01

    Clustering of subsequence time series remains an open issue in time series clustering. Subsequence time series clustering is used in different fields, such as e-commerce, outlier detection, speech recognition, biological systems, DNA recognition, and text mining. One of the useful fields in the domain of subsequence time series clustering is pattern recognition. To improve this field, a sequence of time series data is used. This paper reviews some definitions and backgrounds related to subsequence time series clustering. The categorization of the literature reviews is divided into three groups: preproof, interproof, and postproof period. Moreover, various state-of-the-art approaches in performing subsequence time series clustering are discussed under each of the following categories. The strengths and weaknesses of the employed methods are evaluated as potential issues for future studies. PMID:25140332

  14. Analysis of Heavy-Tailed Time Series

    DEFF Research Database (Denmark)

    Xie, Xiaolei

    This thesis is about analysis of heavy-tailed time series. We discuss tail properties of real-world equity return series and investigate the possibility that a single tail index is shared by all return series of actively traded equities in a market. Conditions for this hypothesis to be true...... are identified. We study the eigenvalues and eigenvectors of sample covariance and sample auto-covariance matrices of multivariate heavy-tailed time series, and particularly for time series with very high dimensions. Asymptotic approximations of the eigenvalues and eigenvectors of such matrices are found...... and expressed in terms of the parameters of the dependence structure, among others. Furthermore, we study an importance sampling method for estimating rare-event probabilities of multivariate heavy-tailed time series generated by matrix recursion. We show that the proposed algorithm is efficient in the sense...

  15. Adaptive time-variant models for fuzzy-time-series forecasting.

    Science.gov (United States)

    Wong, Wai-Keung; Bai, Enjian; Chu, Alice Wai-Ching

    2010-12-01

    A fuzzy time series has been applied to the prediction of enrollment, temperature, stock indices, and other domains. Related studies mainly focus on three factors, namely, the partition of discourse, the content of forecasting rules, and the methods of defuzzification, all of which greatly influence the prediction accuracy of forecasting models. These studies use fixed analysis window sizes for forecasting. In this paper, an adaptive time-variant fuzzy-time-series forecasting model (ATVF) is proposed to improve forecasting accuracy. The proposed model automatically adapts the analysis window size of fuzzy time series based on the prediction accuracy in the training phase and uses heuristic rules to generate forecasting values in the testing phase. The performance of the ATVF model is tested using both simulated and actual time series including the enrollments at the University of Alabama, Tuscaloosa, and the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX). The experiment results show that the proposed ATVF model achieves a significant improvement in forecasting accuracy as compared to other fuzzy-time-series forecasting models.

  16. Time Series Analysis and Forecasting by Example

    CERN Document Server

    Bisgaard, Soren

    2011-01-01

    An intuition-based approach enables you to master time series analysis with ease Time Series Analysis and Forecasting by Example provides the fundamental techniques in time series analysis using various examples. By introducing necessary theory through examples that showcase the discussed topics, the authors successfully help readers develop an intuitive understanding of seemingly complicated time series models and their implications. The book presents methodologies for time series analysis in a simplified, example-based approach. Using graphics, the authors discuss each presented example in

  17. Time series with tailored nonlinearities

    Science.gov (United States)

    Räth, C.; Laut, I.

    2015-10-01

    It is demonstrated how to generate time series with tailored nonlinearities by inducing well-defined constraints on the Fourier phases. Correlations between the phase information of adjacent phases and (static and dynamic) measures of nonlinearities are established and their origin is explained. By applying a set of simple constraints on the phases of an originally linear and uncorrelated Gaussian time series, the observed scaling behavior of the intensity distribution of empirical time series can be reproduced. The power law character of the intensity distributions being typical for, e.g., turbulence and financial data can thus be explained in terms of phase correlations.

  18. Clustering of financial time series

    Science.gov (United States)

    D'Urso, Pierpaolo; Cappelli, Carmela; Di Lallo, Dario; Massari, Riccardo

    2013-05-01

    This paper addresses the topic of classifying financial time series in a fuzzy framework proposing two fuzzy clustering models both based on GARCH models. In general clustering of financial time series, due to their peculiar features, needs the definition of suitable distance measures. At this aim, the first fuzzy clustering model exploits the autoregressive representation of GARCH models and employs, in the framework of a partitioning around medoids algorithm, the classical autoregressive metric. The second fuzzy clustering model, also based on partitioning around medoids algorithm, uses the Caiado distance, a Mahalanobis-like distance, based on estimated GARCH parameters and covariances that takes into account the information about the volatility structure of time series. In order to illustrate the merits of the proposed fuzzy approaches an application to the problem of classifying 29 time series of Euro exchange rates against international currencies is presented and discussed, also comparing the fuzzy models with their crisp version.

  19. Updating Landsat time series of surface-reflectance composites and forest change products with new observations

    Science.gov (United States)

    Hermosilla, Txomin; Wulder, Michael A.; White, Joanne C.; Coops, Nicholas C.; Hobart, Geordie W.

    2017-12-01

    The use of time series satellite data allows for the temporally dense, systematic, transparent, and synoptic capture of land dynamics over time. Subsequent to the opening of the Landsat archive, several time series approaches for characterizing landscape change have been developed, often representing a particular analytical time window. The information richness and widespread utility of these time series data have created a need to maintain the currency of time series information via the addition of new data, as it becomes available. When an existing time series is temporally extended, it is critical that previously generated change information remains consistent, thereby not altering reported change statistics or science outcomes based on that change information. In this research, we investigate the impacts and implications of adding additional years to an existing 29-year annual Landsat time series for forest change. To do so, we undertook a spatially explicit comparison of the 29 overlapping years of a time series representing 1984-2012, with a time series representing 1984-2016. Surface reflectance values, and presence, year, and type of change were compared. We found that the addition of years to extend the time series had minimal effect on the annual surface reflectance composites, with slight band-specific differences (r ≥ 0.1) in the final years of the original time series being updated. The area of stand replacing disturbances and determination of change year are virtually unchanged for the overlapping period between the two time-series products. Over the overlapping temporal period (1984-2012), the total area of change differs by 0.53%, equating to an annual difference in change area of 0.019%. Overall, the spatial and temporal agreement of the changes detected by both time series was 96%. Further, our findings suggest that the entire pre-existing historic time series does not need to be re-processed during the update process. Critically, given the time

  20. GPS Time Series and Geodynamic Implications for the Hellenic Arc Area, Greece

    Science.gov (United States)

    Hollenstein, Ch.; Heller, O.; Geiger, A.; Kahle, H.-G.; Veis, G.

    The quantification of crustal deformation and its temporal behavior is an important contribution to earthquake hazard assessment. With GPS measurements, especially from continuous operating stations, pre-, co-, post- and interseismic movements can be recorded and monitored. We present results of a continuous GPS network which has been operated in the Hellenic Arc area, Greece, since 1995. In order to obtain coordinate time series of high precision which are representative for crustal deformation, a main goal was to eliminate effects which are not of tectonic origin. By applying different steps of improvement, non-tectonic irregularities were reduced significantly, and the precision could be improved by an average of 40%. The improved time series are used to study the crustal movements in space and time. They serve as a base for the estimation of velocities and for the visualization of the movements in terms of trajectories. Special attention is given to large earthquakes (M>6), which occurred near GPS sites during the measuring time span.

  1. Identification of neutral biochemical network models from time series data.

    Science.gov (United States)

    Vilela, Marco; Vinga, Susana; Maia, Marco A Grivet Mattoso; Voit, Eberhard O; Almeida, Jonas S

    2009-05-05

    The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  2. Data Mining Smart Energy Time Series

    Directory of Open Access Journals (Sweden)

    Janina POPEANGA

    2015-07-01

    Full Text Available With the advent of smart metering technology the amount of energy data will increase significantly and utilities industry will have to face another big challenge - to find relationships within time-series data and even more - to analyze such huge numbers of time series to find useful patterns and trends with fast or even real-time response. This study makes a small review of the literature in the field, trying to demonstrate how essential is the application of data mining techniques in the time series to make the best use of this large quantity of data, despite all the difficulties. Also, the most important Time Series Data Mining techniques are presented, highlighting their applicability in the energy domain.

  3. Predicting chaotic time series

    International Nuclear Information System (INIS)

    Farmer, J.D.; Sidorowich, J.J.

    1987-01-01

    We present a forecasting technique for chaotic data. After embedding a time series in a state space using delay coordinates, we ''learn'' the induced nonlinear mapping using local approximation. This allows us to make short-term predictions of the future behavior of a time series, using information based only on past values. We present an error estimate for this technique, and demonstrate its effectiveness by applying it to several examples, including data from the Mackey-Glass delay differential equation, Rayleigh-Benard convection, and Taylor-Couette flow

  4. Measuring multiscaling in financial time-series

    International Nuclear Information System (INIS)

    Buonocore, R.J.; Aste, T.; Di Matteo, T.

    2016-01-01

    We discuss the origin of multiscaling in financial time-series and investigate how to best quantify it. Our methodology consists in separating the different sources of measured multifractality by analyzing the multi/uni-scaling behavior of synthetic time-series with known properties. We use the results from the synthetic time-series to interpret the measure of multifractality of real log-returns time-series. The main finding is that the aggregation horizon of the returns can introduce a strong bias effect on the measure of multifractality. This effect can become especially important when returns distributions have power law tails with exponents in the range (2, 5). We discuss the right aggregation horizon to mitigate this bias.

  5. A Cryogenic Test Set-Up for the Qualification of Pre-Series Test Cells for the LHC Cryogenic Distribution Line

    CERN Document Server

    Livran, J; Parente, C; Riddone, G; Rybkowski, D; Veillet, N

    2000-01-01

    Three pre-series Test Cells of the LHC Cryogenic Distribution Line (QRL) [1], manufactured by three European industrial companies, will be tested in the year 2000 to qualify the design chosen and verify the thermal and mechanical performances. A dedicated test stand (170 m x 13 m) has been built for extensive testing and performance assessment of the pre-series units in parallel. They will be fed with saturated liquid helium at 4.2 K supplied by a mobile helium dewar. In addition, LN2 cooled helium will be used for cool-down and thermal shielding. For each of the three pre-series units, a set of end boxes has been designed and manufactured at CERN. This paper presents the layout of the cryogenic system for the pre-series units, the calorimetric methods as well as the results of the thermal calculation of the end box test.

  6. Time averaging, ageing and delay analysis of financial time series

    Science.gov (United States)

    Cherstvy, Andrey G.; Vinod, Deepak; Aghion, Erez; Chechkin, Aleksei V.; Metzler, Ralf

    2017-06-01

    We introduce three strategies for the analysis of financial time series based on time averaged observables. These comprise the time averaged mean squared displacement (MSD) as well as the ageing and delay time methods for varying fractions of the financial time series. We explore these concepts via statistical analysis of historic time series for several Dow Jones Industrial indices for the period from the 1960s to 2015. Remarkably, we discover a simple universal law for the delay time averaged MSD. The observed features of the financial time series dynamics agree well with our analytical results for the time averaged measurables for geometric Brownian motion, underlying the famed Black-Scholes-Merton model. The concepts we promote here are shown to be useful for financial data analysis and enable one to unveil new universal features of stock market dynamics.

  7. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  8. Entropic Analysis of Electromyography Time Series

    Science.gov (United States)

    Kaufman, Miron; Sung, Paul

    2005-03-01

    We are in the process of assessing the effectiveness of fractal and entropic measures for the diagnostic of low back pain from surface electromyography (EMG) time series. Surface electromyography (EMG) is used to assess patients with low back pain. In a typical EMG measurement, the voltage is measured every millisecond. We observed back muscle fatiguing during one minute, which results in a time series with 60,000 entries. We characterize the complexity of time series by computing the Shannon entropy time dependence. The analysis of the time series from different relevant muscles from healthy and low back pain (LBP) individuals provides evidence that the level of variability of back muscle activities is much larger for healthy individuals than for individuals with LBP. In general the time dependence of the entropy shows a crossover from a diffusive regime to a regime characterized by long time correlations (self organization) at about 0.01s.

  9. Evolution of Early Pleistocene fluvial systems in central Poland prior to the first ice sheet advance – a case study from the Bełchatów lignite mine

    Directory of Open Access Journals (Sweden)

    Goździk Jan

    2017-06-01

    Full Text Available Deposits formed between the Neogene/Pleistocene transition and into the Early Pleistocene have been studied, mainly on the basis of drillings and at rare, small outcrops in the lowland part of Polish territory. At the Bełchatów lignite mine (Kleszczów Graben, central Poland, one of the largest opencast pits in Europe, strata of this age have long been exposed in extensive outcrops. The present paper is based on our field studies and laboratory analyses, as well as on research data presented by other authors. For that reason, it can be seen as an overview of current knowledge of lowermost Pleistocene deposits at Bełchatów, where exploitation of the Quaternary overburden has just been completed. The results of cartographic work, sedimentological, mineralogical and palynological analyses as well as assessment of sand grain morphology have been considered. All of these studies have allowed the distinction of three Lower Pleistocene series, i.e., the Łękińsko, Faustynów and Krzaki series. These were laid down in fluvial environments between the end of the Pliocene up to the advance of the first Scandinavian ice sheet on central Poland. The following environmental features have been interpreted: phases of river incision and aggradation, changes of river channel patterns, source sediments for alluvia, rates of aeolian supply to rivers and roles of fluvial systems in morphological and geological development of the area. The two older series studied, i.e., Łękińsko and Faustynów, share common characteristics. They were formed by sinuous rivers in boreal forest and open forest environments. The Neogene substratum was the source of the alluvium. The younger series (Krzaki formed mainly in a braided river setting, under conditions of progressive climatic cooling. Over time, a gradual increase of aeolian supply to the fluvial system can be noted; initially, silt and sand were laid down, followed by sand only during cold desert conditions. These

  10. Comparison Groups in Short Interrupted Time-Series: An Illustration Evaluating No Child Left Behind

    Science.gov (United States)

    Wong, Manyee; Cook, Thomas D.; Steiner, Peter M.

    2009-01-01

    Interrupted time-series (ITS) are often used to assess the causal effect of a planned or even unplanned shock introduced into an on-going process. The pre-intervention slope is supposed to index the causal counterfactual, and deviations from it in mean, slope or variance are used to indicate an effect. However, a secure causal inference is only…

  11. Quantifying memory in complex physiological time-series.

    Science.gov (United States)

    Shirazi, Amir H; Raoufy, Mohammad R; Ebadi, Haleh; De Rui, Michele; Schiff, Sami; Mazloom, Roham; Hajizadeh, Sohrab; Gharibzadeh, Shahriar; Dehpour, Ahmad R; Amodio, Piero; Jafari, G Reza; Montagnese, Sara; Mani, Ali R

    2013-01-01

    In a time-series, memory is a statistical feature that lasts for a period of time and distinguishes the time-series from a random, or memory-less, process. In the present study, the concept of "memory length" was used to define the time period, or scale over which rare events within a physiological time-series do not appear randomly. The method is based on inverse statistical analysis and provides empiric evidence that rare fluctuations in cardio-respiratory time-series are 'forgotten' quickly in healthy subjects while the memory for such events is significantly prolonged in pathological conditions such as asthma (respiratory time-series) and liver cirrhosis (heart-beat time-series). The memory length was significantly higher in patients with uncontrolled asthma compared to healthy volunteers. Likewise, it was significantly higher in patients with decompensated cirrhosis compared to those with compensated cirrhosis and healthy volunteers. We also observed that the cardio-respiratory system has simple low order dynamics and short memory around its average, and high order dynamics around rare fluctuations.

  12. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  13. Identification of neutral biochemical network models from time series data

    Directory of Open Access Journals (Sweden)

    Maia Marco

    2009-05-01

    Full Text Available Abstract Background The major difficulty in modeling biological systems from multivariate time series is the identification of parameter sets that endow a model with dynamical behaviors sufficiently similar to the experimental data. Directly related to this parameter estimation issue is the task of identifying the structure and regulation of ill-characterized systems. Both tasks are simplified if the mathematical model is canonical, i.e., if it is constructed according to strict guidelines. Results In this report, we propose a method for the identification of admissible parameter sets of canonical S-systems from biological time series. The method is based on a Monte Carlo process that is combined with an improved version of our previous parameter optimization algorithm. The method maps the parameter space into the network space, which characterizes the connectivity among components, by creating an ensemble of decoupled S-system models that imitate the dynamical behavior of the time series with sufficient accuracy. The concept of sloppiness is revisited in the context of these S-system models with an exploration not only of different parameter sets that produce similar dynamical behaviors but also different network topologies that yield dynamical similarity. Conclusion The proposed parameter estimation methodology was applied to actual time series data from the glycolytic pathway of the bacterium Lactococcus lactis and led to ensembles of models with different network topologies. In parallel, the parameter optimization algorithm was applied to the same dynamical data upon imposing a pre-specified network topology derived from prior biological knowledge, and the results from both strategies were compared. The results suggest that the proposed method may serve as a powerful exploration tool for testing hypotheses and the design of new experiments.

  14. MicroEcos: Micro-Scale Explorations of Large-Scale Late Pleistocene Ecosystems

    Science.gov (United States)

    Gellis, B. S.

    2017-12-01

    Pollen data can inform the reconstruction of early-floral environments by providing data for artistic representations of what early-terrestrial ecosystems looked like, and how existing terrestrial landscapes have evolved. For example, what did the Bighorn Basin look like when large ice sheets covered modern Canada, the Yellowstone Plateau had an ice cap, and the Bighorn Mountains were mantled with alpine glaciers? MicroEcos is an immersive, multimedia project that aims to strengthen human-nature connections through the understanding and appreciation of biological ecosystems. Collected pollen data elucidates flora that are visible in the fossil record - associated with the Late-Pleistocene - and have been illustrated and described in botanical literature. It aims to make scientific data accessible and interesting to all audiences through a series of interactive-digital sculptures, large-scale photography and field-based videography. While this project is driven by scientific data, it is rooted in deeply artistic and outreach-based practices, which include broad artistic practices, e.g.: digital design, illustration, photography, video and sound design. Using 3D modeling and printing technology MicroEcos centers around a series of 3D-printed models of the Last Canyon rock shelter on the Wyoming and Montana border, Little Windy Hill pond site in Wyoming's Medicine Bow National Forest, and Natural Trap Cave site in Wyoming's Big Horn Basin. These digital, interactive-3D sculpture provide audiences with glimpses of three-dimensional Late-Pleistocene environments, and helps create dialogue of how grass, sagebrush, and spruce based ecosystems form. To help audiences better contextualize how MicroEcos bridges notions of time, space, and place, modern photography and videography of the Last Canyon, Little Windy Hill and Natural Trap Cave sites surround these 3D-digital reconstructions.

  15. MAGNETOSTRATIGRAPHY OF THE HOMO-BEARING PLEISTOCENE DANDIERO BASIN (DANAKIL DEPRESSION, ERITREA

    Directory of Open Access Journals (Sweden)

    ANDREA ALBIANELLI

    2004-12-01

    Full Text Available Four magnetozones have been found in the 530 m thick profile of the Dandiero Group. The lower unit, the Bukra Sand and Gravel, extends in the R1 reversed magnetozone from 150 m below the tephra level which was used as the reference marker between the sampled sections. The normal magnetozone N1 is almost completely covered by the lacustrine and deltaic sediments of the Alat Formation, while the following reversed magnetozone contains both the Wara Sand and Gravel and the lacustrine Goreya Fm. The N2 polarity zone is completely occupied by the Aro Sand. This polarity sequence has been calibrated to the geomagnetic time scale using the Early to Middle Pleistocene age of the associated vertebrate fauna and fission-track dating. The four magnetozones were thus regarded as representing the chrons by which the Pleistocene is correlated with magnetochronology. Their three reversal boundaries provided the dates of 1.07, 0.99 and 0.78 Ma, allowing to determine average sedimentation rates close to 1 m/ky. Cyclostratigraphy of the magnetic signal, analysed by the spectral analysis of the time series across the Jaramillo and late Matuyama chrons, confirmed that value. The evidenced cyclicities were directly related to the alternating lithofacies, and both to the astronomical parameters driving the climate changes during the deposition of the Dandiero group (some five hundred thousand years. The section with the Homo site covers the Jaramillo/Matuyama boundary, and the Homo bed located 2 m below this limit is dated 0.992 Ma. 

  16. Pleistocene Palaeoart of Asia

    Directory of Open Access Journals (Sweden)

    Robert G. Bednarik

    2013-06-01

    Full Text Available This comprehensive overview considers the currently known Pleistocene palaeoart of Asia on a common basis, which suggests that the available data are entirely inadequate to form any cohesive synthesis about this corpus. In comparison to the attention lavished on the corresponding record available from Eurasia’s small western appendage, Europe, it is evident that Pleistocene palaeoart from the rest of the world has been severely neglected. Southern Asia, in particular, holds great promise for the study of early cognitive development of hominins, and yet this potential has remained almost entirely unexplored. Asia is suggested to be the key continent in any global synthesis of ‘art’ origins, emphasising the need for a comprehensive pan-continental research program. This is not just to counter-balance the incredible imbalance in favour of Europe, but to examine the topic of Middle Pleistocene palaeoart development effectively.

  17. Statistical criteria for characterizing irradiance time series.

    Energy Technology Data Exchange (ETDEWEB)

    Stein, Joshua S.; Ellis, Abraham; Hansen, Clifford W.

    2010-10-01

    We propose and examine several statistical criteria for characterizing time series of solar irradiance. Time series of irradiance are used in analyses that seek to quantify the performance of photovoltaic (PV) power systems over time. Time series of irradiance are either measured or are simulated using models. Simulations of irradiance are often calibrated to or generated from statistics for observed irradiance and simulations are validated by comparing the simulation output to the observed irradiance. Criteria used in this comparison should derive from the context of the analyses in which the simulated irradiance is to be used. We examine three statistics that characterize time series and their use as criteria for comparing time series. We demonstrate these statistics using observed irradiance data recorded in August 2007 in Las Vegas, Nevada, and in June 2009 in Albuquerque, New Mexico.

  18. Homogenising time series: beliefs, dogmas and facts

    Science.gov (United States)

    Domonkos, P.

    2011-06-01

    In the recent decades various homogenisation methods have been developed, but the real effects of their application on time series are still not known sufficiently. The ongoing COST action HOME (COST ES0601) is devoted to reveal the real impacts of homogenisation methods more detailed and with higher confidence than earlier. As a part of the COST activity, a benchmark dataset was built whose characteristics approach well the characteristics of real networks of observed time series. This dataset offers much better opportunity than ever before to test the wide variety of homogenisation methods, and analyse the real effects of selected theoretical recommendations. Empirical results show that real observed time series usually include several inhomogeneities of different sizes. Small inhomogeneities often have similar statistical characteristics than natural changes caused by climatic variability, thus the pure application of the classic theory that change-points of observed time series can be found and corrected one-by-one is impossible. However, after homogenisation the linear trends, seasonal changes and long-term fluctuations of time series are usually much closer to the reality than in raw time series. Some problems around detecting multiple structures of inhomogeneities, as well as that of time series comparisons within homogenisation procedures are discussed briefly in the study.

  19. Ocean time-series near Bermuda: Hydrostation S and the US JGOFS Bermuda Atlantic time-series study

    Science.gov (United States)

    Michaels, Anthony F.; Knap, Anthony H.

    1992-01-01

    Bermuda is the site of two ocean time-series programs. At Hydrostation S, the ongoing biweekly profiles of temperature, salinity and oxygen now span 37 years. This is one of the longest open-ocean time-series data sets and provides a view of decadal scale variability in ocean processes. In 1988, the U.S. JGOFS Bermuda Atlantic Time-series Study began a wide range of measurements at a frequency of 14-18 cruises each year to understand temporal variability in ocean biogeochemistry. On each cruise, the data range from chemical analyses of discrete water samples to data from electronic packages of hydrographic and optics sensors. In addition, a range of biological and geochemical rate measurements are conducted that integrate over time-periods of minutes to days. This sampling strategy yields a reasonable resolution of the major seasonal patterns and of decadal scale variability. The Sargasso Sea also has a variety of episodic production events on scales of days to weeks and these are only poorly resolved. In addition, there is a substantial amount of mesoscale variability in this region and some of the perceived temporal patterns are caused by the intersection of the biweekly sampling with the natural spatial variability. In the Bermuda time-series programs, we have added a series of additional cruises to begin to assess these other sources of variation and their impacts on the interpretation of the main time-series record. However, the adequate resolution of higher frequency temporal patterns will probably require the introduction of new sampling strategies and some emerging technologies such as biogeochemical moorings and autonomous underwater vehicles.

  20. Multivariate Time Series Decomposition into Oscillation Components.

    Science.gov (United States)

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-08-01

    Many time series are considered to be a superposition of several oscillation components. We have proposed a method for decomposing univariate time series into oscillation components and estimating their phases (Matsuda & Komaki, 2017 ). In this study, we extend that method to multivariate time series. We assume that several oscillators underlie the given multivariate time series and that each variable corresponds to a superposition of the projections of the oscillators. Thus, the oscillators superpose on each variable with amplitude and phase modulation. Based on this idea, we develop gaussian linear state-space models and use them to decompose the given multivariate time series. The model parameters are estimated from data using the empirical Bayes method, and the number of oscillators is determined using the Akaike information criterion. Therefore, the proposed method extracts underlying oscillators in a data-driven manner and enables investigation of phase dynamics in a given multivariate time series. Numerical results show the effectiveness of the proposed method. From monthly mean north-south sunspot number data, the proposed method reveals an interesting phase relationship.

  1. Forecasting Enrollments with Fuzzy Time Series.

    Science.gov (United States)

    Song, Qiang; Chissom, Brad S.

    The concept of fuzzy time series is introduced and used to forecast the enrollment of a university. Fuzzy time series, an aspect of fuzzy set theory, forecasts enrollment using a first-order time-invariant model. To evaluate the model, the conventional linear regression technique is applied and the predicted values obtained are compared to the…

  2. Middle and Late Pleistocene glaciations in the southwestern Pamir and their effects on topography [Topography of the SW Pamir shaped by middle-late Pleistocene glaciation

    International Nuclear Information System (INIS)

    Stübner, Konstanze; Grin, Elena; Hidy, Alan J.; Schaller, Mirjam; Gold, Ryan D.

    2017-01-01

    Glacial chronologies provide insight into the evolution of paleo-landscapes, paleoclimate, topography, and the erosion processes that shape mountain ranges. In the Pamir of Central Asia, glacial morphologies and deposits indicate extensive past glaciations, whose timing and extent remain poorly constrained. Geomorphic data and 15 new "1"0Be exposure ages from moraine boulders and roches moutonnées in the southwestern Pamir document multiple Pleistocene glacial stages. The oldest exposure ages, View the MathML source113 ± 10ka, underestimate the age of the earliest preserved glacial advance and imply that the modern relief of the southwestern Pamir (peaks at ~5000–6000 m a.s.l.; valleys at ~2000–3000 m a.s.l.) already existed in the late Middle Pleistocene. Younger exposure ages (~40–80 ka, ~30 ka) complement the existing Central Asian glacial chronology and reflect successively less extensive Late Pleistocene glaciations. The topography of the Pamir and the glacial chronologies suggest that, in the Middle Pleistocene, an ice cap or ice field occupied the eastern Pamir high-altitude plateau, whereas westward flowing valley glaciers incised the southwestern Pamir. Since the Late Pleistocene deglaciation, the rivers of the southwestern Pamir adjusted to the glacially shaped landscape. As a result, localized rapid fluvial incision and drainage network reorganization reflect the transient nature of the deglaciated landscape.

  3. Pleistocene Paleoart of Australia

    Directory of Open Access Journals (Sweden)

    Robert G. Bednarik

    2014-02-01

    Full Text Available Pleistocene rock art is abundant in Australia, but has so far received only limited attention. Instead there has been a trend, begun over a century ago, to search for presumed depictions of extinct megafauna and the tracks of such species. All these notions have been discredited, however, and the current evidence suggests that figurative depiction was introduced only during the Holocene, never reaching Tasmania. Nevertheless, some Australian rock art has been attributed to the Pleistocene by direct dating methods, and its nature implies that a significant portion of the surviving corpus of rock art may also be of such age. In particular much of Australian cave art is of the Ice Age, or appears to be so, and any heavily weathered or patinated petroglyphs on particularly hard rocks are good candidates for Pleistocene antiquity. On the other hand, there is very limited evidence of mobiliary paleoart of such age in Australia.

  4. Results of the examinations of the W7-X pre-series target elements

    International Nuclear Information System (INIS)

    Boscary, J.; Boeswirth, B.; Greuner, H.; Missirlian, M.; Schedler, B.; Scheiber, K.; Schlosser, J.; Streibl, B.

    2007-01-01

    The target elements of the WENDELSTEIN 7-X (W7-X) divertor are designed to sustain a stationary heat flux of 10 MW/m 2 and to remove a maximum power of 100 kW. CFC Sepcarb NB31 tiles are bonded to a water-cooled CuCrZr heat sink in two steps: active metal casting (AMC) of an AMC -copper interlayer to CFC tiles, electron beam welding (EBW) or hot isostatic pressing (HIP) of the AMC -NB31 tiles to CuCrZr. The fabrication of the whole amount of CFC NB31 has been completed. The key target of the pre-series phase is the qualification of this bond based on a series of examinations. The introduction of silicon during the AMC process significantly improved the strength of the joint between CFC and AMC -copper. The strength of the bond is preserved after either EBW or HIP processes. High heat flux testing carried out in the ion beam facility GLADIS exhibited a too high percentage of defective tiles. Pre-series activities have been extended to reduce the stress concentration at the interface between tiles and heat sink by optimizing the design

  5. Forecasting Cryptocurrencies Financial Time Series

    DEFF Research Database (Denmark)

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely...

  6. Forecasting Cryptocurrencies Financial Time Series

    OpenAIRE

    Catania, Leopoldo; Grassi, Stefano; Ravazzolo, Francesco

    2018-01-01

    This paper studies the predictability of cryptocurrencies time series. We compare several alternative univariate and multivariate models in point and density forecasting of four of the most capitalized series: Bitcoin, Litecoin, Ripple and Ethereum. We apply a set of crypto–predictors and rely on Dynamic Model Averaging to combine a large set of univariate Dynamic Linear Models and several multivariate Vector Autoregressive models with different forms of time variation. We find statistical si...

  7. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  8. Time Series Analysis Forecasting and Control

    CERN Document Server

    Box, George E P; Reinsel, Gregory C

    2011-01-01

    A modernized new edition of one of the most trusted books on time series analysis. Since publication of the first edition in 1970, Time Series Analysis has served as one of the most influential and prominent works on the subject. This new edition maintains its balanced presentation of the tools for modeling and analyzing time series and also introduces the latest developments that have occurred n the field over the past decade through applications from areas such as business, finance, and engineering. The Fourth Edition provides a clearly written exploration of the key methods for building, cl

  9. Costationarity of Locally Stationary Time Series Using costat

    OpenAIRE

    Cardinali, Alessandro; Nason, Guy P.

    2013-01-01

    This article describes the R package costat. This package enables a user to (i) perform a test for time series stationarity; (ii) compute and plot time-localized autocovariances, and (iii) to determine and explore any costationary relationship between two locally stationary time series. Two locally stationary time series are said to be costationary if there exists two time-varying combination functions such that the linear combination of the two series with the functions produces another time...

  10. Detecting nonlinear structure in time series

    International Nuclear Information System (INIS)

    Theiler, J.

    1991-01-01

    We describe an approach for evaluating the statistical significance of evidence for nonlinearity in a time series. The formal application of our method requires the careful statement of a null hypothesis which characterizes a candidate linear process, the generation of an ensemble of ''surrogate'' data sets which are similar to the original time series but consistent with the null hypothesis, and the computation of a discriminating statistic for the original and for each of the surrogate data sets. The idea is to test the original time series against the null hypothesis by checking whether the discriminating statistic computed for the original time series differs significantly from the statistics computed for each of the surrogate sets. While some data sets very cleanly exhibit low-dimensional chaos, there are many cases where the evidence is sketchy and difficult to evaluate. We hope to provide a framework within which such claims of nonlinearity can be evaluated. 5 refs., 4 figs

  11. Introduction to time series and forecasting

    CERN Document Server

    Brockwell, Peter J

    2016-01-01

    This book is aimed at the reader who wishes to gain a working knowledge of time series and forecasting methods as applied to economics, engineering and the natural and social sciences. It assumes knowledge only of basic calculus, matrix algebra and elementary statistics. This third edition contains detailed instructions for the use of the professional version of the Windows-based computer package ITSM2000, now available as a free download from the Springer Extras website. The logic and tools of time series model-building are developed in detail. Numerous exercises are included and the software can be used to analyze and forecast data sets of the user's own choosing. The book can also be used in conjunction with other time series packages such as those included in R. The programs in ITSM2000 however are menu-driven and can be used with minimal investment of time in the computational details. The core of the book covers stationary processes, ARMA and ARIMA processes, multivariate time series and state-space mod...

  12. A Unified Framework for Estimating Minimum Detectable Effects for Comparative Short Interrupted Time Series Designs

    Science.gov (United States)

    Price, Cristofer; Unlu, Fatih

    2014-01-01

    The Comparative Short Interrupted Time Series (C-SITS) design is a frequently employed quasi-experimental method, in which the pre- and post-intervention changes observed in the outcome levels of a treatment group is compared with those of a comparison group where the difference between the former and the latter is attributed to the treatment. The…

  13. Interglacial climate dynamics and advanced time series analysis

    Science.gov (United States)

    Mudelsee, Manfred; Bermejo, Miguel; Köhler, Peter; Lohmann, Gerrit

    2013-04-01

    , Fischer H, Joos F, Knutti R, Lohmann G, Masson-Delmotte V (2010) What caused Earth's temperature variations during the last 800,000 years? Data-based evidence on radiative forcing and constraints on climate sensitivity. Quaternary Science Reviews 29:129. Loulergue L, Schilt A, Spahni R, Masson-Delmotte V, Blunier T, Lemieux B, Barnola J-M, Raynaud D, Stocker TF, Chappellaz J (2008) Orbital and millennial-scale features of atmospheric CH4 over the past 800,000 years. Nature 453:383. L¨ü thi D, Le Floch M, Bereiter B, Blunier T, Barnola J-M, Siegenthaler U, Raynaud D, Jouzel J, Fischer H, Kawamura K, Stocker TF (2008) High-resolution carbon dioxide concentration record 650,000-800,000 years before present. Nature 453:379. Mudelsee M (2000) Ramp function regression: A tool for quantifying climate transitions. Computers and Geosciences 26:293. Mudelsee M (2002) TAUEST: A computer program for estimating persistence in unevenly spaced weather/climate time series. Computers and Geosciences 28:69. Mudelsee M (2010) Climate Time Series Analysis: Classical Statistical and Bootstrap Methods. Springer, Dordrecht, 474 pp. [www.manfredmudelsee.com/book] Siegenthaler U, Stocker TF, Monnin E, L¨ü thi D, Schwander J, Stauffer B, Raynaud D, Barnola J-M, Fischer H, Masson-Delmotte V, Jouzel J (2005) Stable carbon cycle-climate relationship during the late Pleistocene. Science 310:1313.

  14. Expansion of the known distribution of Asiatic mouflon (Ovis orientalis) in the Late Pleistocene of the Southern Levant

    Science.gov (United States)

    Yeomans, Lisa; Martin, Louise; Richter, Tobias

    2017-08-01

    Wild sheep (Ovis orientalis) bones recovered from the Natufian site of Shubayqa 1 demonstrate a wider distribution of mouflon in the Late Pleistocene of the Southern Levant than previously known. Early Epipalaeolithic sites are common in the limestone steppe region of eastern Jordan but have yielded only a handful of caprine bones that cannot be identified to species level and few faunal remains from excavated Late Epipalaeolithic sites have been reported. Analysis of animal bone from Shubayqa 1 suggests a significant population of wild sheep could be found concentrated in the basalt desert environment of eastern Jordan during the Late Pleistocene, especially where higher rainfall over the Jebel Druze provided more water. A population of wild sheep was still present in the Pre-Pottery Neolithic A when the nearby site of Shubayqa 6 was occupied. Hunting of diverse, locally available resources including wild sheep at the end of the Pleistocene illustrates the flexible and adaptive exploitation strategies that hunter-forager groups engaged in. This provides further evidence to the increasing body of data showing the creative and opportunistic approach of terminal Pleistocene groups allowing continued occupation even in more marginal environments in a period of environmental change.

  15. TIME SERIES ANALYSIS USING A UNIQUE MODEL OF TRANSFORMATION

    Directory of Open Access Journals (Sweden)

    Goran Klepac

    2007-12-01

    Full Text Available REFII1 model is an authorial mathematical model for time series data mining. The main purpose of that model is to automate time series analysis, through a unique transformation model of time series. An advantage of this approach of time series analysis is the linkage of different methods for time series analysis, linking traditional data mining tools in time series, and constructing new algorithms for analyzing time series. It is worth mentioning that REFII model is not a closed system, which means that we have a finite set of methods. At first, this is a model for transformation of values of time series, which prepares data used by different sets of methods based on the same model of transformation in a domain of problem space. REFII model gives a new approach in time series analysis based on a unique model of transformation, which is a base for all kind of time series analysis. The advantage of REFII model is its possible application in many different areas such as finance, medicine, voice recognition, face recognition and text mining.

  16. Frontiers in Time Series and Financial Econometrics

    OpenAIRE

    Ling, S.; McAleer, M.J.; Tong, H.

    2015-01-01

    __Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time series analysis. The purpose of this special issue of the journal on “Frontiers in Time Series and Financial Econometrics” is to highlight several areas of research by leading academics in which novel methods have contrib...

  17. Scale-dependent intrinsic entropies of complex time series.

    Science.gov (United States)

    Yeh, Jia-Rong; Peng, Chung-Kang; Huang, Norden E

    2016-04-13

    Multi-scale entropy (MSE) was developed as a measure of complexity for complex time series, and it has been applied widely in recent years. The MSE algorithm is based on the assumption that biological systems possess the ability to adapt and function in an ever-changing environment, and these systems need to operate across multiple temporal and spatial scales, such that their complexity is also multi-scale and hierarchical. Here, we present a systematic approach to apply the empirical mode decomposition algorithm, which can detrend time series on various time scales, prior to analysing a signal's complexity by measuring the irregularity of its dynamics on multiple time scales. Simulated time series of fractal Gaussian noise and human heartbeat time series were used to study the performance of this new approach. We show that our method can successfully quantify the fractal properties of the simulated time series and can accurately distinguish modulations in human heartbeat time series in health and disease. © 2016 The Author(s).

  18. Elements of nonlinear time series analysis and forecasting

    CERN Document Server

    De Gooijer, Jan G

    2017-01-01

    This book provides an overview of the current state-of-the-art of nonlinear time series analysis, richly illustrated with examples, pseudocode algorithms and real-world applications. Avoiding a “theorem-proof” format, it shows concrete applications on a variety of empirical time series. The book can be used in graduate courses in nonlinear time series and at the same time also includes interesting material for more advanced readers. Though it is largely self-contained, readers require an understanding of basic linear time series concepts, Markov chains and Monte Carlo simulation methods. The book covers time-domain and frequency-domain methods for the analysis of both univariate and multivariate (vector) time series. It makes a clear distinction between parametric models on the one hand, and semi- and nonparametric models/methods on the other. This offers the reader the option of concentrating exclusively on one of these nonlinear time series analysis methods. To make the book as user friendly as possible...

  19. An Energy-Based Similarity Measure for Time Series

    Directory of Open Access Journals (Sweden)

    Pierre Brunagel

    2007-11-01

    Full Text Available A new similarity measure, called SimilB, for time series analysis, based on the cross-ΨB-energy operator (2004, is introduced. ΨB is a nonlinear measure which quantifies the interaction between two time series. Compared to Euclidean distance (ED or the Pearson correlation coefficient (CC, SimilB includes the temporal information and relative changes of the time series using the first and second derivatives of the time series. SimilB is well suited for both nonstationary and stationary time series and particularly those presenting discontinuities. Some new properties of ΨB are presented. Particularly, we show that ΨB as similarity measure is robust to both scale and time shift. SimilB is illustrated with synthetic time series and an artificial dataset and compared to the CC and the ED measures.

  20. A Cryogenic Test Station for the Pre-series 2400 W @ 1.8 K Refrigeration Units for the LHC

    CERN Document Server

    Claudet, S; Gully, P; Jäger, B; Millet, F; Roussel, P; Tavian, L

    2002-01-01

    The cooling capacity below 2 K for the superconducting magnets in the Large Hadron Collider (LHC), at CERN, will be provided by eight refrigeration units at 1.8 K, each of them coupled to a 4.5 K refrigerator. The supply of the series units is linked to successful testing and acceptance of the pre-series delivered by the two selected vendors. To properly assess the performance of specific components such as cold compressors and some process specificities a dedicated test station is necessary. The test station is able to process up to 130 g/s between 4.5 & 20 K and aims at simulating the steady and transient operational modes foreseen for the LHC. After recalling the basic characteristics of the 1.8 K refrigeration units and the content of the acceptance tests of the pre-series, the principle of the test cryostat is detailed. The components of the test station and corresponding layout are described. The first testing experience is presented as well as preliminary results of the pre-series units.

  1. Dietary traits of the late Early Pleistocene Bison menneri (Bovidae, Mammalia) from its type site Untermassfeld (Central Germany) and the problem of Pleistocene 'wood bison'

    Science.gov (United States)

    van Asperen, Eline N.; Kahlke, Ralf-Dietrich

    2017-12-01

    Over the course of the Early and early Middle Pleistocene, a climatic cooling trend led to the partial opening up of landscapes in the western Palaearctic. This led to a gradual replacement of browsers by grazers, whilst some herbivore species shifted their diet towards including more grass. Wear patterns of herbivore cheek teeth can inform our understanding of the timing and extent of this change and indicate levels of dietary plasticity. One of the indicator species of the faunal turnover is the first large-sized form of bison in the Palaearctic, Bison menneri. The dental mesowear of the palaeopopulation from the species' late Early Pleistocene type site of Untermassfeld in Central Germany and the Late Pleistocene B. priscus from Taubach, both from habitat mosaics of forested habitats and more open landscapes, have a mixed feeder profile similar to that of North American wood bison, which has a distinct preference for open habitats but occasionally consumes a high amount of browse as a fall-back food. In contrast, the grazer mesowear signature of early Middle Pleistocene B. schoetensacki voigtstedtensis from Voigtstedt indicates these animals likely did not regularly feed in the densely forested area around the site. The mesowear of B. schoetensacki from Süssenborn, in a more open environment, is similar to that of extant European bison. Both Pleistocene and extant bison are grazers to mixed feeders with relatively high tolerance of a suboptimal browsing diet. None of these species can be regarded as true 'wood bison'.

  2. Detecting chaos in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W

    2013-09-01

    Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.

  3. Building Chaotic Model From Incomplete Time Series

    Science.gov (United States)

    Siek, Michael; Solomatine, Dimitri

    2010-05-01

    This paper presents a number of novel techniques for building a predictive chaotic model from incomplete time series. A predictive chaotic model is built by reconstructing the time-delayed phase space from observed time series and the prediction is made by a global model or adaptive local models based on the dynamical neighbors found in the reconstructed phase space. In general, the building of any data-driven models depends on the completeness and quality of the data itself. However, the completeness of the data availability can not always be guaranteed since the measurement or data transmission is intermittently not working properly due to some reasons. We propose two main solutions dealing with incomplete time series: using imputing and non-imputing methods. For imputing methods, we utilized the interpolation methods (weighted sum of linear interpolations, Bayesian principle component analysis and cubic spline interpolation) and predictive models (neural network, kernel machine, chaotic model) for estimating the missing values. After imputing the missing values, the phase space reconstruction and chaotic model prediction are executed as a standard procedure. For non-imputing methods, we reconstructed the time-delayed phase space from observed time series with missing values. This reconstruction results in non-continuous trajectories. However, the local model prediction can still be made from the other dynamical neighbors reconstructed from non-missing values. We implemented and tested these methods to construct a chaotic model for predicting storm surges at Hoek van Holland as the entrance of Rotterdam Port. The hourly surge time series is available for duration of 1990-1996. For measuring the performance of the proposed methods, a synthetic time series with missing values generated by a particular random variable to the original (complete) time series is utilized. There exist two main performance measures used in this work: (1) error measures between the actual

  4. Multivariate Time Series Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  5. Analysing Stable Time Series

    National Research Council Canada - National Science Library

    Adler, Robert

    1997-01-01

    We describe how to take a stable, ARMA, time series through the various stages of model identification, parameter estimation, and diagnostic checking, and accompany the discussion with a goodly number...

  6. Direct evidence for human reliance on rainforest resources in late Pleistocene Sri Lanka.

    Science.gov (United States)

    Roberts, Patrick; Perera, Nimal; Wedage, Oshan; Deraniyagala, Siran; Perera, Jude; Eregama, Saman; Gledhill, Andrew; Petraglia, Michael D; Lee-Thorp, Julia A

    2015-03-13

    Human occupation of tropical rainforest habitats is thought to be a mainly Holocene phenomenon. Although archaeological and paleoenvironmental data have hinted at pre-Holocene rainforest foraging, earlier human reliance on rainforest resources has not been shown directly. We applied stable carbon and oxygen isotope analysis to human and faunal tooth enamel from four late Pleistocene-to-Holocene archaeological sites in Sri Lanka. The results show that human foragers relied primarily on rainforest resources from at least ~20,000 years ago, with a distinct preference for semi-open rainforest and rain forest edges. Homo sapiens' relationship with the tropical rainforests of South Asia is therefore long-standing, a conclusion that indicates the time-depth of anthropogenic reliance and influence on these habitats. Copyright © 2015, American Association for the Advancement of Science.

  7. People of the ancient rainforest: late Pleistocene foragers at the Batadomba-lena rockshelter, Sri Lanka.

    Science.gov (United States)

    Perera, Nimal; Kourampas, Nikos; Simpson, Ian A; Deraniyagala, Siran U; Bulbeck, David; Kamminga, Johan; Perera, Jude; Fuller, Dorian Q; Szabó, Katherine; Oliveira, Nuno V

    2011-09-01

    Batadomba-lena, a rockshelter in the rainforest of southwestern Sri Lanka, has yielded some of the earliest evidence of Homo sapiens in South Asia. H. sapiens foragers were present at Batadomba-lena from ca. 36,000 cal BP to the terminal Pleistocene and Holocene. Human occupation was sporadic before the global Last Glacial Maximum (LGM). Batadomba-lena's Late Pleistocene inhabitants foraged for a broad spectrum of plant and mainly arboreal animal resources (monkeys, squirrels and abundant rainforest snails), derived from a landscape that retained equatorial rainforest cover through periods of pronounced regional aridity during the LGM. Juxtaposed hearths, palaeofloors with habitation debris, postholes, excavated pits, and animal and plant remains, including abundant Canarium nutshells, reflect intensive habitation of the rockshelter in times of monsoon intensification and biome reorganisation after ca. 16,000 cal BP. This period corresponds with further broadening of the economic spectrum, evidenced though increased contribution of squirrels, freshwater snails and Canarium nuts in the diet of the rockshelter occupants. Microliths are more abundant and morphologically diverse in the earliest, pre-LGM layer and decline markedly during intensified rockshelter use on the wane of the LGM. We propose that changing toolkits and subsistence base reflect changing foraging practices, from shorter-lived visits of highly mobile foraging bands in the period before the LGM, to intensified use of Batadomba-lena and intense foraging for diverse resources around the site during and, especially, following the LGM. Traces of ochre, marine shell beads and other objects from an 80 km-distant shore, and, possibly burials reflect symbolic practices from the outset of human presence at the rockshelter. Evidence for differentiated use of space (individual hearths, possible habitation structures) is present in LGM and terminal Pleistocene layers. The record of Batadomba-lena demonstrates

  8. Neural Network Models for Time Series Forecasts

    OpenAIRE

    Tim Hill; Marcus O'Connor; William Remus

    1996-01-01

    Neural networks have been advocated as an alternative to traditional statistical forecasting methods. In the present experiment, time series forecasts produced by neural networks are compared with forecasts from six statistical time series methods generated in a major forecasting competition (Makridakis et al. [Makridakis, S., A. Anderson, R. Carbone, R. Fildes, M. Hibon, R. Lewandowski, J. Newton, E. Parzen, R. Winkler. 1982. The accuracy of extrapolation (time series) methods: Results of a ...

  9. Time Series Observations in the North Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Shenoy, D.M.; Naik, H.; Kurian, S.; Naqvi, S.W.A.; Khare, N.

    Ocean and the ongoing time series study (Candolim Time Series; CaTS) off Goa. In addition, this article also focuses on the new time series initiative in the Arabian Sea and the Bay of Bengal under Sustained Indian Ocean Biogeochemistry and Ecosystem...

  10. Geometric noise reduction for multivariate time series.

    Science.gov (United States)

    Mera, M Eugenia; Morán, Manuel

    2006-03-01

    We propose an algorithm for the reduction of observational noise in chaotic multivariate time series. The algorithm is based on a maximum likelihood criterion, and its goal is to reduce the mean distance of the points of the cleaned time series to the attractor. We give evidence of the convergence of the empirical measure associated with the cleaned time series to the underlying invariant measure, implying the possibility to predict the long run behavior of the true dynamics.

  11. BRITS: Bidirectional Recurrent Imputation for Time Series

    OpenAIRE

    Cao, Wei; Wang, Dong; Li, Jian; Zhou, Hao; Li, Lei; Li, Yitan

    2018-01-01

    Time series are widely used as signals in many classification/regression tasks. It is ubiquitous that time series contains many missing values. Given multiple correlated time series data, how to fill in missing values and to predict their class labels? Existing imputation methods often impose strong assumptions of the underlying data generating process, such as linear dynamics in the state space. In this paper, we propose BRITS, a novel method based on recurrent neural networks for missing va...

  12. Efficient Algorithms for Segmentation of Item-Set Time Series

    Science.gov (United States)

    Chundi, Parvathi; Rosenkrantz, Daniel J.

    We propose a special type of time series, which we call an item-set time series, to facilitate the temporal analysis of software version histories, email logs, stock market data, etc. In an item-set time series, each observed data value is a set of discrete items. We formalize the concept of an item-set time series and present efficient algorithms for segmenting a given item-set time series. Segmentation of a time series partitions the time series into a sequence of segments where each segment is constructed by combining consecutive time points of the time series. Each segment is associated with an item set that is computed from the item sets of the time points in that segment, using a function which we call a measure function. We then define a concept called the segment difference, which measures the difference between the item set of a segment and the item sets of the time points in that segment. The segment difference values are required to construct an optimal segmentation of the time series. We describe novel and efficient algorithms to compute segment difference values for each of the measure functions described in the paper. We outline a dynamic programming based scheme to construct an optimal segmentation of the given item-set time series. We use the item-set time series segmentation techniques to analyze the temporal content of three different data sets—Enron email, stock market data, and a synthetic data set. The experimental results show that an optimal segmentation of item-set time series data captures much more temporal content than a segmentation constructed based on the number of time points in each segment, without examining the item set data at the time points, and can be used to analyze different types of temporal data.

  13. Studies on time series applications in environmental sciences

    CERN Document Server

    Bărbulescu, Alina

    2016-01-01

    Time series analysis and modelling represent a large study field, implying the approach from the perspective of the time and frequency, with applications in different domains. Modelling hydro-meteorological time series is difficult due to the characteristics of these series, as long range dependence, spatial dependence, the correlation with other series. Continuous spatial data plays an important role in planning, risk assessment and decision making in environmental management. In this context, in this book we present various statistical tests and modelling techniques used for time series analysis, as well as applications to hydro-meteorological series from Dobrogea, a region situated in the south-eastern part of Romania, less studied till now. Part of the results are accompanied by their R code. .

  14. Global Population Density Grid Time Series Estimates

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Population Density Grid Time Series Estimates provide a back-cast time series of population density grids based on the year 2000 population grid from SEDAC's...

  15. Prediction and Geometry of Chaotic Time Series

    National Research Council Canada - National Science Library

    Leonardi, Mary

    1997-01-01

    This thesis examines the topic of chaotic time series. An overview of chaos, dynamical systems, and traditional approaches to time series analysis is provided, followed by an examination of state space reconstruction...

  16. Synchronous genetic turnovers across Western Eurasia in Late Pleistocene collared lemmings.

    Science.gov (United States)

    Palkopoulou, Eleftheria; Baca, Mateusz; Abramson, Natalia I; Sablin, Mikhail; Socha, Paweł; Nadachowski, Adam; Prost, Stefan; Germonpré, Mietje; Kosintsev, Pavel; Smirnov, Nickolay G; Vartanyan, Sergey; Ponomarev, Dmitry; Nyström, Johanna; Nikolskiy, Pavel; Jass, Christopher N; Litvinov, Yuriy N; Kalthoff, Daniela C; Grigoriev, Semyon; Fadeeva, Tatyana; Douka, Aikaterini; Higham, Thomas F G; Ersmark, Erik; Pitulko, Vladimir; Pavlova, Elena; Stewart, John R; Węgleński, Piotr; Stankovic, Anna; Dalén, Love

    2016-05-01

    Recent palaeogenetic studies indicate a highly dynamic history in collared lemmings (Dicrostonyx spp.), with several demographical changes linked to climatic fluctuations that took place during the last glaciation. At the western range margin of D. torquatus, these changes were characterized by a series of local extinctions and recolonizations. However, it is unclear whether this pattern represents a local phenomenon, possibly driven by ecological edge effects, or a global phenomenon that took place across large geographical scales. To address this, we explored the palaeogenetic history of the collared lemming using a next-generation sequencing approach for pooled mitochondrial DNA amplicons. Sequences were obtained from over 300 fossil remains sampled across Eurasia and two sites in North America. We identified five mitochondrial lineages of D. torquatus that succeeded each other through time across Europe and western Russia, indicating a history of repeated population extinctions and recolonizations, most likely from eastern Russia, during the last 50 000 years. The observation of repeated extinctions across such a vast geographical range indicates large-scale changes in the steppe-tundra environment in western Eurasia during the last glaciation. All Holocene samples, from across the species' entire range, belonged to only one of the five mitochondrial lineages. Thus, extant D. torquatus populations only harbour a small fraction of the total genetic diversity that existed across different stages of the Late Pleistocene. In North American samples, haplotypes belonging to both D. groenlandicus and D. richardsoni were recovered from a Late Pleistocene site in south-western Canada. This suggests that D. groenlandicus had a more southern and D. richardsoni a more northern glacial distribution than previously thought. This study provides significant insights into the population dynamics of a small mammal at a large geographical scale and reveals a rather complex

  17. Sensor-Generated Time Series Events: A Definition Language

    Science.gov (United States)

    Anguera, Aurea; Lara, Juan A.; Lizcano, David; Martínez, Maria Aurora; Pazos, Juan

    2012-01-01

    There are now a great many domains where information is recorded by sensors over a limited time period or on a permanent basis. This data flow leads to sequences of data known as time series. In many domains, like seismography or medicine, time series analysis focuses on particular regions of interest, known as events, whereas the remainder of the time series contains hardly any useful information. In these domains, there is a need for mechanisms to identify and locate such events. In this paper, we propose an events definition language that is general enough to be used to easily and naturally define events in time series recorded by sensors in any domain. The proposed language has been applied to the definition of time series events generated within the branch of medicine dealing with balance-related functions in human beings. A device, called posturograph, is used to study balance-related functions. The platform has four sensors that record the pressure intensity being exerted on the platform, generating four interrelated time series. As opposed to the existing ad hoc proposals, the results confirm that the proposed language is valid, that is generally applicable and accurate, for identifying the events contained in the time series.

  18. Size variation in Middle Pleistocene humans.

    Science.gov (United States)

    Arsuaga, J L; Carretero, J M; Lorenzo, C; Gracia, A; Martínez, I; Bermúdez de Castro, J M; Carbonell, E

    1997-08-22

    It has been suggested that European Middle Pleistocene humans, Neandertals, and prehistoric modern humans had a greater sexual dimorphism than modern humans. Analysis of body size variation and cranial capacity variation in the large sample from the Sima de los Huesos site in Spain showed instead that the sexual dimorphism is comparable in Middle Pleistocene and modern populations.

  19. Correlation and multifractality in climatological time series

    International Nuclear Information System (INIS)

    Pedron, I T

    2010-01-01

    Climate can be described by statistical analysis of mean values of atmospheric variables over a period. It is possible to detect correlations in climatological time series and to classify its behavior. In this work the Hurst exponent, which can characterize correlation and persistence in time series, is obtained by using the Detrended Fluctuation Analysis (DFA) method. Data series of temperature, precipitation, humidity, solar radiation, wind speed, maximum squall, atmospheric pressure and randomic series are studied. Furthermore, the multifractality of such series is analyzed applying the Multifractal Detrended Fluctuation Analysis (MF-DFA) method. The results indicate presence of correlation (persistent character) in all climatological series and multifractality as well. A larger set of data, and longer, could provide better results indicating the universality of the exponents.

  20. Constraining Middle Pleistocene Glaciations in Birmingham, England; Using Optical Stimulated Luminescence (OSL) Dating.

    Science.gov (United States)

    Gibson, S. M.; Gibbard, P. L.; Bateman, M. D.; Boreham, S.

    2014-12-01

    Birmingham is built on a complex sequence of Middle Pleistocene sediments, representing at least three lowland glaciations (MIS12, MIS6, and MIS2). British Geological Survey mapping accounts 75% of the land mass as Quaternary deposits; predominantly glacial-sandy tills, glacial-fluvial sands, clays and organic silts and peats. Understanding the age of fluvial-glacial outwash, related to specific glaciations, is critical in establishing a Geochronology of Birmingham. Shotton (1953) found a series of Middle Pleistocene glacial sediments, termed the Wolstonian, intermediate in age between MIS11 and MIS5e Interglacial's. Uncertainty surrounding the relation to East Anglian sequences developed by Rose (1987) implies Birmingham sequences should be referred to MIS12. Despite this, younger Middle Pleistocene glacial sequences occur in Birmingham, yet uncertainty has deepened over our understanding of the complex, inaccessible sediments, especially as deposits have similar extent with MIS2 sequences. Five Optical Stimulated Luminescence (OSL) dates from three sites around Birmingham have been sampled. East of Birmingham, ice advanced from the Irish Sea and later the North East. In Wolston, a sample of outwash sand, associated with the Thurssington Till, is dated. In Meriden, two samples of outwash sands, associated with a distal Oadby Till, are dated. West of Birmingham, ice advanced from the Welsh Ice Sheet. In Seisdon, two samples of an Esker and outwash sand, associated with a Ridgeacre Till, are dated. Correlation of OSL dates provide an important constraint on understanding the history of Birmingham. Using GSI3D modeling to correlate geochronology and sedimentology, the significance of OSL dating can be understood within the complex sequences (and regional stratigraphy), complimented by Cosmogenic and Palynology dates taken in South West and North East. OSL dating on Birmingham's outwash sands, deposited by extensive repeated Middle Pleistocene glaciations, asserts the

  1. A novel model for Time-Series Data Clustering Based on piecewise SVD and BIRCH for Stock Data Analysis on Hadoop Platform

    Directory of Open Access Journals (Sweden)

    Ibgtc Bowala

    2017-06-01

    Full Text Available With the rapid growth of financial markets, analyzers are paying more attention on predictions. Stock data are time series data, with huge amounts. Feasible solution for handling the increasing amount of data is to use a cluster for parallel processing, and Hadoop parallel computing platform is a typical representative. There are various statistical models for forecasting time series data, but accurate clusters are a pre-requirement. Clustering analysis for time series data is one of the main methods for mining time series data for many other analysis processes. However, general clustering algorithms cannot perform clustering for time series data because series data has a special structure and a high dimensionality has highly co-related values due to high noise level. A novel model for time series clustering is presented using BIRCH, based on piecewise SVD, leading to a novel dimension reduction approach. Highly co-related features are handled using SVD with a novel approach for dimensionality reduction in order to keep co-related behavior optimal and then use BIRCH for clustering. The algorithm is a novel model that can handle massive time series data. Finally, this new model is successfully applied to real stock time series data of Yahoo finance with satisfactory results.

  2. Time Series Forecasting with Missing Values

    Directory of Open Access Journals (Sweden)

    Shin-Fu Wu

    2015-11-01

    Full Text Available Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, on the other hand, may alter the original time series. In this study, we propose a novel forecasting method based on least squares support vector machine (LSSVM. We employ the input patterns with the temporal information which is defined as local time index (LTI. Time series data as well as local time indexes are fed to LSSVM for doing forecasting without imputation. We compare the forecasting performance of our method with other imputation methods. Experimental results show that the proposed method is promising and is worth further investigations.

  3. Reconstruction of ensembles of coupled time-delay systems from time series.

    Science.gov (United States)

    Sysoev, I V; Prokhorov, M D; Ponomarenko, V I; Bezruchko, B P

    2014-06-01

    We propose a method to recover from time series the parameters of coupled time-delay systems and the architecture of couplings between them. The method is based on a reconstruction of model delay-differential equations and estimation of statistical significance of couplings. It can be applied to networks composed of nonidentical nodes with an arbitrary number of unidirectional and bidirectional couplings. We test our method on chaotic and periodic time series produced by model equations of ensembles of diffusively coupled time-delay systems in the presence of noise, and apply it to experimental time series obtained from electronic oscillators with delayed feedback coupled by resistors.

  4. Assessing error sources for Landsat time series analysis for tropical test sites in Viet Nam and Ethiopia

    Science.gov (United States)

    Schultz, Michael; Verbesselt, Jan; Herold, Martin; Avitabile, Valerio

    2013-10-01

    Researchers who use remotely sensed data can spend half of their total effort analysing prior data. If this data preprocessing does not match the application, this time spent on data analysis can increase considerably and can lead to inaccuracies. Despite the existence of a number of methods for pre-processing Landsat time series, each method has shortcomings, particularly for mapping forest changes under varying illumination, data availability and atmospheric conditions. Based on the requirements of mapping forest changes as defined by the United Nations (UN) Reducing Emissions from Forest Degradation and Deforestation (REDD) program, the accurate reporting of the spatio-temporal properties of these changes is necessary. We compared the impact of three fundamentally different radiometric preprocessing techniques Moderate Resolution Atmospheric TRANsmission (MODTRAN), Second Simulation of a Satellite Signal in the Solar Spectrum (6S) and simple Dark Object Subtraction (DOS) on mapping forest changes using Landsat time series data. A modification of Breaks For Additive Season and Trend (BFAST) monitor was used to jointly map the spatial and temporal agreement of forest changes at test sites in Ethiopia and Viet Nam. The suitability of the pre-processing methods for the occurring forest change drivers was assessed using recently captured Ground Truth and high resolution data (1000 points). A method for creating robust generic forest maps used for the sampling design is presented. An assessment of error sources has been performed identifying haze as a major source for time series analysis commission error.

  5. The analysis of time series: an introduction

    National Research Council Canada - National Science Library

    Chatfield, Christopher

    1989-01-01

    .... A variety of practical examples are given to support the theory. The book covers a wide range of time-series topics, including probability models for time series, Box-Jenkins forecasting, spectral analysis, linear systems and system identification...

  6. Plio-Pleistocene aardvarks (Mammalia, Tubulidentata from East Africa

    Directory of Open Access Journals (Sweden)

    T. Lehmann

    2008-08-01

    Full Text Available The Tubulidentata are unique among mammals for being the only order represented nowadays by a single living species, Orycteropus afer: the aardvark. Nevertheless, it is one of the least studied mammalian orders. Aardvarks are currently distributed all over sub-Saharan Africa, but the fossil record extends their spatial range to Europe and Asia. The earliest known Tubulidentata are ca. 20 million years old. About 14 species and three to four genera have been recognised so far, but since the late Pliocene, aardvarks have only been represented by a single genus and are restricted to Africa. The extant aardvark is the only species of Tubulidentata with a large distribution area, i.e. the African continent. There are three known Plio-Pleistocene African species of aardvark: Orycteropus afer (Pallas, 1766, O. crassidens MacInnes, 1956, and O. djourabensis Lehmann et al., 2004. Fossils of these species have been discovered in North-Africa, Kenya, and Chad respectively. The present study is focused on the aardvark material found in the Plio-Pleistocene of East Africa (Ethiopia, Kenya. New specimens from Asa Issie (Ethiopia and East Turkana (Kenya are described, and published ones are re-examined in the light of the latest discoveries. This study demonstrates that Kenyan specimens identified as O. crassidens are in fact representatives of the Chadian O. djourabensis. Moreover, additional material from Ethiopia and Kenya shows a close relationship with the latter species too. The presence of specimens of O. djourabensis in Chad and in Kenya during the Plio-Pleistocene implies that this taxon is the oldest-known species of aardvark to have experienced a continental dispersal. It also shows that Tubulidentates were able to cross Africa from east-west during Plio-Pleistocene times, despite the presence of the Rift Valley. It is however not possible to infer the centre of origin of O. djourabensis. Finally, this study suggests that two species of aardvark

  7. Time series modeling in traffic safety research.

    Science.gov (United States)

    Lavrenz, Steven M; Vlahogianni, Eleni I; Gkritza, Konstantina; Ke, Yue

    2018-08-01

    The use of statistical models for analyzing traffic safety (crash) data has been well-established. However, time series techniques have traditionally been underrepresented in the corresponding literature, due to challenges in data collection, along with a limited knowledge of proper methodology. In recent years, new types of high-resolution traffic safety data, especially in measuring driver behavior, have made time series modeling techniques an increasingly salient topic of study. Yet there remains a dearth of information to guide analysts in their use. This paper provides an overview of the state of the art in using time series models in traffic safety research, and discusses some of the fundamental techniques and considerations in classic time series modeling. It also presents ongoing and future opportunities for expanding the use of time series models, and explores newer modeling techniques, including computational intelligence models, which hold promise in effectively handling ever-larger data sets. The information contained herein is meant to guide safety researchers in understanding this broad area of transportation data analysis, and provide a framework for understanding safety trends that can influence policy-making. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Time series prediction: statistical and neural techniques

    Science.gov (United States)

    Zahirniak, Daniel R.; DeSimio, Martin P.

    1996-03-01

    In this paper we compare the performance of nonlinear neural network techniques to those of linear filtering techniques in the prediction of time series. Specifically, we compare the results of using the nonlinear systems, known as multilayer perceptron and radial basis function neural networks, with the results obtained using the conventional linear Wiener filter, Kalman filter and Widrow-Hoff adaptive filter in predicting future values of stationary and non- stationary time series. Our results indicate the performance of each type of system is heavily dependent upon the form of the time series being predicted and the size of the system used. In particular, the linear filters perform adequately for linear or near linear processes while the nonlinear systems perform better for nonlinear processes. Since the linear systems take much less time to be developed, they should be tried prior to using the nonlinear systems when the linearity properties of the time series process are unknown.

  9. Pleistocene North African genomes link Near Eastern and sub-Saharan African human populations.

    Science.gov (United States)

    van de Loosdrecht, Marieke; Bouzouggar, Abdeljalil; Humphrey, Louise; Posth, Cosimo; Barton, Nick; Aximu-Petri, Ayinuer; Nickel, Birgit; Nagel, Sarah; Talbi, El Hassan; El Hajraoui, Mohammed Abdeljalil; Amzazi, Saaïd; Hublin, Jean-Jacques; Pääbo, Svante; Schiffels, Stephan; Meyer, Matthias; Haak, Wolfgang; Jeong, Choongwon; Krause, Johannes

    2018-05-04

    North Africa is a key region for understanding human history, but the genetic history of its people is largely unknown. We present genomic data from seven 15,000-year-old modern humans, attributed to the Iberomaurusian culture, from Morocco. We find a genetic affinity with early Holocene Near Easterners, best represented by Levantine Natufians, suggesting a pre-agricultural connection between Africa and the Near East. We do not find evidence for gene flow from Paleolithic Europeans to Late Pleistocene North Africans. The Taforalt individuals derive one-third of their ancestry from sub-Saharan Africans, best approximated by a mixture of genetic components preserved in present-day West and East Africans. Thus, we provide direct evidence for genetic interactions between modern humans across Africa and Eurasia in the Pleistocene. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  10. Pleistocene changes in the fauna and flora of South america.

    Science.gov (United States)

    Vuilleumier, B S

    1971-08-27

    In recent years, the view that Pleistocene climatic events played a major role in the evolution of the biotas of southern, primarily tropical continents has begun to displace the previously held conviction that these areas remained relatively stable during the Quaternary. Studies of speciation patterns of high Andean plant and avian taxa (7-14) have led to the conclusion that Pleistocene climatic events were the factors that ultimately shaped the patterns now observed in the paramo-puna and the related Patagonian flora and fauna. The final uplift of the Andes at the end of the Tertiary automatically limits the age of the high Andean habitats and their biotas to the Quaternary. Within this period, the number of ecological fluctuations caused by the glaciations could easily have provided the mechanism behind the patterns now present in these habitats (Appendix, 1; Figs. 1 and 2; Table 1). In glacial periods, when vegetation belts, were lowered, organisms in the paramo-puna habitat were allowed to expand their ranges. In interglacial periods, these taxa were isolated on disjunct peaks, where differentiation could occur. At times of ice expansion, glacial tongues and lakes provided local barriers to gene exchange, whereas in warm, interglacial times, dry river valleys were a major deterrent to the interbreeding of populations on different mountains (Fig. 2; Table 2). A preliminary analysis of about 10 to 12 percent of the total South American avifauna (14), subsequent to the study of the high Andean biota, suggested that the birds of all the major habitats of the continent possess, with about equal frequency, similar stages of speciation. This correspondence in levels of evolution indicated that the avifauna of vegetation zones which were thought to have been more stable (for example, tropical rainforests) are as actively speciating as are those of the more recent paramo-puna habitats. More intensive work on lowland tropical taxa (16, 19-21) and recent work on montane

  11. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  12. Clinical and epidemiological rounds. Time series

    Directory of Open Access Journals (Sweden)

    León-Álvarez, Alba Luz

    2016-07-01

    Full Text Available Analysis of time series is a technique that implicates the study of individuals or groups observed in successive moments in time. This type of analysis allows the study of potential causal relationships between different variables that change over time and relate to each other. It is the most important technique to make inferences about the future, predicting, on the basis or what has happened in the past and it is applied in different disciplines of knowledge. Here we discuss different components of time series, the analysis technique and specific examples in health research.

  13. Pollen analyses of Pleistocene hyaena coprolites from Montenegro and Serbia

    Directory of Open Access Journals (Sweden)

    Argant Jacqueline

    2007-01-01

    Full Text Available The results of pollen analyses of hyaena coprolites from the Early Pleistocene cave of Trlica in northern Montenegro and the Late Pleistocene cave of Baranica in southeast Serbia are described. The Early Pleistocene Pachycrocuta brevirostris, and the Late Pleistocene Crocuta spelaea are coprolite-producing species. Although the pollen concentration was rather low, the presented analyses add considerably to the much-needed knowledge of the vegetation of the central Balkans during the Pleistocene. Pollen extracted from a coprolite from the Baranica cave indicates an open landscape with the presence of steppe taxa, which is in accordance with the recorded conditions and faunal remains. Pollen analysis of the Early Pleistocene samples from Trlica indicate fresh and temperate humid climatic conditions, as well as the co-existence of several biotopes which formed a mosaic landscape in the vicinity of the cave.

  14. Integer-valued time series

    NARCIS (Netherlands)

    van den Akker, R.

    2007-01-01

    This thesis adresses statistical problems in econometrics. The first part contributes statistical methodology for nonnegative integer-valued time series. The second part of this thesis discusses semiparametric estimation in copula models and develops semiparametric lower bounds for a large class of

  15. Robust Forecasting of Non-Stationary Time Series

    NARCIS (Netherlands)

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable

  16. Modeling pollen time series using seasonal-trend decomposition procedure based on LOESS smoothing.

    Science.gov (United States)

    Rojo, Jesús; Rivero, Rosario; Romero-Morte, Jorge; Fernández-González, Federico; Pérez-Badia, Rosa

    2017-02-01

    Analysis of airborne pollen concentrations provides valuable information on plant phenology and is thus a useful tool in agriculture-for predicting harvests in crops such as the olive and for deciding when to apply phytosanitary treatments-as well as in medicine and the environmental sciences. Variations in airborne pollen concentrations, moreover, are indicators of changing plant life cycles. By modeling pollen time series, we can not only identify the variables influencing pollen levels but also predict future pollen concentrations. In this study, airborne pollen time series were modeled using a seasonal-trend decomposition procedure based on LOcally wEighted Scatterplot Smoothing (LOESS) smoothing (STL). The data series-daily Poaceae pollen concentrations over the period 2006-2014-was broken up into seasonal and residual (stochastic) components. The seasonal component was compared with data on Poaceae flowering phenology obtained by field sampling. Residuals were fitted to a model generated from daily temperature and rainfall values, and daily pollen concentrations, using partial least squares regression (PLSR). This method was then applied to predict daily pollen concentrations for 2014 (independent validation data) using results for the seasonal component of the time series and estimates of the residual component for the period 2006-2013. Correlation between predicted and observed values was r = 0.79 (correlation coefficient) for the pre-peak period (i.e., the period prior to the peak pollen concentration) and r = 0.63 for the post-peak period. Separate analysis of each of the components of the pollen data series enables the sources of variability to be identified more accurately than by analysis of the original non-decomposed data series, and for this reason, this procedure has proved to be a suitable technique for analyzing the main environmental factors influencing airborne pollen concentrations.

  17. Characterizing time series via complexity-entropy curves

    Science.gov (United States)

    Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.

    2017-06-01

    The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.

  18. Complex network approach to fractional time series

    Energy Technology Data Exchange (ETDEWEB)

    Manshour, Pouya [Physics Department, Persian Gulf University, Bushehr 75169 (Iran, Islamic Republic of)

    2015-10-15

    In order to extract correlation information inherited in stochastic time series, the visibility graph algorithm has been recently proposed, by which a time series can be mapped onto a complex network. We demonstrate that the visibility algorithm is not an appropriate one to study the correlation aspects of a time series. We then employ the horizontal visibility algorithm, as a much simpler one, to map fractional processes onto complex networks. The degree distributions are shown to have parabolic exponential forms with Hurst dependent fitting parameter. Further, we take into account other topological properties such as maximum eigenvalue of the adjacency matrix and the degree assortativity, and show that such topological quantities can also be used to predict the Hurst exponent, with an exception for anti-persistent fractional Gaussian noises. To solve this problem, we take into account the Spearman correlation coefficient between nodes' degrees and their corresponding data values in the original time series.

  19. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2008-01-01

    An accessible introduction to the most current thinking in and practicality of forecasting techniques in the context of time-oriented data. Analyzing time-oriented data and forecasting are among the most important problems that analysts face across many fields, ranging from finance and economics to production operations and the natural sciences. As a result, there is a widespread need for large groups of people in a variety of fields to understand the basic concepts of time series analysis and forecasting. Introduction to Time Series Analysis and Forecasting presents the time series analysis branch of applied statistics as the underlying methodology for developing practical forecasts, and it also bridges the gap between theory and practice by equipping readers with the tools needed to analyze time-oriented data and construct useful, short- to medium-term, statistically based forecasts.

  20. Speleothem Mg-isotope time-series data from different climate belts

    Science.gov (United States)

    Riechelmann, S.; Buhl, D.; Richter, D. K.; Schröder-Ritzrau, A.; Riechelmann, D. F. C.; Niedermayr, A.; Vonhof, H. B.; Wassenburg, J.; Immenhauser, A.

    2012-04-01

    Speleothem Mg-isotope time-series data from different climate belts Sylvia Riechelmann (1), Dieter Buhl(1), Detlev K. Richter (1), Andrea Schröder-Ritzrau (2), Dana F.C. Riechelmann (3), Andrea Niedermayr (1), Hubert B. Vonhof (4) , Jasper Wassenburg (1), Adrian Immenhauser (1) (1) Ruhr-University Bochum, Institute for Geology, Mineralogy and Geophysics, Universitätsstraße 150, D-44801 Bochum, Germany (2) Heidelberg Academy of Sciences, Im Neuenheimer Feld 229, D-69120 Heidelberg, Germany (3) Johannes Gutenberg-University Mainz, Institute of Geography, Johann-Joachim-Becher-Weg 21, D-55128 Mainz, Germany (4) Faculty of Earth and Life Sciences, Vrije Universiteit Amsterdam, De Boelelaan 1085, 1081 HV, Amsterdam, The Netherlands The Magnesium isotope proxy in Earth surface research is still underexplored. Recently, field and laboratory experiments have shed light on the complex suite of processes affecting Mg isotope fractionation in continental weathering systems. Magnesium-isotope fractionation in speleothems depends on a series of factors including biogenic activity and composition of soils, mineralogy of hostrock, changes in silicate versus carbonate weathering ratios, water residence time in the soil and hostrock and disequilibrium factors such as the precipitation rate of calcite in speleothems. Furthermore, the silicate (here mainly Mg-bearing clays) versus carbonate weathering ratio depends on air temperature and rainfall amount, also influencing the soil biogenic activity. It must be emphasized that carbonate weathering is generally dominant, but under increasingly warm and more arid climate conditions, silicate weathering rates increase and release 26Mg-enriched isotopes to the soil water. Furthermore, as shown in laboratory experiments, increasing calcite precipitation rates lead to elevated delta26Mg ratios and vice versa. Here, data from six stalagmite time-series Mg-isotope records (Thermo Fisher Scientific Neptune MC-ICP-MS) are shown. Stalagmites

  1. Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance.

    Science.gov (United States)

    Liu, Yongli; Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao

    2018-01-01

    Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy.

  2. Incremental fuzzy C medoids clustering of time series data using dynamic time warping distance

    Science.gov (United States)

    Chen, Jingli; Wu, Shuai; Liu, Zhizhong; Chao, Hao

    2018-01-01

    Clustering time series data is of great significance since it could extract meaningful statistics and other characteristics. Especially in biomedical engineering, outstanding clustering algorithms for time series may help improve the health level of people. Considering data scale and time shifts of time series, in this paper, we introduce two incremental fuzzy clustering algorithms based on a Dynamic Time Warping (DTW) distance. For recruiting Single-Pass and Online patterns, our algorithms could handle large-scale time series data by splitting it into a set of chunks which are processed sequentially. Besides, our algorithms select DTW to measure distance of pair-wise time series and encourage higher clustering accuracy because DTW could determine an optimal match between any two time series by stretching or compressing segments of temporal data. Our new algorithms are compared to some existing prominent incremental fuzzy clustering algorithms on 12 benchmark time series datasets. The experimental results show that the proposed approaches could yield high quality clusters and were better than all the competitors in terms of clustering accuracy. PMID:29795600

  3. The foundations of modern time series analysis

    CERN Document Server

    Mills, Terence C

    2011-01-01

    This book develops the analysis of Time Series from its formal beginnings in the 1890s through to the publication of Box and Jenkins' watershed publication in 1970, showing how these methods laid the foundations for the modern techniques of Time Series analysis that are in use today.

  4. Time series clustering in large data sets

    Directory of Open Access Journals (Sweden)

    Jiří Fejfar

    2011-01-01

    Full Text Available The clustering of time series is a widely researched area. There are many methods for dealing with this task. We are actually using the Self-organizing map (SOM with the unsupervised learning algorithm for clustering of time series. After the first experiment (Fejfar, Weinlichová, Šťastný, 2009 it seems that the whole concept of the clustering algorithm is correct but that we have to perform time series clustering on much larger dataset to obtain more accurate results and to find the correlation between configured parameters and results more precisely. The second requirement arose in a need for a well-defined evaluation of results. It seems useful to use sound recordings as instances of time series again. There are many recordings to use in digital libraries, many interesting features and patterns can be found in this area. We are searching for recordings with the similar development of information density in this experiment. It can be used for musical form investigation, cover songs detection and many others applications.The objective of the presented paper is to compare clustering results made with different parameters of feature vectors and the SOM itself. We are describing time series in a simplistic way evaluating standard deviations for separated parts of recordings. The resulting feature vectors are clustered with the SOM in batch training mode with different topologies varying from few neurons to large maps.There are other algorithms discussed, usable for finding similarities between time series and finally conclusions for further research are presented. We also present an overview of the related actual literature and projects.

  5. Transmission of linear regression patterns between time series: from relationship in time series to complex networks.

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong; Ding, Yinghui

    2014-07-01

    The linear regression parameters between two time series can be different under different lengths of observation period. If we study the whole period by the sliding window of a short period, the change of the linear regression parameters is a process of dynamic transmission over time. We tackle fundamental research that presents a simple and efficient computational scheme: a linear regression patterns transmission algorithm, which transforms linear regression patterns into directed and weighted networks. The linear regression patterns (nodes) are defined by the combination of intervals of the linear regression parameters and the results of the significance testing under different sizes of the sliding window. The transmissions between adjacent patterns are defined as edges, and the weights of the edges are the frequency of the transmissions. The major patterns, the distance, and the medium in the process of the transmission can be captured. The statistical results of weighted out-degree and betweenness centrality are mapped on timelines, which shows the features of the distribution of the results. Many measurements in different areas that involve two related time series variables could take advantage of this algorithm to characterize the dynamic relationships between the time series from a new perspective.

  6. Lag space estimation in time series modelling

    DEFF Research Database (Denmark)

    Goutte, Cyril

    1997-01-01

    The purpose of this article is to investigate some techniques for finding the relevant lag-space, i.e. input information, for time series modelling. This is an important aspect of time series modelling, as it conditions the design of the model through the regressor vector a.k.a. the input layer...

  7. The role of ice sheets in the pleistocene climate

    NARCIS (Netherlands)

    Oerlemans, J.

    1991-01-01

    Northern hemisphere ice sheets have played an important role in the climatic evolution of the Pleistocene. The characteristic time-scale of icesheet growth has the same order-of-magnitude as that for the orbital insolation variations. The interaction with the solid earth, the importance of the

  8. Time-series prediction and applications a machine intelligence approach

    CERN Document Server

    Konar, Amit

    2017-01-01

    This book presents machine learning and type-2 fuzzy sets for the prediction of time-series with a particular focus on business forecasting applications. It also proposes new uncertainty management techniques in an economic time-series using type-2 fuzzy sets for prediction of the time-series at a given time point from its preceding value in fluctuating business environments. It employs machine learning to determine repetitively occurring similar structural patterns in the time-series and uses stochastic automaton to predict the most probabilistic structure at a given partition of the time-series. Such predictions help in determining probabilistic moves in a stock index time-series Primarily written for graduate students and researchers in computer science, the book is equally useful for researchers/professionals in business intelligence and stock index prediction. A background of undergraduate level mathematics is presumed, although not mandatory, for most of the sections. Exercises with tips are provided at...

  9. Pleistocene and Holocene Iberian flora: a complete picture and review

    Science.gov (United States)

    González Sampériz, Penélope

    2010-05-01

    A detailed analysis of the location and composition of Iberian vegetation types during the whole Pleistocene and Holocene periods shows a complex patched landscape with persistence of different types of ecosystems, even during glacial times. In addition, recent, high-resolution palaeoecological records are changing the traditional picture of post-glacial vegetation succession in the Iberian Peninsula. The main available charcoal and pollen sequences include, coniferous and deciduous forest, steppes, shrublands, savannahs and glacial refugia during the Pleistocene for Meso-thermophytes (phytodiversity reservoirs), in different proportions. This panorama suggests an environmental complexity that relates biotic responses to climate changes forced by Milankovitch cycles, suborbital forcings and by the latitudinal and physiographic particularities of the Iberian Peninsula. Thus, many factors are critical in the course of vegetational developments and strong regional differences are observed since the Early Pleistocene. Currently, the flora of Iberia is located in two biogeographical/climatic regions: the Eurosiberian and the Mediterranean. The first one includes northern and northwestern areas of the peninsula, where post-glacial responses of vegetation are very similar to Central Europe, although with some particularities due to its proximity to both the Atlantic Ocean and the Mediterranean region. The second one comprises the main territory of Iberia and shows more complex patterns and singularities, now and in the past. Steppe landscapes dominated extensive areas over all the territory during the cold spells of the Quaternary, especially during the Late Pleistocene up to the Last Glacial Maximum, but differences in composition of the dominant taxa (Compositae versus Artemisia) are observed since the Early Pleistocene, probably related to moisture regional gradients. Coastal shelves and intramountainous valleys, even in continental areas, are spots of floristic

  10. A Time Series Forecasting Method

    Directory of Open Access Journals (Sweden)

    Wang Zhao-Yu

    2017-01-01

    Full Text Available This paper proposes a novel time series forecasting method based on a weighted self-constructing clustering technique. The weighted self-constructing clustering processes all the data patterns incrementally. If a data pattern is not similar enough to an existing cluster, it forms a new cluster of its own. However, if a data pattern is similar enough to an existing cluster, it is removed from the cluster it currently belongs to and added to the most similar cluster. During the clustering process, weights are learned for each cluster. Given a series of time-stamped data up to time t, we divide it into a set of training patterns. By using the weighted self-constructing clustering, the training patterns are grouped into a set of clusters. To estimate the value at time t + 1, we find the k nearest neighbors of the input pattern and use these k neighbors to decide the estimation. Experimental results are shown to demonstrate the effectiveness of the proposed approach.

  11. Stochastic nature of series of waiting times

    Science.gov (United States)

    Anvari, Mehrnaz; Aghamohammadi, Cina; Dashti-Naserabadi, H.; Salehi, E.; Behjat, E.; Qorbani, M.; Khazaei Nezhad, M.; Zirak, M.; Hadjihosseini, Ali; Peinke, Joachim; Tabar, M. Reza Rahimi

    2013-06-01

    Although fluctuations in the waiting time series have been studied for a long time, some important issues such as its long-range memory and its stochastic features in the presence of nonstationarity have so far remained unstudied. Here we find that the “waiting times” series for a given increment level have long-range correlations with Hurst exponents belonging to the interval 1/2time distribution. We find that the logarithmic difference of waiting times series has a short-range correlation, and then we study its stochastic nature using the Markovian method and determine the corresponding Kramers-Moyal coefficients. As an example, we analyze the velocity fluctuations in high Reynolds number turbulence and determine the level dependence of Markov time scales, as well as the drift and diffusion coefficients. We show that the waiting time distributions exhibit power law tails, and we were able to model the distribution with a continuous time random walk.

  12. Efficient Approximate OLAP Querying Over Time Series

    DEFF Research Database (Denmark)

    Perera, Kasun Baruhupolage Don Kasun Sanjeewa; Hahmann, Martin; Lehner, Wolfgang

    2016-01-01

    The ongoing trend for data gathering not only produces larger volumes of data, but also increases the variety of recorded data types. Out of these, especially time series, e.g. various sensor readings, have attracted attention in the domains of business intelligence and decision making. As OLAP...... queries play a major role in these domains, it is desirable to also execute them on time series data. While this is not a problem on the conceptual level, it can become a bottleneck with regards to query run-time. In general, processing OLAP queries gets more computationally intensive as the volume...... of data grows. This is a particular problem when querying time series data, which generally contains multiple measures recorded at fine time granularities. Usually, this issue is addressed either by scaling up hardware or by employing workload based query optimization techniques. However, these solutions...

  13. A Dynamic Fuzzy Cluster Algorithm for Time Series

    Directory of Open Access Journals (Sweden)

    Min Ji

    2013-01-01

    clustering time series by introducing the definition of key point and improving FCM algorithm. The proposed algorithm works by determining those time series whose class labels are vague and further partitions them into different clusters over time. The main advantage of this approach compared with other existing algorithms is that the property of some time series belonging to different clusters over time can be partially revealed. Results from simulation-based experiments on geographical data demonstrate the excellent performance and the desired results have been obtained. The proposed algorithm can be applied to solve other clustering problems in data mining.

  14. A novel weight determination method for time series data aggregation

    Science.gov (United States)

    Xu, Paiheng; Zhang, Rong; Deng, Yong

    2017-09-01

    Aggregation in time series is of great importance in time series smoothing, predicting and other time series analysis process, which makes it crucial to address the weights in times series correctly and reasonably. In this paper, a novel method to obtain the weights in time series is proposed, in which we adopt induced ordered weighted aggregation (IOWA) operator and visibility graph averaging (VGA) operator and linearly combine the weights separately generated by the two operator. The IOWA operator is introduced to the weight determination of time series, through which the time decay factor is taken into consideration. The VGA operator is able to generate weights with respect to the degree distribution in the visibility graph constructed from the corresponding time series, which reflects the relative importance of vertices in time series. The proposed method is applied to two practical datasets to illustrate its merits. The aggregation of Construction Cost Index (CCI) demonstrates the ability of proposed method to smooth time series, while the aggregation of The Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) illustrate how proposed method maintain the variation tendency of original data.

  15. Foundations of Sequence-to-Sequence Modeling for Time Series

    OpenAIRE

    Kuznetsov, Vitaly; Mariet, Zelda

    2018-01-01

    The availability of large amounts of time series data, paired with the performance of deep-learning algorithms on a broad class of problems, has recently led to significant interest in the use of sequence-to-sequence models for time series forecasting. We provide the first theoretical analysis of this time series forecasting framework. We include a comparison of sequence-to-sequence modeling to classical time series models, and as such our theory can serve as a quantitative guide for practiti...

  16. Climate Prediction Center (CPC) Global Precipitation Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global precipitation time series provides time series charts showing observations of daily precipitation as well as accumulated precipitation compared to normal...

  17. Climate Prediction Center (CPC) Global Temperature Time Series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The global temperature time series provides time series charts using station based observations of daily temperature. These charts provide information about the...

  18. High-latitude steppe vegetation and the mineral nutrition of Pleistocene herbivores

    Science.gov (United States)

    Davydov, S. P.; Davydova, A.; Makarevich, R.; Loranty, M. M.; Boeskorov, G.

    2014-12-01

    High-latitude steppes were widespread and zonal in the Late Pleistocene and formed a landscape basis for the Mammoth Biome. Now the patches of these steppes survived on steep slopes under southern aspects. These steppes serve as unique information sources about the Late Pleistocene "Mammoth" steppe. Numerous data obtained by palynological, carpological, and DNA analysis of plant remains from feces and stomach contents of Pleistocene herbivore mummies, as well as from buried soils and enclosing deposits show that they are similar to modern steppe plant assemblage in taxa composition. Plant's nutrient concentrations are of fundamental importance across Pleistocene grass-rich ecosystems because of their role in the support of large herbivores. The average weight of an adult mammoth skeleton (about 0.5 tons) and of a woolly rhinoceros (about 0.2 tons) clearly suggests this. Detailed studies on fossil bone remains showed mineral deficiency in large Pleistocene herbivores. A three-year study of ash and mineral contents of two types of relict steppe vegetation at the Kolyma Lowland, Arctic Siberia has been carried out. Nowadays refugia of similar vegetation are located not far (1 - 15km) from the Yedoma permafrost outcrops were abundant fossil remains are found. Dominant species of the steppe vegetation were sampled. Preliminary studies indicate that the ash-content varied 1.5-2 times in speceies of steppe herbs. The Ca, P, Mg, K element contents was higher for most steppe species than in the local herbaceous vegetation, especially in Ca and P. One of the most important elements of the mineral nutrition, the phosphorus, was always found in higher concentrations in the steppe vegetation than in plants of recently dominant landscapes of the study area. It should be noted that the mineral nutrient content of the modern steppe vegetation of Siberian Arctic is comparable to that of the recent zonal steppe of Transbaikal Region. This study supports the hypothesis that

  19. Recurrent Neural Network Applications for Astronomical Time Series

    Science.gov (United States)

    Protopapas, Pavlos

    2017-06-01

    The benefits of good predictive models in astronomy lie in early event prediction systems and effective resource allocation. Current time series methods applicable to regular time series have not evolved to generalize for irregular time series. In this talk, I will describe two Recurrent Neural Network methods, Long Short-Term Memory (LSTM) and Echo State Networks (ESNs) for predicting irregular time series. Feature engineering along with a non-linear modeling proved to be an effective predictor. For noisy time series, the prediction is improved by training the network on error realizations using the error estimates from astronomical light curves. In addition to this, we propose a new neural network architecture to remove correlation from the residuals in order to improve prediction and compensate for the noisy data. Finally, I show how to set hyperparameters for a stable and performant solution correctly. In this work, we circumvent this obstacle by optimizing ESN hyperparameters using Bayesian optimization with Gaussian Process priors. This automates the tuning procedure, enabling users to employ the power of RNN without needing an in-depth understanding of the tuning procedure.

  20. The British Lower Palaeolithic of the early Middle Pleistocene

    Science.gov (United States)

    Hosfield, Robert

    2011-06-01

    The archaeology of Britain during the early Middle Pleistocene (MIS 19-12) is represented by a number of key sites across eastern and southern England. These sites include Pakefield, Happisburgh 1, High Lodge, Warren Hill, Waverley Wood, Boxgrove, Kent's Cavern, and Westbury-sub-Mendip, alongside a 'background scatter' lithic record associated with the principal river systems (Bytham, pre-diversion Thames, and Solent) and raised beaches (Westbourne-Arundel). Hominin behaviour can be characterised in terms of: preferences for temperate or cool temperate climates and open/woodland mosaic habitats (indicated by mammalian fauna, mollusca, insects, and sediments); a biface-dominated material culture characterised by technological diversity, although with accompanying evidence for distinctive core and flake (Pakefield) and flake tool (High Lodge) assemblages; probable direct hunting-based subsistence strategies (with a focus upon large mammal fauna); and generally locally-focused spatial and landscape behaviours (principally indicated by raw material sources data), although with some evidence of dynamic, mobile and structured technological systems. The British data continues to support a 'modified short chronology' to the north of the Alps and the Pyrenees, with highly sporadic evidence for a hominin presence prior to 500-600 ka, although the ages of key assemblages are subject to ongoing debates regarding the chronology of the Bytham river terraces and the early Middle Pleistocene glaciations of East Anglia.

  1. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    Science.gov (United States)

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  2. Upper Middle Pleistocene climate and landscape development of Northern Germany

    Science.gov (United States)

    Urban, B.

    2009-04-01

    The Pleistocene sequence of the Schöningen lignite mine contains a number of interglacial and interstadial limnic and peat deposits, travertine tuff, soils, tills and fluvioglacial sediments as well as loess deposits. The complex Quaternary sequence contains six major cycles with evidence of four interglacials younger than the Elsterian glaciation and preceding the Holocene. The sequence begins with Late Elsterian glacial and three interstadial deposits formed in shallow basins. Cycle I is assigned to late parts of the Holsteinian interglacial. A strong cooling is recorded by a significant increase of Artemisia and grasses during the following Buschhaus A Stadial, which is considered to mark the onset of the Saalian Complex sensu lato (penultimate glacial-complex). The lacustrine sediments of Cycle II, Reinsdorf interglacial sequence (Urban, 1995), have been found to occur at archaeological sites Schöningen 12 and 13 (Thieme,1997). Recent investigations give evidence for at least 13 Local Pollen Assemblage Zones showing a five-fold division of the interglacial and a sequence of five climatic oscillations following the interglacial (Urban, 2006). From the relative high values for grasses and herbs in the inferred forested periods of the interglacial, a warm dry forest steppe climate can be deduced. The stratigraphic position of throwing spears (Thieme, 1997), can clearly be allocated to Reinsdorf Interstadial B (level II-4) characterized by an open pine-birch forest. Uppermost parts (level II-5) represent the transition into a periglacial environment indicating the definite end of cycle II. The Schöningen Interglacial (Cycle III) represents the youngest of the pre-Drenthe (Early Saalian Stadial) interglacials (Urban, 1995). In summary, it can be concluded that the Middle Pleistocene terrestrial pollen record of the Schöningen sequence represents tentative correlatives of MIS 7, 9 and 11. North of Leck (North Friesland, Schleswig-Holstein) sediments of the centre

  3. Multifractal analysis of visibility graph-based Ito-related connectivity time series.

    Science.gov (United States)

    Czechowski, Zbigniew; Lovallo, Michele; Telesca, Luciano

    2016-02-01

    In this study, we investigate multifractal properties of connectivity time series resulting from the visibility graph applied to normally distributed time series generated by the Ito equations with multiplicative power-law noise. We show that multifractality of the connectivity time series (i.e., the series of numbers of links outgoing any node) increases with the exponent of the power-law noise. The multifractality of the connectivity time series could be due to the width of connectivity degree distribution that can be related to the exit time of the associated Ito time series. Furthermore, the connectivity time series are characterized by persistence, although the original Ito time series are random; this is due to the procedure of visibility graph that, connecting the values of the time series, generates persistence but destroys most of the nonlinear correlations. Moreover, the visibility graph is sensitive for detecting wide "depressions" in input time series.

  4. Mathematical foundations of time series analysis a concise introduction

    CERN Document Server

    Beran, Jan

    2017-01-01

    This book provides a concise introduction to the mathematical foundations of time series analysis, with an emphasis on mathematical clarity. The text is reduced to the essential logical core, mostly using the symbolic language of mathematics, thus enabling readers to very quickly grasp the essential reasoning behind time series analysis. It appeals to anybody wanting to understand time series in a precise, mathematical manner. It is suitable for graduate courses in time series analysis but is equally useful as a reference work for students and researchers alike.

  5. Time series analysis in the social sciences the fundamentals

    CERN Document Server

    Shin, Youseop

    2017-01-01

    Times Series Analysis in the Social Sciences is a practical and highly readable introduction written exclusively for students and researchers whose mathematical background is limited to basic algebra. The book focuses on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and re

  6. Data imputation analysis for Cosmic Rays time series

    Science.gov (United States)

    Fernandes, R. C.; Lucio, P. S.; Fernandez, J. H.

    2017-05-01

    The occurrence of missing data concerning Galactic Cosmic Rays time series (GCR) is inevitable since loss of data is due to mechanical and human failure or technical problems and different periods of operation of GCR stations. The aim of this study was to perform multiple dataset imputation in order to depict the observational dataset. The study has used the monthly time series of GCR Climax (CLMX) and Roma (ROME) from 1960 to 2004 to simulate scenarios of 10%, 20%, 30%, 40%, 50%, 60%, 70%, 80% and 90% of missing data compared to observed ROME series, with 50 replicates. Then, the CLMX station as a proxy for allocation of these scenarios was used. Three different methods for monthly dataset imputation were selected: AMÉLIA II - runs the bootstrap Expectation Maximization algorithm, MICE - runs an algorithm via Multivariate Imputation by Chained Equations and MTSDI - an Expectation Maximization algorithm-based method for imputation of missing values in multivariate normal time series. The synthetic time series compared with the observed ROME series has also been evaluated using several skill measures as such as RMSE, NRMSE, Agreement Index, R, R2, F-test and t-test. The results showed that for CLMX and ROME, the R2 and R statistics were equal to 0.98 and 0.96, respectively. It was observed that increases in the number of gaps generate loss of quality of the time series. Data imputation was more efficient with MTSDI method, with negligible errors and best skill coefficients. The results suggest a limit of about 60% of missing data for imputation, for monthly averages, no more than this. It is noteworthy that CLMX, ROME and KIEL stations present no missing data in the target period. This methodology allowed reconstructing 43 time series.

  7. Algorithm for Compressing Time-Series Data

    Science.gov (United States)

    Hawkins, S. Edward, III; Darlington, Edward Hugo

    2012-01-01

    An algorithm based on Chebyshev polynomials effects lossy compression of time-series data or other one-dimensional data streams (e.g., spectral data) that are arranged in blocks for sequential transmission. The algorithm was developed for use in transmitting data from spacecraft scientific instruments to Earth stations. In spite of its lossy nature, the algorithm preserves the information needed for scientific analysis. The algorithm is computationally simple, yet compresses data streams by factors much greater than two. The algorithm is not restricted to spacecraft or scientific uses: it is applicable to time-series data in general. The algorithm can also be applied to general multidimensional data that have been converted to time-series data, a typical example being image data acquired by raster scanning. However, unlike most prior image-data-compression algorithms, this algorithm neither depends on nor exploits the two-dimensional spatial correlations that are generally present in images. In order to understand the essence of this compression algorithm, it is necessary to understand that the net effect of this algorithm and the associated decompression algorithm is to approximate the original stream of data as a sequence of finite series of Chebyshev polynomials. For the purpose of this algorithm, a block of data or interval of time for which a Chebyshev polynomial series is fitted to the original data is denoted a fitting interval. Chebyshev approximation has two properties that make it particularly effective for compressing serial data streams with minimal loss of scientific information: The errors associated with a Chebyshev approximation are nearly uniformly distributed over the fitting interval (this is known in the art as the "equal error property"); and the maximum deviations of the fitted Chebyshev polynomial from the original data have the smallest possible values (this is known in the art as the "min-max property").

  8. Modeling of Volatility with Non-linear Time Series Model

    OpenAIRE

    Kim Song Yon; Kim Mun Chol

    2013-01-01

    In this paper, non-linear time series models are used to describe volatility in financial time series data. To describe volatility, two of the non-linear time series are combined into form TAR (Threshold Auto-Regressive Model) with AARCH (Asymmetric Auto-Regressive Conditional Heteroskedasticity) error term and its parameter estimation is studied.

  9. Interrupted time-series analysis of regulations to reduce paracetamol (acetaminophen poisoning.

    Directory of Open Access Journals (Sweden)

    Oliver W Morgan

    2007-04-01

    Full Text Available Paracetamol (acetaminophen poisoning is the leading cause of acute liver failure in Great Britain and the United States. Successful interventions to reduced harm from paracetamol poisoning are needed. To achieve this, the government of the United Kingdom introduced legislation in 1998 limiting the pack size of paracetamol sold in shops. Several studies have reported recent decreases in fatal poisonings involving paracetamol. We use interrupted time-series analysis to evaluate whether the recent fall in the number of paracetamol deaths is different to trends in fatal poisoning involving aspirin, paracetamol compounds, antidepressants, or nondrug poisoning suicide.We calculated directly age-standardised mortality rates for paracetamol poisoning in England and Wales from 1993 to 2004. We used an ordinary least-squares regression model divided into pre- and postintervention segments at 1999. The model included a term for autocorrelation within the time series. We tested for changes in the level and slope between the pre- and postintervention segments. To assess whether observed changes in the time series were unique to paracetamol, we compared against poisoning deaths involving compound paracetamol (not covered by the regulations, aspirin, antidepressants, and nonpoisoning suicide deaths. We did this comparison by calculating a ratio of each comparison series with paracetamol and applying a segmented regression model to the ratios. No change in the ratio level or slope indicated no difference compared to the control series. There were about 2,200 deaths involving paracetamol. The age-standardised mortality rate rose from 8.1 per million in 1993 to 8.8 per million in 1997, subsequently falling to about 5.3 per million in 2004. After the regulations were introduced, deaths dropped by 2.69 per million (p = 0.003. Trends in the age-standardised mortality rate for paracetamol compounds, aspirin, and antidepressants were broadly similar to paracetamol

  10. Layered Ensemble Architecture for Time Series Forecasting.

    Science.gov (United States)

    Rahman, Md Mustafizur; Islam, Md Monirul; Murase, Kazuyuki; Yao, Xin

    2016-01-01

    Time series forecasting (TSF) has been widely used in many application areas such as science, engineering, and finance. The phenomena generating time series are usually unknown and information available for forecasting is only limited to the past values of the series. It is, therefore, necessary to use an appropriate number of past values, termed lag, for forecasting. This paper proposes a layered ensemble architecture (LEA) for TSF problems. Our LEA consists of two layers, each of which uses an ensemble of multilayer perceptron (MLP) networks. While the first ensemble layer tries to find an appropriate lag, the second ensemble layer employs the obtained lag for forecasting. Unlike most previous work on TSF, the proposed architecture considers both accuracy and diversity of the individual networks in constructing an ensemble. LEA trains different networks in the ensemble by using different training sets with an aim of maintaining diversity among the networks. However, it uses the appropriate lag and combines the best trained networks to construct the ensemble. This indicates LEAs emphasis on accuracy of the networks. The proposed architecture has been tested extensively on time series data of neural network (NN)3 and NN5 competitions. It has also been tested on several standard benchmark time series data. In terms of forecasting accuracy, our experimental results have revealed clearly that LEA is better than other ensemble and nonensemble methods.

  11. A window-based time series feature extraction method.

    Science.gov (United States)

    Katircioglu-Öztürk, Deniz; Güvenir, H Altay; Ravens, Ursula; Baykal, Nazife

    2017-10-01

    This study proposes a robust similarity score-based time series feature extraction method that is termed as Window-based Time series Feature ExtraCtion (WTC). Specifically, WTC generates domain-interpretable results and involves significantly low computational complexity thereby rendering itself useful for densely sampled and populated time series datasets. In this study, WTC is applied to a proprietary action potential (AP) time series dataset on human cardiomyocytes and three precordial leads from a publicly available electrocardiogram (ECG) dataset. This is followed by comparing WTC in terms of predictive accuracy and computational complexity with shapelet transform and fast shapelet transform (which constitutes an accelerated variant of the shapelet transform). The results indicate that WTC achieves a slightly higher classification performance with significantly lower execution time when compared to its shapelet-based alternatives. With respect to its interpretable features, WTC has a potential to enable medical experts to explore definitive common trends in novel datasets. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Prewhitening of hydroclimatic time series? Implications for inferred change and variability across time scales

    Science.gov (United States)

    Razavi, Saman; Vogel, Richard

    2018-02-01

    Prewhitening, the process of eliminating or reducing short-term stochastic persistence to enable detection of deterministic change, has been extensively applied to time series analysis of a range of geophysical variables. Despite the controversy around its utility, methodologies for prewhitening time series continue to be a critical feature of a variety of analyses including: trend detection of hydroclimatic variables and reconstruction of climate and/or hydrology through proxy records such as tree rings. With a focus on the latter, this paper presents a generalized approach to exploring the impact of a wide range of stochastic structures of short- and long-term persistence on the variability of hydroclimatic time series. Through this approach, we examine the impact of prewhitening on the inferred variability of time series across time scales. We document how a focus on prewhitened, residual time series can be misleading, as it can drastically distort (or remove) the structure of variability across time scales. Through examples with actual data, we show how such loss of information in prewhitened time series of tree rings (so-called "residual chronologies") can lead to the underestimation of extreme conditions in climate and hydrology, particularly droughts, reconstructed for centuries preceding the historical period.

  13. Signatures of Late Pleistocene fluvial incision in an Alpine landscape

    Science.gov (United States)

    Leith, Kerry; Fox, Matthew; Moore, Jeffrey R.

    2018-02-01

    Uncertainty regarding the relative efficacy of fluvial and glacial erosion has hindered attempts to quantitatively analyse the Pleistocene evolution of alpine landscapes. Here we show that the morphology of major tributaries of the Rhone River, Switzerland, is consistent with that predicted for a landscape shaped primarily by multiple phases of fluvial incision following a period of intense glacial erosion after the mid-Pleistocene transition (∼0.7 Ma). This is despite major ice sheets reoccupying the region during cold intervals since the mid-Pleistocene. We use high-resolution LiDAR data to identify a series of convex reaches within the long-profiles of 18 tributary channels. We propose these reaches represent knickpoints, which developed as regional uplift raised tributary bedrock channels above the local fluvial baselevel during glacial intervals, and migrated upstream as the fluvial system was re-established during interglacial periods. Using a combination of integral long-profile analysis and stream-power modelling, we find that the locations of ∼80% of knickpoints in our study region are consistent with that predicted for a fluvial origin, while the mean residual error over ∼100 km of modelled channels is just 26.3 m. Breaks in cross-valley profiles project toward the elevation of former end-of-interglacial channel elevations, supporting our model results. Calculated long-term uplift rates are within ∼15% of present-day measurements, while modelled rates of bedrock incision range from ∼1 mm/yr for low gradient reaches between knickpoints to ∼6-10 mm/yr close to retreating knickpoints, typical of observed rates in alpine settings. Together, our results reveal approximately 800 m of regional uplift, river incision, and hillslope erosion in the lower half of each tributary catchment since 0.7 Ma.

  14. DTW-APPROACH FOR UNCORRELATED MULTIVARIATE TIME SERIES IMPUTATION

    OpenAIRE

    Phan , Thi-Thu-Hong; Poisson Caillault , Emilie; Bigand , André; Lefebvre , Alain

    2017-01-01

    International audience; Missing data are inevitable in almost domains of applied sciences. Data analysis with missing values can lead to a loss of efficiency and unreliable results, especially for large missing sub-sequence(s). Some well-known methods for multivariate time series imputation require high correlations between series or their features. In this paper , we propose an approach based on the shape-behaviour relation in low/un-correlated multivariate time series under an assumption of...

  15. Variable Selection in Time Series Forecasting Using Random Forests

    Directory of Open Access Journals (Sweden)

    Hristos Tyralis

    2017-10-01

    Full Text Available Time series forecasting using machine learning algorithms has gained popularity recently. Random forest is a machine learning algorithm implemented in time series forecasting; however, most of its forecasting properties have remained unexplored. Here we focus on assessing the performance of random forests in one-step forecasting using two large datasets of short time series with the aim to suggest an optimal set of predictor variables. Furthermore, we compare its performance to benchmarking methods. The first dataset is composed by 16,000 simulated time series from a variety of Autoregressive Fractionally Integrated Moving Average (ARFIMA models. The second dataset consists of 135 mean annual temperature time series. The highest predictive performance of RF is observed when using a low number of recent lagged predictor variables. This outcome could be useful in relevant future applications, with the prospect to achieve higher predictive accuracy.

  16. Trend time-series modeling and forecasting with neural networks.

    Science.gov (United States)

    Qi, Min; Zhang, G Peter

    2008-05-01

    Despite its great importance, there has been no general consensus on how to model the trends in time-series data. Compared to traditional approaches, neural networks (NNs) have shown some promise in time-series forecasting. This paper investigates how to best model trend time series using NNs. Four different strategies (raw data, raw data with time index, detrending, and differencing) are used to model various trend patterns (linear, nonlinear, deterministic, stochastic, and breaking trend). We find that with NNs differencing often gives meritorious results regardless of the underlying data generating processes (DGPs). This finding is also confirmed by the real gross national product (GNP) series.

  17. Paleomagnetism and geochronology of the Pliocene-Pleistocene lavas in Iceland

    NARCIS (Netherlands)

    McDougall, Ian; Wensink, H.

    Potassium-argon dates are reported on five basalt samples from the Pliocene-Pleistocene sequence of lavas in the Jökuldalur area, northeastern Iceland. These dates confirm the correlations previously made with the geological time scale by means of paleomagnetic stratigraphy. The R1 and N2 polarity

  18. Paleoescatology in the sopas formation (Upper Pleistocene) form Uruguay, paleobilogic focus

    International Nuclear Information System (INIS)

    Verde, M.; Ubilla, M.; Soloviy, J.

    1998-01-01

    Continental tetrapod coprolites are reported for the first time for Uruguay, these remains come from the Sopas Formation (Upper Pleistocene). They are assigned to carnivore mammals based on morphology and inclusions of micrommmal remains besides of other attributes.(author)

  19. Segmentation of Nonstationary Time Series with Geometric Clustering

    DEFF Research Database (Denmark)

    Bocharov, Alexei; Thiesson, Bo

    2013-01-01

    We introduce a non-parametric method for segmentation in regimeswitching time-series models. The approach is based on spectral clustering of target-regressor tuples and derives a switching regression tree, where regime switches are modeled by oblique splits. Such models can be learned efficiently...... from data, where clustering is used to propose one single split candidate at each split level. We use the class of ART time series models to serve as illustration, but because of the non-parametric nature of our segmentation approach, it readily generalizes to a wide range of time-series models that go...

  20. Non-parametric characterization of long-term rainfall time series

    Science.gov (United States)

    Tiwari, Harinarayan; Pandey, Brij Kishor

    2018-03-01

    The statistical study of rainfall time series is one of the approaches for efficient hydrological system design. Identifying, and characterizing long-term rainfall time series could aid in improving hydrological systems forecasting. In the present study, eventual statistics was applied for the long-term (1851-2006) rainfall time series under seven meteorological regions of India. Linear trend analysis was carried out using Mann-Kendall test for the observed rainfall series. The observed trend using the above-mentioned approach has been ascertained using the innovative trend analysis method. Innovative trend analysis has been found to be a strong tool to detect the general trend of rainfall time series. Sequential Mann-Kendall test has also been carried out to examine nonlinear trends of the series. The partial sum of cumulative deviation test is also found to be suitable to detect the nonlinear trend. Innovative trend analysis, sequential Mann-Kendall test and partial cumulative deviation test have potential to detect the general as well as nonlinear trend for the rainfall time series. Annual rainfall analysis suggests that the maximum changes in mean rainfall is 11.53% for West Peninsular India, whereas the maximum fall in mean rainfall is 7.8% for the North Mountainous Indian region. The innovative trend analysis method is also capable of finding the number of change point available in the time series. Additionally, we have performed von Neumann ratio test and cumulative deviation test to estimate the departure from homogeneity. Singular spectrum analysis has been applied in this study to evaluate the order of departure from homogeneity in the rainfall time series. Monsoon season (JS) of North Mountainous India and West Peninsular India zones has higher departure from homogeneity and singular spectrum analysis shows the results to be in coherence with the same.

  1. Time Series Decomposition into Oscillation Components and Phase Estimation.

    Science.gov (United States)

    Matsuda, Takeru; Komaki, Fumiyasu

    2017-02-01

    Many time series are naturally considered as a superposition of several oscillation components. For example, electroencephalogram (EEG) time series include oscillation components such as alpha, beta, and gamma. We propose a method for decomposing time series into such oscillation components using state-space models. Based on the concept of random frequency modulation, gaussian linear state-space models for oscillation components are developed. In this model, the frequency of an oscillator fluctuates by noise. Time series decomposition is accomplished by this model like the Bayesian seasonal adjustment method. Since the model parameters are estimated from data by the empirical Bayes' method, the amplitudes and the frequencies of oscillation components are determined in a data-driven manner. Also, the appropriate number of oscillation components is determined with the Akaike information criterion (AIC). In this way, the proposed method provides a natural decomposition of the given time series into oscillation components. In neuroscience, the phase of neural time series plays an important role in neural information processing. The proposed method can be used to estimate the phase of each oscillation component and has several advantages over a conventional method based on the Hilbert transform. Thus, the proposed method enables an investigation of the phase dynamics of time series. Numerical results show that the proposed method succeeds in extracting intermittent oscillations like ripples and detecting the phase reset phenomena. We apply the proposed method to real data from various fields such as astronomy, ecology, tidology, and neuroscience.

  2. Plio-Pleistocene basanitic and melilititic series of the Bohemian Massif: K-Ar ages, major/trace element and Sr–Nd isotopic data

    Czech Academy of Sciences Publication Activity Database

    Ulrych, Jaromír; Ackerman, Lukáš; Balogh, K.; Hegner, E.; Jelínek, E.; Pécskay, Z.; Přichystal, A.; Upton, B. G. J.; Zimák, J.; Foltýnová, R.

    2013-01-01

    Roč. 73, č. 4 (2013), s. 429-450 ISSN 0009-2819 Institutional support: RVO:67985831 Keywords : Bohemian Massif * Plio-Pleistocene * Basanite * Melilitite * K-Ar age * Magmatism * Sr–Nd isotopes Subject RIV: DD - Geochemistry Impact factor: 1.397, year: 2013

  3. Introduction to time series analysis and forecasting

    CERN Document Server

    Montgomery, Douglas C; Kulahci, Murat

    2015-01-01

    Praise for the First Edition ""…[t]he book is great for readers who need to apply the methods and models presented but have little background in mathematics and statistics."" -MAA Reviews Thoroughly updated throughout, Introduction to Time Series Analysis and Forecasting, Second Edition presents the underlying theories of time series analysis that are needed to analyze time-oriented data and construct real-world short- to medium-term statistical forecasts.    Authored by highly-experienced academics and professionals in engineering statistics, the Second Edition features discussions on both

  4. Multi-Scale Dissemination of Time Series Data

    DEFF Research Database (Denmark)

    Guo, Qingsong; Zhou, Yongluan; Su, Li

    2013-01-01

    In this paper, we consider the problem of continuous dissemination of time series data, such as sensor measurements, to a large number of subscribers. These subscribers fall into multiple subscription levels, where each subscription level is specified by the bandwidth constraint of a subscriber......, which is an abstract indicator for both the physical limits and the amount of data that the subscriber would like to handle. To handle this problem, we propose a system framework for multi-scale time series data dissemination that employs a typical tree-based dissemination network and existing time...

  5. RADON CONCENTRATION TIME SERIES MODELING AND APPLICATION DISCUSSION.

    Science.gov (United States)

    Stránský, V; Thinová, L

    2017-11-01

    In the year 2010 a continual radon measurement was established at Mladeč Caves in the Czech Republic using a continual radon monitor RADIM3A. In order to model radon time series in the years 2010-15, the Box-Jenkins Methodology, often used in econometrics, was applied. Because of the behavior of radon concentrations (RCs), a seasonal integrated, autoregressive moving averages model with exogenous variables (SARIMAX) has been chosen to model the measured time series. This model uses the time series seasonality, previously acquired values and delayed atmospheric parameters, to forecast RC. The developed model for RC time series is called regARIMA(5,1,3). Model residuals could be retrospectively compared with seismic evidence of local or global earthquakes, which occurred during the RCs measurement. This technique enables us to asses if continuously measured RC could serve an earthquake precursor. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Similarity estimators for irregular and age uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2013-09-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many datasets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age uncertain time series. We compare the Gaussian-kernel based cross correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  7. Similarity estimators for irregular and age-uncertain time series

    Science.gov (United States)

    Rehfeld, K.; Kurths, J.

    2014-01-01

    Paleoclimate time series are often irregularly sampled and age uncertain, which is an important technical challenge to overcome for successful reconstruction of past climate variability and dynamics. Visual comparison and interpolation-based linear correlation approaches have been used to infer dependencies from such proxy time series. While the first is subjective, not measurable and not suitable for the comparison of many data sets at a time, the latter introduces interpolation bias, and both face difficulties if the underlying dependencies are nonlinear. In this paper we investigate similarity estimators that could be suitable for the quantitative investigation of dependencies in irregular and age-uncertain time series. We compare the Gaussian-kernel-based cross-correlation (gXCF, Rehfeld et al., 2011) and mutual information (gMI, Rehfeld et al., 2013) against their interpolation-based counterparts and the new event synchronization function (ESF). We test the efficiency of the methods in estimating coupling strength and coupling lag numerically, using ensembles of synthetic stalagmites with short, autocorrelated, linear and nonlinearly coupled proxy time series, and in the application to real stalagmite time series. In the linear test case, coupling strength increases are identified consistently for all estimators, while in the nonlinear test case the correlation-based approaches fail. The lag at which the time series are coupled is identified correctly as the maximum of the similarity functions in around 60-55% (in the linear case) to 53-42% (for the nonlinear processes) of the cases when the dating of the synthetic stalagmite is perfectly precise. If the age uncertainty increases beyond 5% of the time series length, however, the true coupling lag is not identified more often than the others for which the similarity function was estimated. Age uncertainty contributes up to half of the uncertainty in the similarity estimation process. Time series irregularity

  8. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    Science.gov (United States)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  9. Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden

    Directory of Open Access Journals (Sweden)

    S. Bartl

    2009-11-01

    Full Text Available The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.

  10. Revising time series of the Elbe river discharge for flood frequency determination at gauge Dresden

    Science.gov (United States)

    Bartl, S.; Schümberg, S.; Deutsch, M.

    2009-11-01

    The German research programme RIsk MAnagment of eXtreme flood events has accomplished the improvement of regional hazard assessment for the large rivers in Germany. Here we focused on the Elbe river at its gauge Dresden, which belongs to the oldest gauges in Europe with officially available daily discharge time series beginning on 1 January 1890. The project on the one hand aimed to extend and to revise the existing time series, and on the other hand to examine the variability of the Elbe river discharge conditions on a greater time scale. Therefore one major task were the historical searches and the examination of the retrieved documents and the contained information. After analysing this information the development of the river course and the discharge conditions were discussed. Using the provided knowledge, in an other subproject, a historical hydraulic model was established. Its results then again were used here. A further purpose was the determining of flood frequency based on all pre-processed data. The obtained knowledge about historical changes was also used to get an idea about possible future variations under climate change conditions. Especially variations in the runoff characteristic of the Elbe river over the course of the year were analysed. It succeeded to obtain a much longer discharge time series which contain fewer errors and uncertainties. Hence an optimized regional hazard assessment was realised.

  11. Robust Forecasting of Non-Stationary Time Series

    OpenAIRE

    Croux, C.; Fried, R.; Gijbels, I.; Mahieu, K.

    2010-01-01

    This paper proposes a robust forecasting method for non-stationary time series. The time series is modelled using non-parametric heteroscedastic regression, and fitted by a localized MM-estimator, combining high robustness and large efficiency. The proposed method is shown to produce reliable forecasts in the presence of outliers, non-linearity, and heteroscedasticity. In the absence of outliers, the forecasts are only slightly less precise than those based on a localized Least Squares estima...

  12. Pleistocene megafaunal interaction networks became more vulnerable after human arrival.

    Science.gov (United States)

    Pires, Mathias M; Koch, Paul L; Fariña, Richard A; de Aguiar, Marcus A M; dos Reis, Sérgio F; Guimarães, Paulo R

    2015-09-07

    The end of the Pleistocene was marked by the extinction of almost all large land mammals worldwide except in Africa. Although the debate on Pleistocene extinctions has focused on the roles of climate change and humans, the impact of perturbations depends on properties of ecological communities, such as species composition and the organization of ecological interactions. Here, we combined palaeoecological and ecological data, food-web models and community stability analysis to investigate if differences between Pleistocene and modern mammalian assemblages help us understand why the megafauna died out in the Americas while persisting in Africa. We show Pleistocene and modern assemblages share similar network topology, but differences in richness and body size distributions made Pleistocene communities significantly more vulnerable to the effects of human arrival. The structural changes promoted by humans in Pleistocene networks would have increased the likelihood of unstable dynamics, which may favour extinction cascades in communities facing extrinsic perturbations. Our findings suggest that the basic aspects of the organization of ecological communities may have played an important role in major extinction events in the past. Knowledge of community-level properties and their consequences to dynamics may be critical to understand past and future extinctions. © 2015 The Author(s).

  13. Time Series Econometrics for the 21st Century

    Science.gov (United States)

    Hansen, Bruce E.

    2017-01-01

    The field of econometrics largely started with time series analysis because many early datasets were time-series macroeconomic data. As the field developed, more cross-sectional and longitudinal datasets were collected, which today dominate the majority of academic empirical research. In nonacademic (private sector, central bank, and governmental)…

  14. Effectiveness of firefly algorithm based neural network in time series ...

    African Journals Online (AJOL)

    Effectiveness of firefly algorithm based neural network in time series forecasting. ... In the experiments, three well known time series were used to evaluate the performance. Results obtained were compared with ... Keywords: Time series, Artificial Neural Network, Firefly Algorithm, Particle Swarm Optimization, Overfitting ...

  15. Phylogeography of the Alcippe morrisonia (Aves: Timaliidae: long population history beyond late Pleistocene glaciations

    Directory of Open Access Journals (Sweden)

    Li Shouhsien

    2009-06-01

    Full Text Available Abstract Background The role of Pleistocene glacial oscillations in current biodiversity and distribution patterns varies with latitude, physical topology and population life history and has long been a topic of discussion. However, there had been little phylogeographical research in south China, where the geophysical complexity is associated with great biodiversity. A bird endemic in Southeast Asia, the Grey-cheeked Fulvetta, Alcippe morrisonia, has been reported to show deep genetic divergences among its seven subspecies. In the present study, we investigated the phylogeography of A. morrisonia to explore its population structure and evolutionary history, in order to gain insight into the effect of geological events on the speciation and diversity of birds endemic in south China. Results Mitochondrial genes cytochrome b (Cytb and cytochrome c oxidase I (COI were represented by 1236 nucleotide sites from 151 individuals from 29 localities. Phylogenetic analysis showed seven monophyletic clades congruent with the geographically separated groups, which were identified as major sources of molecular variance (90.92% by AMOVA. TCS analysis revealed four disconnected networks, and that no haplotype was shared among the geographical groups. The common ancestor of these populations was dated to 11.6 Mya and several divergence events were estimated along the population evolutionary history. Isolation by distance was inferred by NCPA to be responsible for the current intra-population genetic pattern and gene flow among geographical groups was interrupted. A late Pleistocene demographic expansion was detected in the eastern geographical groups, while the expansion time (0.2–0.4 Mya was earlier than the Last Glacial Maximum. Conclusion It is proposed that the complicated topology preserves high genetic diversity and ancient lineages for geographical groups of A. morrisonia in China mainland and its two major islands, and restricts gene exchange during

  16. Time Series Analysis of Insar Data: Methods and Trends

    Science.gov (United States)

    Osmanoglu, Batuhan; Sunar, Filiz; Wdowinski, Shimon; Cano-Cabral, Enrique

    2015-01-01

    Time series analysis of InSAR data has emerged as an important tool for monitoring and measuring the displacement of the Earth's surface. Changes in the Earth's surface can result from a wide range of phenomena such as earthquakes, volcanoes, landslides, variations in ground water levels, and changes in wetland water levels. Time series analysis is applied to interferometric phase measurements, which wrap around when the observed motion is larger than one-half of the radar wavelength. Thus, the spatio-temporal ''unwrapping" of phase observations is necessary to obtain physically meaningful results. Several different algorithms have been developed for time series analysis of InSAR data to solve for this ambiguity. These algorithms may employ different models for time series analysis, but they all generate a first-order deformation rate, which can be compared to each other. However, there is no single algorithm that can provide optimal results in all cases. Since time series analyses of InSAR data are used in a variety of applications with different characteristics, each algorithm possesses inherently unique strengths and weaknesses. In this review article, following a brief overview of InSAR technology, we discuss several algorithms developed for time series analysis of InSAR data using an example set of results for measuring subsidence rates in Mexico City.

  17. Interpretation of a compositional time series

    Science.gov (United States)

    Tolosana-Delgado, R.; van den Boogaart, K. G.

    2012-04-01

    Common methods for multivariate time series analysis use linear operations, from the definition of a time-lagged covariance/correlation to the prediction of new outcomes. However, when the time series response is a composition (a vector of positive components showing the relative importance of a set of parts in a total, like percentages and proportions), then linear operations are afflicted of several problems. For instance, it has been long recognised that (auto/cross-)correlations between raw percentages are spurious, more dependent on which other components are being considered than on any natural link between the components of interest. Also, a long-term forecast of a composition in models with a linear trend will ultimately predict negative components. In general terms, compositional data should not be treated in a raw scale, but after a log-ratio transformation (Aitchison, 1986: The statistical analysis of compositional data. Chapman and Hill). This is so because the information conveyed by a compositional data is relative, as stated in their definition. The principle of working in coordinates allows to apply any sort of multivariate analysis to a log-ratio transformed composition, as long as this transformation is invertible. This principle is of full application to time series analysis. We will discuss how results (both auto/cross-correlation functions and predictions) can be back-transformed, viewed and interpreted in a meaningful way. One view is to use the exhaustive set of all possible pairwise log-ratios, which allows to express the results into D(D - 1)/2 separate, interpretable sets of one-dimensional models showing the behaviour of each possible pairwise log-ratios. Another view is the interpretation of estimated coefficients or correlations back-transformed in terms of compositions. These two views are compatible and complementary. These issues are illustrated with time series of seasonal precipitation patterns at different rain gauges of the USA

  18. Impact of STROBE statement publication on quality of observational study reporting: interrupted time series versus before-after analysis.

    Directory of Open Access Journals (Sweden)

    Sylvie Bastuji-Garin

    Full Text Available In uncontrolled before-after studies, CONSORT was shown to improve the reporting of randomised trials. Before-after studies ignore underlying secular trends and may overestimate the impact of interventions. Our aim was to assess the impact of the 2007 STROBE statement publication on the quality of observational study reporting, using both uncontrolled before-after analyses and interrupted time series.For this quasi-experimental study, original articles reporting cohort, case-control, and cross-sectional studies published between 2004 and 2010 in the four dermatological journals having the highest 5-year impact factors (≥ 4 were selected. We compared the proportions of STROBE items (STROBE score adequately reported in each article during three periods, two pre STROBE period (2004-2005 and 2006-2007 and one post STROBE period (2008-2010. Segmented regression analysis of interrupted time series was also performed.Of the 456 included articles, 187 (41% reported cohort studies, 166 (36.4% cross-sectional studies, and 103 (22.6% case-control studies. The median STROBE score was 57% (range, 18%-98%. Before-after analysis evidenced significant STROBE score increases between the two pre-STROBE periods and between the earliest pre-STROBE period and the post-STROBE period (median score2004-05 48% versus median score2008-10 58%, p<0.001 but not between the immediate pre-STROBE period and the post-STROBE period (median score2006-07 58% versus median score2008-10 58%, p = 0.42. In the pre STROBE period, the six-monthly mean STROBE score increased significantly, by 1.19% per six-month period (absolute increase 95%CI, 0.26% to 2.11%, p = 0.016. By segmented analysis, no significant changes in STROBE score trends occurred (-0.40%; 95%CI, -2.20 to 1.41; p = 0.64 in the post STROBE statement publication.The quality of reports increased over time but was not affected by STROBE. Our findings raise concerns about the relevance of uncontrolled before

  19. Capturing Structure Implicitly from Time-Series having Limited Data

    OpenAIRE

    Emaasit, Daniel; Johnson, Matthew

    2018-01-01

    Scientific fields such as insider-threat detection and highway-safety planning often lack sufficient amounts of time-series data to estimate statistical models for the purpose of scientific discovery. Moreover, the available limited data are quite noisy. This presents a major challenge when estimating time-series models that are robust to overfitting and have well-calibrated uncertainty estimates. Most of the current literature in these fields involve visualizing the time-series for noticeabl...

  20. Self-affinity in the dengue fever time series

    Science.gov (United States)

    Azevedo, S. M.; Saba, H.; Miranda, J. G. V.; Filho, A. S. Nascimento; Moret, M. A.

    2016-06-01

    Dengue is a complex public health problem that is common in tropical and subtropical regions. This disease has risen substantially in the last three decades, and the physical symptoms depict the self-affine behavior of the occurrences of reported dengue cases in Bahia, Brazil. This study uses detrended fluctuation analysis (DFA) to verify the scale behavior in a time series of dengue cases and to evaluate the long-range correlations that are characterized by the power law α exponent for different cities in Bahia, Brazil. The scaling exponent (α) presents different long-range correlations, i.e. uncorrelated, anti-persistent, persistent and diffusive behaviors. The long-range correlations highlight the complex behavior of the time series of this disease. The findings show that there are two distinct types of scale behavior. In the first behavior, the time series presents a persistent α exponent for a one-month period. For large periods, the time series signal approaches subdiffusive behavior. The hypothesis of the long-range correlations in the time series of the occurrences of reported dengue cases was validated. The observed self-affinity is useful as a forecasting tool for future periods through extrapolation of the α exponent behavior. This complex system has a higher predictability in a relatively short time (approximately one month), and it suggests a new tool in epidemiological control strategies. However, predictions for large periods using DFA are hidden by the subdiffusive behavior.

  1. On the plurality of times: disunified time and the A-series | Nefdt ...

    African Journals Online (AJOL)

    Then, I attempt to show that disunified time is a problem for a semantics based on the A-series since A-truthmakers are hard to come by in a universe of temporally disconnected time-series. Finally, I provide a novel argument showing that presentists should be particularly fearful of such a universe. South African Journal of ...

  2. Time-series modeling of long-term weight self-monitoring data.

    Science.gov (United States)

    Helander, Elina; Pavel, Misha; Jimison, Holly; Korhonen, Ilkka

    2015-08-01

    Long-term self-monitoring of weight is beneficial for weight maintenance, especially after weight loss. Connected weight scales accumulate time series information over long term and hence enable time series analysis of the data. The analysis can reveal individual patterns, provide more sensitive detection of significant weight trends, and enable more accurate and timely prediction of weight outcomes. However, long term self-weighing data has several challenges which complicate the analysis. Especially, irregular sampling, missing data, and existence of periodic (e.g. diurnal and weekly) patterns are common. In this study, we apply time series modeling approach on daily weight time series from two individuals and describe information that can be extracted from this kind of data. We study the properties of weight time series data, missing data and its link to individuals behavior, periodic patterns and weight series segmentation. Being able to understand behavior through weight data and give relevant feedback is desired to lead to positive intervention on health behaviors.

  3. Time series prediction of apple scab using meteorological ...

    African Journals Online (AJOL)

    A new prediction model for the early warning of apple scab is proposed in this study. The method is based on artificial intelligence and time series prediction. The infection period of apple scab was evaluated as the time series prediction model instead of summation of wetness duration. Also, the relations of different ...

  4. Characterization of time series via Rényi complexity-entropy curves

    Science.gov (United States)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  5. Improving prehospital trauma care in Rwanda through continuous quality improvement: an interrupted time series analysis.

    Science.gov (United States)

    Scott, John W; Nyinawankusi, Jeanne D'Arc; Enumah, Samuel; Maine, Rebecca; Uwitonze, Eric; Hu, Yihan; Kabagema, Ignace; Byiringiro, Jean Claude; Riviello, Robert; Jayaraman, Sudha

    2017-07-01

    Injury is a major cause of premature death and disability in East Africa, and high-quality pre-hospital care is essential for optimal trauma outcomes. The Rwandan pre-hospital emergency care service (SAMU) uses an electronic database to evaluate and optimize pre-hospital care through a continuous quality improvement programme (CQIP), beginning March 2014. The SAMU database was used to assess pre-hospital quality metrics including supplementary oxygen for hypoxia (O2), intravenous fluids for hypotension (IVF), cervical collar placement for head injuries (c-collar), and either splinting (splint) or administration of pain medications (pain) for long bone fractures. Targets of >90% were set for each metric and daily team meetings and monthly feedback sessions were implemented to address opportunities for improvement. These five pre-hospital quality metrics were assessed monthly before and after implementation of the CQIP. Met and unmet needs for O2, IVF, and c-collar were combined into a summative monthly SAMU Trauma Quality Scores (STQ score). An interrupted time series linear regression model compared the STQ score during 14 months before the CQIP implementation to the first 14 months after. During the 29-month study period 3,822 patients met study criteria. 1,028 patients needed one or more of the five studied interventions during the study period. All five endpoints had a significant increase between the pre-CQI and post-CQI periods (pRwanda. This programme may be used as an example for additional efforts engaging frontline staff with real-time data feedback in order to rapidly translate data collection efforts into improved care for the injured in a resource-limited setting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. 1.8 K Refrigeration Units for the LHC: Performance Assessment of Pre-series Units

    CERN Document Server

    Claudet, S; Millet, F; Tavian, L; 20th International Cryogenic Engineering Conference (ICEC20)

    2005-01-01

    The cooling capacity below 2 K for the superconducting magnets of the Large Hadron Collider (LHC), at CERN, will be provided by eight refrigeration units of 2400 W at 1.8 K, each of them coupled to a 4.5 K refrigerator. The two selected vendors have proposed cycles based on centrifugal cold compressors combined with volumetric screw compressors with sub-atmospheric suction, as previously identified by CERN as “reference cycle”. The supply of the series units was linked to successful testing and acceptance of the pre-series temporarily installed in a dedicated test station. The global capacity, the performance of cold compressors and some process specificities have been thoroughly tested and will be presented.

  7. Quantifying Selection with Pool-Seq Time Series Data.

    Science.gov (United States)

    Taus, Thomas; Futschik, Andreas; Schlötterer, Christian

    2017-11-01

    Allele frequency time series data constitute a powerful resource for unraveling mechanisms of adaptation, because the temporal dimension captures important information about evolutionary forces. In particular, Evolve and Resequence (E&R), the whole-genome sequencing of replicated experimentally evolving populations, is becoming increasingly popular. Based on computer simulations several studies proposed experimental parameters to optimize the identification of the selection targets. No such recommendations are available for the underlying parameters selection strength and dominance. Here, we introduce a highly accurate method to estimate selection parameters from replicated time series data, which is fast enough to be applied on a genome scale. Using this new method, we evaluate how experimental parameters can be optimized to obtain the most reliable estimates for selection parameters. We show that the effective population size (Ne) and the number of replicates have the largest impact. Because the number of time points and sequencing coverage had only a minor effect, we suggest that time series analysis is feasible without major increase in sequencing costs. We anticipate that time series analysis will become routine in E&R studies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  8. Results of the examinations of the W7-X pre-series target elements

    International Nuclear Information System (INIS)

    Boscary, J.; Boeswirth, B.; Greuner, H.; Streibl, B.; Missirlian, M.; Schlosser, J.; Schedler, B.; Scheiber, K.

    2006-01-01

    The highly heat-loaded area of the target plates of the WENDELSTEIN 7-X (W7-X) divertor is formed by 890 water-cooled target elements (TEs). This surface is designed to sustain a maximum stationary heat flux of 10 MW/m 2 and to remove a maximum power of 100 kW per element. Flat tiles made of CFC Sepcarb(r) NB31 are bonded to a CuCrZr heat sink. This joint is manufactured in two stages. In the first step, an OFHC copper interlayer is cast onto the tile by active metal casting (AMC(r)). At this stage, the joint copper-CFC is inspected by X-ray and lock-in thermography. In the second step, the AMC(r)-NB31 tiles are joined to CuCrZr. Two technologies have been investigated: electron beam welding (EBW) and hot isostatic pressing (HIP). The joint copper-CuCrZr is examined by ultrasonic method. At the end of the fabrication, the bond between the heat sink and the CFC tiles is inspected by thermography methods. The produced CFC NB31 material for W7-X showed a large scatter in the tensile strength in the ex-pitch direction in the range of 50 - 110 MPa. Pre-series TEs have been manufactured to qualify the design, the fabrication, the relevant non-destructive examinations (NDEs) and the delivered CFC for the serial production. The whole manufacturing route is validated if the delivered elements withstand operating conditions similar to those in W7-X in the high heat flux (HHF) test facility GLADIS without degradation of performance and integrity. HHF tests did not show any effect that could be attributed to the CFC grade or to the joining method. The HHF test results exhibited a high percentage of defective tiles, indicated by hot spots at the border of the CFC surface. Visual inspections after HHF tests have mostly correlated these spots to the initiation and /or propagation of cracks at the lateral edge of the tiles in CFC at the interface CFC-copper. The pre-series activities have been extended to reduce the stresses at the critical AMC(r) interface. By means of

  9. Lethal Interpersonal Violence in the Middle Pleistocene

    OpenAIRE

    Sala, Nohemi; Arsuaga, Juan Luis; Pantoja-P?rez, Ana; Pablos, Adri?n; Mart?nez, Ignacio; Quam, Rolf M.; G?mez-Olivencia, Asier; Berm?dez de Castro, Jos? Mar?a; Carbonell, Eudald

    2015-01-01

    Evidence of interpersonal violence has been documented previously in Pleistocene members of the genus Homo, but only very rarely has this been posited as the possible manner of death. Here we report the earliest evidence of lethal interpersonal violence in the hominin fossil record. Cranium 17 recovered from the Sima de los Huesos Middle Pleistocene site shows two clear perimortem depression fractures on the frontal bone, interpreted as being produced by two episodes of localized blunt force ...

  10. Transformation-cost time-series method for analyzing irregularly sampled data.

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  11. Transformation-cost time-series method for analyzing irregularly sampled data

    Science.gov (United States)

    Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen

    2015-06-01

    Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.

  12. A multidisciplinary database for geophysical time series management

    Science.gov (United States)

    Montalto, P.; Aliotta, M.; Cassisi, C.; Prestifilippo, M.; Cannata, A.

    2013-12-01

    The variables collected by a sensor network constitute a heterogeneous data source that needs to be properly organized in order to be used in research and geophysical monitoring. With the time series term we refer to a set of observations of a given phenomenon acquired sequentially in time. When the time intervals are equally spaced one speaks of period or sampling frequency. Our work describes in detail a possible methodology for storage and management of time series using a specific data structure. We designed a framework, hereinafter called TSDSystem (Time Series Database System), in order to acquire time series from different data sources and standardize them within a relational database. The operation of standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common time scale. The proposed architecture follows a multiple layer paradigm (Loaders layer, Database layer and Business Logic layer). Each layer is specialized in performing particular operations for the reorganization and archiving of data from different sources such as ASCII, Excel, ODBC (Open DataBase Connectivity), file accessible from the Internet (web pages, XML). In particular, the loader layer performs a security check of the working status of each running software through an heartbeat system, in order to automate the discovery of acquisition issues and other warning conditions. Although our system has to manage huge amounts of data, performance is guaranteed by using a smart partitioning table strategy, that keeps balanced the percentage of data stored in each database table. TSDSystem also contains modules for the visualization of acquired data, that provide the possibility to query different time series on a specified time range, or follow the realtime signal acquisition, according to a data access policy from the users.

  13. Modeling financial time series with S-plus

    CERN Document Server

    Zivot, Eric

    2003-01-01

    The field of financial econometrics has exploded over the last decade This book represents an integration of theory, methods, and examples using the S-PLUS statistical modeling language and the S+FinMetrics module to facilitate the practice of financial econometrics This is the first book to show the power of S-PLUS for the analysis of time series data It is written for researchers and practitioners in the finance industry, academic researchers in economics and finance, and advanced MBA and graduate students in economics and finance Readers are assumed to have a basic knowledge of S-PLUS and a solid grounding in basic statistics and time series concepts Eric Zivot is an associate professor and Gary Waterman Distinguished Scholar in the Economics Department at the University of Washington, and is co-director of the nascent Professional Master's Program in Computational Finance He regularly teaches courses on econometric theory, financial econometrics and time series econometrics, and is the recipient of the He...

  14. Application of Time Series Analysis in Determination of Lag Time in Jahanbin Basin

    Directory of Open Access Journals (Sweden)

    Seied Yahya Mirzaee

    2005-11-01

        One of the important issues that have significant role in study of hydrology of basin is determination of lag time. Lag time has significant role in hydrological studies. Quantity of rainfall related lag time depends on several factors, such as permeability, vegetation cover, catchments slope, rainfall intensity, storm duration and type of rain. Determination of lag time is important parameter in many projects such as dam design and also water resource studies. Lag time of basin could be calculated using various methods. One of these methods is time series analysis of spectral density. The analysis is based on fouries series. The time series is approximated with Sinuous and Cosines functions. In this method harmonically significant quantities with individual frequencies are presented. Spectral density under multiple time series could be used to obtain basin lag time for annual runoff and short-term rainfall fluctuation. A long lag time could be due to snowmelt as well as melting ice due to rainfalls in freezing days. In this research the lag time of Jahanbin basin has been determined using spectral density method. The catchments is subjected to both rainfall and snowfall. For short term rainfall fluctuation with a return period  2, 3, 4 months, the lag times were found 0.18, 0.5 and 0.083 month, respectively.

  15. Modeling Time Series Data for Supervised Learning

    Science.gov (United States)

    Baydogan, Mustafa Gokce

    2012-01-01

    Temporal data are increasingly prevalent and important in analytics. Time series (TS) data are chronological sequences of observations and an important class of temporal data. Fields such as medicine, finance, learning science and multimedia naturally generate TS data. Each series provide a high-dimensional data vector that challenges the learning…

  16. Empirical method to measure stochasticity and multifractality in nonlinear time series

    Science.gov (United States)

    Lin, Chih-Hao; Chang, Chia-Seng; Li, Sai-Ping

    2013-12-01

    An empirical algorithm is used here to study the stochastic and multifractal nature of nonlinear time series. A parameter can be defined to quantitatively measure the deviation of the time series from a Wiener process so that the stochasticity of different time series can be compared. The local volatility of the time series under study can be constructed using this algorithm, and the multifractal structure of the time series can be analyzed by using this local volatility. As an example, we employ this method to analyze financial time series from different stock markets. The result shows that while developed markets evolve very much like an Ito process, the emergent markets are far from efficient. Differences about the multifractal structures and leverage effects between developed and emergent markets are discussed. The algorithm used here can be applied in a similar fashion to study time series of other complex systems.

  17. Pleistocene speleothem fracturing in the foreland of the Western Carpathians: a case study from the seismically active eastern margin of the Bohemian Massif

    Czech Academy of Sciences Publication Activity Database

    Bábek, O.; Briestenský, Miloš; Přecechtělová, G.; Štěpančíková, Petra; Hellstrom, J.C.; Drysdale, R.N.

    2015-01-01

    Roč. 59, č. 3 (2015), s. 491-506 ISSN 1641-7291 R&D Projects: GA ČR GAP210/12/0573 Institutional support: RVO:67985891 Keywords : speleothems * U/Th series dating * palaeoseismicity * Pleistocene * Bohemian Massif Subject RIV: DB - Geology ; Mineralogy Impact factor: 0.858, year: 2015

  18. The timing of Late Pleistocene glaciation at Mount Wilhelm, Papua New Guinea

    Science.gov (United States)

    Mills, Stephanie; Barrows, Timothy; Hope, Geoff; Pillans, Brad; Fifield, Keith

    2016-04-01

    The highlands of New Guinea were the most extensively glaciated area in the Asian tropical region during the Late Pleistocene. Evidence for glaciation is widespread on most of the mountain peaks above ~3500 m. Glacial landforms include both valley and ice cap forms, but the timing of glaciation remains constrained to only a few local areas. This paper focuses on Mount Wilhelm, which is situated in the central southern region of Papua New Guinea at 5.78°S and is the highest peak (4510 m a.s.l.) We focus on a south easterly valley (Pindaunde Valley) emanating from the peak, where large moraines indicate the maximum ice extent of a valley glacier ~5 km long. Within this extensive moraine complex, recessional moraines document the retreat of the glacier towards the summit region. In order to determine the timing of deglaciation, we collected samples for surface exposure dating using 36Cl and 10Be from diorite boulders positioned on moraine crests. The ages indicate that maximum ice extent was attained during the last glacial maximum (LGM) and that ice remained near its maximum extent until after 15 ka but persisted at higher elevations almost until the Holocene. These results are similar to those described from Mt Giluwe to the northwest of Mount Wilhelm, where an ice cap reached its maximum extent at the LGM and remained there for around 3-4,000 years. This indicates that full glacial conditions were only brief in this region of the tropics.

  19. Clinical time series prediction: Toward a hierarchical dynamical system framework.

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2015-09-01

    Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. We tested our framework by first learning the time series model from data for the patients in the training set, and then using it to predict future time series values for the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive performance. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Clinical time series prediction: towards a hierarchical dynamical system framework

    Science.gov (United States)

    Liu, Zitao; Hauskrecht, Milos

    2014-01-01

    Objective Developing machine learning and data mining algorithms for building temporal models of clinical time series is important for understanding of the patient condition, the dynamics of a disease, effect of various patient management interventions and clinical decision making. In this work, we propose and develop a novel hierarchical framework for modeling clinical time series data of varied length and with irregularly sampled observations. Materials and methods Our hierarchical dynamical system framework for modeling clinical time series combines advantages of the two temporal modeling approaches: the linear dynamical system and the Gaussian process. We model the irregularly sampled clinical time series by using multiple Gaussian process sequences in the lower level of our hierarchical framework and capture the transitions between Gaussian processes by utilizing the linear dynamical system. The experiments are conducted on the complete blood count (CBC) panel data of 1000 post-surgical cardiac patients during their hospitalization. Our framework is evaluated and compared to multiple baseline approaches in terms of the mean absolute prediction error and the absolute percentage error. Results We tested our framework by first learning the time series model from data for the patient in the training set, and then applying the model in order to predict future time series values on the patients in the test set. We show that our model outperforms multiple existing models in terms of its predictive accuracy. Our method achieved a 3.13% average prediction accuracy improvement on ten CBC lab time series when it was compared against the best performing baseline. A 5.25% average accuracy improvement was observed when only short-term predictions were considered. Conclusion A new hierarchical dynamical system framework that lets us model irregularly sampled time series data is a promising new direction for modeling clinical time series and for improving their predictive

  1. Turbulencelike Behavior of Seismic Time Series

    International Nuclear Information System (INIS)

    Manshour, P.; Saberi, S.; Sahimi, Muhammad; Peinke, J.; Pacheco, Amalio F.; Rahimi Tabar, M. Reza

    2009-01-01

    We report on a stochastic analysis of Earth's vertical velocity time series by using methods originally developed for complex hierarchical systems and, in particular, for turbulent flows. Analysis of the fluctuations of the detrended increments of the series reveals a pronounced transition in their probability density function from Gaussian to non-Gaussian. The transition occurs 5-10 hours prior to a moderate or large earthquake, hence representing a new and reliable precursor for detecting such earthquakes

  2. Characterizing time series: when Granger causality triggers complex networks

    Science.gov (United States)

    Ge, Tian; Cui, Yindong; Lin, Wei; Kurths, Jürgen; Liu, Chong

    2012-08-01

    In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIHMassachusetts Institute of Technology-Beth Israel Hospital. human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length.

  3. Characterizing time series: when Granger causality triggers complex networks

    International Nuclear Information System (INIS)

    Ge Tian; Cui Yindong; Lin Wei; Liu Chong; Kurths, Jürgen

    2012-01-01

    In this paper, we propose a new approach to characterize time series with noise perturbations in both the time and frequency domains by combining Granger causality and complex networks. We construct directed and weighted complex networks from time series and use representative network measures to describe their physical and topological properties. Through analyzing the typical dynamical behaviors of some physical models and the MIT-BIH human electrocardiogram data sets, we show that the proposed approach is able to capture and characterize various dynamics and has much potential for analyzing real-world time series of rather short length. (paper)

  4. Multivariate time series analysis with R and financial applications

    CERN Document Server

    Tsay, Ruey S

    2013-01-01

    Since the publication of his first book, Analysis of Financial Time Series, Ruey Tsay has become one of the most influential and prominent experts on the topic of time series. Different from the traditional and oftentimes complex approach to multivariate (MV) time series, this sequel book emphasizes structural specification, which results in simplified parsimonious VARMA modeling and, hence, eases comprehension. Through a fundamental balance between theory and applications, the book supplies readers with an accessible approach to financial econometric models and their applications to real-worl

  5. Measurements of spatial population synchrony: influence of time series transformations.

    Science.gov (United States)

    Chevalier, Mathieu; Laffaille, Pascal; Ferdy, Jean-Baptiste; Grenouillet, Gaël

    2015-09-01

    Two mechanisms have been proposed to explain spatial population synchrony: dispersal among populations, and the spatial correlation of density-independent factors (the "Moran effect"). To identify which of these two mechanisms is driving spatial population synchrony, time series transformations (TSTs) of abundance data have been used to remove the signature of one mechanism, and highlight the effect of the other. However, several issues with TSTs remain, and to date no consensus has emerged about how population time series should be handled in synchrony studies. Here, by using 3131 time series involving 34 fish species found in French rivers, we computed several metrics commonly used in synchrony studies to determine whether a large-scale climatic factor (temperature) influenced fish population dynamics at the regional scale, and to test the effect of three commonly used TSTs (detrending, prewhitening and a combination of both) on these metrics. We also tested whether the influence of TSTs on time series and population synchrony levels was related to the features of the time series using both empirical and simulated time series. For several species, and regardless of the TST used, we evidenced a Moran effect on freshwater fish populations. However, these results were globally biased downward by TSTs which reduced our ability to detect significant signals. Depending on the species and the features of the time series, we found that TSTs could lead to contradictory results, regardless of the metric considered. Finally, we suggest guidelines on how population time series should be processed in synchrony studies.

  6. Stochastic time series analysis of hydrology data for water resources

    Science.gov (United States)

    Sathish, S.; Khadar Babu, S. K.

    2017-11-01

    The prediction to current publication of stochastic time series analysis in hydrology and seasonal stage. The different statistical tests for predicting the hydrology time series on Thomas-Fiering model. The hydrology time series of flood flow have accept a great deal of consideration worldwide. The concentration of stochastic process areas of time series analysis method are expanding with develop concerns about seasonal periods and global warming. The recent trend by the researchers for testing seasonal periods in the hydrologic flowseries using stochastic process on Thomas-Fiering model. The present article proposed to predict the seasonal periods in hydrology using Thomas-Fiering model.

  7. Neural network versus classical time series forecasting models

    Science.gov (United States)

    Nor, Maria Elena; Safuan, Hamizah Mohd; Shab, Noorzehan Fazahiyah Md; Asrul, Mohd; Abdullah, Affendi; Mohamad, Nurul Asmaa Izzati; Lee, Muhammad Hisyam

    2017-05-01

    Artificial neural network (ANN) has advantage in time series forecasting as it has potential to solve complex forecasting problems. This is because ANN is data driven approach which able to be trained to map past values of a time series. In this study the forecast performance between neural network and classical time series forecasting method namely seasonal autoregressive integrated moving average models was being compared by utilizing gold price data. Moreover, the effect of different data preprocessing on the forecast performance of neural network being examined. The forecast accuracy was evaluated using mean absolute deviation, root mean square error and mean absolute percentage error. It was found that ANN produced the most accurate forecast when Box-Cox transformation was used as data preprocessing.

  8. Nonlinear time series analysis of the human electrocardiogram

    International Nuclear Information System (INIS)

    Perc, Matjaz

    2005-01-01

    We analyse the human electrocardiogram with simple nonlinear time series analysis methods that are appropriate for graduate as well as undergraduate courses. In particular, attention is devoted to the notions of determinism and stationarity in physiological data. We emphasize that methods of nonlinear time series analysis can be successfully applied only if the studied data set originates from a deterministic stationary system. After positively establishing the presence of determinism and stationarity in the studied electrocardiogram, we calculate the maximal Lyapunov exponent, thus providing interesting insights into the dynamics of the human heart. Moreover, to facilitate interest and enable the integration of nonlinear time series analysis methods into the curriculum at an early stage of the educational process, we also provide user-friendly programs for each implemented method

  9. Multichannel biomedical time series clustering via hierarchical probabilistic latent semantic analysis.

    Science.gov (United States)

    Wang, Jin; Sun, Xiangping; Nahavandi, Saeid; Kouzani, Abbas; Wu, Yuchuan; She, Mary

    2014-11-01

    Biomedical time series clustering that automatically groups a collection of time series according to their internal similarity is of importance for medical record management and inspection such as bio-signals archiving and retrieval. In this paper, a novel framework that automatically groups a set of unlabelled multichannel biomedical time series according to their internal structural similarity is proposed. Specifically, we treat a multichannel biomedical time series as a document and extract local segments from the time series as words. We extend a topic model, i.e., the Hierarchical probabilistic Latent Semantic Analysis (H-pLSA), which was originally developed for visual motion analysis to cluster a set of unlabelled multichannel time series. The H-pLSA models each channel of the multichannel time series using a local pLSA in the first layer. The topics learned in the local pLSA are then fed to a global pLSA in the second layer to discover the categories of multichannel time series. Experiments on a dataset extracted from multichannel Electrocardiography (ECG) signals demonstrate that the proposed method performs better than previous state-of-the-art approaches and is relatively robust to the variations of parameters including length of local segments and dictionary size. Although the experimental evaluation used the multichannel ECG signals in a biometric scenario, the proposed algorithm is a universal framework for multichannel biomedical time series clustering according to their structural similarity, which has many applications in biomedical time series management. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  10. Hidden Markov Models for Time Series An Introduction Using R

    CERN Document Server

    Zucchini, Walter

    2009-01-01

    Illustrates the flexibility of HMMs as general-purpose models for time series data. This work presents an overview of HMMs for analyzing time series data, from continuous-valued, circular, and multivariate series to binary data, bounded and unbounded counts and categorical observations.

  11. Constructing ordinal partition transition networks from multivariate time series.

    Science.gov (United States)

    Zhang, Jiayang; Zhou, Jie; Tang, Ming; Guo, Heng; Small, Michael; Zou, Yong

    2017-08-10

    A growing number of algorithms have been proposed to map a scalar time series into ordinal partition transition networks. However, most observable phenomena in the empirical sciences are of a multivariate nature. We construct ordinal partition transition networks for multivariate time series. This approach yields weighted directed networks representing the pattern transition properties of time series in velocity space, which hence provides dynamic insights of the underling system. Furthermore, we propose a measure of entropy to characterize ordinal partition transition dynamics, which is sensitive to capturing the possible local geometric changes of phase space trajectories. We demonstrate the applicability of pattern transition networks to capture phase coherence to non-coherence transitions, and to characterize paths to phase synchronizations. Therefore, we conclude that the ordinal partition transition network approach provides complementary insight to the traditional symbolic analysis of nonlinear multivariate time series.

  12. Detection of Outliers and Imputing of Missing Values for Water Quality UV-VIS Absorbance Time Series

    OpenAIRE

    Plazas-Nossa, Leonardo; Ávila Angulo, Miguel Antonio; Torres, Andrés

    2017-01-01

    Context:The UV-Vis absorbance collection using online optical captors for water quality detection may yield outliers and/or missing values. Therefore, pre-processing to correct these anomalies is required to improve the analysis of monitoring data. The aim of this study is to propose a method to detect outliers as well as to fill-in the gaps in time series. Method:Outliers are detected using Winsorising procedure and the application of the Discrete Fourier Transform (DFT) and the Inverse of F...

  13. Permutation entropy of finite-length white-noise time series.

    Science.gov (United States)

    Little, Douglas J; Kane, Deb M

    2016-08-01

    Permutation entropy (PE) is commonly used to discriminate complex structure from white noise in a time series. While the PE of white noise is well understood in the long time-series limit, analysis in the general case is currently lacking. Here the expectation value and variance of white-noise PE are derived as functions of the number of ordinal pattern trials, N, and the embedding dimension, D. It is demonstrated that the probability distribution of the white-noise PE converges to a χ^{2} distribution with D!-1 degrees of freedom as N becomes large. It is further demonstrated that the PE variance for an arbitrary time series can be estimated as the variance of a related metric, the Kullback-Leibler entropy (KLE), allowing the qualitative N≫D! condition to be recast as a quantitative estimate of the N required to achieve a desired PE calculation precision. Application of this theory to statistical inference is demonstrated in the case of an experimentally obtained noise series, where the probability of obtaining the observed PE value was calculated assuming a white-noise time series. Standard statistical inference can be used to draw conclusions whether the white-noise null hypothesis can be accepted or rejected. This methodology can be applied to other null hypotheses, such as discriminating whether two time series are generated from different complex system states.

  14. Multiresolution analysis of Bursa Malaysia KLCI time series

    Science.gov (United States)

    Ismail, Mohd Tahir; Dghais, Amel Abdoullah Ahmed

    2017-05-01

    In general, a time series is simply a sequence of numbers collected at regular intervals over a period. Financial time series data processing is concerned with the theory and practice of processing asset price over time, such as currency, commodity data, and stock market data. The primary aim of this study is to understand the fundamental characteristics of selected financial time series by using the time as well as the frequency domain analysis. After that prediction can be executed for the desired system for in sample forecasting. In this study, multiresolution analysis which the assist of discrete wavelet transforms (DWT) and maximal overlap discrete wavelet transform (MODWT) will be used to pinpoint special characteristics of Bursa Malaysia KLCI (Kuala Lumpur Composite Index) daily closing prices and return values. In addition, further case study discussions include the modeling of Bursa Malaysia KLCI using linear ARIMA with wavelets to address how multiresolution approach improves fitting and forecasting results.

  15. Modelling bursty time series

    International Nuclear Information System (INIS)

    Vajna, Szabolcs; Kertész, János; Tóth, Bálint

    2013-01-01

    Many human-related activities show power-law decaying interevent time distribution with exponents usually varying between 1 and 2. We study a simple task-queuing model, which produces bursty time series due to the non-trivial dynamics of the task list. The model is characterized by a priority distribution as an input parameter, which describes the choice procedure from the list. We give exact results on the asymptotic behaviour of the model and we show that the interevent time distribution is power-law decaying for any kind of input distributions that remain normalizable in the infinite list limit, with exponents tunable between 1 and 2. The model satisfies a scaling law between the exponents of interevent time distribution (β) and autocorrelation function (α): α + β = 2. This law is general for renewal processes with power-law decaying interevent time distribution. We conclude that slowly decaying autocorrelation function indicates long-range dependence only if the scaling law is violated. (paper)

  16. Timing calibration and spectral cleaning of LOFAR time series data

    NARCIS (Netherlands)

    Corstanje, A.; Buitink, S.; Enriquez, J. E.; Falcke, H.; Horandel, J. R.; Krause, M.; Nelles, A.; Rachen, J. P.; Schellart, P.; Scholten, O.; ter Veen, S.; Thoudam, S.; Trinh, T. N. G.

    We describe a method for spectral cleaning and timing calibration of short time series data of the voltage in individual radio interferometer receivers. It makes use of phase differences in fast Fourier transform (FFT) spectra across antenna pairs. For strong, localized terrestrial sources these are

  17. Time series momentum and contrarian effects in the Chinese stock market

    Science.gov (United States)

    Shi, Huai-Long; Zhou, Wei-Xing

    2017-10-01

    This paper concentrates on the time series momentum or contrarian effects in the Chinese stock market. We evaluate the performance of the time series momentum strategy applied to major stock indices in mainland China and explore the relation between the performance of time series momentum strategies and some firm-specific characteristics. Our findings indicate that there is a time series momentum effect in the short run and a contrarian effect in the long run in the Chinese stock market. The performances of the time series momentum and contrarian strategies are highly dependent on the look-back and holding periods and firm-specific characteristics.

  18. Time-Series Analysis: A Cautionary Tale

    Science.gov (United States)

    Damadeo, Robert

    2015-01-01

    Time-series analysis has often been a useful tool in atmospheric science for deriving long-term trends in various atmospherically important parameters (e.g., temperature or the concentration of trace gas species). In particular, time-series analysis has been repeatedly applied to satellite datasets in order to derive the long-term trends in stratospheric ozone, which is a critical atmospheric constituent. However, many of the potential pitfalls relating to the non-uniform sampling of the datasets were often ignored and the results presented by the scientific community have been unknowingly biased. A newly developed and more robust application of this technique is applied to the Stratospheric Aerosol and Gas Experiment (SAGE) II version 7.0 ozone dataset and the previous biases and newly derived trends are presented.

  19. Characterizing interdependencies of multiple time series theory and applications

    CERN Document Server

    Hosoya, Yuzo; Takimoto, Taro; Kinoshita, Ryo

    2017-01-01

    This book introduces academic researchers and professionals to the basic concepts and methods for characterizing interdependencies of multiple time series in the frequency domain. Detecting causal directions between a pair of time series and the extent of their effects, as well as testing the non existence of a feedback relation between them, have constituted major focal points in multiple time series analysis since Granger introduced the celebrated definition of causality in view of prediction improvement. Causality analysis has since been widely applied in many disciplines. Although most analyses are conducted from the perspective of the time domain, a frequency domain method introduced in this book sheds new light on another aspect that disentangles the interdependencies between multiple time series in terms of long-term or short-term effects, quantitatively characterizing them. The frequency domain method includes the Granger noncausality test as a special case. Chapters 2 and 3 of the book introduce an i...

  20. Interrupted time-series analysis: studying trends in neurosurgery.

    Science.gov (United States)

    Wong, Ricky H; Smieliauskas, Fabrice; Pan, I-Wen; Lam, Sandi K

    2015-12-01

    OBJECT Neurosurgery studies traditionally have evaluated the effects of interventions on health care outcomes by studying overall changes in measured outcomes over time. Yet, this type of linear analysis is limited due to lack of consideration of the trend's effects both pre- and postintervention and the potential for confounding influences. The aim of this study was to illustrate interrupted time-series analysis (ITSA) as applied to an example in the neurosurgical literature and highlight ITSA's potential for future applications. METHODS The methods used in previous neurosurgical studies were analyzed and then compared with the methodology of ITSA. RESULTS The ITSA method was identified in the neurosurgical literature as an important technique for isolating the effect of an intervention (such as a policy change or a quality and safety initiative) on a health outcome independent of other factors driving trends in the outcome. The authors determined that ITSA allows for analysis of the intervention's immediate impact on outcome level and on subsequent trends and enables a more careful measure of the causal effects of interventions on health care outcomes. CONCLUSIONS ITSA represents a significant improvement over traditional observational study designs in quantifying the impact of an intervention. ITSA is a useful statistical procedure to understand, consider, and implement as the field of neurosurgery evolves in sophistication in big-data analytics, economics, and health services research.

  1. A perturbative approach for enhancing the performance of time series forecasting.

    Science.gov (United States)

    de Mattos Neto, Paulo S G; Ferreira, Tiago A E; Lima, Aranildo R; Vasconcelos, Germano C; Cavalcanti, George D C

    2017-04-01

    This paper proposes a method to perform time series prediction based on perturbation theory. The approach is based on continuously adjusting an initial forecasting model to asymptotically approximate a desired time series model. First, a predictive model generates an initial forecasting for a time series. Second, a residual time series is calculated as the difference between the original time series and the initial forecasting. If that residual series is not white noise, then it can be used to improve the accuracy of the initial model and a new predictive model is adjusted using residual series. The whole process is repeated until convergence or the residual series becomes white noise. The output of the method is then given by summing up the outputs of all trained predictive models in a perturbative sense. To test the method, an experimental investigation was conducted on six real world time series. A comparison was made with six other methods experimented and ten other results found in the literature. Results show that not only the performance of the initial model is significantly improved but also the proposed method outperforms the other results previously published. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Drunk driving detection based on classification of multivariate time series.

    Science.gov (United States)

    Li, Zhenlong; Jin, Xue; Zhao, Xiaohua

    2015-09-01

    This paper addresses the problem of detecting drunk driving based on classification of multivariate time series. First, driving performance measures were collected from a test in a driving simulator located in the Traffic Research Center, Beijing University of Technology. Lateral position and steering angle were used to detect drunk driving. Second, multivariate time series analysis was performed to extract the features. A piecewise linear representation was used to represent multivariate time series. A bottom-up algorithm was then employed to separate multivariate time series. The slope and time interval of each segment were extracted as the features for classification. Third, a support vector machine classifier was used to classify driver's state into two classes (normal or drunk) according to the extracted features. The proposed approach achieved an accuracy of 80.0%. Drunk driving detection based on the analysis of multivariate time series is feasible and effective. The approach has implications for drunk driving detection. Copyright © 2015 Elsevier Ltd and National Safety Council. All rights reserved.

  3. Evaluation of scaling invariance embedded in short time series.

    Directory of Open Access Journals (Sweden)

    Xue Pan

    Full Text Available Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2. Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03 and sharp confidential interval (standard deviation ≤0.05. Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  4. Evaluation of scaling invariance embedded in short time series.

    Science.gov (United States)

    Pan, Xue; Hou, Lei; Stephen, Mutua; Yang, Huijie; Zhu, Chenping

    2014-01-01

    Scaling invariance of time series has been making great contributions in diverse research fields. But how to evaluate scaling exponent from a real-world series is still an open problem. Finite length of time series may induce unacceptable fluctuation and bias to statistical quantities and consequent invalidation of currently used standard methods. In this paper a new concept called correlation-dependent balanced estimation of diffusion entropy is developed to evaluate scale-invariance in very short time series with length ~10(2). Calculations with specified Hurst exponent values of 0.2,0.3,...,0.9 show that by using the standard central moving average de-trending procedure this method can evaluate the scaling exponents for short time series with ignorable bias (≤0.03) and sharp confidential interval (standard deviation ≤0.05). Considering the stride series from ten volunteers along an approximate oval path of a specified length, we observe that though the averages and deviations of scaling exponents are close, their evolutionary behaviors display rich patterns. It has potential use in analyzing physiological signals, detecting early warning signals, and so on. As an emphasis, the our core contribution is that by means of the proposed method one can estimate precisely shannon entropy from limited records.

  5. Modeling Non-Gaussian Time Series with Nonparametric Bayesian Model.

    Science.gov (United States)

    Xu, Zhiguang; MacEachern, Steven; Xu, Xinyi

    2015-02-01

    We present a class of Bayesian copula models whose major components are the marginal (limiting) distribution of a stationary time series and the internal dynamics of the series. We argue that these are the two features with which an analyst is typically most familiar, and hence that these are natural components with which to work. For the marginal distribution, we use a nonparametric Bayesian prior distribution along with a cdf-inverse cdf transformation to obtain large support. For the internal dynamics, we rely on the traditionally successful techniques of normal-theory time series. Coupling the two components gives us a family of (Gaussian) copula transformed autoregressive models. The models provide coherent adjustments of time scales and are compatible with many extensions, including changes in volatility of the series. We describe basic properties of the models, show their ability to recover non-Gaussian marginal distributions, and use a GARCH modification of the basic model to analyze stock index return series. The models are found to provide better fit and improved short-range and long-range predictions than Gaussian competitors. The models are extensible to a large variety of fields, including continuous time models, spatial models, models for multiple series, models driven by external covariate streams, and non-stationary models.

  6. Geomechanical time series and its singularity spectrum analysis

    Czech Academy of Sciences Publication Activity Database

    Lyubushin, Alexei A.; Kaláb, Zdeněk; Lednická, Markéta

    2012-01-01

    Roč. 47, č. 1 (2012), s. 69-77 ISSN 1217-8977 R&D Projects: GA ČR GA105/09/0089 Institutional research plan: CEZ:AV0Z30860518 Keywords : geomechanical time series * singularity spectrum * time series segmentation * laser distance meter Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 0.347, year: 2012 http://www.akademiai.com/content/88v4027758382225/fulltext.pdf

  7. Pseudo-random bit generator based on lag time series

    Science.gov (United States)

    García-Martínez, M.; Campos-Cantón, E.

    2014-12-01

    In this paper, we present a pseudo-random bit generator (PRBG) based on two lag time series of the logistic map using positive and negative values in the bifurcation parameter. In order to hidden the map used to build the pseudo-random series we have used a delay in the generation of time series. These new series when they are mapped xn against xn+1 present a cloud of points unrelated to the logistic map. Finally, the pseudo-random sequences have been tested with the suite of NIST giving satisfactory results for use in stream ciphers.

  8. Non-linear forecasting in high-frequency financial time series

    Science.gov (United States)

    Strozzi, F.; Zaldívar, J. M.

    2005-08-01

    A new methodology based on state space reconstruction techniques has been developed for trading in financial markets. The methodology has been tested using 18 high-frequency foreign exchange time series. The results are in apparent contradiction with the efficient market hypothesis which states that no profitable information about future movements can be obtained by studying the past prices series. In our (off-line) analysis positive gain may be obtained in all those series. The trading methodology is quite general and may be adapted to other financial time series. Finally, the steps for its on-line application are discussed.

  9. Analysis of JET ELMy time series

    International Nuclear Information System (INIS)

    Zvejnieks, G.; Kuzovkov, V.N.

    2005-01-01

    Full text: Achievement of the planned operational regime in the next generation tokamaks (such as ITER) still faces principal problems. One of the main challenges is obtaining the control of edge localized modes (ELMs), which should lead to both long plasma pulse times and reasonable divertor life time. In order to control ELMs the hypothesis was proposed by Degeling [1] that ELMs exhibit features of chaotic dynamics and thus a standard chaos control methods might be applicable. However, our findings which are based on the nonlinear autoregressive (NAR) model contradict this hypothesis for JET ELMy time-series. In turn, it means that ELM behavior is of a relaxation or random type. These conclusions coincide with our previous results obtained for ASDEX Upgrade time series [2]. [1] A.W. Degeling, Y.R. Martin, P.E. Bak, J. B.Lister, and X. Llobet, Plasma Phys. Control. Fusion 43, 1671 (2001). [2] G. Zvejnieks, V.N. Kuzovkov, O. Dumbrajs, A.W. Degeling, W. Suttrop, H. Urano, and H. Zohm, Physics of Plasmas 11, 5658 (2004)

  10. The Statistical Analysis of Time Series

    CERN Document Server

    Anderson, T W

    2011-01-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences George

  11. Late Pleistocene and Holocene landscape formation in a gully catchment area in Northern Hesse, Germany

    DEFF Research Database (Denmark)

    Döhler, Susanne; Damm, Bodo; Terhorst, Birgit

    2015-01-01

    the differentiation between Pleistocene and Holocene landforms. Radiocarbon and optically stimulated luminescence dating are applied to add numerical data to the relative ages of the sediments and landforms. The gully channels are oriented along Pleistocene depressions that are built up of periglacial cover beds...... and intercalated reworked loess. As the gully channels cut through the periglacial cover beds, especially the upper layer, the gully system is of Holocene age. At least two phases of gully erosion are identified in the alluvial fan sediments. The initial gully erosion is dated to the time span between the Late......Permanent gully channels under forest are common geomorphological features in Central European low mountain areas. In the Rehgraben/Fuchslöchergraben gully catchment in Northern Hesse, Germany the Late Pleistocene landscape formation is reconstructed based on periglacial cover beds. In addition...

  12. Late Pleistocene glacial stratigraphy of the Kumara-Moana region, West Coast of South Island, New Zealand

    Science.gov (United States)

    Barrows, Timothy T.; Almond, Peter; Rose, Robert; Keith Fifield, L.; Mills, Stephanie C.; Tims, Stephen G.

    2013-08-01

    On the South Island of New Zealand, large piedmont glaciers descended from an ice cap on the Southern Alps onto the coastal plain of the West Coast during the late Pleistocene. The series of moraine belts and outwash plains left by the Taramakau glacier are used as a type section for interpreting the glacial geology and timing of major climatic events of New Zealand and also as a benchmark for comparison with the wider Southern Hemisphere. In this paper we review the chronology of advances by the Taramakau glacier during the last or Otira Glaciation using a combination of exposure dating using the cosmogenic nuclides 10Be and 36Cl, and tephrochronology. We document three distinct glacial maxima, represented by the Loopline, Larrikins and Moana Formations, separated by brief interstadials. We find that the Loopline Formation, originally attributed to Oxygen Isotope Chronozone 4, is much younger than previously thought, with an advance culminating around 24,900 ± 800 yr. The widespread late Pleistocene Kawakawa/Oruanui tephra stratigraphically lies immediately above it. This Formation has the same age previously attributed to the older part of the Larrikins Formation. Dating of the Larrikins Formation demonstrates there is no longer a basis for subdividing it into older and younger phases with an advance lasting about 1000 years between 20,800 ± 500 to 20,000 ± 400 yr. The Moana Formation represents the deposits of the last major advance of ice at 17,300 ± 500 yr and is younger than expected based on limited previous dating. The timing of major piedmont glaciation is restricted to between ˜25,000 and 17,000 yr and this interval corresponds to a time of regionally cold sea surface temperatures, expansion of grasslands at the expense of forest on South Island, and hemisphere wide glaciation.

  13. LAI, FAPAR and FCOVER products derived from AVHRR long time series: principles and evaluation

    Science.gov (United States)

    Verger, A.; Baret, F.; Weiss, M.; Lacaze, R.; Makhmara, H.; Pacholczyk, P.; Smets, B.; Kandasamy, S.; Vermote, E.

    2012-04-01

    Continuous and long term global monitoring of the terrestrial biosphere has draught an intense interest in the recent years in the context of climate and global change. Developing methodologies for generating historical data records from data collected with different satellite sensors over the past three decades by taking benefits from the improvements identified in the processing of the new generation sensors is a new central issue in remote sensing community. In this context, the Bio-geophysical Parameters (BioPar) service within Geoland2 project (http://www.geoland2.eu) aims at developing pre-operational infrastructures for providing global land products both in near real time and off-line mode with long time series. In this contribution, we describe the principles of the GEOLAND algorithm for generating long term datasets of three key biophysical variables, leaf area index (LAI), Fraction of Absorbed Photosynthetic Active Radiation (FAPAR) and cover fraction (FCOVER), that play a key role in several processes, including photosynthesis, respiration and transpiration. LAI, FAPAR and FCOVER are produced globally from AVHRR Long Term Data Record (LTDR) for the 1981-2000 period at 0.05° spatial resolution and 10 days temporal sampling frequency. The proposed algorithm aims to ensure robustness of the derived long time series and consistency with the ones developed in the recent years, and particularly with GEOLAND products derived from VEGETATION sensor. The approach is based on the capacity of neural networks to learn a particular biophysical product (GEOLAND) from reflectances from another sensor (AVHRR normalized reflectances in the red and near infrared bands). Outliers due to possible cloud contamination or residual atmospheric correction are iteratively eliminated. Prior information based on the climatology is used to get more robust estimates. A specific gap filing and smoothing procedure was applied to generate continuous and smooth time series of decadal

  14. Scalable Prediction of Energy Consumption using Incremental Time Series Clustering

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Noor, Muhammad Usman

    2013-10-09

    Time series datasets are a canonical form of high velocity Big Data, and often generated by pervasive sensors, such as found in smart infrastructure. Performing predictive analytics on time series data can be computationally complex, and requires approximation techniques. In this paper, we motivate this problem using a real application from the smart grid domain. We propose an incremental clustering technique, along with a novel affinity score for determining cluster similarity, which help reduce the prediction error for cumulative time series within a cluster. We evaluate this technique, along with optimizations, using real datasets from smart meters, totaling ~700,000 data points, and show the efficacy of our techniques in improving the prediction error of time series data within polynomial time.

  15. Forecasting with nonlinear time series models

    DEFF Research Database (Denmark)

    Kock, Anders Bredahl; Teräsvirta, Timo

    In this paper, nonlinear models are restricted to mean nonlinear parametric models. Several such models popular in time series econo- metrics are presented and some of their properties discussed. This in- cludes two models based on universal approximators: the Kolmogorov- Gabor polynomial model...... applied to economic fore- casting problems, is briefly highlighted. A number of large published studies comparing macroeconomic forecasts obtained using different time series models are discussed, and the paper also contains a small simulation study comparing recursive and direct forecasts in a partic...... and two versions of a simple artificial neural network model. Techniques for generating multi-period forecasts from nonlinear models recursively are considered, and the direct (non-recursive) method for this purpose is mentioned as well. Forecasting with com- plex dynamic systems, albeit less frequently...

  16. Nonparametric factor analysis of time series

    OpenAIRE

    Rodríguez-Poo, Juan M.; Linton, Oliver Bruce

    1998-01-01

    We introduce a nonparametric smoothing procedure for nonparametric factor analaysis of multivariate time series. The asymptotic properties of the proposed procedures are derived. We present an application based on the residuals from the Fair macromodel.

  17. Time Series Outlier Detection Based on Sliding Window Prediction

    Directory of Open Access Journals (Sweden)

    Yufeng Yu

    2014-01-01

    Full Text Available In order to detect outliers in hydrological time series data for improving data quality and decision-making quality related to design, operation, and management of water resources, this research develops a time series outlier detection method for hydrologic data that can be used to identify data that deviate from historical patterns. The method first built a forecasting model on the history data and then used it to predict future values. Anomalies are assumed to take place if the observed values fall outside a given prediction confidence interval (PCI, which can be calculated by the predicted value and confidence coefficient. The use of PCI as threshold is mainly on the fact that it considers the uncertainty in the data series parameters in the forecasting model to address the suitable threshold selection problem. The method performs fast, incremental evaluation of data as it becomes available, scales to large quantities of data, and requires no preclassification of anomalies. Experiments with different hydrologic real-world time series showed that the proposed methods are fast and correctly identify abnormal data and can be used for hydrologic time series analysis.

  18. Metagenomics meets time series analysis: unraveling microbial community dynamics

    NARCIS (Netherlands)

    Faust, K.; Lahti, L.M.; Gonze, D.; Vos, de W.M.; Raes, J.

    2015-01-01

    The recent increase in the number of microbial time series studies offers new insights into the stability and dynamics of microbial communities, from the world's oceans to human microbiota. Dedicated time series analysis tools allow taking full advantage of these data. Such tools can reveal periodic

  19. Time series forecasting based on deep extreme learning machine

    NARCIS (Netherlands)

    Guo, Xuqi; Pang, Y.; Yan, Gaowei; Qiao, Tiezhu; Yang, Guang-Hong; Yang, Dan

    2017-01-01

    Multi-layer Artificial Neural Networks (ANN) has caught widespread attention as a new method for time series forecasting due to the ability of approximating any nonlinear function. In this paper, a new local time series prediction model is established with the nearest neighbor domain theory, in

  20. False-nearest-neighbors algorithm and noise-corrupted time series

    International Nuclear Information System (INIS)

    Rhodes, C.; Morari, M.

    1997-01-01

    The false-nearest-neighbors (FNN) algorithm was originally developed to determine the embedding dimension for autonomous time series. For noise-free computer-generated time series, the algorithm does a good job in predicting the embedding dimension. However, the problem of predicting the embedding dimension when the time-series data are corrupted by noise was not fully examined in the original studies of the FNN algorithm. Here it is shown that with large data sets, even small amounts of noise can lead to incorrect prediction of the embedding dimension. Surprisingly, as the length of the time series analyzed by FNN grows larger, the cause of incorrect prediction becomes more pronounced. An analysis of the effect of noise on the FNN algorithm and a solution for dealing with the effects of noise are given here. Some results on the theoretically correct choice of the FNN threshold are also presented. copyright 1997 The American Physical Society

  1. Some regularities of spatial and time distribution of organogenous material in Upper-Pleistocene and Holocene sediments of Central Asia (from the data of Carbon-isotope dating)

    International Nuclear Information System (INIS)

    Pshenin, G.N.; Steklenkov, A.P.; Varushchenko, A.N.

    1991-01-01

    The analysis of space time distribution of ancient organogenous material is carried out through generalization of practically all available at the present time data on radiocarbon dating of Upper-Pleistocene and Holocene sediments in the Middle Asia. The investigations were performed to study the variability of humidification over the specific territory of the Middle Asia within a determined period of time. Three rather clearly limited vertical height intervals are determined by the results of the isotope dating of wood, coal, peat and mollus samples

  2. Paleocene-Eocene and Plio-Pleistocene sea-level changes as "species pumps" in Southeast Asia: Evidence from Althepus spiders.

    Science.gov (United States)

    Li, Fengyuan; Li, Shuqiang

    2018-05-17

    Sea-level change has been viewed as a primary driver in the formation of biodiversity. Early studies confirmed that Plio-Pleistocene sea-level changes led to the isolation and subsequent genetic differentiation of Southeast (SE) Asian organisms over short geological timescales. However, long-time consequences of sea-level fluctuations remain unclear. Herein, we analyze the evolutionary history of Althepus (spiders) whose distribution encompasses Indo-Burma and the Sunda shelf islands to understand how sea-level changes over shallow and deep timescales effected their history. Our integrative analyses, including phylogeny, divergence times, ancestral area reconstruction and diversification dynamics, reveal an intricate pattern of diversification, probably triggered by sea-level fluctuations during the Paleocene-Eocene and Plio-Pleistocene. The timing of one early divergence between the Indo-Burmese and Sundaic species coincides with late Paleocene and early Eocene high global sea levels, which induced the formation of inland seaways in the Thai-Malay Peninsula. Subsequent lowered sea levels could have provided a land bridge for its dispersal colonization across the Isthmus of Kra. Analyses suggest that Plio-Pleistocene sea-level rises contributed to recent divergence of many species. Thus, our findings cannot reject the hypothesis that sea-level changes during the Paleocene-Eocene and Plio-Pleistocene played a major role in generating biodiversity in SE Asia; sea-level changes can act as "species pumps". Copyright © 2018 Elsevier Inc. All rights reserved.

  3. CauseMap: fast inference of causality from complex time series.

    Science.gov (United States)

    Maher, M Cyrus; Hernandez, Ryan D

    2015-01-01

    Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data. Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM), a method for establishing causality from long time series data (≳25 observations). Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens' Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement CCM in Julia, a

  4. CauseMap: fast inference of causality from complex time series

    Directory of Open Access Journals (Sweden)

    M. Cyrus Maher

    2015-03-01

    Full Text Available Background. Establishing health-related causal relationships is a central pursuit in biomedical research. Yet, the interdependent non-linearity of biological systems renders causal dynamics laborious and at times impractical to disentangle. This pursuit is further impeded by the dearth of time series that are sufficiently long to observe and understand recurrent patterns of flux. However, as data generation costs plummet and technologies like wearable devices democratize data collection, we anticipate a coming surge in the availability of biomedically-relevant time series data. Given the life-saving potential of these burgeoning resources, it is critical to invest in the development of open source software tools that are capable of drawing meaningful insight from vast amounts of time series data.Results. Here we present CauseMap, the first open source implementation of convergent cross mapping (CCM, a method for establishing causality from long time series data (≳25 observations. Compared to existing time series methods, CCM has the advantage of being model-free and robust to unmeasured confounding that could otherwise induce spurious associations. CCM builds on Takens’ Theorem, a well-established result from dynamical systems theory that requires only mild assumptions. This theorem allows us to reconstruct high dimensional system dynamics using a time series of only a single variable. These reconstructions can be thought of as shadows of the true causal system. If reconstructed shadows can predict points from opposing time series, we can infer that the corresponding variables are providing views of the same causal system, and so are causally related. Unlike traditional metrics, this test can establish the directionality of causation, even in the presence of feedback loops. Furthermore, since CCM can extract causal relationships from times series of, e.g., a single individual, it may be a valuable tool to personalized medicine. We implement

  5. Time domain series system definition and gear set reliability modeling

    International Nuclear Information System (INIS)

    Xie, Liyang; Wu, Ningxiang; Qian, Wenxue

    2016-01-01

    Time-dependent multi-configuration is a typical feature for mechanical systems such as gear trains and chain drives. As a series system, a gear train is distinct from a traditional series system, such as a chain, in load transmission path, system-component relationship, system functioning manner, as well as time-dependent system configuration. Firstly, the present paper defines time-domain series system to which the traditional series system reliability model is not adequate. Then, system specific reliability modeling technique is proposed for gear sets, including component (tooth) and subsystem (tooth-pair) load history description, material priori/posterior strength expression, time-dependent and system specific load-strength interference analysis, as well as statistically dependent failure events treatment. Consequently, several system reliability models are developed for gear sets with different tooth numbers in the scenario of tooth root material ultimate tensile strength failure. The application of the models is discussed in the last part, and the differences between the system specific reliability model and the traditional series system reliability model are illustrated by virtue of several numerical examples. - Highlights: • A new type of series system, i.e. time-domain multi-configuration series system is defined, that is of great significance to reliability modeling. • Multi-level statistical analysis based reliability modeling method is presented for gear transmission system. • Several system specific reliability models are established for gear set reliability estimation. • The differences between the traditional series system reliability model and the new model are illustrated.

  6. Track Irregularity Time Series Analysis and Trend Forecasting

    Directory of Open Access Journals (Sweden)

    Jia Chaolong

    2012-01-01

    Full Text Available The combination of linear and nonlinear methods is widely used in the prediction of time series data. This paper analyzes track irregularity time series data by using gray incidence degree models and methods of data transformation, trying to find the connotative relationship between the time series data. In this paper, GM (1,1 is based on first-order, single variable linear differential equations; after an adaptive improvement and error correction, it is used to predict the long-term changing trend of track irregularity at a fixed measuring point; the stochastic linear AR, Kalman filtering model, and artificial neural network model are applied to predict the short-term changing trend of track irregularity at unit section. Both long-term and short-term changes prove that the model is effective and can achieve the expected accuracy.

  7. PRESEE: an MDL/MML algorithm to time-series stream segmenting.

    Science.gov (United States)

    Xu, Kaikuo; Jiang, Yexi; Tang, Mingjie; Yuan, Changan; Tang, Changjie

    2013-01-01

    Time-series stream is one of the most common data types in data mining field. It is prevalent in fields such as stock market, ecology, and medical care. Segmentation is a key step to accelerate the processing speed of time-series stream mining. Previous algorithms for segmenting mainly focused on the issue of ameliorating precision instead of paying much attention to the efficiency. Moreover, the performance of these algorithms depends heavily on parameters, which are hard for the users to set. In this paper, we propose PRESEE (parameter-free, real-time, and scalable time-series stream segmenting algorithm), which greatly improves the efficiency of time-series stream segmenting. PRESEE is based on both MDL (minimum description length) and MML (minimum message length) methods, which could segment the data automatically. To evaluate the performance of PRESEE, we conduct several experiments on time-series streams of different types and compare it with the state-of-art algorithm. The empirical results show that PRESEE is very efficient for real-time stream datasets by improving segmenting speed nearly ten times. The novelty of this algorithm is further demonstrated by the application of PRESEE in segmenting real-time stream datasets from ChinaFLUX sensor networks data stream.

  8. Time-varying surrogate data to assess nonlinearity in nonstationary time series: application to heart rate variability.

    Science.gov (United States)

    Faes, Luca; Zhao, He; Chon, Ki H; Nollo, Giandomenico

    2009-03-01

    We propose a method to extend to time-varying (TV) systems the procedure for generating typical surrogate time series, in order to test the presence of nonlinear dynamics in potentially nonstationary signals. The method is based on fitting a TV autoregressive (AR) model to the original series and then regressing the model coefficients with random replacements of the model residuals to generate TV AR surrogate series. The proposed surrogate series were used in combination with a TV sample entropy (SE) discriminating statistic to assess nonlinearity in both simulated and experimental time series, in comparison with traditional time-invariant (TIV) surrogates combined with the TIV SE discriminating statistic. Analysis of simulated time series showed that using TIV surrogates, linear nonstationary time series may be erroneously regarded as nonlinear and weak TV nonlinearities may remain unrevealed, while the use of TV AR surrogates markedly increases the probability of a correct interpretation. Application to short (500 beats) heart rate variability (HRV) time series recorded at rest (R), after head-up tilt (T), and during paced breathing (PB) showed: 1) modifications of the SE statistic that were well interpretable with the known cardiovascular physiology; 2) significant contribution of nonlinear dynamics to HRV in all conditions, with significant increase during PB at 0.2 Hz respiration rate; and 3) a disagreement between TV AR surrogates and TIV surrogates in about a quarter of the series, suggesting that nonstationarity may affect HRV recordings and bias the outcome of the traditional surrogate-based nonlinearity test.

  9. Local normalization: Uncovering correlations in non-stationary financial time series

    Science.gov (United States)

    Schäfer, Rudi; Guhr, Thomas

    2010-09-01

    The measurement of correlations between financial time series is of vital importance for risk management. In this paper we address an estimation error that stems from the non-stationarity of the time series. We put forward a method to rid the time series of local trends and variable volatility, while preserving cross-correlations. We test this method in a Monte Carlo simulation, and apply it to empirical data for the S&P 500 stocks.

  10. Fuzzy time-series based on Fibonacci sequence for stock price forecasting

    Science.gov (United States)

    Chen, Tai-Liang; Cheng, Ching-Hsue; Jong Teoh, Hia

    2007-07-01

    Time-series models have been utilized to make reasonably accurate predictions in the areas of stock price movements, academic enrollments, weather, etc. For promoting the forecasting performance of fuzzy time-series models, this paper proposes a new model, which incorporates the concept of the Fibonacci sequence, the framework of Song and Chissom's model and the weighted method of Yu's model. This paper employs a 5-year period TSMC (Taiwan Semiconductor Manufacturing Company) stock price data and a 13-year period of TAIEX (Taiwan Stock Exchange Capitalization Weighted Stock Index) stock index data as experimental datasets. By comparing our forecasting performances with Chen's (Forecasting enrollments based on fuzzy time-series. Fuzzy Sets Syst. 81 (1996) 311-319), Yu's (Weighted fuzzy time-series models for TAIEX forecasting. Physica A 349 (2004) 609-624) and Huarng's (The application of neural networks to forecast fuzzy time series. Physica A 336 (2006) 481-491) models, we conclude that the proposed model surpasses in accuracy these conventional fuzzy time-series models.

  11. Parameterizing unconditional skewness in models for financial time series

    DEFF Research Database (Denmark)

    He, Changli; Silvennoinen, Annastiina; Teräsvirta, Timo

    In this paper we consider the third-moment structure of a class of time series models. It is often argued that the marginal distribution of financial time series such as returns is skewed. Therefore it is of importance to know what properties a model should possess if it is to accommodate...

  12. Self-organising mixture autoregressive model for non-stationary time series modelling.

    Science.gov (United States)

    Ni, He; Yin, Hujun

    2008-12-01

    Modelling non-stationary time series has been a difficult task for both parametric and nonparametric methods. One promising solution is to combine the flexibility of nonparametric models with the simplicity of parametric models. In this paper, the self-organising mixture autoregressive (SOMAR) network is adopted as a such mixture model. It breaks time series into underlying segments and at the same time fits local linear regressive models to the clusters of segments. In such a way, a global non-stationary time series is represented by a dynamic set of local linear regressive models. Neural gas is used for a more flexible structure of the mixture model. Furthermore, a new similarity measure has been introduced in the self-organising network to better quantify the similarity of time series segments. The network can be used naturally in modelling and forecasting non-stationary time series. Experiments on artificial, benchmark time series (e.g. Mackey-Glass) and real-world data (e.g. numbers of sunspots and Forex rates) are presented and the results show that the proposed SOMAR network is effective and superior to other similar approaches.

  13. The Prediction of Teacher Turnover Employing Time Series Analysis.

    Science.gov (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  14. Stacked Heterogeneous Neural Networks for Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Florin Leon

    2010-01-01

    Full Text Available A hybrid model for time series forecasting is proposed. It is a stacked neural network, containing one normal multilayer perceptron with bipolar sigmoid activation functions, and the other with an exponential activation function in the output layer. As shown by the case studies, the proposed stacked hybrid neural model performs well on a variety of benchmark time series. The combination of weights of the two stack components that leads to optimal performance is also studied.

  15. Chaotic time series prediction: From one to another

    International Nuclear Information System (INIS)

    Zhao Pengfei; Xing Lei; Yu Jun

    2009-01-01

    In this Letter, a new local linear prediction model is proposed to predict a chaotic time series of a component x(t) by using the chaotic time series of another component y(t) in the same system with x(t). Our approach is based on the phase space reconstruction coming from the Takens embedding theorem. To illustrate our results, we present an example of Lorenz system and compare with the performance of the original local linear prediction model.

  16. Grammar-based feature generation for time-series prediction

    CERN Document Server

    De Silva, Anthony Mihirana

    2015-01-01

    This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method ...

  17. Forecasting autoregressive time series under changing persistence

    DEFF Research Database (Denmark)

    Kruse, Robinson

    Changing persistence in time series models means that a structural change from nonstationarity to stationarity or vice versa occurs over time. Such a change has important implications for forecasting, as negligence may lead to inaccurate model predictions. This paper derives generally applicable...

  18. Recurrent Neural Networks for Multivariate Time Series with Missing Values.

    Science.gov (United States)

    Che, Zhengping; Purushotham, Sanjay; Cho, Kyunghyun; Sontag, David; Liu, Yan

    2018-04-17

    Multivariate time series data in practical applications, such as health care, geoscience, and biology, are characterized by a variety of missing values. In time series prediction and other related tasks, it has been noted that missing values and their missing patterns are often correlated with the target labels, a.k.a., informative missingness. There is very limited work on exploiting the missing patterns for effective imputation and improving prediction performance. In this paper, we develop novel deep learning models, namely GRU-D, as one of the early attempts. GRU-D is based on Gated Recurrent Unit (GRU), a state-of-the-art recurrent neural network. It takes two representations of missing patterns, i.e., masking and time interval, and effectively incorporates them into a deep model architecture so that it not only captures the long-term temporal dependencies in time series, but also utilizes the missing patterns to achieve better prediction results. Experiments of time series classification tasks on real-world clinical datasets (MIMIC-III, PhysioNet) and synthetic datasets demonstrate that our models achieve state-of-the-art performance and provide useful insights for better understanding and utilization of missing values in time series analysis.

  19. Conditional time series forecasting with convolutional neural networks

    NARCIS (Netherlands)

    A. Borovykh (Anastasia); S.M. Bohte (Sander); C.W. Oosterlee (Cornelis)

    2017-01-01

    textabstractForecasting financial time series using past observations has been a significant topic of interest. While temporal relationships in the data exist, they are difficult to analyze and predict accurately due to the non-linear trends and noise present in the series. We propose to learn these

  20. Time Series Analysis of Wheat Futures Reward in China

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Different from the fact that the main researches are focused on single futures contract and lack of the comparison of different periods, this paper described the statistical characteristics of wheat futures reward time series of Zhengzhou Commodity Exchange in recent three years. Besides the basic statistic analysis, the paper used the GARCH and EGARCH model to describe the time series which had the ARCH effect and analyzed the persistence of volatility shocks and the leverage effect. The results showed that compared with that of normal one,wheat futures reward series were abnormality, leptokurtic and thick tail distribution. The study also found that two-part of the reward series had no autocorrelation. Among the six correlative series, three ones presented the ARCH effect. By using of the Auto-regressive Distributed Lag Model, GARCH model and EGARCH model, the paper demonstrates the persistence of volatility shocks and the leverage effect on the wheat futures reward time series. The results reveal that on the one hand, the statistical characteristics of the wheat futures reward are similar to the aboard mature futures market as a whole. But on the other hand, the results reflect some shortages such as the immatureness and the over-control by the government in the Chinese future market.

  1. The first occurence of a pleistocenic coral along the Brazilian coast - Age dating of the maximum of the penultimate transgression

    International Nuclear Information System (INIS)

    Martin, L.; Bittencourt, A.C.S.P.; Silva Vilas Boas, G. da

    1982-01-01

    Age dating work on a coral from Olivenca, Bahia, Brazil, has disclosed the first occurrence of a pleistocenic coral along the Brazilian coast. This coral has its top at the present high tide level and is covered by a series of beach-ridges formed after the maximum of the penultimate transgression that rose above present sea level. Five determinations by the Ionium ( 230 Th)/Uranium method produced ages ranging from 116.000 to 142.000 years B.P., indicating that maximum in the area to have taken place 120.000-125.000 years B.P., consistent with its documentation in other parts of the world. At that time, mean sea level was 8 + - 2 m above the present. (Author) [pt

  2. forecasting with nonlinear time series model: a monte-carlo

    African Journals Online (AJOL)

    PUBLICATIONS1

    erated recursively up to any step greater than one. For nonlinear time series model, point forecast for step one can be done easily like in the linear case but forecast for a step greater than or equal to ..... London. Franses, P. H. (1998). Time series models for business and Economic forecasting, Cam- bridge University press.

  3. Time series analysis of temporal networks

    Science.gov (United States)

    Sikdar, Sandipan; Ganguly, Niloy; Mukherjee, Animesh

    2016-01-01

    A common but an important feature of all real-world networks is that they are temporal in nature, i.e., the network structure changes over time. Due to this dynamic nature, it becomes difficult to propose suitable growth models that can explain the various important characteristic properties of these networks. In fact, in many application oriented studies only knowing these properties is sufficient. For instance, if one wishes to launch a targeted attack on a network, this can be done even without the knowledge of the full network structure; rather an estimate of some of the properties is sufficient enough to launch the attack. We, in this paper show that even if the network structure at a future time point is not available one can still manage to estimate its properties. We propose a novel method to map a temporal network to a set of time series instances, analyze them and using a standard forecast model of time series, try to predict the properties of a temporal network at a later time instance. To our aim, we consider eight properties such as number of active nodes, average degree, clustering coefficient etc. and apply our prediction framework on them. We mainly focus on the temporal network of human face-to-face contacts and observe that it represents a stochastic process with memory that can be modeled as Auto-Regressive-Integrated-Moving-Average (ARIMA). We use cross validation techniques to find the percentage accuracy of our predictions. An important observation is that the frequency domain properties of the time series obtained from spectrogram analysis could be used to refine the prediction framework by identifying beforehand the cases where the error in prediction is likely to be high. This leads to an improvement of 7.96% (for error level ≤20%) in prediction accuracy on an average across all datasets. As an application we show how such prediction scheme can be used to launch targeted attacks on temporal networks. Contribution to the Topical Issue

  4. The Hierarchical Spectral Merger Algorithm: A New Time Series Clustering Procedure

    KAUST Repository

    Euá n, Carolina; Ombao, Hernando; Ortega, Joaquí n

    2018-01-01

    We present a new method for time series clustering which we call the Hierarchical Spectral Merger (HSM) method. This procedure is based on the spectral theory of time series and identifies series that share similar oscillations or waveforms

  5. Notes on economic time series analysis system theoretic perspectives

    CERN Document Server

    Aoki, Masanao

    1983-01-01

    In seminars and graduate level courses I have had several opportunities to discuss modeling and analysis of time series with economists and economic graduate students during the past several years. These experiences made me aware of a gap between what economic graduate students are taught about vector-valued time series and what is available in recent system literature. Wishing to fill or narrow the gap that I suspect is more widely spread than my personal experiences indicate, I have written these notes to augment and reor­ ganize materials I have given in these courses and seminars. I have endeavored to present, in as much a self-contained way as practicable, a body of results and techniques in system theory that I judge to be relevant and useful to economists interested in using time series in their research. I have essentially acted as an intermediary and interpreter of system theoretic results and perspectives in time series by filtering out non-essential details, and presenting coherent accounts of wha...

  6. Trophic interactions between larger crocodylians and giant tortoises on Aldabra Atoll, Western Indian Ocean, during the Late Pleistocene.

    Science.gov (United States)

    Scheyer, Torsten M; Delfino, Massimo; Klein, Nicole; Bunbury, Nancy; Fleischer-Dogley, Frauke; Hansen, Dennis M

    2018-01-01

    Today, the UNESCO World Heritage Site of Aldabra Atoll is home to about 100 000 giant tortoises, Aldabrachelys gigantea , whose fossil record goes back to the Late Pleistocene. New Late Pleistocene fossils (age ca . 90-125 000 years) from the atoll revealed some appendicular bones and numerous shell fragments of giant tortoises and cranial and postcranial elements of crocodylians. Several tortoise bones show circular holes, pits and scratch marks that are interpreted as bite marks of crocodylians. The presence of a Late Pleistocene crocodylian species, Aldabrachampsus dilophus , has been known for some time, but the recently found crocodylian remains presented herein are distinctly larger than those previously described. This indicates the presence of at least some larger crocodylians, either of the same or of a different species, on the atoll. These larger crocodylians, likely the apex predators in the Aldabra ecosystem at the time, were well capable of inflicting damage on even very large giant tortoises. We thus propose an extinct predator-prey interaction between crocodylians and giant tortoises during the Late Pleistocene, when both groups were living sympatrically on Aldabra, and we discuss scenarios for the crocodylians directly attacking the tortoises or scavenging on recently deceased animals.

  7. Dynamical analysis and visualization of tornadoes time series.

    Directory of Open Access Journals (Sweden)

    António M Lopes

    Full Text Available In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  8. Dynamical analysis and visualization of tornadoes time series.

    Science.gov (United States)

    Lopes, António M; Tenreiro Machado, J A

    2015-01-01

    In this paper we analyze the behavior of tornado time-series in the U.S. from the perspective of dynamical systems. A tornado is a violently rotating column of air extending from a cumulonimbus cloud down to the ground. Such phenomena reveal features that are well described by power law functions and unveil characteristics found in systems with long range memory effects. Tornado time series are viewed as the output of a complex system and are interpreted as a manifestation of its dynamics. Tornadoes are modeled as sequences of Dirac impulses with amplitude proportional to the events size. First, a collection of time series involving 64 years is analyzed in the frequency domain by means of the Fourier transform. The amplitude spectra are approximated by power law functions and their parameters are read as an underlying signature of the system dynamics. Second, it is adopted the concept of circular time and the collective behavior of tornadoes analyzed. Clustering techniques are then adopted to identify and visualize the emerging patterns.

  9. "Observation Obscurer" - Time Series Viewer, Editor and Processor

    Science.gov (United States)

    Andronov, I. L.

    The program is described, which contains a set of subroutines suitable for East viewing and interactive filtering and processing of regularly and irregularly spaced time series. Being a 32-bit DOS application, it may be used as a default fast viewer/editor of time series in any compute shell ("commander") or in Windows. It allows to view the data in the "time" or "phase" mode, to remove ("obscure") or filter outstanding bad points; to make scale transformations and smoothing using few methods (e.g. mean with phase binning, determination of the statistically opti- mal number of phase bins; "running parabola" (Andronov, 1997, As. Ap. Suppl, 125, 207) fit and to make time series analysis using some methods, e.g. correlation, autocorrelation and histogram analysis: determination of extrema etc. Some features have been developed specially for variable star observers, e.g. the barycentric correction, the creation and fast analysis of "OC" diagrams etc. The manual for "hot keys" is presented. The computer code was compiled with a 32-bit Free Pascal (www.freepascal.org).

  10. Modelling road accidents: An approach using structural time series

    Science.gov (United States)

    Junus, Noor Wahida Md; Ismail, Mohd Tahir

    2014-09-01

    In this paper, the trend of road accidents in Malaysia for the years 2001 until 2012 was modelled using a structural time series approach. The structural time series model was identified using a stepwise method, and the residuals for each model were tested. The best-fitted model was chosen based on the smallest Akaike Information Criterion (AIC) and prediction error variance. In order to check the quality of the model, a data validation procedure was performed by predicting the monthly number of road accidents for the year 2012. Results indicate that the best specification of the structural time series model to represent road accidents is the local level with a seasonal model.

  11. Multiscale Poincaré plots for visualizing the structure of heartbeat time series.

    Science.gov (United States)

    Henriques, Teresa S; Mariani, Sara; Burykin, Anton; Rodrigues, Filipa; Silva, Tiago F; Goldberger, Ary L

    2016-02-09

    Poincaré delay maps are widely used in the analysis of cardiac interbeat interval (RR) dynamics. To facilitate visualization of the structure of these time series, we introduce multiscale Poincaré (MSP) plots. Starting with the original RR time series, the method employs a coarse-graining procedure to create a family of time series, each of which represents the system's dynamics in a different time scale. Next, the Poincaré plots are constructed for the original and the coarse-grained time series. Finally, as an optional adjunct, color can be added to each point to represent its normalized frequency. We illustrate the MSP method on simulated Gaussian white and 1/f noise time series. The MSP plots of 1/f noise time series reveal relative conservation of the phase space area over multiple time scales, while those of white noise show a marked reduction in area. We also show how MSP plots can be used to illustrate the loss of complexity when heartbeat time series from healthy subjects are compared with those from patients with chronic (congestive) heart failure syndrome or with atrial fibrillation. This generalized multiscale approach to Poincaré plots may be useful in visualizing other types of time series.

  12. Middle Pleistocene protein sequences from the rhinoceros genus Stephanorhinus and the phylogeny of extant and extinct Middle/Late Pleistocene Rhinocerotidae.

    Science.gov (United States)

    Welker, Frido; Smith, Geoff M; Hutson, Jarod M; Kindler, Lutz; Garcia-Moreno, Alejandro; Villaluenga, Aritza; Turner, Elaine; Gaudzinski-Windheuser, Sabine

    2017-01-01

    Ancient protein sequences are increasingly used to elucidate the phylogenetic relationships between extinct and extant mammalian taxa. Here, we apply these recent developments to Middle Pleistocene bone specimens of the rhinoceros genus Stephanorhinus . No biomolecular sequence data is currently available for this genus, leaving phylogenetic hypotheses on its evolutionary relationships to extant and extinct rhinoceroses untested. Furthermore, recent phylogenies based on Rhinocerotidae (partial or complete) mitochondrial DNA sequences differ in the placement of the Sumatran rhinoceros ( Dicerorhinus sumatrensis ). Therefore, studies utilising ancient protein sequences from Middle Pleistocene contexts have the potential to provide further insights into the phylogenetic relationships between extant and extinct species, including Stephanorhinus and Dicerorhinus . ZooMS screening (zooarchaeology by mass spectrometry) was performed on several Late and Middle Pleistocene specimens from the genus Stephanorhinus , subsequently followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS) to obtain ancient protein sequences from a Middle Pleistocene Stephanorhinus specimen. We performed parallel analysis on a Late Pleistocene woolly rhinoceros specimen and extant species of rhinoceroses, resulting in the availability of protein sequence data for five extant species and two extinct genera. Phylogenetic analysis additionally included all extant Perissodactyla genera ( Equus , Tapirus ), and was conducted using Bayesian (MrBayes) and maximum-likelihood (RAxML) methods. Various ancient proteins were identified in both the Middle and Late Pleistocene rhinoceros samples. Protein degradation and proteome complexity are consistent with an endogenous origin of the identified proteins. Phylogenetic analysis of informative proteins resolved the Perissodactyla phylogeny in agreement with previous studies in regards to the placement of the families Equidae, Tapiridae, and

  13. Time series patterns and language support in DBMS

    Science.gov (United States)

    Telnarova, Zdenka

    2017-07-01

    This contribution is focused on pattern type Time Series as a rich in semantics representation of data. Some example of implementation of this pattern type in traditional Data Base Management Systems is briefly presented. There are many approaches how to manipulate with patterns and query patterns. Crucial issue can be seen in systematic approach to pattern management and specific pattern query language which takes into consideration semantics of patterns. Query language SQL-TS for manipulating with patterns is shown on Time Series data.

  14. Two-fractal overlap time series: Earthquakes and market crashes

    Indian Academy of Sciences (India)

    velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations. Keywords. Cantor set; time series; earthquake; market crash. PACS Nos 05.00; 02.50.-r; 64.60; 89.65.Gh; 95.75.Wx. 1. Introduction. Capturing dynamical patterns of ...

  15. PLIO-PLEISTOCENE FOSSIL VERTEBRATES OF MONTE TUTTAVISTA (OROSEI, EASTERN SARDINIA, ITALY, AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    LAURA ABBAZZI

    2004-11-01

    Full Text Available The preliminary results of the analisys of fossil vertebrate remains from 19 fissure fillings in the karst network at Monte Tuttavista (Orosei, NMoro are reported. about 80 taxa, among fishes, amphibians, reptiles, birds and mammals have been recognised.These remains document the evolution of vertebrate assemblages in the Sardinian insular domain, during a time interval apparently spanning the Late Pliocene to Late Pleistocene or Holocene. A succession of at least four populating complexes has been identified which document the vertebrate colonisation phases from the Italian mainland and the following periods of insularity. Indeed, the occurrence of endemic taxa such as the murid Rhagapodemus minor, the primate Macaca cf. M. majori and the caprine Nesogoral, suggest some fissure fillings date to a phase close to the Plio/Pleistocene boundary since these taxa occur at the Sardinian locality Capo Figari I which has been dated to about 1.8 Ma. However, the presence of the "hunting-hyaena" Chasmaporthetes, never reported before in Sardinia, could suggest that the beginning of the vertebrate record of Monte Tuttavista is older, given that this carnivore is documented in European Middle Pliocene-Early Pleistocene localities. The vertebrate assemblages that document the most recent migratory phases in the karst network of Monte Tuttavista are characterised by the occurrence of the endemic megalocerine cervid Praemegaceros cazioti and the arvicolid Tyrrhenicola henseli which are comparable with those occurring in other Late Pleistocene and early Holocene Sardinian sites.

  16. Nonlinear time series analysis with R

    CERN Document Server

    Huffaker, Ray; Rosa, Rodolfo

    2017-01-01

    In the process of data analysis, the investigator is often facing highly-volatile and random-appearing observed data. A vast body of literature shows that the assumption of underlying stochastic processes was not necessarily representing the nature of the processes under investigation and, when other tools were used, deterministic features emerged. Non Linear Time Series Analysis (NLTS) allows researchers to test whether observed volatility conceals systematic non linear behavior, and to rigorously characterize governing dynamics. Behavioral patterns detected by non linear time series analysis, along with scientific principles and other expert information, guide the specification of mechanistic models that serve to explain real-world behavior rather than merely reproducing it. Often there is a misconception regarding the complexity of the level of mathematics needed to understand and utilize the tools of NLTS (for instance Chaos theory). However, mathematics used in NLTS is much simpler than many other subjec...

  17. InSAR Deformation Time Series Processed On-Demand in the Cloud

    Science.gov (United States)

    Horn, W. B.; Weeden, R.; Dimarchi, H.; Arko, S. A.; Hogenson, K.

    2017-12-01

    During this past year, ASF has developed a cloud-based on-demand processing system known as HyP3 (http://hyp3.asf.alaska.edu/), the Hybrid Pluggable Processing Pipeline, for Synthetic Aperture Radar (SAR) data. The system makes it easy for a user who doesn't have the time or inclination to install and use complex SAR processing software to leverage SAR data in their research or operations. One such processing algorithm is generation of a deformation time series product, which is a series of images representing ground displacements over time, which can be computed using a time series of interferometric SAR (InSAR) products. The set of software tools necessary to generate this useful product are difficult to install, configure, and use. Moreover, for a long time series with many images, the processing of just the interferograms can take days. Principally built by three undergraduate students at the ASF DAAC, the deformation time series processing relies the new Amazon Batch service, which enables processing of jobs with complex interconnected dependencies in a straightforward and efficient manner. In the case of generating a deformation time series product from a stack of single-look complex SAR images, the system uses Batch to serialize the up-front processing, interferogram generation, optional tropospheric correction, and deformation time series generation. The most time consuming portion is the interferogram generation, because even for a fairly small stack of images many interferograms need to be processed. By using AWS Batch, the interferograms are all generated in parallel; the entire process completes in hours rather than days. Additionally, the individual interferograms are saved in Amazon's cloud storage, so that when new data is acquired in the stack, an updated time series product can be generated with minimal addiitonal processing. This presentation will focus on the development techniques and enabling technologies that were used in developing the time

  18. [Vegetation spatial and temporal dynamic characteristics based on NDVI time series trajectories in grassland opencast coal mining].

    Science.gov (United States)

    Jia, Duo; Wang, Cang Jiao; Mu, Shou Guo; Zhao, Hua

    2017-06-18

    The spatiotemporal dynamic patterns of vegetation in mining area are still unclear. This study utilized time series trajectory segmentation algorithm to fit Landsat NDVI time series which generated from fusion images at the most prosperous period of growth based on ESTARFM algorithm. Combining with the shape features of the fitted trajectory, this paper extracted five vegetation dynamic patterns including pre-disturbance type, continuous disturbance type, stabilization after disturbance type, stabilization between disturbance and recovery type, and recovery after disturbance type. The result indicated that recovery after disturbance type was the dominant vegetation change pattern among the five types of vegetation dynamic pattern, which accounted for 55.2% of the total number of pixels. The follows were stabilization after disturbance type and continuous disturbance type, accounting for 25.6% and 11.0%, respectively. The pre-disturbance type and stabilization between disturbance and recovery type accounted for 3.5% and 4.7%, respectively. Vegetation disturbance mainly occurred from 2004 to 2009 in Shengli mining area. The onset time of stable state was 2008 and the spatial locations mainlydistributed in open-pit stope and waste dump. The reco-very state mainly started since the year of 2008 and 2010, while the areas were small and mainly distributed at the periphery of open-pit stope and waste dump. Duration of disturbance was mainly 1 year. The duration of stable period usually sustained 7 years. The duration of recovery state of the type of stabilization between disturbances continued 2 to 5 years, while the type of recovery after disturbance often sustained 8 years.

  19. Vector bilinear autoregressive time series model and its superiority ...

    African Journals Online (AJOL)

    In this research, a vector bilinear autoregressive time series model was proposed and used to model three revenue series (X1, X2, X3) . The “orders” of the three series were identified on the basis of the distribution of autocorrelation and partial autocorrelation functions and were used to construct the vector bilinear models.

  20. The earliest settlers of Mesoamerica date back to the late Pleistocene.

    Directory of Open Access Journals (Sweden)

    Wolfgang Stinnesbeck

    Full Text Available Preceramic human skeletal remains preserved in submerged caves near Tulum in the Mexican state of Quintana Roo, Mexico, reveal conflicting results regarding 14C dating. Here we use U-series techniques for dating a stalagmite overgrowing the pelvis of a human skeleton discovered in the submerged Chan Hol cave. The oldest closed system U/Th age comes from around 21 mm above the pelvis defining the terminus ante quem for the pelvis to 11311±370 y BP. However, the skeleton might be considerable older, probably as old as 13 ky BP as indicated by the speleothem stable isotope data. The Chan Hol individual confirms a late Pleistocene settling of Mesoamerica and represents one of the oldest human osteological remains in America.

  1. Paleohydrology and paleoenvironments at Bir Sahara: Pleistocene lithostratigraphy and sedimentology in the southern Egyptian Sahara

    Science.gov (United States)

    Hill, Christopher L.; Schild, Romuald

    2017-12-01

    The Bir Sahara area contains a remarkable record of Middle and Late Pleistocene hydrologic and environmental conditions for Saharan North Africa, based on lithostratigraphic and sedimentologic evidence from basin-fill deposits. Some of the deposits contain Lower Paleolithic (Acheulean) or Middle Paleolithic artifacts that help to constrain their age, since Acheulian artifacts are assigned to the Middle Pleistocene, while Middle Paleolithic artifacts are limited to either the Middle or Late Pleistocene. Locality BS-14 is in the southern part of Bir Sahara, while localities E-88-15, E-88-2, BS-13, and BS-16 are situated in the south-central part of the deflational basin, closer to the present-day water-hole. Lowered groundwater conditions during arid intervals resulted in erosional topographic basins. These deflational basins were later filled with sediments associated with wetter hydrologic conditions. The oldest studied sedimentary sequence in the Bir Sahara depression (BS-14) contains in situ Acheulian artifacts. Acheulian handaxes are found in sands underlying carbonates that are interpreted as evidence of spring-fed pond and marsh environments during a Middle Pleistocene wet interval. At the E-88-15 locality, the stratigraphic sequence documents deposition in a possible perennial pond or small lake that varied in extent and depth and is associated with Middle Paleolithic artifacts. At E-88-12 and BS-13, lateral and vertical variations in the lithofacies of the basin-fill sediments provide additional records of changing hydrologic conditions during the Late Pleistocene. These hydrologic conditions appear to reflect variations in water-table levels related to groundwater recharge and, at times, local rains.

  2. 25 years of time series forecasting

    NARCIS (Netherlands)

    de Gooijer, J.G.; Hyndman, R.J.

    2006-01-01

    We review the past 25 years of research into time series forecasting. In this silver jubilee issue, we naturally highlight results published in journals managed by the International Institute of Forecasters (Journal of Forecasting 1982-1985 and International Journal of Forecasting 1985-2005). During

  3. Markov Trends in Macroeconomic Time Series

    NARCIS (Netherlands)

    R. Paap (Richard)

    1997-01-01

    textabstractMany macroeconomic time series are characterised by long periods of positive growth, expansion periods, and short periods of negative growth, recessions. A popular model to describe this phenomenon is the Markov trend, which is a stochastic segmented trend where the slope depends on the

  4. Modeling seasonality in bimonthly time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    1992-01-01

    textabstractA recurring issue in modeling seasonal time series variables is the choice of the most adequate model for the seasonal movements. One selection method for quarterly data is proposed in Hylleberg et al. (1990). Market response models are often constructed for bimonthly variables, and

  5. On clustering fMRI time series

    DEFF Research Database (Denmark)

    Goutte, Cyril; Toft, Peter Aundal; Rostrup, E.

    1999-01-01

    Analysis of fMRI time series is often performed by extracting one or more parameters for the individual voxels. Methods based, e.g., on various statistical tests are then used to yield parameters corresponding to probability of activation or activation strength. However, these methods do...

  6. FALSE DETERMINATIONS OF CHAOS IN SHORT NOISY TIME SERIES. (R828745)

    Science.gov (United States)

    A method (NEMG) proposed in 1992 for diagnosing chaos in noisy time series with 50 or fewer observations entails fitting the time series with an empirical function which predicts an observation in the series from previous observations, and then estimating the rate of divergenc...

  7. Multiscale multifractal multiproperty analysis of financial time series based on Rényi entropy

    Science.gov (United States)

    Yujun, Yang; Jianping, Li; Yimei, Yang

    This paper introduces a multiscale multifractal multiproperty analysis based on Rényi entropy (3MPAR) method to analyze short-range and long-range characteristics of financial time series, and then applies this method to the five time series of five properties in four stock indices. Combining the two analysis techniques of Rényi entropy and multifractal detrended fluctuation analysis (MFDFA), the 3MPAR method focuses on the curves of Rényi entropy and generalized Hurst exponent of five properties of four stock time series, which allows us to study more universal and subtle fluctuation characteristics of financial time series. By analyzing the curves of the Rényi entropy and the profiles of the logarithm distribution of MFDFA of five properties of four stock indices, the 3MPAR method shows some fluctuation characteristics of the financial time series and the stock markets. Then, it also shows a richer information of the financial time series by comparing the profile of five properties of four stock indices. In this paper, we not only focus on the multifractality of time series but also the fluctuation characteristics of the financial time series and subtle differences in the time series of different properties. We find that financial time series is far more complex than reported in some research works using one property of time series.

  8. A Literature Survey of Early Time Series Classification and Deep Learning

    OpenAIRE

    Santos, Tiago; Kern, Roman

    2017-01-01

    This paper provides an overview of current literature on time series classification approaches, in particular of early time series classification. A very common and effective time series classification approach is the 1-Nearest Neighbor classier, with different distance measures such as the Euclidean or dynamic time warping distances. This paper starts by reviewing these baseline methods. More recently, with the gain in popularity in the application of deep neural networks to the eld of...

  9. Signal Processing for Time-Series Functions on a Graph

    Science.gov (United States)

    2018-02-01

    Figures Fig. 1 Time -series function on a fixed graph.............................................2 iv Approved for public release; distribution is...φi〉`2(V)φi (39) 6= f̄ (40) Instead, we simply recover the average of f over time . 13 Approved for public release; distribution is unlimited. This...ARL-TR-8276• FEB 2018 US Army Research Laboratory Signal Processing for Time -Series Functions on a Graph by Humberto Muñoz-Barona, Jean Vettel, and

  10. Non-linear time series extreme events and integer value problems

    CERN Document Server

    Turkman, Kamil Feridun; Zea Bermudez, Patrícia

    2014-01-01

    This book offers a useful combination of probabilistic and statistical tools for analyzing nonlinear time series. Key features of the book include a study of the extremal behavior of nonlinear time series and a comprehensive list of nonlinear models that address different aspects of nonlinearity. Several inferential methods, including quasi likelihood methods, sequential Markov Chain Monte Carlo Methods and particle filters, are also included so as to provide an overall view of the available tools for parameter estimation for nonlinear models. A chapter on integer time series models based on several thinning operations, which brings together all recent advances made in this area, is also included. Readers should have attended a prior course on linear time series, and a good grasp of simulation-based inferential methods is recommended. This book offers a valuable resource for second-year graduate students and researchers in statistics and other scientific areas who need a basic understanding of nonlinear time ...

  11. Learning of time series through neuron-to-neuron instruction

    Energy Technology Data Exchange (ETDEWEB)

    Miyazaki, Y [Department of Physics, Kyoto University, Kyoto 606-8502, (Japan); Kinzel, W [Institut fuer Theoretische Physik, Universitaet Wurzburg, 97074 Wurzburg (Germany); Shinomoto, S [Department of Physics, Kyoto University, Kyoto (Japan)

    2003-02-07

    A model neuron with delayline feedback connections can learn a time series generated by another model neuron. It has been known that some student neurons that have completed such learning under the instruction of a teacher's quasi-periodic sequence mimic the teacher's time series over a long interval, even after instruction has ceased. We found that in addition to such faithful students, there are unfaithful students whose time series eventually diverge exponentially from that of the teacher. In order to understand the circumstances that allow for such a variety of students, the orbit dimension was estimated numerically. The quasi-periodic orbits in question were found to be confined in spaces with dimensions significantly smaller than that of the full phase space.

  12. Learning of time series through neuron-to-neuron instruction

    International Nuclear Information System (INIS)

    Miyazaki, Y; Kinzel, W; Shinomoto, S

    2003-01-01

    A model neuron with delayline feedback connections can learn a time series generated by another model neuron. It has been known that some student neurons that have completed such learning under the instruction of a teacher's quasi-periodic sequence mimic the teacher's time series over a long interval, even after instruction has ceased. We found that in addition to such faithful students, there are unfaithful students whose time series eventually diverge exponentially from that of the teacher. In order to understand the circumstances that allow for such a variety of students, the orbit dimension was estimated numerically. The quasi-periodic orbits in question were found to be confined in spaces with dimensions significantly smaller than that of the full phase space

  13. Quirky patterns in time-series of estimates of recruitment could be artefacts

    DEFF Research Database (Denmark)

    Dickey-Collas, M.; Hinzen, N.T.; Nash, R.D.M.

    2015-01-01

    of recruitment time-series in databases is therefore not consistent across or within species and stocks. Caution is therefore required as perhaps the characteristics of the time-series of stock dynamics may be determined by the model used to generate them, rather than underlying ecological phenomena......The accessibility of databases of global or regional stock assessment outputs is leading to an increase in meta-analysis of the dynamics of fish stocks. In most of these analyses, each of the time-series is generally assumed to be directly comparable. However, the approach to stock assessment...... employed, and the associated modelling assumptions, can have an important influence on the characteristics of each time-series. We explore this idea by investigating recruitment time-series with three different recruitment parameterizations: a stock–recruitment model, a random-walk time-series model...

  14. The Hierarchical Spectral Merger Algorithm: A New Time Series Clustering Procedure

    KAUST Repository

    Euán, Carolina

    2018-04-12

    We present a new method for time series clustering which we call the Hierarchical Spectral Merger (HSM) method. This procedure is based on the spectral theory of time series and identifies series that share similar oscillations or waveforms. The extent of similarity between a pair of time series is measured using the total variation distance between their estimated spectral densities. At each step of the algorithm, every time two clusters merge, a new spectral density is estimated using the whole information present in both clusters, which is representative of all the series in the new cluster. The method is implemented in an R package HSMClust. We present two applications of the HSM method, one to data coming from wave-height measurements in oceanography and the other to electroencefalogram (EEG) data.

  15. Significance of Two New Pleistocene Plant Records from Western Europe

    Science.gov (United States)

    Field, Michael H.; Velichkevich, Felix Y.; Andrieu-Ponel, Valerie; Woltz, Phillipe

    2000-09-01

    The first records of extinct Caulinia goretskyi (Dorofeev) Dorofeev (synonym Najas goretskyi Dorofeev) in western Europe and of Potamogeton occidentalis M.H. Field sp. nov. were obtained from plant macrofossil analyses of Middle Pleistocene temperate stage deposits exposed at Trez Rouz, Brittany, France. Palynological assemblages recovered suggest correlation with the Holsteinian Stage. This discovery greatly expands the western limit of the paleogeographical distribution of Caulinia goretskyi. The record of Potamogeton occidentalis indicates an affinity with the eastern Asiatic flora, as the fruits resemble those of the extant Potamogeton maackianus A. Bennett. Other extinct Pleistocene species related to P. maackianus have been described, and it is possible to follow the development of this group through the Pleistocene in the European fossil record. These new finds illustrate the importance of a complete paleobotanical approach (both plant macrofossil and palynological analyses). The plant macrofossil assemblages not only provide detailed insight into local vegetation and environment, because they are often not transported long distances (in temperate areas) and can frequently be identified to species level; they can also offer the opportunity to investigate Pleistocene evolutionary trends.

  16. Estimation of time-delayed mutual information and bias for irregularly and sparsely sampled time-series

    International Nuclear Information System (INIS)

    Albers, D.J.; Hripcsak, George

    2012-01-01

    Highlights: ► Time-delayed mutual information for irregularly sampled time-series. ► Estimation bias for the time-delayed mutual information calculation. ► Fast, simple, PDF estimator independent, time-delayed mutual information bias estimate. ► Quantification of data-set-size limits of the time-delayed mutual calculation. - Abstract: A method to estimate the time-dependent correlation via an empirical bias estimate of the time-delayed mutual information for a time-series is proposed. In particular, the bias of the time-delayed mutual information is shown to often be equivalent to the mutual information between two distributions of points from the same system separated by infinite time. Thus intuitively, estimation of the bias is reduced to estimation of the mutual information between distributions of data points separated by large time intervals. The proposed bias estimation techniques are shown to work for Lorenz equations data and glucose time series data of three patients from the Columbia University Medical Center database.

  17. Inferring interdependencies from short time series

    Indian Academy of Sciences (India)

    Abstract. Complex networks provide an invaluable framework for the study of interlinked dynamical systems. In many cases, such networks are constructed from observed time series by first estimating the ...... does not quantify causal relations (unlike IOTA, or .... Africa_map_regions.svg, which is under public domain.

  18. Stochastic modeling of hourly rainfall times series in Campania (Italy)

    Science.gov (United States)

    Giorgio, M.; Greco, R.

    2009-04-01

    Occurrence of flowslides and floods in small catchments is uneasy to predict, since it is affected by a number of variables, such as mechanical and hydraulic soil properties, slope morphology, vegetation coverage, rainfall spatial and temporal variability. Consequently, landslide risk assessment procedures and early warning systems still rely on simple empirical models based on correlation between recorded rainfall data and observed landslides and/or river discharges. Effectiveness of such systems could be improved by reliable quantitative rainfall prediction, which can allow gaining larger lead-times. Analysis of on-site recorded rainfall height time series represents the most effective approach for a reliable prediction of local temporal evolution of rainfall. Hydrological time series analysis is a widely studied field in hydrology, often carried out by means of autoregressive models, such as AR, ARMA, ARX, ARMAX (e.g. Salas [1992]). Such models gave the best results when applied to the analysis of autocorrelated hydrological time series, like river flow or level time series. Conversely, they are not able to model the behaviour of intermittent time series, like point rainfall height series usually are, especially when recorded with short sampling time intervals. More useful for this issue are the so-called DRIP (Disaggregated Rectangular Intensity Pulse) and NSRP (Neymann-Scott Rectangular Pulse) model [Heneker et al., 2001; Cowpertwait et al., 2002], usually adopted to generate synthetic point rainfall series. In this paper, the DRIP model approach is adopted, in which the sequence of rain storms and dry intervals constituting the structure of rainfall time series is modeled as an alternating renewal process. Final aim of the study is to provide a useful tool to implement an early warning system for hydrogeological risk management. Model calibration has been carried out with hourly rainfall hieght data provided by the rain gauges of Campania Region civil

  19. Using forbidden ordinal patterns to detect determinism in irregularly sampled time series.

    Science.gov (United States)

    Kulp, C W; Chobot, J M; Niskala, B J; Needhammer, C J

    2016-02-01

    It is known that when symbolizing a time series into ordinal patterns using the Bandt-Pompe (BP) methodology, there will be ordinal patterns called forbidden patterns that do not occur in a deterministic series. The existence of forbidden patterns can be used to identify deterministic dynamics. In this paper, the ability to use forbidden patterns to detect determinism in irregularly sampled time series is tested on data generated from a continuous model system. The study is done in three parts. First, the effects of sampling time on the number of forbidden patterns are studied on regularly sampled time series. The next two parts focus on two types of irregular-sampling, missing data and timing jitter. It is shown that forbidden patterns can be used to detect determinism in irregularly sampled time series for low degrees of sampling irregularity (as defined in the paper). In addition, comments are made about the appropriateness of using the BP methodology to symbolize irregularly sampled time series.

  20. Forecasting the Reference Evapotranspiration Using Time Series Model

    Directory of Open Access Journals (Sweden)

    H. Zare Abyaneh

    2016-10-01

    Full Text Available Introduction: Reference evapotranspiration is one of the most important factors in irrigation timing and field management. Moreover, reference evapotranspiration forecasting can play a vital role in future developments. Therefore in this study, the seasonal autoregressive integrated moving average (ARIMA model was used to forecast the reference evapotranspiration time series in the Esfahan, Semnan, Shiraz, Kerman, and Yazd synoptic stations. Materials and Methods: In the present study in all stations (characteristics of the synoptic stations are given in Table 1, the meteorological data, including mean, maximum and minimum air temperature, relative humidity, dry-and wet-bulb temperature, dew-point temperature, wind speed, precipitation, air vapor pressure and sunshine hours were collected from the Islamic Republic of Iran Meteorological Organization (IRIMO for the 41 years from 1965 to 2005. The FAO Penman-Monteith equation was used to calculate the monthly reference evapotranspiration in the five synoptic stations and the evapotranspiration time series were formed. The unit root test was used to identify whether the time series was stationary, then using the Box-Jenkins method, seasonal ARIMA models were applied to the sample data. Table 1. The geographical location and climate conditions of the synoptic stations Station\tGeographical location\tAltitude (m\tMean air temperature (°C\tMean precipitation (mm\tClimate, according to the De Martonne index classification Longitude (E\tLatitude (N Annual\tMin. and Max. Esfahan\t51° 40'\t32° 37'\t1550.4\t16.36\t9.4-23.3\t122\tArid Semnan\t53° 33'\t35° 35'\t1130.8\t18.0\t12.4-23.8\t140\tArid Shiraz\t52° 36'\t29° 32'\t1484\t18.0\t10.2-25.9\t324\tSemi-arid Kerman\t56° 58'\t30° 15'\t1753.8\t15.6\t6.7-24.6\t142\tArid Yazd\t54° 17'\t31° 54'\t1237.2\t19.2\t11.8-26.0\t61\tArid Results and Discussion: The monthly meteorological data were used as input for the Ref-ET software and monthly reference

  1. Complexity testing techniques for time series data: A comprehensive literature review

    International Nuclear Information System (INIS)

    Tang, Ling; Lv, Huiling; Yang, Fengmei; Yu, Lean

    2015-01-01

    Highlights: • A literature review of complexity testing techniques for time series data is provided. • Complexity measurements can generally fall into fractality, methods derived from nonlinear dynamics and entropy. • Different types investigate time series data from different perspectives. • Measures, applications and future studies for each type are presented. - Abstract: Complexity may be one of the most important measurements for analysing time series data; it covers or is at least closely related to different data characteristics within nonlinear system theory. This paper provides a comprehensive literature review examining the complexity testing techniques for time series data. According to different features, the complexity measurements for time series data can be divided into three primary groups, i.e., fractality (mono- or multi-fractality) for self-similarity (or system memorability or long-term persistence), methods derived from nonlinear dynamics (via attractor invariants or diagram descriptions) for attractor properties in phase-space, and entropy (structural or dynamical entropy) for the disorder state of a nonlinear system. These estimations analyse time series dynamics from different perspectives but are closely related to or even dependent on each other at the same time. In particular, a weaker self-similarity, a more complex structure of attractor, and a higher-level disorder state of a system consistently indicate that the observed time series data are at a higher level of complexity. Accordingly, this paper presents a historical tour of the important measures and works for each group, as well as ground-breaking and recent applications and future research directions.

  2. Complex dynamic in ecological time series

    Science.gov (United States)

    Peter Turchin; Andrew D. Taylor

    1992-01-01

    Although the possibility of complex dynamical behaviors-limit cycles, quasiperiodic oscillations, and aperiodic chaos-has been recognized theoretically, most ecologists are skeptical of their importance in nature. In this paper we develop a methodology for reconstructing endogenous (or deterministic) dynamics from ecological time series. Our method consists of fitting...

  3. Time Series Modelling using Proc Varmax

    DEFF Research Database (Denmark)

    Milhøj, Anders

    2007-01-01

    In this paper it will be demonstrated how various time series problems could be met using Proc Varmax. The procedure is rather new and hence new features like cointegration, testing for Granger causality are included, but it also means that more traditional ARIMA modelling as outlined by Box...

  4. Kriging Methodology and Its Development in Forecasting Econometric Time Series

    Directory of Open Access Journals (Sweden)

    Andrej Gajdoš

    2017-03-01

    Full Text Available One of the approaches for forecasting future values of a time series or unknown spatial data is kriging. The main objective of the paper is to introduce a general scheme of kriging in forecasting econometric time series using a family of linear regression time series models (shortly named as FDSLRM which apply regression not only to a trend but also to a random component of the observed time series. Simultaneously performing a Monte Carlo simulation study with a real electricity consumption dataset in the R computational langure and environment, we investigate the well-known problem of “negative” estimates of variance components when kriging predictions fail. Our following theoretical analysis, including also the modern apparatus of advanced multivariate statistics, gives us the formulation and proof of a general theorem about the explicit form of moments (up to sixth order for a Gaussian time series observation. This result provides a basis for further theoretical and computational research in the kriging methodology development.

  5. Use of Time-Series, ARIMA Designs to Assess Program Efficacy.

    Science.gov (United States)

    Braden, Jeffery P.; And Others

    1990-01-01

    Illustrates use of time-series designs for determining efficacy of interventions with fictitious data describing drug-abuse prevention program. Discusses problems and procedures associated with time-series data analysis using Auto Regressive Integrated Moving Averages (ARIMA) models. Example illustrates application of ARIMA analysis for…

  6. An algorithm of Saxena-Easo on fuzzy time series forecasting

    Science.gov (United States)

    Ramadhani, L. C.; Anggraeni, D.; Kamsyakawuni, A.; Hadi, A. F.

    2018-04-01

    This paper presents a forecast model of Saxena-Easo fuzzy time series prediction to study the prediction of Indonesia inflation rate in 1970-2016. We use MATLAB software to compute this method. The algorithm of Saxena-Easo fuzzy time series doesn’t need stationarity like conventional forecasting method, capable of dealing with the value of time series which are linguistic and has the advantage of reducing the calculation, time and simplifying the calculation process. Generally it’s focus on percentage change as the universe discourse, interval partition and defuzzification. The result indicate that between the actual data and the forecast data are close enough with Root Mean Square Error (RMSE) = 1.5289.

  7. Evolutionary Algorithms for the Detection of Structural Breaks in Time Series

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Fischer, Paul; Hilbert, Astrid

    2013-01-01

    Detecting structural breaks is an essential task for the statistical analysis of time series, for example, for fitting parametric models to it. In short, structural breaks are points in time at which the behavior of the time series changes. Typically, no solid background knowledge of the time...

  8. On modeling panels of time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans)

    2002-01-01

    textabstractThis paper reviews research issues in modeling panels of time series. Examples of this type of data are annually observed macroeconomic indicators for all countries in the world, daily returns on the individual stocks listed in the S&P500, and the sales records of all items in a

  9. Unsupervised Symbolization of Signal Time Series for Extraction of the Embedded Information

    Directory of Open Access Journals (Sweden)

    Yue Li

    2017-03-01

    Full Text Available This paper formulates an unsupervised algorithm for symbolization of signal time series to capture the embedded dynamic behavior. The key idea is to convert time series of the digital signal into a string of (spatially discrete symbols from which the embedded dynamic information can be extracted in an unsupervised manner (i.e., no requirement for labeling of time series. The main challenges here are: (1 definition of the symbol assignment for the time series; (2 identification of the partitioning segment locations in the signal space of time series; and (3 construction of probabilistic finite-state automata (PFSA from the symbol strings that contain temporal patterns. The reported work addresses these challenges by maximizing the mutual information measures between symbol strings and PFSA states. The proposed symbolization method has been validated by numerical simulation as well as by experimentation in a laboratory environment. Performance of the proposed algorithm has been compared to that of two commonly used algorithms of time series partitioning.

  10. Size and shape stasis in late Pleistocene mammals and birds from Rancho La Brea during the Last Glacial-Interglacial cycle

    Science.gov (United States)

    Prothero, Donald R.; Syverson, Valerie J.; Raymond, Kristina R.; Madan, Meena; Molina, Sarah; Fragomeni, Ashley; DeSantis, Sylvana; Sutyagina, Anastasiya; Gage, Gina L.

    2012-11-01

    Conventional neo-Darwinian theory views organisms as infinitely sensitive and responsive to their environments, and considers them able to readily change size or shape when they adapt to selective pressures. Yet since 1863 it has been well known that Pleistocene animals and plants do not show much morphological change or speciation in response to the glacial-interglacial climate cycles. We tested this hypothesis with all of the common birds (condors, golden and bald eagles, turkeys, caracaras) and mammals (dire wolves, saber-toothed cats, giant lions, horses, camels, bison, and ground sloths) from Rancho La Brea tar pits in Los Angeles, California, which preserves large samples of many bones from many well-dated pits spanning the 35,000 years of the Last Glacial-Interglacial cycle. Pollen evidence showed the climate changed from chaparral/oaks 35,000 years ago to snowy piñon-juniper forests at the peak glacial 20,000 years ago, then back to the modern chaparral since the glacial-interglacial transition. Based on Bergmann's rule, we would expect peak glacial specimens to have larger body sizes, and based on Allen's rule, peak glacial samples should have shorter and more robust limbs. Yet statistical analysis (ANOVA for parametric samples; Kruskal-Wallis test for non-parametric samples) showed that none of the Pleistocene pit samples is statistically distinct from the rest, indicating complete stasis from 35 ka to 9 ka. The sole exception was the Pit 13 sample of dire wolves (16 ka), which was significantly smaller than the rest, but this did not occur in response to climate change. We also performed a time series analysis of the pit samples. None showed directional change; all were either static or showed a random walk. Thus, the data show that birds and mammals at Rancho La Brea show complete stasis and were unresponsive to the major climate change that occurred at 20 ka, consistent with other studies of Pleistocene animals and plants. Most explanations for such

  11. Classification of time-series images using deep convolutional neural networks

    Science.gov (United States)

    Hatami, Nima; Gavet, Yann; Debayle, Johan

    2018-04-01

    Convolutional Neural Networks (CNN) has achieved a great success in image recognition task by automatically learning a hierarchical feature representation from raw data. While the majority of Time-Series Classification (TSC) literature is focused on 1D signals, this paper uses Recurrence Plots (RP) to transform time-series into 2D texture images and then take advantage of the deep CNN classifier. Image representation of time-series introduces different feature types that are not available for 1D signals, and therefore TSC can be treated as texture image recognition task. CNN model also allows learning different levels of representations together with a classifier, jointly and automatically. Therefore, using RP and CNN in a unified framework is expected to boost the recognition rate of TSC. Experimental results on the UCR time-series classification archive demonstrate competitive accuracy of the proposed approach, compared not only to the existing deep architectures, but also to the state-of-the art TSC algorithms.

  12. Long Range Dependence Prognostics for Bearing Vibration Intensity Chaotic Time Series

    Directory of Open Access Journals (Sweden)

    Qing Li

    2016-01-01

    Full Text Available According to the chaotic features and typical fractional order characteristics of the bearing vibration intensity time series, a forecasting approach based on long range dependence (LRD is proposed. In order to reveal the internal chaotic properties, vibration intensity time series are reconstructed based on chaos theory in phase-space, the delay time is computed with C-C method and the optimal embedding dimension and saturated correlation dimension are calculated via the Grassberger–Procaccia (G-P method, respectively, so that the chaotic characteristics of vibration intensity time series can be jointly determined by the largest Lyapunov exponent and phase plane trajectory of vibration intensity time series, meanwhile, the largest Lyapunov exponent is calculated by the Wolf method and phase plane trajectory is illustrated using Duffing-Holmes Oscillator (DHO. The Hurst exponent and long range dependence prediction method are proposed to verify the typical fractional order features and improve the prediction accuracy of bearing vibration intensity time series, respectively. Experience shows that the vibration intensity time series have chaotic properties and the LRD prediction method is better than the other prediction methods (largest Lyapunov, auto regressive moving average (ARMA and BP neural network (BPNN model in prediction accuracy and prediction performance, which provides a new approach for running tendency predictions for rotating machinery and provide some guidance value to the engineering practice.

  13. Critical values for unit root tests in seasonal time series

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); B. Hobijn (Bart)

    1997-01-01

    textabstractIn this paper, we present tables with critical values for a variety of tests for seasonal and non-seasonal unit roots in seasonal time series. We consider (extensions of) the Hylleberg et al. and Osborn et al. test procedures. These extensions concern time series with increasing seasonal

  14. Pleistocene lake level changes in Western Mongolia

    Science.gov (United States)

    Borodavko, P. S.

    2009-04-01

    cane sedges and horsetails dominant. The benthic fauna is poor, and only single pecimens of molluscs and amphipods are met. The ichtyofauna is represented by Oreoleuciscus Pewzowi. Previous and modern investigations of these lakes, their morphologies and deposits, allow to specify periods of extension of the lakes and palaeogeographical conditions. Two clear extension periods can be determined in the Mongolian Great Lakes Basin, corresponding to Mid-and Late Pleistocene transgressions. During the Mid-Pleistocene transgression the current Lakes Har-Us Nur, Dorgon Nur, Hara Nur, Airag Nur and Hyargas were integrated to a united lake, with a maximal level at 1265 m. and total water area about 23 158 km2 . The maximal thickness of Mid-Pleistocene lake deposits is 70 m. Late Pleistocene lake sediments are investigated in sections near Dzabhan River and Hyargas Nuur shorelines. They consist of laminated sand, clay and gravel with cryogenic structures at the base and upper part of sections. The mean thickness of Late Pleistocene lake deposits is 20-35 m. The main characteristics of Late Pleistocene lake features are represented by a very bright "lake relief" — obvious steps of shorelines, gravel bands, bars and spits. The specific structure of Late Pleistocene lake cross-sections allows to separate two transgressions within this period. In the first half of the Holocene a minor regression of several meters occurred. Elements of the modern time aeolian relief were still inundated on the north shore of Lake Har-Us Nur. Researches funded by RFBR (Grant 08-05-00037-a) References 1. Geomorfologiya Mongol'skoi Narodnoi Respubliki (Geomorphology of the Mongolian People Republic). M.: Nauka, pp. 135-148. 2. Ozera MNR i ikh mineral'nye resursy (Lakes of MPR and their mineral resources), 1991. Moscow, Nauka, 136 p. 3. Sevastyanov, D.V., Shuvalov, V.F. and Neustrueva, I. Yu. (Eds.), 1994. Limnologiya i paleolimnologiya Mongolii (Limnology and Palaeolimnology of Mongolia). St

  15. Classification of time series patterns from complex dynamic systems

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately, the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.

  16. Sensitivity analysis of machine-learning models of hydrologic time series

    Science.gov (United States)

    O'Reilly, A. M.

    2017-12-01

    Sensitivity analysis traditionally has been applied to assessing model response to perturbations in model parameters, where the parameters are those model input variables adjusted during calibration. Unlike physics-based models where parameters represent real phenomena, the equivalent of parameters for machine-learning models are simply mathematical "knobs" that are automatically adjusted during training/testing/verification procedures. Thus the challenge of extracting knowledge of hydrologic system functionality from machine-learning models lies in their very nature, leading to the label "black box." Sensitivity analysis of the forcing-response behavior of machine-learning models, however, can provide understanding of how the physical phenomena represented by model inputs affect the physical phenomena represented by model outputs.As part of a previous study, hybrid spectral-decomposition artificial neural network (ANN) models were developed to simulate the observed behavior of hydrologic response contained in multidecadal datasets of lake water level, groundwater level, and spring flow. Model inputs used moving window averages (MWA) to represent various frequencies and frequency-band components of time series of rainfall and groundwater use. Using these forcing time series, the MWA-ANN models were trained to predict time series of lake water level, groundwater level, and spring flow at 51 sites in central Florida, USA. A time series of sensitivities for each MWA-ANN model was produced by perturbing forcing time-series and computing the change in response time-series per unit change in perturbation. Variations in forcing-response sensitivities are evident between types (lake, groundwater level, or spring), spatially (among sites of the same type), and temporally. Two generally common characteristics among sites are more uniform sensitivities to rainfall over time and notable increases in sensitivities to groundwater usage during significant drought periods.

  17. Fractal analysis and nonlinear forecasting of indoor 222Rn time series

    International Nuclear Information System (INIS)

    Pausch, G.; Bossew, P.; Hofmann, W.; Steger, F.

    1998-01-01

    Fractal analyses of indoor 222 Rn time series were performed using different chaos theory based measurements such as time delay method, Hurst's rescaled range analysis, capacity (fractal) dimension, and Lyapunov exponent. For all time series we calculated only positive Lyapunov exponents which is a hint to chaos, while the Hurst exponents were well below 0.5, indicating antipersistent behaviour (past trends tend to reverse in the future). These time series were also analyzed with a nonlinear prediction method which allowed an estimation of the embedding dimensions with some restrictions, limiting the prediction to about three relative time steps. (orig.)

  18. Koopman Operator Framework for Time Series Modeling and Analysis

    Science.gov (United States)

    Surana, Amit

    2018-01-01

    We propose an interdisciplinary framework for time series classification, forecasting, and anomaly detection by combining concepts from Koopman operator theory, machine learning, and linear systems and control theory. At the core of this framework is nonlinear dynamic generative modeling of time series using the Koopman operator which is an infinite-dimensional but linear operator. Rather than working with the underlying nonlinear model, we propose two simpler linear representations or model forms based on Koopman spectral properties. We show that these model forms are invariants of the generative model and can be readily identified directly from data using techniques for computing Koopman spectral properties without requiring the explicit knowledge of the generative model. We also introduce different notions of distances on the space of such model forms which is essential for model comparison/clustering. We employ the space of Koopman model forms equipped with distance in conjunction with classical machine learning techniques to develop a framework for automatic feature generation for time series classification. The forecasting/anomaly detection framework is based on using Koopman model forms along with classical linear systems and control approaches. We demonstrate the proposed framework for human activity classification, and for time series forecasting/anomaly detection in power grid application.

  19. Testing for intracycle determinism in pseudoperiodic time series.

    Science.gov (United States)

    Coelho, Mara C S; Mendes, Eduardo M A M; Aguirre, Luis A

    2008-06-01

    A determinism test is proposed based on the well-known method of the surrogate data. Assuming predictability to be a signature of determinism, the proposed method checks for intracycle (e.g., short-term) determinism in the pseudoperiodic time series for which standard methods of surrogate analysis do not apply. The approach presented is composed of two steps. First, the data are preprocessed to reduce the effects of seasonal and trend components. Second, standard tests of surrogate analysis can then be used. The determinism test is applied to simulated and experimental pseudoperiodic time series and the results show the applicability of the proposed test.

  20. Model for the respiratory modulation of the heart beat-to-beat time interval series

    Science.gov (United States)

    Capurro, Alberto; Diambra, Luis; Malta, C. P.

    2005-09-01

    In this study we present a model for the respiratory modulation of the heart beat-to-beat interval series. The model consists of a set of differential equations used to simulate the membrane potential of a single rabbit sinoatrial node cell, excited with a periodic input signal with added correlated noise. This signal, which simulates the input from the autonomous nervous system to the sinoatrial node, was included in the pacemaker equations as a modulation of the iNaK current pump and the potassium current iK. We focus at modeling the heart beat-to-beat time interval series from normal subjects during meditation of the Kundalini Yoga and Chi techniques. The analysis of the experimental data indicates that while the embedding of pre-meditation and control cases have a roughly circular shape, it acquires a polygonal shape during meditation, triangular for the Kundalini Yoga data and quadrangular in the case of Chi data. The model was used to assess the waveshape of the respiratory signals needed to reproduce the trajectory of the experimental data in the phase space. The embedding of the Chi data could be reproduced using a periodic signal obtained by smoothing a square wave. In the case of Kundalini Yoga data, the embedding was reproduced with a periodic signal obtained by smoothing a triangular wave having a rising branch of longer duration than the decreasing branch. Our study provides an estimation of the respiratory signal using only the heart beat-to-beat time interval series.

  1. Time series analysis and its applications with R examples

    CERN Document Server

    Shumway, Robert H

    2017-01-01

    The fourth edition of this popular graduate textbook, like its predecessors, presents a balanced and comprehensive treatment of both time and frequency domain methods with accompanying theory. Numerous examples using nontrivial data illustrate solutions to problems such as discovering natural and anthropogenic climate change, evaluating pain perception experiments using functional magnetic resonance imaging, and monitoring a nuclear test ban treaty. The book is designed as a textbook for graduate level students in the physical, biological, and social sciences and as a graduate level text in statistics. Some parts may also serve as an undergraduate introductory course. Theory and methodology are separated to allow presentations on different levels. In addition to coverage of classical methods of time series regression, ARIMA models, spectral analysis and state-space models, the text includes modern developments including categorical time series analysis, multivariate spectral methods, long memory series, nonli...

  2. A KST framework for correlation network construction from time series signals

    Science.gov (United States)

    Qi, Jin-Peng; Gu, Quan; Zhu, Ying; Zhang, Ping

    2018-04-01

    A KST (Kolmogorov-Smirnov test and T statistic) method is used for construction of a correlation network based on the fluctuation of each time series within the multivariate time signals. In this method, each time series is divided equally into multiple segments, and the maximal data fluctuation in each segment is calculated by a KST change detection procedure. Connections between each time series are derived from the data fluctuation matrix, and are used for construction of the fluctuation correlation network (FCN). The method was tested with synthetic simulations and the result was compared with those from using KS or T only for detection of data fluctuation. The novelty of this study is that the correlation analyses was based on the data fluctuation in each segment of each time series rather than on the original time signals, which would be more meaningful for many real world applications and for analysis of large-scale time signals where prior knowledge is uncertain.

  3. Multivariate stochastic analysis for Monthly hydrological time series at Cuyahoga River Basin

    Science.gov (United States)

    zhang, L.

    2011-12-01

    Copula has become a very powerful statistic and stochastic methodology in case of the multivariate analysis in Environmental and Water resources Engineering. In recent years, the popular one-parameter Archimedean copulas, e.g. Gumbel-Houggard copula, Cook-Johnson copula, Frank copula, the meta-elliptical copula, e.g. Gaussian Copula, Student-T copula, etc. have been applied in multivariate hydrological analyses, e.g. multivariate rainfall (rainfall intensity, duration and depth), flood (peak discharge, duration and volume), and drought analyses (drought length, mean and minimum SPI values, and drought mean areal extent). Copula has also been applied in the flood frequency analysis at the confluences of river systems by taking into account the dependence among upstream gauge stations rather than by using the hydrological routing technique. In most of the studies above, the annual time series have been considered as stationary signal which the time series have been assumed as independent identically distributed (i.i.d.) random variables. But in reality, hydrological time series, especially the daily and monthly hydrological time series, cannot be considered as i.i.d. random variables due to the periodicity existed in the data structure. Also, the stationary assumption is also under question due to the Climate Change and Land Use and Land Cover (LULC) change in the fast years. To this end, it is necessary to revaluate the classic approach for the study of hydrological time series by relaxing the stationary assumption by the use of nonstationary approach. Also as to the study of the dependence structure for the hydrological time series, the assumption of same type of univariate distribution also needs to be relaxed by adopting the copula theory. In this paper, the univariate monthly hydrological time series will be studied through the nonstationary time series analysis approach. The dependence structure of the multivariate monthly hydrological time series will be

  4. Forecasting daily meteorological time series using ARIMA and regression models

    Science.gov (United States)

    Murat, Małgorzata; Malinowska, Iwona; Gos, Magdalena; Krzyszczak, Jaromir

    2018-04-01

    The daily air temperature and precipitation time series recorded between January 1, 1980 and December 31, 2010 in four European sites (Jokioinen, Dikopshof, Lleida and Lublin) from different climatic zones were modeled and forecasted. In our forecasting we used the methods of the Box-Jenkins and Holt- Winters seasonal auto regressive integrated moving-average, the autoregressive integrated moving-average with external regressors in the form of Fourier terms and the time series regression, including trend and seasonality components methodology with R software. It was demonstrated that obtained models are able to capture the dynamics of the time series data and to produce sensible forecasts.

  5. Analysis of complex time series using refined composite multiscale entropy

    International Nuclear Information System (INIS)

    Wu, Shuen-De; Wu, Chiu-Wen; Lin, Shiou-Gwo; Lee, Kung-Yen; Peng, Chung-Kang

    2014-01-01

    Multiscale entropy (MSE) is an effective algorithm for measuring the complexity of a time series that has been applied in many fields successfully. However, MSE may yield an inaccurate estimation of entropy or induce undefined entropy because the coarse-graining procedure reduces the length of a time series considerably at large scales. Composite multiscale entropy (CMSE) was recently proposed to improve the accuracy of MSE, but it does not resolve undefined entropy. Here we propose a refined composite multiscale entropy (RCMSE) to improve CMSE. For short time series analyses, we demonstrate that RCMSE increases the accuracy of entropy estimation and reduces the probability of inducing undefined entropy.

  6. Compounding approach for univariate time series with nonstationary variances

    Science.gov (United States)

    Schäfer, Rudi; Barkhofen, Sonja; Guhr, Thomas; Stöckmann, Hans-Jürgen; Kuhl, Ulrich

    2015-12-01

    A defining feature of nonstationary systems is the time dependence of their statistical parameters. Measured time series may exhibit Gaussian statistics on short time horizons, due to the central limit theorem. The sample statistics for long time horizons, however, averages over the time-dependent variances. To model the long-term statistical behavior, we compound the local distribution with the distribution of its parameters. Here, we consider two concrete, but diverse, examples of such nonstationary systems: the turbulent air flow of a fan and a time series of foreign exchange rates. Our main focus is to empirically determine the appropriate parameter distribution for the compounding approach. To this end, we extract the relevant time scales by decomposing the time signals into windows and determine the distribution function of the thus obtained local variances.

  7. Pre-series and testing route for the serial fabrication of W7-X target elements

    International Nuclear Information System (INIS)

    Boscary, J.; Greuner, H.; Friedrich, T.; Traxler, H.; Mendelevitch, B.; Boeswirth, B.; Schlosser, J.; Smirnow, M.; Stadler, R.

    2009-01-01

    The fabrication of the actively cooled high-heat flux divertor of the WENDELSTEIN 7-X stellarator (W7-X) requires the delivery of 890 target elements, which are designed to withstand a stationary heat flux of 10 MW/m 2 . The organization of the manufacturing and testing route for the serial fabrication is the result of the pre-series activities. Flat CFC Sepcarb NB31 tiles are bonded to CuCrZr copper alloy cooling structure in consecutive steps. A copper layer is active metal cast to CFC tiles, and then an OF-copper layer is added by hot isostatic pressing to produce bi-layer tiles. These tiles are bonded by electron beam welding onto the cooling structure, which was manufactured independently. The introduction of the bi-layer technology proved to be a significant improvement of the bond reliability under thermal cycling loading. This result is also the consequence of the improved bond inspections throughout the manufacturing route performed in the ARGUS pulsed thermography facility of PLANSEE. The repairing process by electron beam welding of the bonding was also qualified. The extended pre-series activities related to the qualification of fabrication processes with the relevant non-destructive examinations aim to minimize the risks for the serial manufacturing and to guarantee the steady-state operation of the W7-X divertor.

  8. Tools for Generating Useful Time-series Data from PhenoCam Images

    Science.gov (United States)

    Milliman, T. E.; Friedl, M. A.; Frolking, S.; Hufkens, K.; Klosterman, S.; Richardson, A. D.; Toomey, M. P.

    2012-12-01

    The PhenoCam project (http://phenocam.unh.edu/) is tasked with acquiring, processing, and archiving digital repeat photography to be used for scientific studies of vegetation phenological processes. Over the past 5 years the PhenoCam project has collected over 2 million time series images for a total over 700 GB of image data. Several papers have been published describing derived "vegetation indices" (such as green-chromatic-coordinate or gcc) which can be compared to standard measures such as NDVI or EVI. Imagery from our archive is available for download but converting series of images for a particular camera into useful scientific data, while simple in principle, is complicated by a variety of factors. Cameras are often exposed to harsh weather conditions (high wind, rain, ice, snow pile up), which result in images where the field of view (FOV) is partially obscured or completely blocked for periods of time. The FOV can also change for other reasons (mount failures, tower maintenance, etc.) Some of the relatively inexpensive cameras that are being used can also temporarily lose color balance or exposure controls resulting in loss of imagery. All these factors negatively influence the automated analysis of the image time series making this a non-trivial task. Here we discuss the challenges of processing PhenoCam image time-series for vegetation monitoring and the associated data management tasks. We describe our current processing framework and a simple standardized output format for the resulting time-series data. The time-series data in this format will be generated for specific "regions of interest" (ROI's) for each of the cameras in the PhenoCam network. This standardized output (which will be updated daily) can be considered 'the pulse' of a particular camera and will provide a default phenological dynamic for said camera. The time-series data can also be viewed as a higher level product which can be used to generate "vegetation indices", like gcc, for

  9. Multiple Time Series Ising Model for Financial Market Simulations

    International Nuclear Information System (INIS)

    Takaishi, Tetsuya

    2015-01-01

    In this paper we propose an Ising model which simulates multiple financial time series. Our model introduces the interaction which couples to spins of other systems. Simulations from our model show that time series exhibit the volatility clustering that is often observed in the real financial markets. Furthermore we also find non-zero cross correlations between the volatilities from our model. Thus our model can simulate stock markets where volatilities of stocks are mutually correlated

  10. Pleistocene vertebrates of the Yukon Territory

    Science.gov (United States)

    Harington, C. R.

    2011-08-01

    Unglaciated parts of the Yukon constitute one of the most important areas in North America for yielding Pleistocene vertebrate fossils. Nearly 30 vertebrate faunal localities are reviewed spanning a period of about 1.6 Ma (million years ago) to the close of the Pleistocene some 10 000 BP (radiocarbon years before present, taken as 1950). The vertebrate fossils represent at least 8 species of fishes, 1 amphibian, 41 species of birds and 83 species of mammals. Dominant among the large mammals are: steppe bison ( Bison priscus), horse ( Equus sp.), woolly mammoth ( Mammuthus primigenius), and caribou ( Rangifer tarandus) - signature species of the Mammoth Steppe fauna ( Fig. 1), which was widespread from the British Isles, through northern Europe, and Siberia to Alaska, Yukon and adjacent Northwest Territories. The Yukon faunas extend from Herschel Island in the north to Revenue Creek in the south and from the Alaskan border in the west to Ketza River in the east. The Yukon holds evidence of the earliest-known people in North America. Artifacts made from bison, mammoth and caribou bones from Bluefish Caves, Old Crow Basin and Dawson City areas show that people had a substantial knowledge of making and using bone tools at least by 25 000 BP, and possibly as early as 40 000 BP. A suggested chronological sequence of Yukon Pleistocene vertebrates ( Table 1) facilitates comparison of selected faunas and indicates the known duration of various taxa.

  11. Human Remains from the Pleistocene-Holocene Transition of Southwest China Suggest a Complex Evolutionary History for East Asians

    Science.gov (United States)

    Curnoe, Darren; Xueping, Ji; Herries, Andy I. R.; Kanning, Bai; Taçon, Paul S. C.; Zhende, Bao; Fink, David; Yunsheng, Zhu; Hellstrom, John; Yun, Luo; Cassis, Gerasimos; Bing, Su; Wroe, Stephen; Shi, Hong; Parr, William C. H.; Shengmin, Huang; Rogers, Natalie

    2012-01-01

    Background Later Pleistocene human evolution in East Asia remains poorly understood owing to a scarcity of well described, reliably classified and accurately dated fossils. Southwest China has been identified from genetic research as a hotspot of human diversity, containing ancient mtDNA and Y-DNA lineages, and has yielded a number of human remains thought to derive from Pleistocene deposits. We have prepared, reconstructed, described and dated a new partial skull from a consolidated sediment block collected in 1979 from the site of Longlin Cave (Guangxi Province). We also undertook new excavations at Maludong (Yunnan Province) to clarify the stratigraphy and dating of a large sample of mostly undescribed human remains from the site. Methodology/Principal Findings We undertook a detailed comparison of cranial, including a virtual endocast for the Maludong calotte, mandibular and dental remains from these two localities. Both samples probably derive from the same population, exhibiting an unusual mixture of modern human traits, characters probably plesiomorphic for later Homo, and some unusual features. We dated charcoal with AMS radiocarbon dating and speleothem with the Uranium-series technique and the results show both samples to be from the Pleistocene-Holocene transition: ∼14.3-11.5 ka. Conclusions/Significance Our analysis suggests two plausible explanations for the morphology sampled at Longlin Cave and Maludong. First, it may represent a late-surviving archaic population, perhaps paralleling the situation seen in North Africa as indicated by remains from Dar-es-Soltane and Temara, and maybe also in southern China at Zhirendong. Alternatively, East Asia may have been colonised during multiple waves during the Pleistocene, with the Longlin-Maludong morphology possibly reflecting deep population substructure in Africa prior to modern humans dispersing into Eurasia. PMID:22431968

  12. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012.

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis.

  13. Upper Pleistocene Human Dispersals out of Africa: A Review of the Current State of the Debate

    Science.gov (United States)

    Beyin, Amanuel

    2011-01-01

    Although there is a general consensus on African origin of early modern humans, there is disagreement about how and when they dispersed to Eurasia. This paper reviews genetic and Middle Stone Age/Middle Paleolithic archaeological literature from northeast Africa, Arabia, and the Levant to assess the timing and geographic backgrounds of Upper Pleistocene human colonization of Eurasia. At the center of the discussion lies the question of whether eastern Africa alone was the source of Upper Pleistocene human dispersals into Eurasia or were there other loci of human expansions outside of Africa? The reviewed literature hints at two modes of early modern human colonization of Eurasia in the Upper Pleistocene: (i) from multiple Homo sapiens source populations that had entered Arabia, South Asia, and the Levant prior to and soon after the onset of the Last Interglacial (MIS-5), (ii) from a rapid dispersal out of East Africa via the Southern Route (across the Red Sea basin), dating to ~74–60 kya. PMID:21716744

  14. FTSPlot: fast time series visualization for large datasets.

    Directory of Open Access Journals (Sweden)

    Michael Riss

    Full Text Available The analysis of electrophysiological recordings often involves visual inspection of time series data to locate specific experiment epochs, mask artifacts, and verify the results of signal processing steps, such as filtering or spike detection. Long-term experiments with continuous data acquisition generate large amounts of data. Rapid browsing through these massive datasets poses a challenge to conventional data plotting software because the plotting time increases proportionately to the increase in the volume of data. This paper presents FTSPlot, which is a visualization concept for large-scale time series datasets using techniques from the field of high performance computer graphics, such as hierarchic level of detail and out-of-core data handling. In a preprocessing step, time series data, event, and interval annotations are converted into an optimized data format, which then permits fast, interactive visualization. The preprocessing step has a computational complexity of O(n x log(N; the visualization itself can be done with a complexity of O(1 and is therefore independent of the amount of data. A demonstration prototype has been implemented and benchmarks show that the technology is capable of displaying large amounts of time series data, event, and interval annotations lag-free with < 20 ms ms. The current 64-bit implementation theoretically supports datasets with up to 2(64 bytes, on the x86_64 architecture currently up to 2(48 bytes are supported, and benchmarks have been conducted with 2(40 bytes/1 TiB or 1.3 x 10(11 double precision samples. The presented software is freely available and can be included as a Qt GUI component in future software projects, providing a standard visualization method for long-term electrophysiological experiments.

  15. Normalization methods in time series of platelet function assays

    Science.gov (United States)

    Van Poucke, Sven; Zhang, Zhongheng; Roest, Mark; Vukicevic, Milan; Beran, Maud; Lauwereins, Bart; Zheng, Ming-Hua; Henskens, Yvonne; Lancé, Marcus; Marcus, Abraham

    2016-01-01

    Abstract Platelet function can be quantitatively assessed by specific assays such as light-transmission aggregometry, multiple-electrode aggregometry measuring the response to adenosine diphosphate (ADP), arachidonic acid, collagen, and thrombin-receptor activating peptide and viscoelastic tests such as rotational thromboelastometry (ROTEM). The task of extracting meaningful statistical and clinical information from high-dimensional data spaces in temporal multivariate clinical data represented in multivariate time series is complex. Building insightful visualizations for multivariate time series demands adequate usage of normalization techniques. In this article, various methods for data normalization (z-transformation, range transformation, proportion transformation, and interquartile range) are presented and visualized discussing the most suited approach for platelet function data series. Normalization was calculated per assay (test) for all time points and per time point for all tests. Interquartile range, range transformation, and z-transformation demonstrated the correlation as calculated by the Spearman correlation test, when normalized per assay (test) for all time points. When normalizing per time point for all tests, no correlation could be abstracted from the charts as was the case when using all data as 1 dataset for normalization. PMID:27428217

  16. A geometric morphometric analysis of hominin upper second and third molars, with particular emphasis on European Pleistocene populations.

    Science.gov (United States)

    Gómez-Robles, Aida; Bermúdez de Castro, José María; Martinón-Torres, María; Prado-Simón, Leyre; Arsuaga, Juan Luis

    2012-09-01

    The study of dental morphology by means of geometric morphometric methods allows for a detailed and quantitative comparison of hominin species that is useful for taxonomic assignment and phylogenetic reconstruction. Upper second and third molars have been studied in a comprehensive sample of Plio- and Pleistocene hominins from African, Asian and European sites in order to complete our analysis of the upper postcanine dentition. Intraspecific variation in these two molars is high, but some interspecific trends can be identified. Both molars exhibit a strong reduction of the distal cusps in recent hominin species, namely European Homo heidelbergensis, Homo neanderthalensis and Homo sapiens, but this reduction shows specific patterns and proportions in the three groups. Second molars tend to show four well developed cusps in earlier hominin species and their morphology is only marginally affected by allometric effects. Third molars can be incipiently reduced in earlier species and they evince a significant allometric component, identified both inter- and intraspecifically. European Middle Pleistocene fossils from Sima de los Huesos (SH) show a very strong reduction of these two molars, even more marked than the reduction observed in Neanderthals and in modern human populations. The highly derived shape of SH molars points to an early acquisition of typical Neanderthal dental traits by pre-Neanderthal populations and to a deviation of this population from mean morphologies of other European Middle Pleistocene groups. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Development and application of a modified dynamic time warping algorithm (DTW-S) to analyses of primate brain expression time series.

    Science.gov (United States)

    Yuan, Yuan; Chen, Yi-Ping Phoebe; Ni, Shengyu; Xu, Augix Guohua; Tang, Lin; Vingron, Martin; Somel, Mehmet; Khaitovich, Philipp

    2011-08-18

    Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S) algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR) and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.

  18. hctsa: A Computational Framework for Automated Time-Series Phenotyping Using Massive Feature Extraction.

    Science.gov (United States)

    Fulcher, Ben D; Jones, Nick S

    2017-11-22

    Phenotype measurements frequently take the form of time series, but we currently lack a systematic method for relating these complex data streams to scientifically meaningful outcomes, such as relating the movement dynamics of organisms to their genotype or measurements of brain dynamics of a patient to their disease diagnosis. Previous work addressed this problem by comparing implementations of thousands of diverse scientific time-series analysis methods in an approach termed highly comparative time-series analysis. Here, we introduce hctsa, a software tool for applying this methodological approach to data. hctsa includes an architecture for computing over 7,700 time-series features and a suite of analysis and visualization algorithms to automatically select useful and interpretable time-series features for a given application. Using exemplar applications to high-throughput phenotyping experiments, we show how hctsa allows researchers to leverage decades of time-series research to quantify and understand informative structure in time-series data. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. Small Sample Properties of Bayesian Multivariate Autoregressive Time Series Models

    Science.gov (United States)

    Price, Larry R.

    2012-01-01

    The aim of this study was to compare the small sample (N = 1, 3, 5, 10, 15) performance of a Bayesian multivariate vector autoregressive (BVAR-SEM) time series model relative to frequentist power and parameter estimation bias. A multivariate autoregressive model was developed based on correlated autoregressive time series vectors of varying…

  20. Time-series panel analysis (TSPA): multivariate modeling of temporal associations in psychotherapy process.

    Science.gov (United States)

    Ramseyer, Fabian; Kupper, Zeno; Caspar, Franz; Znoj, Hansjörg; Tschacher, Wolfgang

    2014-10-01

    Processes occurring in the course of psychotherapy are characterized by the simple fact that they unfold in time and that the multiple factors engaged in change processes vary highly between individuals (idiographic phenomena). Previous research, however, has neglected the temporal perspective by its traditional focus on static phenomena, which were mainly assessed at the group level (nomothetic phenomena). To support a temporal approach, the authors introduce time-series panel analysis (TSPA), a statistical methodology explicitly focusing on the quantification of temporal, session-to-session aspects of change in psychotherapy. TSPA-models are initially built at the level of individuals and are subsequently aggregated at the group level, thus allowing the exploration of prototypical models. TSPA is based on vector auto-regression (VAR), an extension of univariate auto-regression models to multivariate time-series data. The application of TSPA is demonstrated in a sample of 87 outpatient psychotherapy patients who were monitored by postsession questionnaires. Prototypical mechanisms of change were derived from the aggregation of individual multivariate models of psychotherapy process. In a 2nd step, the associations between mechanisms of change (TSPA) and pre- to postsymptom change were explored. TSPA allowed a prototypical process pattern to be identified, where patient's alliance and self-efficacy were linked by a temporal feedback-loop. Furthermore, therapist's stability over time in both mastery and clarification interventions was positively associated with better outcomes. TSPA is a statistical tool that sheds new light on temporal mechanisms of change. Through this approach, clinicians may gain insight into prototypical patterns of change in psychotherapy. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  1. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  2. On Stabilizing the Variance of Dynamic Functional Brain Connectivity Time Series.

    Science.gov (United States)

    Thompson, William Hedley; Fransson, Peter

    2016-12-01

    Assessment of dynamic functional brain connectivity based on functional magnetic resonance imaging (fMRI) data is an increasingly popular strategy to investigate temporal dynamics of the brain's large-scale network architecture. Current practice when deriving connectivity estimates over time is to use the Fisher transformation, which aims to stabilize the variance of correlation values that fluctuate around varying true correlation values. It is, however, unclear how well the stabilization of signal variance performed by the Fisher transformation works for each connectivity time series, when the true correlation is assumed to be fluctuating. This is of importance because many subsequent analyses either assume or perform better when the time series have stable variance or adheres to an approximate Gaussian distribution. In this article, using simulations and analysis of resting-state fMRI data, we analyze the effect of applying different variance stabilization strategies on connectivity time series. We focus our investigation on the Fisher transformation, the Box-Cox (BC) transformation and an approach that combines both transformations. Our results show that, if the intention of stabilizing the variance is to use metrics on the time series, where stable variance or a Gaussian distribution is desired (e.g., clustering), the Fisher transformation is not optimal and may even skew connectivity time series away from being Gaussian. Furthermore, we show that the suboptimal performance of the Fisher transformation can be substantially improved by including an additional BC transformation after the dynamic functional connectivity time series has been Fisher transformed.

  3. Characteristics of the transmission of autoregressive sub-patterns in financial time series

    Science.gov (United States)

    Gao, Xiangyun; An, Haizhong; Fang, Wei; Huang, Xuan; Li, Huajiao; Zhong, Weiqiong

    2014-09-01

    There are many types of autoregressive patterns in financial time series, and they form a transmission process. Here, we define autoregressive patterns quantitatively through an econometrical regression model. We present a computational algorithm that sets the autoregressive patterns as nodes and transmissions between patterns as edges, and then converts the transmission process of autoregressive patterns in a time series into a network. We utilised daily Shanghai (securities) composite index time series to study the transmission characteristics of autoregressive patterns. We found statistically significant evidence that the financial market is not random and that there are similar characteristics between parts and whole time series. A few types of autoregressive sub-patterns and transmission patterns drive the oscillations of the financial market. A clustering effect on fluctuations appears in the transmission process, and certain non-major autoregressive sub-patterns have high media capabilities in the financial time series. Different stock indexes exhibit similar characteristics in the transmission of fluctuation information. This work not only proposes a distinctive perspective for analysing financial time series but also provides important information for investors.

  4. A Review of Some Aspects of Robust Inference for Time Series.

    Science.gov (United States)

    1984-09-01

    REVIEW OF SOME ASPECTSOF ROBUST INFERNCE FOR TIME SERIES by Ad . Dougla Main TE "iAL REPOW No. 63 Septermber 1984 Department of Statistics University of ...clear. One cannot hope to have a good method for dealing with outliers in time series by using only an instantaneous nonlinear transformation of the data...AI.49 716 A REVIEWd OF SOME ASPECTS OF ROBUST INFERENCE FOR TIME 1/1 SERIES(U) WASHINGTON UNIV SEATTLE DEPT OF STATISTICS R D MARTIN SEP 84 TR-53

  5. Refined composite multiscale weighted-permutation entropy of financial time series

    Science.gov (United States)

    Zhang, Yongping; Shang, Pengjian

    2018-04-01

    For quantifying the complexity of nonlinear systems, multiscale weighted-permutation entropy (MWPE) has recently been proposed. MWPE has incorporated amplitude information and been applied to account for the multiple inherent dynamics of time series. However, MWPE may be unreliable, because its estimated values show large fluctuation for slight variation of the data locations, and a significant distinction only for the different length of time series. Therefore, we propose the refined composite multiscale weighted-permutation entropy (RCMWPE). By comparing the RCMWPE results with other methods' results on both synthetic data and financial time series, RCMWPE method shows not only the advantages inherited from MWPE but also lower sensitivity to the data locations, more stable and much less dependent on the length of time series. Moreover, we present and discuss the results of RCMWPE method on the daily price return series from Asian and European stock markets. There are significant differences between Asian markets and European markets, and the entropy values of Hang Seng Index (HSI) are close to but higher than those of European markets. The reliability of the proposed RCMWPE method has been supported by simulations on generated and real data. It could be applied to a variety of fields to quantify the complexity of the systems over multiple scales more accurately.

  6. Parametric, nonparametric and parametric modelling of a chaotic circuit time series

    Science.gov (United States)

    Timmer, J.; Rust, H.; Horbelt, W.; Voss, H. U.

    2000-09-01

    The determination of a differential equation underlying a measured time series is a frequently arising task in nonlinear time series analysis. In the validation of a proposed model one often faces the dilemma that it is hard to decide whether possible discrepancies between the time series and model output are caused by an inappropriate model or by bad estimates of parameters in a correct type of model, or both. We propose a combination of parametric modelling based on Bock's multiple shooting algorithm and nonparametric modelling based on optimal transformations as a strategy to test proposed models and if rejected suggest and test new ones. We exemplify this strategy on an experimental time series from a chaotic circuit where we obtain an extremely accurate reconstruction of the observed attractor.

  7. Synthetic river flow time series generator for dispatch and spot price forecast

    International Nuclear Information System (INIS)

    Flores, R.A.

    2007-01-01

    Decision-making in electricity markets is complicated by uncertainties in demand growth, power supplies and fuel prices. In Peru, where the electrical power system is highly dependent on water resources at dams and river flows, hydrological uncertainties play a primary role in planning, price and dispatch forecast. This paper proposed a signal processing method for generating new synthetic river flow time series as a support for planning and spot market price forecasting. River flow time series are natural phenomena representing a continuous-time domain process. As an alternative synthetic representation of the original river flow time series, this proposed signal processing method preserves correlations, basic statistics and seasonality. It takes into account deterministic, periodic and non periodic components such as those due to the El Nino Southern Oscillation phenomenon. The new synthetic time series has many correlations with the original river flow time series, rendering it suitable for possible replacement of the classical method of sorting historical river flow time series. As a dispatch and planning approach to spot pricing, the proposed method offers higher accuracy modeling by decomposing the signal into deterministic, periodic, non periodic and stochastic sub signals. 4 refs., 4 tabs., 13 figs

  8. Causality as a Rigorous Notion and Quantitative Causality Analysis with Time Series

    Science.gov (United States)

    Liang, X. S.

    2017-12-01

    , on paleoclimate scales the cause-effect relation may be completely reversed. Key words: Causation, Information flow, Uncertainty Generation, El Niño, IOD, CO2/Global warming Reference : Liang, 2014: Unraveling the cause-effect relation between time series. PRE 90, 052150 News Report: http://scitation.aip.org/content/aip/magazine/physicstoday/news/10.1063/PT.5.7124

  9. Cross-sample entropy of foreign exchange time series

    Science.gov (United States)

    Liu, Li-Zhi; Qian, Xi-Yuan; Lu, Heng-Yao

    2010-11-01

    The correlation of foreign exchange rates in currency markets is investigated based on the empirical data of DKK/USD, NOK/USD, CAD/USD, JPY/USD, KRW/USD, SGD/USD, THB/USD and TWD/USD for a period from 1995 to 2002. Cross-SampEn (cross-sample entropy) method is used to compare the returns of every two exchange rate time series to assess their degree of asynchrony. The calculation method of confidence interval of SampEn is extended and applied to cross-SampEn. The cross-SampEn and its confidence interval for every two of the exchange rate time series in periods 1995-1998 (before the Asian currency crisis) and 1999-2002 (after the Asian currency crisis) are calculated. The results show that the cross-SampEn of every two of these exchange rates becomes higher after the Asian currency crisis, indicating a higher asynchrony between the exchange rates. Especially for Singapore, Thailand and Taiwan, the cross-SampEn values after the Asian currency crisis are significantly higher than those before the Asian currency crisis. Comparison with the correlation coefficient shows that cross-SampEn is superior to describe the correlation between time series.

  10. Clustering Multivariate Time Series Using Hidden Markov Models

    Directory of Open Access Journals (Sweden)

    Shima Ghassempour

    2014-03-01

    Full Text Available In this paper we describe an algorithm for clustering multivariate time series with variables taking both categorical and continuous values. Time series of this type are frequent in health care, where they represent the health trajectories of individuals. The problem is challenging because categorical variables make it difficult to define a meaningful distance between trajectories. We propose an approach based on Hidden Markov Models (HMMs, where we first map each trajectory into an HMM, then define a suitable distance between HMMs and finally proceed to cluster the HMMs with a method based on a distance matrix. We test our approach on a simulated, but realistic, data set of 1,255 trajectories of individuals of age 45 and over, on a synthetic validation set with known clustering structure, and on a smaller set of 268 trajectories extracted from the longitudinal Health and Retirement Survey. The proposed method can be implemented quite simply using standard packages in R and Matlab and may be a good candidate for solving the difficult problem of clustering multivariate time series with categorical variables using tools that do not require advanced statistic knowledge, and therefore are accessible to a wide range of researchers.

  11. TimesVector: a vectorized clustering approach to the analysis of time series transcriptome data from multiple phenotypes.

    Science.gov (United States)

    Jung, Inuk; Jo, Kyuri; Kang, Hyejin; Ahn, Hongryul; Yu, Youngjae; Kim, Sun

    2017-12-01

    Identifying biologically meaningful gene expression patterns from time series gene expression data is important to understand the underlying biological mechanisms. To identify significantly perturbed gene sets between different phenotypes, analysis of time series transcriptome data requires consideration of time and sample dimensions. Thus, the analysis of such time series data seeks to search gene sets that exhibit similar or different expression patterns between two or more sample conditions, constituting the three-dimensional data, i.e. gene-time-condition. Computational complexity for analyzing such data is very high, compared to the already difficult NP-hard two dimensional biclustering algorithms. Because of this challenge, traditional time series clustering algorithms are designed to capture co-expressed genes with similar expression pattern in two sample conditions. We present a triclustering algorithm, TimesVector, specifically designed for clustering three-dimensional time series data to capture distinctively similar or different gene expression patterns between two or more sample conditions. TimesVector identifies clusters with distinctive expression patterns in three steps: (i) dimension reduction and clustering of time-condition concatenated vectors, (ii) post-processing clusters for detecting similar and distinct expression patterns and (iii) rescuing genes from unclassified clusters. Using four sets of time series gene expression data, generated by both microarray and high throughput sequencing platforms, we demonstrated that TimesVector successfully detected biologically meaningful clusters of high quality. TimesVector improved the clustering quality compared to existing triclustering tools and only TimesVector detected clusters with differential expression patterns across conditions successfully. The TimesVector software is available at http://biohealth.snu.ac.kr/software/TimesVector/. sunkim.bioinfo@snu.ac.kr. Supplementary data are available at

  12. Stochastic generation of hourly wind speed time series

    International Nuclear Information System (INIS)

    Shamshad, A.; Wan Mohd Ali Wan Hussin; Bawadi, M.A.; Mohd Sanusi, S.A.

    2006-01-01

    In the present study hourly wind speed data of Kuala Terengganu in Peninsular Malaysia are simulated by using transition matrix approach of Markovian process. The wind speed time series is divided into various states based on certain criteria. The next wind speed states are selected based on the previous states. The cumulative probability transition matrix has been formed in which each row ends with 1. Using the uniform random numbers between 0 and 1, a series of future states is generated. These states have been converted to the corresponding wind speed values using another uniform random number generator. The accuracy of the model has been determined by comparing the statistical characteristics such as average, standard deviation, root mean square error, probability density function and autocorrelation function of the generated data to those of the original data. The generated wind speed time series data is capable to preserve the wind speed characteristics of the observed data

  13. Causal strength induction from time series data.

    Science.gov (United States)

    Soo, Kevin W; Rottman, Benjamin M

    2018-04-01

    One challenge when inferring the strength of cause-effect relations from time series data is that the cause and/or effect can exhibit temporal trends. If temporal trends are not accounted for, a learner could infer that a causal relation exists when it does not, or even infer that there is a positive causal relation when the relation is negative, or vice versa. We propose that learners use a simple heuristic to control for temporal trends-that they focus not on the states of the cause and effect at a given instant, but on how the cause and effect change from one observation to the next, which we call transitions. Six experiments were conducted to understand how people infer causal strength from time series data. We found that participants indeed use transitions in addition to states, which helps them to reach more accurate causal judgments (Experiments 1A and 1B). Participants use transitions more when the stimuli are presented in a naturalistic visual format than a numerical format (Experiment 2), and the effect of transitions is not driven by primacy or recency effects (Experiment 3). Finally, we found that participants primarily use the direction in which variables change rather than the magnitude of the change for estimating causal strength (Experiments 4 and 5). Collectively, these studies provide evidence that people often use a simple yet effective heuristic for inferring causal strength from time series data. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Interpretable Categorization of Heterogeneous Time Series Data

    Science.gov (United States)

    Lee, Ritchie; Kochenderfer, Mykel J.; Mengshoel, Ole J.; Silbermann, Joshua

    2017-01-01

    We analyze data from simulated aircraft encounters to validate and inform the development of a prototype aircraft collision avoidance system. The high-dimensional and heterogeneous time series dataset is analyzed to discover properties of near mid-air collisions (NMACs) and categorize the NMAC encounters. Domain experts use these properties to better organize and understand NMAC occurrences. Existing solutions either are not capable of handling high-dimensional and heterogeneous time series datasets or do not provide explanations that are interpretable by a domain expert. The latter is critical to the acceptance and deployment of safety-critical systems. To address this gap, we propose grammar-based decision trees along with a learning algorithm. Our approach extends decision trees with a grammar framework for classifying heterogeneous time series data. A context-free grammar is used to derive decision expressions that are interpretable, application-specific, and support heterogeneous data types. In addition to classification, we show how grammar-based decision trees can also be used for categorization, which is a combination of clustering and generating interpretable explanations for each cluster. We apply grammar-based decision trees to a simulated aircraft encounter dataset and evaluate the performance of four variants of our learning algorithm. The best algorithm is used to analyze and categorize near mid-air collisions in the aircraft encounter dataset. We describe each discovered category in detail and discuss its relevance to aircraft collision avoidance.

  15. Minimum entropy density method for the time series analysis

    Science.gov (United States)

    Lee, Jeong Won; Park, Joongwoo Brian; Jo, Hang-Hyun; Yang, Jae-Suk; Moon, Hie-Tae

    2009-01-01

    The entropy density is an intuitive and powerful concept to study the complicated nonlinear processes derived from physical systems. We develop the minimum entropy density method (MEDM) to detect the structure scale of a given time series, which is defined as the scale in which the uncertainty is minimized, hence the pattern is revealed most. The MEDM is applied to the financial time series of Standard and Poor’s 500 index from February 1983 to April 2006. Then the temporal behavior of structure scale is obtained and analyzed in relation to the information delivery time and efficient market hypothesis.

  16. Time series analysis of the developed financial markets' integration using visibility graphs

    Science.gov (United States)

    Zhuang, Enyu; Small, Michael; Feng, Gang

    2014-09-01

    A time series representing the developed financial markets' segmentation from 1973 to 2012 is studied. The time series reveals an obvious market integration trend. To further uncover the features of this time series, we divide it into seven windows and generate seven visibility graphs. The measuring capabilities of the visibility graphs provide means to quantitatively analyze the original time series. It is found that the important historical incidents that influenced market integration coincide with variations in the measured graphical node degree. Through the measure of neighborhood span, the frequencies of the historical incidents are disclosed. Moreover, it is also found that large "cycles" and significant noise in the time series are linked to large and small communities in the generated visibility graphs. For large cycles, how historical incidents significantly affected market integration is distinguished by density and compactness of the corresponding communities.

  17. A cluster merging method for time series microarray with production values.

    Science.gov (United States)

    Chira, Camelia; Sedano, Javier; Camara, Monica; Prieto, Carlos; Villar, Jose R; Corchado, Emilio

    2014-09-01

    A challenging task in time-course microarray data analysis is to cluster genes meaningfully combining the information provided by multiple replicates covering the same key time points. This paper proposes a novel cluster merging method to accomplish this goal obtaining groups with highly correlated genes. The main idea behind the proposed method is to generate a clustering starting from groups created based on individual temporal series (representing different biological replicates measured in the same time points) and merging them by taking into account the frequency by which two genes are assembled together in each clustering. The gene groups at the level of individual time series are generated using several shape-based clustering methods. This study is focused on a real-world time series microarray task with the aim to find co-expressed genes related to the production and growth of a certain bacteria. The shape-based clustering methods used at the level of individual time series rely on identifying similar gene expression patterns over time which, in some models, are further matched to the pattern of production/growth. The proposed cluster merging method is able to produce meaningful gene groups which can be naturally ranked by the level of agreement on the clustering among individual time series. The list of clusters and genes is further sorted based on the information correlation coefficient and new problem-specific relevant measures. Computational experiments and results of the cluster merging method are analyzed from a biological perspective and further compared with the clustering generated based on the mean value of time series and the same shape-based algorithm.

  18. Constructing networks from a dynamical system perspective for multivariate nonlinear time series.

    Science.gov (United States)

    Nakamura, Tomomichi; Tanizawa, Toshihiro; Small, Michael

    2016-03-01

    We describe a method for constructing networks for multivariate nonlinear time series. We approach the interaction between the various scalar time series from a deterministic dynamical system perspective and provide a generic and algorithmic test for whether the interaction between two measured time series is statistically significant. The method can be applied even when the data exhibit no obvious qualitative similarity: a situation in which the naive method utilizing the cross correlation function directly cannot correctly identify connectivity. To establish the connectivity between nodes we apply the previously proposed small-shuffle surrogate (SSS) method, which can investigate whether there are correlation structures in short-term variabilities (irregular fluctuations) between two data sets from the viewpoint of deterministic dynamical systems. The procedure to construct networks based on this idea is composed of three steps: (i) each time series is considered as a basic node of a network, (ii) the SSS method is applied to verify the connectivity between each pair of time series taken from the whole multivariate time series, and (iii) the pair of nodes is connected with an undirected edge when the null hypothesis cannot be rejected. The network constructed by the proposed method indicates the intrinsic (essential) connectivity of the elements included in the system or the underlying (assumed) system. The method is demonstrated for numerical data sets generated by known systems and applied to several experimental time series.

  19. Time Series Modelling of Syphilis Incidence in China from 2005 to 2012

    Science.gov (United States)

    Zhang, Xingyu; Zhang, Tao; Pei, Jiao; Liu, Yuanyuan; Li, Xiaosong; Medrano-Gracia, Pau

    2016-01-01

    Background The infection rate of syphilis in China has increased dramatically in recent decades, becoming a serious public health concern. Early prediction of syphilis is therefore of great importance for heath planning and management. Methods In this paper, we analyzed surveillance time series data for primary, secondary, tertiary, congenital and latent syphilis in mainland China from 2005 to 2012. Seasonality and long-term trend were explored with decomposition methods. Autoregressive integrated moving average (ARIMA) was used to fit a univariate time series model of syphilis incidence. A separate multi-variable time series for each syphilis type was also tested using an autoregressive integrated moving average model with exogenous variables (ARIMAX). Results The syphilis incidence rates have increased three-fold from 2005 to 2012. All syphilis time series showed strong seasonality and increasing long-term trend. Both ARIMA and ARIMAX models fitted and estimated syphilis incidence well. All univariate time series showed highest goodness-of-fit results with the ARIMA(0,0,1)×(0,1,1) model. Conclusion Time series analysis was an effective tool for modelling the historical and future incidence of syphilis in China. The ARIMAX model showed superior performance than the ARIMA model for the modelling of syphilis incidence. Time series correlations existed between the models for primary, secondary, tertiary, congenital and latent syphilis. PMID:26901682

  20. Reconstruction of tritium time series in precipitation

    International Nuclear Information System (INIS)

    Celle-Jeanton, H.; Gourcy, L.; Aggarwal, P.K.

    2002-01-01

    Tritium is commonly used in groundwaters studies to calculate the recharge rate and to identify the presence of a modern recharge. The knowledge of 3 H precipitation time series is then very important for the study of groundwater recharge. Rozanski and Araguas provided good information on precipitation tritium content in 180 stations of the GNIP network to the end of 1987, but it shows some lacks of measurements either within one chronicle or within one region (the Southern hemisphere for instance). Therefore, it seems to be essential to find a method to recalculate data for a region where no measurement is available.To solve this problem, we propose another method which is based on triangulation. It needs the knowledge of 3 H time series of 3 stations surrounding geographically the 4-th station for which tritium input curve has to be reconstructed

  1. Time Series, Stochastic Processes and Completeness of Quantum Theory

    International Nuclear Information System (INIS)

    Kupczynski, Marian

    2011-01-01

    Most of physical experiments are usually described as repeated measurements of some random variables. Experimental data registered by on-line computers form time series of outcomes. The frequencies of different outcomes are compared with the probabilities provided by the algorithms of quantum theory (QT). In spite of statistical predictions of QT a claim was made that it provided the most complete description of the data and of the underlying physical phenomena. This claim could be easily rejected if some fine structures, averaged out in the standard descriptive statistical analysis, were found in time series of experimental data. To search for these structures one has to use more subtle statistical tools which were developed to study time series produced by various stochastic processes. In this talk we review some of these tools. As an example we show how the standard descriptive statistical analysis of the data is unable to reveal a fine structure in a simulated sample of AR (2) stochastic process. We emphasize once again that the violation of Bell inequalities gives no information on the completeness or the non locality of QT. The appropriate way to test the completeness of quantum theory is to search for fine structures in time series of the experimental data by means of the purity tests or by studying the autocorrelation and partial autocorrelation functions.

  2. Efficient use of correlation entropy for analysing time series data

    Indian Academy of Sciences (India)

    Abstract. The correlation dimension D2 and correlation entropy K2 are both important quantifiers in nonlinear time series analysis. However, use of D2 has been more common compared to K2 as a discriminating measure. One reason for this is that D2 is a static measure and can be easily evaluated from a time series.

  3. Financial time series analysis based on information categorization method

    Science.gov (United States)

    Tian, Qiang; Shang, Pengjian; Feng, Guochen

    2014-12-01

    The paper mainly applies the information categorization method to analyze the financial time series. The method is used to examine the similarity of different sequences by calculating the distances between them. We apply this method to quantify the similarity of different stock markets. And we report the results of similarity in US and Chinese stock markets in periods 1991-1998 (before the Asian currency crisis), 1999-2006 (after the Asian currency crisis and before the global financial crisis), and 2007-2013 (during and after global financial crisis) by using this method. The results show the difference of similarity between different stock markets in different time periods and the similarity of the two stock markets become larger after these two crises. Also we acquire the results of similarity of 10 stock indices in three areas; it means the method can distinguish different areas' markets from the phylogenetic trees. The results show that we can get satisfactory information from financial markets by this method. The information categorization method can not only be used in physiologic time series, but also in financial time series.

  4. Time Series Analysis of the Effectiveness and Safety of Capsule Endoscopy between the Premarketing and Postmarketing Settings: A Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Kazuo Iijima

    Full Text Available Clinical studies for assessing the effectiveness and safety in a premarketing setting are conducted under time and cost constraints. In recent years, postmarketing data analysis has been given more attention. However, to our knowledge, no studies have compared the effectiveness and the safety between the pre- and postmarketing settings. In this study, we aimed to investigate the importance of the postmarketing data analysis using clinical data.Studies on capsule endoscopy with rich clinical data in both pre- and postmarketing settings were selected for the analysis. For effectiveness, clinical studies published before October 10, 2015 comparing capsule endoscopy and conventional flexible endoscopy measuring the detection ratio of obscure gastrointestinal bleeding were selected (premarketing: 4 studies and postmarketing: 8 studies from PubMed (MEDLINE, Cochrane Library, EMBASE and Web of Science. Among the 12 studies, 5 were blinded and 7 were non-blinded. A time series meta-analysis was conducted. Effectiveness (odds ratio decreased in the postmarketing setting (premarketing: 5.19 [95% confidence interval: 3.07-8.76] vs. postmarketing: 1.48 [0.81-2.69]. The change in odds ratio was caused by the increase in the detection ratio with flexible endoscopy as the control group. The efficacy of capsule endoscopy did not change between pre- and postmarketing settings. Heterogeneity (I2 increased in the postmarketing setting because of one study. For safety, in terms of endoscope retention in the body, data from the approval summary and adverse event reports were analyzed. The incidence of retention decreased in the postmarketing setting (premarketing: 0.75% vs postmarketing: 0.095%. The introduction of the new patency capsule for checking the patency of the digestive tract might contribute to the decrease.Effectiveness and safety could change in the postmarketing setting. Therefore, time series meta-analyses could be useful to continuously monitor the

  5. Classification of biosensor time series using dynamic time warping: applications in screening cancer cells with characteristic biomarkers.

    Science.gov (United States)

    Rai, Shesh N; Trainor, Patrick J; Khosravi, Farhad; Kloecker, Goetz; Panchapakesan, Balaji

    2016-01-01

    The development of biosensors that produce time series data will facilitate improvements in biomedical diagnostics and in personalized medicine. The time series produced by these devices often contains characteristic features arising from biochemical interactions between the sample and the sensor. To use such characteristic features for determining sample class, similarity-based classifiers can be utilized. However, the construction of such classifiers is complicated by the variability in the time domains of such series that renders the traditional distance metrics such as Euclidean distance ineffective in distinguishing between biological variance and time domain variance. The dynamic time warping (DTW) algorithm is a sequence alignment algorithm that can be used to align two or more series to facilitate quantifying similarity. In this article, we evaluated the performance of DTW distance-based similarity classifiers for classifying time series that mimics electrical signals produced by nanotube biosensors. Simulation studies demonstrated the positive performance of such classifiers in discriminating between time series containing characteristic features that are obscured by noise in the intensity and time domains. We then applied a DTW distance-based k -nearest neighbors classifier to distinguish the presence/absence of mesenchymal biomarker in cancer cells in buffy coats in a blinded test. Using a train-test approach, we find that the classifier had high sensitivity (90.9%) and specificity (81.8%) in differentiating between EpCAM-positive MCF7 cells spiked in buffy coats and those in plain buffy coats.

  6. A novel water quality data analysis framework based on time-series data mining.

    Science.gov (United States)

    Deng, Weihui; Wang, Guoyin

    2017-07-01

    The rapid development of time-series data mining provides an emerging method for water resource management research. In this paper, based on the time-series data mining methodology, we propose a novel and general analysis framework for water quality time-series data. It consists of two parts: implementation components and common tasks of time-series data mining in water quality data. In the first part, we propose to granulate the time series into several two-dimensional normal clouds and calculate the similarities in the granulated level. On the basis of the similarity matrix, the similarity search, anomaly detection, and pattern discovery tasks in the water quality time-series instance dataset can be easily implemented in the second part. We present a case study of this analysis framework on weekly Dissolve Oxygen time-series data collected from five monitoring stations on the upper reaches of Yangtze River, China. It discovered the relationship of water quality in the mainstream and tributary as well as the main changing patterns of DO. The experimental results show that the proposed analysis framework is a feasible and efficient method to mine the hidden and valuable knowledge from water quality historical time-series data. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Development and application of a modified dynamic time warping algorithm (DTW-S to analyses of primate brain expression time series

    Directory of Open Access Journals (Sweden)

    Vingron Martin

    2011-08-01

    Full Text Available Abstract Background Comparing biological time series data across different conditions, or different specimens, is a common but still challenging task. Algorithms aligning two time series represent a valuable tool for such comparisons. While many powerful computation tools for time series alignment have been developed, they do not provide significance estimates for time shift measurements. Results Here, we present an extended version of the original DTW algorithm that allows us to determine the significance of time shift estimates in time series alignments, the DTW-Significance (DTW-S algorithm. The DTW-S combines important properties of the original algorithm and other published time series alignment tools: DTW-S calculates the optimal alignment for each time point of each gene, it uses interpolated time points for time shift estimation, and it does not require alignment of the time-series end points. As a new feature, we implement a simulation procedure based on parameters estimated from real time series data, on a series-by-series basis, allowing us to determine the false positive rate (FPR and the significance of the estimated time shift values. We assess the performance of our method using simulation data and real expression time series from two published primate brain expression datasets. Our results show that this method can provide accurate and robust time shift estimates for each time point on a gene-by-gene basis. Using these estimates, we are able to uncover novel features of the biological processes underlying human brain development and maturation. Conclusions The DTW-S provides a convenient tool for calculating accurate and robust time shift estimates at each time point for each gene, based on time series data. The estimates can be used to uncover novel biological features of the system being studied. The DTW-S is freely available as an R package TimeShift at http://www.picb.ac.cn/Comparative/data.html.

  8. PhilDB: the time series database with built-in change logging

    Directory of Open Access Journals (Sweden)

    Andrew MacDonald

    2016-03-01

    Full Text Available PhilDB is an open-source time series database that supports storage of time series datasets that are dynamic; that is, it records updates to existing values in a log as they occur. PhilDB eases loading of data for the user by utilising an intelligent data write method. It preserves existing values during updates and abstracts the update complexity required to achieve logging of data value changes. It implements fast reads to make it practical to select data for analysis. Recent open-source systems have been developed to indefinitely store long-period high-resolution time series data without change logging. Unfortunately, such systems generally require a large initial installation investment before use because they are designed to operate over a cluster of servers to achieve high-performance writing of static data in real time. In essence, they have a ‘big data’ approach to storage and access. Other open-source projects for handling time series data that avoid the ‘big data’ approach are also relatively new and are complex or incomplete. None of these systems gracefully handle revision of existing data while tracking values that change. Unlike ‘big data’ solutions, PhilDB has been designed for single machine deployment on commodity hardware, reducing the barrier to deployment. PhilDB takes a unique approach to meta-data tracking; optional attribute attachment. This facilitates scaling the complexities of storing a wide variety of data. That is, it allows time series data to be loaded as time series instances with minimal initial meta-data, yet additional attributes can be created and attached to differentiate the time series instances when a wider variety of data is needed. PhilDB was written in Python, leveraging existing libraries. While some existing systems come close to meeting the needs PhilDB addresses, none cover all the needs at once. PhilDB was written to fill this gap in existing solutions. This paper explores existing time

  9. Human influence on distribution and extinctions of the late Pleistocene Eurasian megafauna.

    Science.gov (United States)

    Pushkina, Diana; Raia, Pasquale

    2008-06-01

    Late Pleistocene extinctions are of interest to paleontological and anthropological research. In North America and Australia, human occupation occurred during a short period of time and overexploitation may have led to the extinction of mammalian megafauna. In northern Eurasia megafaunal extinctions are believed to have occurred over a relatively longer period of time, perhaps as a result of changing environmental conditions, but the picture is much less clear. To consider megafaunal extinction in Eurasia, we compare differences in the geographical distribution and commonness of extinct and extant species between paleontological and archaeological localities from the late middle Pleistocene to Holocene. Purely paleontological localities, as well as most extinct species, were distributed north of archaeological sites and of the extant species, suggesting that apart from possible differences in adaptations between humans and other species, humans could also have a detrimental effect on large mammal distribution. However, evidence for human overexploitation applies only to the extinct steppe bison Bison priscus. Other human-preferred species survive into the Holocene, including Rangifer tarandus, Equus ferus, Capreolus capreolus, Cervus elaphus, Equus hemionus, Saiga tatarica, and Sus scrofa. Mammuthus primigenius and Megaloceros giganteus were rare in archaeological sites. Carnivores appear little influenced by human presence, although they become rarer in Holocene archaeological sites. Overall, the data are consistent with the conclusion that humans acted as efficient hunters selecting for the most abundant species. Our study supports the idea that the late Pleistocene extinctions were environmentally driven by climatic changes that triggered habitat fragmentation, species range reduction, and population decrease, after which human interference either by direct hunting or via indirect activities probably became critical.

  10. Seasonal and annual precipitation time series trend analysis in North Carolina, United States

    Science.gov (United States)

    Sayemuzzaman, Mohammad; Jha, Manoj K.

    2014-02-01

    The present study performs the spatial and temporal trend analysis of the annual and seasonal time-series of a set of uniformly distributed 249 stations precipitation data across the state of North Carolina, United States over the period of 1950-2009. The Mann-Kendall (MK) test, the Theil-Sen approach (TSA) and the Sequential Mann-Kendall (SQMK) test were applied to quantify the significance of trend, magnitude of trend, and the trend shift, respectively. Regional (mountain, piedmont and coastal) precipitation trends were also analyzed using the above-mentioned tests. Prior to the application of statistical tests, the pre-whitening technique was used to eliminate the effect of autocorrelation of precipitation data series. The application of the above-mentioned procedures has shown very notable statewide increasing trend for winter and decreasing trend for fall precipitation. Statewide mixed (increasing/decreasing) trend has been detected in annual, spring, and summer precipitation time series. Significant trends (confidence level ≥ 95%) were detected only in 8, 7, 4 and 10 nos. of stations (out of 249 stations) in winter, spring, summer, and fall, respectively. Magnitude of the highest increasing (decreasing) precipitation trend was found about 4 mm/season (- 4.50 mm/season) in fall (summer) season. Annual precipitation trend magnitude varied between - 5.50 mm/year and 9 mm/year. Regional trend analysis found increasing precipitation in mountain and coastal regions in general except during the winter. Piedmont region was found to have increasing trends in summer and fall, but decreasing trend in winter, spring and on an annual basis. The SQMK test on "trend shift analysis" identified a significant shift during 1960 - 70 in most parts of the state. Finally, the comparison between winter (summer) precipitations with the North Atlantic Oscillation (Southern Oscillation) indices concluded that the variability and trend of precipitation can be explained by the

  11. Model-based Clustering of Categorical Time Series with Multinomial Logit Classification

    Science.gov (United States)

    Frühwirth-Schnatter, Sylvia; Pamminger, Christoph; Winter-Ebmer, Rudolf; Weber, Andrea

    2010-09-01

    A common problem in many areas of applied statistics is to identify groups of similar time series in a panel of time series. However, distance-based clustering methods cannot easily be extended to time series data, where an appropriate distance-measure is rather difficult to define, particularly for discrete-valued time series. Markov chain clustering, proposed by Pamminger and Frühwirth-Schnatter [6], is an approach for clustering discrete-valued time series obtained by observing a categorical variable with several states. This model-based clustering method is based on finite mixtures of first-order time-homogeneous Markov chain models. In order to further explain group membership we present an extension to the approach of Pamminger and Frühwirth-Schnatter [6] by formulating a probabilistic model for the latent group indicators within the Bayesian classification rule by using a multinomial logit model. The parameters are estimated for a fixed number of clusters within a Bayesian framework using an Markov chain Monte Carlo (MCMC) sampling scheme representing a (full) Gibbs-type sampler which involves only draws from standard distributions. Finally, an application to a panel of Austrian wage mobility data is presented which leads to an interesting segmentation of the Austrian labour market.

  12. Assessment of geomorphological and hydrological changes produced by Pleistocene glaciations in a Patagonian basin

    Science.gov (United States)

    Scordo, Facundo; Seitz, Carina; Melo, Walter D.; Piccolo, M. Cintia; Perillo, Gerardo M. E.

    2018-04-01

    This work aims to assess how Pleistocene glaciations modeled the landscape in the upper Senguer River basin and its relationship to current watershed features (drainage surface and fluvial hydrological regime). During the Pleistocene six glacial lobes developed in the upper basin of the Senguer River localized east of the Andean range in southern Argentinean Patagonia between 43° 36' - 46° 27‧ S. To describe the topography and hydrology, map the geomorphology, and propose an evolution of the study area during the Pleistocene we employed multitemporal Landsat images, national geological sheets and a mosaic of the digital elevation model (Shuttle Radar Topography Mission) along with fieldwork. The main conclusion is that until the Middle Pleistocene, the drainage divide of the Senguer River basin was located to the west of its current limits and its rivers drained the meltwater of the glaciers during interglacial periods. However, processes of drainage inversion and drainage surface reduction occurred in the headwater of most rivers of the basin during the Late Pleistocene. Those processes were favored by a relative shorter glacial extension during LGM and the dam effect produced by the moraines of the Post GPG I and III glaciations. Thus, since the Late Pleistocene, the headwaters of several rivers in the basin have been reduced, and the moraines corresponding to the Middle Pleistocene glaciations currently divide the watersheds that drain towards the Senguer River from those that flow west towards the Pacific Ocean.

  13. Time Series Discord Detection in Medical Data using a Parallel Relational Database

    Energy Technology Data Exchange (ETDEWEB)

    Woodbridge, Diane; Rintoul, Mark Daniel; Wilson, Andrew T.; Goldstein, Richard

    2015-10-01

    Recent advances in sensor technology have made continuous real-time health monitoring available in both hospital and non-hospital settings. Since data collected from high frequency medical sensors includes a huge amount of data, storing and processing continuous medical data is an emerging big data area. Especially detecting anomaly in real time is important for patients’ emergency detection and prevention. A time series discord indicates a subsequence that has the maximum difference to the rest of the time series subsequences, meaning that it has abnormal or unusual data trends. In this study, we implemented two versions of time series discord detection algorithms on a high performance parallel database management system (DBMS) and applied them to 240 Hz waveform data collected from 9,723 patients. The initial brute force version of the discord detection algorithm takes each possible subsequence and calculates a distance to the nearest non-self match to find the biggest discords in time series. For the heuristic version of the algorithm, a combination of an array and a trie structure was applied to order time series data for enhancing time efficiency. The study results showed efficient data loading, decoding and discord searches in a large amount of data, benefiting from the time series discord detection algorithm and the architectural characteristics of the parallel DBMS including data compression, data pipe-lining, and task scheduling.

  14. Estimation of system parameters in discrete dynamical systems from time series

    International Nuclear Information System (INIS)

    Palaniyandi, P.; Lakshmanan, M.

    2005-01-01

    We propose a simple method to estimate the parameters involved in discrete dynamical systems from time series. The method is based on the concept of controlling chaos by constant feedback. The major advantages of the method are that it needs a minimal number of time series data (either vector or scalar) and is applicable to dynamical systems of any dimension. The method also works extremely well even in the presence of noise in the time series. The method is specifically illustrated by means of logistic and Henon maps

  15. Evaluation of nonlinearity and validity of nonlinear modeling for complex time series.

    Science.gov (United States)

    Suzuki, Tomoya; Ikeguchi, Tohru; Suzuki, Masuo

    2007-10-01

    Even if an original time series exhibits nonlinearity, it is not always effective to approximate the time series by a nonlinear model because such nonlinear models have high complexity from the viewpoint of information criteria. Therefore, we propose two measures to evaluate both the nonlinearity of a time series and validity of nonlinear modeling applied to it by nonlinear predictability and information criteria. Through numerical simulations, we confirm that the proposed measures effectively detect the nonlinearity of an observed time series and evaluate the validity of the nonlinear model. The measures are also robust against observational noises. We also analyze some real time series: the difference of the number of chickenpox and measles patients, the number of sunspots, five Japanese vowels, and the chaotic laser. We can confirm that the nonlinear model is effective for the Japanese vowel /a/, the difference of the number of measles patients, and the chaotic laser.

  16. Late Middle Pleistocene hominin teeth from Panxian Dadong, South China.

    Science.gov (United States)

    Liu, Wu; Schepartz, Lynne A; Xing, Song; Miller-Antonio, Sari; Wu, Xiujie; Trinkaus, Erik; Martinón-Torres, María

    2013-05-01

    The hominin teeth and evidence of hominin activities recovered from 1991 to 2005 at the Panxian Dadong site in South China are dated to the late Middle Pleistocene (MIS 8-6 or ca. 130-300 ka), a period for which very little is known about the morphology of Asian populations. The present study provides the first detailed morphometric description and comparisons of four hominin teeth (I(1), C1, P(3) and P3) from this site. Our study shows that the Panxian Dadong teeth combine archaic and derived features that align them with Middle and Upper Pleistocene fossils from East and West Asia and Europe. These teeth do not display any typical Neanderthal features and they are generally more derived than other contemporaneous populations from Asia and Africa. However, the derived traits are not diagnostic enough to specifically link the Panxian Dadong teeth to Homo sapiens, a common problem when analyzing the Middle Pleistocene dental record from Africa and Asia. These findings are contextualized in the discussion of the evolutionary course of Asian Middle Pleistocene hominins, and they highlight the necessity of incorporating the Asian fossil record in the still open debate about the origin of H. sapiens. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. A Framework and Algorithms for Multivariate Time Series Analytics (MTSA): Learning, Monitoring, and Recommendation

    Science.gov (United States)

    Ngan, Chun-Kit

    2013-01-01

    Making decisions over multivariate time series is an important topic which has gained significant interest in the past decade. A time series is a sequence of data points which are measured and ordered over uniform time intervals. A multivariate time series is a set of multiple, related time series in a particular domain in which domain experts…

  18. NEW FOSSIL VERTEBRATE REMAINS FROM SAN GIOVANNI DI SINIS (LATE PLEISTOCENE, SARDINIA: THE LAST MAUREMYS (REPTILIA, TESTUDINES IN THE CENTRAL MEDITERRANEAN

    Directory of Open Access Journals (Sweden)

    FRANCESCO CHESI

    2007-07-01

    Full Text Available New fossil vertebrates from the most representative Upper Pleistocene section (Tyrrhenian, MIS 5e of the outcrop of San Giovanni di Sinis (Oristano, Sardinia are here reported and described. The fossils, although scarce and fragmentary, document the occurrence of a terrapin (Mauremys sp. and the endemic Sardinian deer (Praemegaceros cazioti. Significant is the occurrence of the terrapin because it is the youngest representative of the genus in the central Mediterranean area where it is extinct at present. The Late Pleistocene extinction of Mauremys in Italy follows the same pattern of other Mediterranean reptiles, in being in some cases delayed on the islands. A comparison of the modern range of Mauremys and that of the pond turtle, Emys, as well as of their past ranges as evidenced by the fossil record, might suggest that some sort of thermophily (at least during pre-hatching stages characterized the former taxon and is responsible for its past and present distribution. SHORT NOTE

  19. How Far into Europe Did Pikas (Lagomorpha: Ochotonidae) Go during the Pleistocene? New Evidence from Central Iberia

    Science.gov (United States)

    Laplana, César; Sevilla, Paloma; Arsuaga, Juan Luis; Arriaza, Mari Carmen; Baquedano, Enrique; Pérez-González, Alfredo

    2015-01-01

    This paper reports the first find of pika remains in the Iberian Peninsula, at a site in central Spain. A fragmented mandible of Ochotona cf. pusilla was unearthed from Layer 3 (deposited some 63.4±5.5 ka ago as determined by thermoluminescence) of the Buena Pinta Cave. This record establishes new limits for the genus geographic distribution during the Pleistocene, shifting the previous edge of its known range southwest by some 500 km. It also supports the idea that, even though Europe’s alpine mountain ranges represented a barrier that prevented the dispersal into the south to this and other taxa of small mammals from central and eastern Europe, they were crossed or circumvented at the coldest time intervals of the end of the Middle Pleistocene and of the Late Pleistocene. During those periods both the reduction of the forest cover and the emersion of large areas of the continental shelf due to the drop of the sea level probably provided these species a way to surpass this barrier. The pika mandible was found accompanying the remains of other small mammals adapted to cold climates, indicating the presence of steppe environments in central Iberia during the Late Pleistocene. PMID:26535576

  20. The discovery and character of Pleistocene calcrete uranium deposits in the Southern High Plains of west Texas, United States

    Science.gov (United States)

    Van Gosen, Bradley S.; Hall, Susan M.

    2017-12-18

    This report describes the discovery and geology of two near-surface uranium deposits within calcareous lacustrine strata of Pleistocene age in west Texas, United States. Calcrete uranium deposits have not been previously reported in the United States. The west Texas uranium deposits share characteristics with some calcrete uranium deposits in Western Australia—uranium-vanadium minerals hosted by nonpedogenic calcretes deposited in saline lacustrine environments.In the mid-1970s, Kerr-McGee Corporation conducted a regional uranium exploration program in the Southern High Plains province of the United States, which led to the discovery of two shallow uranium deposits (that were not publicly reported). With extensive drilling, Kerr-McGee delineated one deposit of about 2.1 million metric tons of ore with an average grade of 0.037 percent U3O8 and another deposit of about 0.93 million metric tons of ore averaging 0.047 percent U3O8.The west-Texas calcrete uranium-vanadium deposits occur in calcareous, fine-grained sediments interpreted to be deposited in saline lakes formed during dry interglacial periods of the Pleistocene. The lakes were associated with drainages upstream of a large Pleistocene lake. Age determinations of tephra in strata adjacent to one deposit indicate the host strata is middle Pleistocene in age.Examination of the uranium-vanadium mineralization by scanning-electron microscopy indicated at least two generations of uranium-vanadium deposition in the lacustrine strata identified as carnotite and a strontium-uranium-vanadium mineral. Preliminary uranium-series results indicate a two-component system in the host calcrete, with early lacustrine carbonate that was deposited (or recrystallized) about 190 kilo-annum, followed much later by carnotite-rich crusts and strontium-uranium-vanadium mineralization in the Holocene (about 5 kilo-annum). Differences in initial 234U/238U activity ratios indicate two separate, distinct fluid sources.

  1. Modeling vector nonlinear time series using POLYMARS

    NARCIS (Netherlands)

    de Gooijer, J.G.; Ray, B.K.

    2003-01-01

    A modified multivariate adaptive regression splines method for modeling vector nonlinear time series is investigated. The method results in models that can capture certain types of vector self-exciting threshold autoregressive behavior, as well as provide good predictions for more general vector

  2. Forecasting with periodic autoregressive time series models

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); R. Paap (Richard)

    1999-01-01

    textabstractThis paper is concerned with forecasting univariate seasonal time series data using periodic autoregressive models. We show how one should account for unit roots and deterministic terms when generating out-of-sample forecasts. We illustrate the models for various quarterly UK consumption

  3. vector bilinear autoregressive time series model and its superiority

    African Journals Online (AJOL)

    KEYWORDS: Linear time series, Autoregressive process, Autocorrelation function, Partial autocorrelation function,. Vector time .... important result on matrix algebra with respect to the spectral ..... application to covariance analysis of super-.

  4. Correlation measure to detect time series distances, whence economy globalization

    Science.gov (United States)

    Miśkiewicz, Janusz; Ausloos, Marcel

    2008-11-01

    An instantaneous time series distance is defined through the equal time correlation coefficient. The idea is applied to the Gross Domestic Product (GDP) yearly increments of 21 rich countries between 1950 and 2005 in order to test the process of economic globalisation. Some data discussion is first presented to decide what (EKS, GK, or derived) GDP series should be studied. Distances are then calculated from the correlation coefficient values between pairs of series. The role of time averaging of the distances over finite size windows is discussed. Three network structures are next constructed based on the hierarchy of distances. It is shown that the mean distance between the most developed countries on several networks actually decreases in time, -which we consider as a proof of globalization. An empirical law is found for the evolution after 1990, similar to that found in flux creep. The optimal observation time window size is found ≃15 years.

  5. Time Series Analysis Using Geometric Template Matching.

    Science.gov (United States)

    Frank, Jordan; Mannor, Shie; Pineau, Joelle; Precup, Doina

    2013-03-01

    We present a novel framework for analyzing univariate time series data. At the heart of the approach is a versatile algorithm for measuring the similarity of two segments of time series called geometric template matching (GeTeM). First, we use GeTeM to compute a similarity measure for clustering and nearest-neighbor classification. Next, we present a semi-supervised learning algorithm that uses the similarity measure with hierarchical clustering in order to improve classification performance when unlabeled training data are available. Finally, we present a boosting framework called TDEBOOST, which uses an ensemble of GeTeM classifiers. TDEBOOST augments the traditional boosting approach with an additional step in which the features used as inputs to the classifier are adapted at each step to improve the training error. We empirically evaluate the proposed approaches on several datasets, such as accelerometer data collected from wearable sensors and ECG data.

  6. On-line analysis of reactor noise using time-series analysis

    International Nuclear Information System (INIS)

    McGevna, V.G.

    1981-10-01

    A method to allow use of time series analysis for on-line noise analysis has been developed. On-line analysis of noise in nuclear power reactors has been limited primarily to spectral analysis and related frequency domain techniques. Time series analysis has many distinct advantages over spectral analysis in the automated processing of reactor noise. However, fitting an autoregressive-moving average (ARMA) model to time series data involves non-linear least squares estimation. Unless a high speed, general purpose computer is available, the calculations become too time consuming for on-line applications. To eliminate this problem, a special purpose algorithm was developed for fitting ARMA models. While it is based on a combination of steepest descent and Taylor series linearization, properties of the ARMA model are used so that the auto- and cross-correlation functions can be used to eliminate the need for estimating derivatives. The number of calculations, per iteration varies lineegardless of the mee 0.2% yield strength displayed anisotropy, with axial and circumferential values being greater than radial. For CF8-CPF8 and CF8M-CPF8M castings to meet current ASME Code S acid fuel cells

  7. Improving GNSS time series for volcano monitoring: application to Canary Islands (Spain)

    Science.gov (United States)

    García-Cañada, Laura; Sevilla, Miguel J.; Pereda de Pablo, Jorge; Domínguez Cerdeña, Itahiza

    2017-04-01

    The number of permanent GNSS stations has increased significantly in recent years for different geodetic applications such as volcano monitoring, which require a high precision. Recently we have started to have coordinates time series long enough so that we can apply different analysis and filters that allow us to improve the GNSS coordinates results. Following this idea we have processed data from GNSS permanent stations used by the Spanish Instituto Geográfico Nacional (IGN) for volcano monitoring in Canary Islands to obtained time series by double difference processing method with Bernese v5.0 for the period 2007-2014. We have identified the characteristics of these time series and obtained models to estimate velocities with greater accuracy and more realistic uncertainties. In order to improve the results we have used two kinds of filters to improve the time series. The first, a spatial filter, has been computed using the series of residuals of all stations in the Canary Islands without an anomalous behaviour after removing a linear trend. This allows us to apply this filter to all sets of coordinates of the permanent stations reducing their dispersion. The second filter takes account of the temporal correlation in the coordinate time series for each station individually. A research about the evolution of the velocity depending on the series length has been carried out and it has demonstrated the need for using time series of at least four years. Therefore, in those stations with more than four years of data, we calculated the velocity and the characteristic parameters in order to have time series of residuals. This methodology has been applied to the GNSS data network in El Hierro (Canary Islands) during the 2011-2012 eruption and the subsequent magmatic intrusions (2012-2014). The results show that in the new series it is easier to detect anomalous behaviours in the coordinates, so they are most useful to detect crustal deformations in volcano monitoring.

  8. Complexity analysis of the turbulent environmental fluid flow time series

    Science.gov (United States)

    Mihailović, D. T.; Nikolić-Đorić, E.; Drešković, N.; Mimić, G.

    2014-02-01

    We have used the Kolmogorov complexities, sample and permutation entropies to quantify the randomness degree in river flow time series of two mountain rivers in Bosnia and Herzegovina, representing the turbulent environmental fluid, for the period 1926-1990. In particular, we have examined the monthly river flow time series from two rivers (the Miljacka and the Bosnia) in the mountain part of their flow and then calculated the Kolmogorov complexity (KL) based on the Lempel-Ziv Algorithm (LZA) (lower-KLL and upper-KLU), sample entropy (SE) and permutation entropy (PE) values for each time series. The results indicate that the KLL, KLU, SE and PE values in two rivers are close to each other regardless of the amplitude differences in their monthly flow rates. We have illustrated the changes in mountain river flow complexity by experiments using (i) the data set for the Bosnia River and (ii) anticipated human activities and projected climate changes. We have explored the sensitivity of considered measures in dependence on the length of time series. In addition, we have divided the period 1926-1990 into three subintervals: (a) 1926-1945, (b) 1946-1965, (c) 1966-1990, and calculated the KLL, KLU, SE, PE values for the various time series in these subintervals. It is found that during the period 1946-1965, there is a decrease in their complexities, and corresponding changes in the SE and PE, in comparison to the period 1926-1990. This complexity loss may be primarily attributed to (i) human interventions, after the Second World War, on these two rivers because of their use for water consumption and (ii) climate change in recent times.

  9. Mapping Crop Cycles in China Using MODIS-EVI Time Series

    Directory of Open Access Journals (Sweden)

    Le Li

    2014-03-01

    Full Text Available As the Earth’s population continues to grow and demand for food increases, the need for improved and timely information related to the properties and dynamics of global agricultural systems is becoming increasingly important. Global land cover maps derived from satellite data provide indispensable information regarding the geographic distribution and areal extent of global croplands. However, land use information, such as cropping intensity (defined here as the number of cropping cycles per year, is not routinely available over large areas because mapping this information from remote sensing is challenging. In this study, we present a simple but efficient algorithm for automated mapping of cropping intensity based on data from NASA’s (NASA: The National Aeronautics and Space Administration MODerate Resolution Imaging Spectroradiometer (MODIS. The proposed algorithm first applies an adaptive Savitzky-Golay filter to smooth Enhanced Vegetation Index (EVI time series derived from MODIS surface reflectance data. It then uses an iterative moving-window methodology to identify cropping cycles from the smoothed EVI time series. Comparison of results from our algorithm with national survey data at both the provincial and prefectural level in China show that the algorithm provides estimates of gross sown area that agree well with inventory data. Accuracy assessment comparing visually interpreted time series with algorithm results for a random sample of agricultural areas in China indicates an overall accuracy of 91.0% for three classes defined based on the number of cycles observed in EVI time series. The algorithm therefore appears to provide a straightforward and efficient method for mapping cropping intensity from MODIS time series data.

  10. Toward automatic time-series forecasting using neural networks.

    Science.gov (United States)

    Yan, Weizhong

    2012-07-01

    Over the past few decades, application of artificial neural networks (ANN) to time-series forecasting (TSF) has been growing rapidly due to several unique features of ANN models. However, to date, a consistent ANN performance over different studies has not been achieved. Many factors contribute to the inconsistency in the performance of neural network models. One such factor is that ANN modeling involves determining a large number of design parameters, and the current design practice is essentially heuristic and ad hoc, this does not exploit the full potential of neural networks. Systematic ANN modeling processes and strategies for TSF are, therefore, greatly needed. Motivated by this need, this paper attempts to develop an automatic ANN modeling scheme. It is based on the generalized regression neural network (GRNN), a special type of neural network. By taking advantage of several GRNN properties (i.e., a single design parameter and fast learning) and by incorporating several design strategies (e.g., fusing multiple GRNNs), we have been able to make the proposed modeling scheme to be effective for modeling large-scale business time series. The initial model was entered into the NN3 time-series competition. It was awarded the best prediction on the reduced dataset among approximately 60 different models submitted by scholars worldwide.

  11. A NEW EARLY PLEISTOCENE BIRD ASSOCIATION FROM PIETRAFITTA (PERUGIA, CENTRAL ITALY

    Directory of Open Access Journals (Sweden)

    GILDA ZUCCHETTA

    2003-11-01

    Full Text Available We here present che preliminary results of the analysis of the fossil bird assemblages found in the lignite deposits of the Pietrafitta Mine (Perugia, Central Italy. A rich vertebrate association, mainly mammals, has been retrieved in Pietrafitta, which is the richest local fauna of the Farneta Faunal Unit (late Villafranchian, early Pleistocene. Avian remains of Podicipedidae, Ardeidae, Phalacrocoracidae, Anatidae, Phasianidae and Rallidae have been identified, for most of which Pietrafitta represents the earliest occurrence in Italy. The Pietrafitta fossil bird association is the first Italian bird assemblage of the Early Pleistocene and seems to be one of the most important ones for the early Pleistocene in Europe, especially because it contains mainly aquatic birds, often rare in many other European deposits. 

  12. Multi-granular trend detection for time-series analysis

    NARCIS (Netherlands)

    van Goethem, A.I.; Staals, F.; Löffler, M.; Dykes, J.; Speckmann, B.

    2017-01-01

    Time series (such as stock prices) and ensembles (such as model runs for weather forecasts) are two important types of one-dimensional time-varying data. Such data is readily available in large quantities but visual analysis of the raw data quickly becomes infeasible, even for moderately sized data

  13. Optimal transformations for categorical autoregressive time series

    NARCIS (Netherlands)

    Buuren, S. van

    1996-01-01

    This paper describes a method for finding optimal transformations for analyzing time series by autoregressive models. 'Optimal' implies that the agreement between the autoregressive model and the transformed data is maximal. Such transformations help 1) to increase the model fit, and 2) to analyze

  14. Satellite Image Time Series Decomposition Based on EEMD

    Directory of Open Access Journals (Sweden)

    Yun-long Kong

    2015-11-01

    Full Text Available Satellite Image Time Series (SITS have recently been of great interest due to the emerging remote sensing capabilities for Earth observation. Trend and seasonal components are two crucial elements of SITS. In this paper, a novel framework of SITS decomposition based on Ensemble Empirical Mode Decomposition (EEMD is proposed. EEMD is achieved by sifting an ensemble of adaptive orthogonal components called Intrinsic Mode Functions (IMFs. EEMD is noise-assisted and overcomes the drawback of mode mixing in conventional Empirical Mode Decomposition (EMD. Inspired by these advantages, the aim of this work is to employ EEMD to decompose SITS into IMFs and to choose relevant IMFs for the separation of seasonal and trend components. In a series of simulations, IMFs extracted by EEMD achieved a clear representation with physical meaning. The experimental results of 16-day compositions of Moderate Resolution Imaging Spectroradiometer (MODIS, Normalized Difference Vegetation Index (NDVI, and Global Environment Monitoring Index (GEMI time series with disturbance illustrated the effectiveness and stability of the proposed approach to monitoring tasks, such as applications for the detection of abrupt changes.

  15. An accuracy assessment of realtime GNSS time series toward semi- real time seafloor geodetic observation

    Science.gov (United States)

    Osada, Y.; Ohta, Y.; Demachi, T.; Kido, M.; Fujimoto, H.; Azuma, R.; Hino, R.

    2013-12-01

    Large interplate earthquake repeatedly occurred in Japan Trench. Recently, the detail crustal deformation revealed by the nation-wide inland GPS network called as GEONET by GSI. However, the maximum displacement region for interplate earthquake is mainly located offshore region. GPS/Acoustic seafloor geodetic observation (hereafter GPS/A) is quite important and useful for understanding of shallower part of the interplate coupling between subducting and overriding plates. We typically conduct GPS/A in specific ocean area based on repeated campaign style using research vessel or buoy. Therefore, we cannot monitor the temporal variation of seafloor crustal deformation in real time. The one of technical issue on real time observation is kinematic GPS analysis because kinematic GPS analysis based on reference and rover data. If the precise kinematic GPS analysis will be possible in the offshore region, it should be promising method for real time GPS/A with USV (Unmanned Surface Vehicle) and a moored buoy. We assessed stability, precision and accuracy of StarFireTM global satellites based augmentation system. We primarily tested for StarFire in the static condition. In order to assess coordinate precision and accuracy, we compared 1Hz StarFire time series and post-processed precise point positioning (PPP) 1Hz time series by GIPSY-OASIS II processing software Ver. 6.1.2 with three difference product types (ultra-rapid, rapid, and final orbits). We also used difference interval clock information (30 and 300 seconds) for the post-processed PPP processing. The standard deviation of real time StarFire time series is less than 30 mm (horizontal components) and 60 mm (vertical component) based on 1 month continuous processing. We also assessed noise spectrum of the estimated time series by StarFire and post-processed GIPSY PPP results. We found that the noise spectrum of StarFire time series is similar pattern with GIPSY-OASIS II processing result based on JPL rapid orbit

  16. The Exponential Model for the Spectrum of a Time Series: Extensions and Applications

    DEFF Research Database (Denmark)

    Proietti, Tommaso; Luati, Alessandra

    The exponential model for the spectrum of a time series and its fractional extensions are based on the Fourier series expansion of the logarithm of the spectral density. The coefficients of the expansion form the cepstrum of the time series. After deriving the cepstrum of important classes of time...

  17. Leveraging the Pre-DFG Residue Thr-406 To Obtain High Kinase Selectivity in an Aminopyrazole-Type PAK1 Inhibitor Series.

    Science.gov (United States)

    Rudolph, Joachim; Aliagas, Ignacio; Crawford, James J; Mathieu, Simon; Lee, Wendy; Chao, Qi; Dong, Ping; Rouge, Lionel; Wang, Weiru; Heise, Christopher; Murray, Lesley J; La, Hank; Liu, Yanzhou; Manning, Gerard; Diederich, François; Hoeflich, Klaus P

    2015-06-11

    To increase kinase selectivity in an aminopyrazole-based PAK1 inhibitor series, analogues were designed to interact with the PAK1 deep-front pocket pre-DFG residue Thr-406, a residue that is hydrophobic in most kinases. This goal was achieved by installing lactam head groups to the aminopyrazole hinge binding moiety. The corresponding analogues represent the most kinase selective ATP-competitive Group I PAK inhibitors described to date. Hydrogen bonding with the Thr-406 side chain was demonstrated by X-ray crystallography, and inhibitory activities, particularly against kinases with hydrophobic pre-DFG residues, were mitigated. Leveraging hydrogen bonding side chain interactions with polar pre-DFG residues is unprecedented, and similar strategies should be applicable to other appropriate kinases.

  18. Enteroclysis and small bowel series: Comparison of radiation dose and examination time

    International Nuclear Information System (INIS)

    Thoeni, R.F.; Gould, R.G.

    1991-01-01

    Respective radiation doses and total examination and fluoroscopy times were compared for 50 patients; 25 underwent enteroclysis and 25 underwent small bowel series with (n = 17) and without (n = 8) an examination of the upper gastrointestinal (GI) tract. For enteroclysis, the mean skin entry radiation dose (12.3 rad [123 mGy]) and mean fluoroscopy time (18.4 minutes) were almost 1 1/2 times greater than those for the small bowel series with examination of the upper GI tract (8.4 rad [84 mGy]; 11.4 minutes) and almost three times greater than those for the small bowel series without upper GI examination (4.6 rad [46 mGy]; 6.3 minutes). However, the mean total examination completion time for enteroclysis (31.2 minutes) was almost half that of the small bowel series without upper GI examination (57.5 minutes) and almost four times shorter than that of the small bowel series with upper GI examination (114 minutes). The higher radiation dose of enteroclysis should be considered along with the short examination time, the age and clinical condition of the patient, and the reported higher accuracy when deciding on the appropriate radiographic examination of the small bowel

  19. Rotation in the dynamic factor modeling of multivariate stationary time series.

    NARCIS (Netherlands)

    Molenaar, P.C.M.; Nesselroade, J.R.

    2001-01-01

    A special rotation procedure is proposed for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white

  20. A simple and fast representation space for classifying complex time series

    International Nuclear Information System (INIS)

    Zunino, Luciano; Olivares, Felipe; Bariviera, Aurelio F.; Rosso, Osvaldo A.

    2017-01-01

    In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease. - Highlights: • A bidimensional scheme has been tested for classification purposes. • A multiscale generalization is introduced. • Several practical applications confirm its usefulness. • Different sets of financial and physiological data are efficiently distinguished. • This multiscale bidimensional approach has high potential as discriminative tool.

  1. A simple and fast representation space for classifying complex time series

    Energy Technology Data Exchange (ETDEWEB)

    Zunino, Luciano, E-mail: lucianoz@ciop.unlp.edu.ar [Centro de Investigaciones Ópticas (CONICET La Plata – CIC), C.C. 3, 1897 Gonnet (Argentina); Departamento de Ciencias Básicas, Facultad de Ingeniería, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina); Olivares, Felipe, E-mail: olivaresfe@gmail.com [Instituto de Física, Pontificia Universidad Católica de Valparaíso (PUCV), 23-40025 Valparaíso (Chile); Bariviera, Aurelio F., E-mail: aurelio.fernandez@urv.cat [Department of Business, Universitat Rovira i Virgili, Av. Universitat 1, 43204 Reus (Spain); Rosso, Osvaldo A., E-mail: oarosso@gmail.com [Instituto de Física, Universidade Federal de Alagoas (UFAL), BR 104 Norte km 97, 57072-970, Maceió, Alagoas (Brazil); Instituto Tecnológico de Buenos Aires (ITBA) and CONICET, C1106ACD, Av. Eduardo Madero 399, Ciudad Autónoma de Buenos Aires (Argentina); Complex Systems Group, Facultad de Ingeniería y Ciencias Aplicadas, Universidad de los Andes, Av. Mons. Álvaro del Portillo 12.455, Las Condes, Santiago (Chile)

    2017-03-18

    In the context of time series analysis considerable effort has been directed towards the implementation of efficient discriminating statistical quantifiers. Very recently, a simple and fast representation space has been introduced, namely the number of turning points versus the Abbe value. It is able to separate time series from stationary and non-stationary processes with long-range dependences. In this work we show that this bidimensional approach is useful for distinguishing complex time series: different sets of financial and physiological data are efficiently discriminated. Additionally, a multiscale generalization that takes into account the multiple time scales often involved in complex systems has been also proposed. This multiscale analysis is essential to reach a higher discriminative power between physiological time series in health and disease. - Highlights: • A bidimensional scheme has been tested for classification purposes. • A multiscale generalization is introduced. • Several practical applications confirm its usefulness. • Different sets of financial and physiological data are efficiently distinguished. • This multiscale bidimensional approach has high potential as discriminative tool.

  2. Mammalian niche conservation through deep time.

    Directory of Open Access Journals (Sweden)

    Larisa R G DeSantis

    Full Text Available Climate change alters species distributions, causing plants and animals to move north or to higher elevations with current warming. Bioclimatic models predict species distributions based on extant realized niches and assume niche conservation. Here, we evaluate if proxies for niches (i.e., range areas are conserved at the family level through deep time, from the Eocene to the Pleistocene. We analyze the occurrence of all mammalian families in the continental USA, calculating range area, percent range area occupied, range area rank, and range polygon centroids during each epoch. Percent range area occupied significantly increases from the Oligocene to the Miocene and again from the Pliocene to the Pleistocene; however, mammalian families maintain statistical concordance between rank orders across time. Families with greater taxonomic diversity occupy a greater percent of available range area during each epoch and net changes in taxonomic diversity are significantly positively related to changes in percent range area occupied from the Eocene to the Pleistocene. Furthermore, gains and losses in generic and species diversity are remarkably consistent with ~2.3 species gained per generic increase. Centroids demonstrate southeastern shifts from the Eocene through the Pleistocene that may correspond to major environmental events and/or climate changes during the Cenozoic. These results demonstrate range conservation at the family level and support the idea that niche conservation at higher taxonomic levels operates over deep time and may be controlled by life history traits. Furthermore, families containing megafauna and/or terminal Pleistocene extinction victims do not incur significantly greater declines in range area rank than families containing only smaller taxa and/or only survivors, from the Pliocene to Pleistocene. Collectively, these data evince the resilience of families to climate and/or environmental change in deep time, the absence of

  3. Visibility graphlet approach to chaotic time series

    Energy Technology Data Exchange (ETDEWEB)

    Mutua, Stephen [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China); Computer Science Department, Masinde Muliro University of Science and Technology, P.O. Box 190-50100, Kakamega (Kenya); Gu, Changgui, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn; Yang, Huijie, E-mail: gu-changgui@163.com, E-mail: hjyang@ustc.edu.cn [Business School, University of Shanghai for Science and Technology, Shanghai 200093 (China)

    2016-05-15

    Many novel methods have been proposed for mapping time series into complex networks. Although some dynamical behaviors can be effectively captured by existing approaches, the preservation and tracking of the temporal behaviors of a chaotic system remains an open problem. In this work, we extended the visibility graphlet approach to investigate both discrete and continuous chaotic time series. We applied visibility graphlets to capture the reconstructed local states, so that each is treated as a node and tracked downstream to create a temporal chain link. Our empirical findings show that the approach accurately captures the dynamical properties of chaotic systems. Networks constructed from periodic dynamic phases all converge to regular networks and to unique network structures for each model in the chaotic zones. Furthermore, our results show that the characterization of chaotic and non-chaotic zones in the Lorenz system corresponds to the maximal Lyapunov exponent, thus providing a simple and straightforward way to analyze chaotic systems.

  4. Fast and Flexible Multivariate Time Series Subsequence Search

    Data.gov (United States)

    National Aeronautics and Space Administration — Multivariate Time-Series (MTS) are ubiquitous, and are generated in areas as disparate as sensor recordings in aerospace systems, music and video streams, medical...

  5. Automated Feature Design for Time Series Classification by Genetic Programming

    OpenAIRE

    Harvey, Dustin Yewell

    2014-01-01

    Time series classification (TSC) methods discover and exploit patterns in time series and other one-dimensional signals. Although many accurate, robust classifiers exist for multivariate feature sets, general approaches are needed to extend machine learning techniques to make use of signal inputs. Numerous applications of TSC can be found in structural engineering, especially in the areas of structural health monitoring and non-destructive evaluation. Additionally, the fields of process contr...

  6. The use of synthetic input sequences in time series modeling

    International Nuclear Information System (INIS)

    Oliveira, Dair Jose de; Letellier, Christophe; Gomes, Murilo E.D.; Aguirre, Luis A.

    2008-01-01

    In many situations time series models obtained from noise-like data settle to trivial solutions under iteration. This Letter proposes a way of producing a synthetic (dummy) input, that is included to prevent the model from settling down to a trivial solution, while maintaining features of the original signal. Simulated benchmark models and a real time series of RR intervals from an ECG are used to illustrate the procedure

  7. Near-Real-Time Monitoring of Insect Defoliation Using Landsat Time Series

    Directory of Open Access Journals (Sweden)

    Valerie J. Pasquarella

    2017-07-01

    Full Text Available Introduced insects and pathogens impact millions of acres of forested land in the United States each year, and large-scale monitoring efforts are essential for tracking the spread of outbreaks and quantifying the extent of damage. However, monitoring the impacts of defoliating insects presents a significant challenge due to the ephemeral nature of defoliation events. Using the 2016 gypsy moth (Lymantria dispar outbreak in Southern New England as a case study, we present a new approach for near-real-time defoliation monitoring using synthetic images produced from Landsat time series. By comparing predicted and observed images, we assessed changes in vegetation condition multiple times over the course of an outbreak. Initial measures can be made as imagery becomes available, and season-integrated products provide a wall-to-wall assessment of potential defoliation at 30 m resolution. Qualitative and quantitative comparisons suggest our Landsat Time Series (LTS products improve identification of defoliation events relative to existing products and provide a repeatable metric of change in condition. Our synthetic-image approach is an important step toward using the full temporal potential of the Landsat archive for operational monitoring of forest health over large extents, and provides an important new tool for understanding spatial and temporal dynamics of insect defoliators.

  8. Analysis of three amphibian populations with quarter-century long time-series.

    OpenAIRE

    Meyer, A H; Schimidt, B R; Grossenbacher, K

    1998-01-01

    Amphibians are in decline in many parts of the world. Long tme-series of amphibian populations are necessary to distinguish declines from the often strong fluctuations observed in natural populations. Time-series may also help to understand the causes of these declines. We analysed 23-28-year long time-series of the frog Rana temporaria. Only one of the three studied populations showed a negative trend which was probably caused by the introduction of fish. Two populations appeared to be densi...

  9. Hominin teeth from the Middle Pleistocene site of Yiyuan, Eastern China.

    Science.gov (United States)

    Xing, Song; Sun, Chengkai; Martinón-Torres, María; Bermúdez de Castro, José María; Han, Fei; Zhang, Yingqi; Liu, Wu

    2016-06-01

    In 1981-1982, some hominin fossils, including a relatively complete skull and seven isolated teeth, were recovered from the Middle Pleistocene site of Yiyuan in Eastern China. In the present study we provide a detailed metric and morphological comparison of the Yiyuan dental sample in order to characterize better the variability of the human populations that inhabited China during the Middle Pleistocene. Aside from taxonomic and phylogenetic questions, the lack of understanding and/or knowledge about the morphological variability of these populations have caused concern about the human versus non-human nature of some of the hominin dental remains found in East Asia during the Early and the Middle Pleistocene. Thus, our study aims to present a detailed description and comparison of the Yiyuan isolated teeth to 1) discuss and support their human nature and 2) to explore their taxonomic affinities with regard to other penecontemporaneous populations from Asia. Our results clearly differentiate the Yiyuan sample from Pongo specimens and support a human attribution for the Yiyuan material. Our analyses also suggest that the Yiyuan teeth form a morphologically coherent group together with samples from Zhoukoudian, Chaoxian and Hexian. They are different from the more derived specimens from Panxian Dadong, suggesting a pattern of biogeographic isolation and different evolutionary trends between northern and southern China during the Middle Pleistocene. In addition, and despite sharing a common morphological bauplan with Homo erectus sensu stricto (s.s.), the Yiyuan, Zhoukoudian and Hexian teeth are also different from the Indonesian Early Pleistocene samples. In particular, the expression of a highly crenulated or dendritic enamel-dentine surface could be unique to these groups. Our study supports the notion that the taxonomy of the Pleistocene hominins from Asia may have been oversimplified. Future studies should explore the variability of the Asian specimens and

  10. Early pleistocene sediments at Great Blakenham, Suffolk, England

    Science.gov (United States)

    Gibbard, P. L.; Allen, P.; Field, M. H.; Hallam, D. F.

    Detailed investigation of a fine sediment sequence, the College Farm Silty Clay Member, that overlies the Creeting Sands (Early Pleistocene) in Suffolk, is presented. The sedimentary sequence is thought to represent a freshwater pool accumulation in a small coastal embayment. Palaeobotanical investigation of the sediment indicates that it accumulated during the late temperate substage of a temperate (interglacial) event. The occurrence of Tsuga pollen, associated with abundant remains of the water fern Azolla tegeliensis indicate that the deposits are of Early Pleistocene age and are correlated with a later part of the Antian-Bramertonian Stage. Correlation with Tiglian TO substage in The Netherlands' sequence is most likely. The sediments' normal palaeomagnetic polarity reinforces the biostratigraphical correlation.

  11. Construction and Qualification of the Pre-Series MQM Superconducting Quadrupoles for the LHC Insertions

    CERN Document Server

    Ostojic, R; Lucas, J; Venturini-Delsolaro, W; Landgrebe, D

    2004-01-01

    The LHC insertions will be equipped with individually powered MQM superconducting quadrupoles, produced in three versions with magnetic lengths of 2.4 m, 3.4 m, and 4.8 m. The quadrupoles feature a 56 mm aperture coil, designed on the basis of an 8.8 mm wide Rutherford-type NbTi cable for a nominal gradient of 200 T/m at 1.9 K and 5390 A. A total of 96 quadrupoles are in production in Tesla Engineering, UK. In this report we describe the construction of the pre-series MQM quadrupoles and present the results of the qualification tests.

  12. Plio-Pleistocene phylogeography of the Southeast Asian Blue Panchax killifish, Aplocheilus panchax

    Science.gov (United States)

    Carvalho, Gary R.; Barlow, Axel; Rüber, Lukas; Hui Tan, Heok; Nugroho, Estu; Wowor, Daisy; Mohd Nor, Siti Azizah; Herder, Fabian; Muchlisin, Zainal A.; de Bruyn, Mark

    2017-01-01

    The complex climatic and geological history of Southeast Asia has shaped this region’s high biodiversity. In particular, sea level fluctuations associated with repeated glacial cycles during the Pleistocene both facilitated, and limited, connectivity between populations. In this study, we used data from two mitochondrial and three anonymous nuclear markers to determine whether a fresh/brackish water killifish, Aplocheilus panchax, Hamilton, 1822, could be used to further understand how climatic oscillations and associated sea level fluctuations have shaped the distribution of biota within this region, and whether such patterns show evidence of isolation within palaeodrainage basins. Our analyses revealed three major mitochondrial clades within A. panchax. The basal divergence of A. panchax mitochondrial lineages was approximately 3.5 Ma, whilst the subsequent divergence timings of these clades occurred early Pleistocene (~2.6 Ma), proceeding through the Pleistocene. Continuous phylogeographic analysis showed a clear west-east dispersal followed by rapid radiation across Southeast Asia. Individuals from Krabi, just north of the Isthmus of Kra, were more closely related to the Indian lineages, providing further evidence for a freshwater faunal disjunction at the Isthmus of Kra biogeographic barrier. Our results suggest that Sulawesi, across the Wallace Line, was colonised relatively recently (~30 ka). Nuclear DNA is less geographically structured, although Mantel tests indicated that nuclear genetic distances were correlated with geographic proximity. Overall, these results imply that recent gene flow, as opposed to historical isolation, has been the key factor determining patterns of nuclear genetic variation in A. panchax, however, some evidence of historical isolation is retained within the mitochondrial genome. Our study further validates the existence of a major biogeographic boundary at the Kra Isthmus, and also demonstrates the use of widely distributed

  13. Robust Control Charts for Time Series Data

    NARCIS (Netherlands)

    Croux, C.; Gelper, S.; Mahieu, K.

    2010-01-01

    This article presents a control chart for time series data, based on the one-step- ahead forecast errors of the Holt-Winters forecasting method. We use robust techniques to prevent that outliers affect the estimation of the control limits of the chart. Moreover, robustness is important to maintain

  14. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  15. Predicting long-term catchment nutrient export: the use of nonlinear time series models

    Science.gov (United States)

    Valent, Peter; Howden, Nicholas J. K.; Szolgay, Jan; Komornikova, Magda

    2010-05-01

    After the Second World War the nitrate concentrations in European water bodies changed significantly as the result of increased nitrogen fertilizer use and changes in land use. However, in the last decades, as a consequence of the implementation of nitrate-reducing measures in Europe, the nitrate concentrations in water bodies slowly decrease. This causes that the mean and variance of the observed time series also changes with time (nonstationarity and heteroscedascity). In order to detect changes and properly describe the behaviour of such time series by time series analysis, linear models (such as autoregressive (AR), moving average (MA) and autoregressive moving average models (ARMA)), are no more suitable. Time series with sudden changes in statistical characteristics can cause various problems in the calibration of traditional water quality models and thus give biased predictions. Proper statistical analysis of these non-stationary and heteroscedastic time series with the aim of detecting and subsequently explaining the variations in their statistical characteristics requires the use of nonlinear time series models. This information can be then used to improve the model building and calibration of conceptual water quality model or to select right calibration periods in order to produce reliable predictions. The objective of this contribution is to analyze two long time series of nitrate concentrations of the rivers Ouse and Stour with advanced nonlinear statistical modelling techniques and compare their performance with traditional linear models of the ARMA class in order to identify changes in the time series characteristics. The time series were analysed with nonlinear models with multiple regimes represented by self-exciting threshold autoregressive (SETAR) and Markov-switching models (MSW). The analysis showed that, based on the value of residual sum of squares (RSS) in both datasets, SETAR and MSW models described the time-series better than models of the

  16. Interactive Web-based Visualization of Atomic Position-time Series Data

    Science.gov (United States)

    Thapa, S.; Karki, B. B.

    2017-12-01

    Extracting and interpreting the information contained in large sets of time-varying three dimensional positional data for the constituent atoms of simulated material is a challenging task. We have recently implemented a web-based visualization system to analyze the position-time series data extracted from the local or remote hosts. It involves a pre-processing step for data reduction, which involves skipping uninteresting parts of the data uniformly (at full atomic configuration level) or non-uniformly (at atomic species level or individual atom level). Atomic configuration snapshot is rendered using the ball-stick representation and can be animated by rendering successive configurations. The entire atomic dynamics can be captured as the trajectories by rendering the atomic positions at all time steps together as points. The trajectories can be manipulated at both species and atomic levels so that we can focus on one or more trajectories of interest, and can be also superimposed with the instantaneous atomic structure. The implementation was done using WebGL and Three.js for graphical rendering, HTML5 and Javascript for GUI, and Elasticsearch and JSON for data storage and retrieval within the Grails Framework. We have applied our visualization system to the simulation datatsets for proton-bearing forsterite (Mg2SiO4) - an abundant mineral of Earths upper mantle. Visualization reveals that protons (hydrogen ions) incorporated as interstitials are much more mobile than protons substituting the host Mg and Si cation sites. The proton diffusion appears to be anisotropic with high mobility along the x-direction, showing limited discrete jumps in other two directions.

  17. Extracting biologically significant patterns from short time series gene expression data

    Directory of Open Access Journals (Sweden)

    McGinnis Thomas

    2009-08-01

    Full Text Available Abstract Background Time series gene expression data analysis is used widely to study the dynamics of various cell processes. Most of the time series data available today consist of few time points only, thus making the application of standard clustering techniques difficult. Results We developed two new algorithms that are capable of extracting biological patterns from short time point series gene expression data. The two algorithms, ASTRO and MiMeSR, are inspired by the rank order preserving framework and the minimum mean squared residue approach, respectively. However, ASTRO and MiMeSR differ from previous approaches in that they take advantage of the relatively few number of time points in order to reduce the problem from NP-hard to linear. Tested on well-defined short time expression data, we found that our approaches are robust to noise, as well as to random patterns, and that they can correctly detect the temporal expression profile of relevant functional categories. Evaluation of our methods was performed using Gene Ontology (GO annotations and chromatin immunoprecipitation (ChIP-chip data. Conclusion Our approaches generally outperform both standard clustering algorithms and algorithms designed specifically for clustering of short time series gene expression data. Both algorithms are available at http://www.benoslab.pitt.edu/astro/.

  18. The application of complex network time series analysis in turbulent heated jets

    International Nuclear Information System (INIS)

    Charakopoulos, A. K.; Karakasidis, T. E.; Liakopoulos, A.; Papanicolaou, P. N.

    2014-01-01

    In the present study, we applied the methodology of the complex network-based time series analysis to experimental temperature time series from a vertical turbulent heated jet. More specifically, we approach the hydrodynamic problem of discriminating time series corresponding to various regions relative to the jet axis, i.e., time series corresponding to regions that are close to the jet axis from time series originating at regions with a different dynamical regime based on the constructed network properties. Applying the transformation phase space method (k nearest neighbors) and also the visibility algorithm, we transformed time series into networks and evaluated the topological properties of the networks such as degree distribution, average path length, diameter, modularity, and clustering coefficient. The results show that the complex network approach allows distinguishing, identifying, and exploring in detail various dynamical regions of the jet flow, and associate it to the corresponding physical behavior. In addition, in order to reject the hypothesis that the studied networks originate from a stochastic process, we generated random network and we compared their statistical properties with that originating from the experimental data. As far as the efficiency of the two methods for network construction is concerned, we conclude that both methodologies lead to network properties that present almost the same qualitative behavior and allow us to reveal the underlying system dynamics

  19. Trend analysis using non-stationary time series clustering based on the finite element method

    Science.gov (United States)

    Gorji Sefidmazgi, M.; Sayemuzzaman, M.; Homaifar, A.; Jha, M. K.; Liess, S.

    2014-05-01

    In order to analyze low-frequency variability of climate, it is useful to model the climatic time series with multiple linear trends and locate the times of significant changes. In this paper, we have used non-stationary time series clustering to find change points in the trends. Clustering in a multi-dimensional non-stationary time series is challenging, since the problem is mathematically ill-posed. Clustering based on the finite element method (FEM) is one of the methods that can analyze multidimensional time series. One important attribute of this method is that it is not dependent on any statistical assumption and does not need local stationarity in the time series. In this paper, it is shown how the FEM-clustering method can be used to locate change points in the trend of temperature time series from in situ observations. This method is applied to the temperature time series of North Carolina (NC) and the results represent region-specific climate variability despite higher frequency harmonics in climatic time series. Next, we investigated the relationship between the climatic indices with the clusters/trends detected based on this clustering method. It appears that the natural variability of climate change in NC during 1950-2009 can be explained mostly by AMO and solar activity.

  20. Frontiers in Time Series and Financial Econometrics : An overview

    NARCIS (Netherlands)

    S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)

    2015-01-01

    markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time

  1. Frontiers in Time Series and Financial Econometrics: An Overview

    NARCIS (Netherlands)

    S. Ling (Shiqing); M.J. McAleer (Michael); H. Tong (Howell)

    2015-01-01

    markdownabstract__Abstract__ Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time

  2. A four-stage hybrid model for hydrological time series forecasting.

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of 'denoising, decomposition and ensemble'. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models.

  3. A Four-Stage Hybrid Model for Hydrological Time Series Forecasting

    Science.gov (United States)

    Di, Chongli; Yang, Xiaohua; Wang, Xiaochao

    2014-01-01

    Hydrological time series forecasting remains a difficult task due to its complicated nonlinear, non-stationary and multi-scale characteristics. To solve this difficulty and improve the prediction accuracy, a novel four-stage hybrid model is proposed for hydrological time series forecasting based on the principle of ‘denoising, decomposition and ensemble’. The proposed model has four stages, i.e., denoising, decomposition, components prediction and ensemble. In the denoising stage, the empirical mode decomposition (EMD) method is utilized to reduce the noises in the hydrological time series. Then, an improved method of EMD, the ensemble empirical mode decomposition (EEMD), is applied to decompose the denoised series into a number of intrinsic mode function (IMF) components and one residual component. Next, the radial basis function neural network (RBFNN) is adopted to predict the trend of all of the components obtained in the decomposition stage. In the final ensemble prediction stage, the forecasting results of all of the IMF and residual components obtained in the third stage are combined to generate the final prediction results, using a linear neural network (LNN) model. For illustration and verification, six hydrological cases with different characteristics are used to test the effectiveness of the proposed model. The proposed hybrid model performs better than conventional single models, the hybrid models without denoising or decomposition and the hybrid models based on other methods, such as the wavelet analysis (WA)-based hybrid models. In addition, the denoising and decomposition strategies decrease the complexity of the series and reduce the difficulties of the forecasting. With its effective denoising and accurate decomposition ability, high prediction precision and wide applicability, the new model is very promising for complex time series forecasting. This new forecast model is an extension of nonlinear prediction models. PMID:25111782

  4. A Gaussian Process Based Online Change Detection Algorithm for Monitoring Periodic Time Series

    Energy Technology Data Exchange (ETDEWEB)

    Chandola, Varun [ORNL; Vatsavai, Raju [ORNL

    2011-01-01

    Online time series change detection is a critical component of many monitoring systems, such as space and air-borne remote sensing instruments, cardiac monitors, and network traffic profilers, which continuously analyze observations recorded by sensors. Data collected by such sensors typically has a periodic (seasonal) component. Most existing time series change detection methods are not directly applicable to handle such data, either because they are not designed to handle periodic time series or because they cannot operate in an online mode. We propose an online change detection algorithm which can handle periodic time series. The algorithm uses a Gaussian process based non-parametric time series prediction model and monitors the difference between the predictions and actual observations within a statistically principled control chart framework to identify changes. A key challenge in using Gaussian process in an online mode is the need to solve a large system of equations involving the associated covariance matrix which grows with every time step. The proposed algorithm exploits the special structure of the covariance matrix and can analyze a time series of length T in O(T^2) time while maintaining a O(T) memory footprint, compared to O(T^4) time and O(T^2) memory requirement of standard matrix manipulation methods. We experimentally demonstrate the superiority of the proposed algorithm over several existing time series change detection algorithms on a set of synthetic and real time series. Finally, we illustrate the effectiveness of the proposed algorithm for identifying land use land cover changes using Normalized Difference Vegetation Index (NDVI) data collected for an agricultural region in Iowa state, USA. Our algorithm is able to detect different types of changes in a NDVI validation data set (with ~80% accuracy) which occur due to crop type changes as well as disruptive changes (e.g., natural disasters).

  5. Time-variant power spectral analysis of heart-rate time series by ...

    Indian Academy of Sciences (India)

    Frequency domain representation of a short-term heart-rate time series (HRTS) signal is a popular method for evaluating the cardiovascular control system. The spectral parameters, viz. percentage power in low frequency band (%PLF), percentage power in high frequency band (%PHF), power ratio of low frequency to high ...

  6. Influence of Time-Series Normalization, Number of Nodes, Connectivity and Graph Measure Selection on Seizure-Onset Zone Localization from Intracranial EEG.

    Science.gov (United States)

    van Mierlo, Pieter; Lie, Octavian; Staljanssens, Willeke; Coito, Ana; Vulliémoz, Serge

    2018-04-26

    We investigated the influence of processing steps in the estimation of multivariate directed functional connectivity during seizures recorded with intracranial EEG (iEEG) on seizure-onset zone (SOZ) localization. We studied the effect of (i) the number of nodes, (ii) time-series normalization, (iii) the choice of multivariate time-varying connectivity measure: Adaptive Directed Transfer Function (ADTF) or Adaptive Partial Directed Coherence (APDC) and (iv) graph theory measure: outdegree or shortest path length. First, simulations were performed to quantify the influence of the various processing steps on the accuracy to localize the SOZ. Afterwards, the SOZ was estimated from a 113-electrodes iEEG seizure recording and compared with the resection that rendered the patient seizure-free. The simulations revealed that ADTF is preferred over APDC to localize the SOZ from ictal iEEG recordings. Normalizing the time series before analysis resulted in an increase of 25-35% of correctly localized SOZ, while adding more nodes to the connectivity analysis led to a moderate decrease of 10%, when comparing 128 with 32 input nodes. The real-seizure connectivity estimates localized the SOZ inside the resection area using the ADTF coupled to outdegree or shortest path length. Our study showed that normalizing the time-series is an important pre-processing step, while adding nodes to the analysis did only marginally affect the SOZ localization. The study shows that directed multivariate Granger-based connectivity analysis is feasible with many input nodes (> 100) and that normalization of the time-series before connectivity analysis is preferred.

  7. Rotation in the Dynamic Factor Modeling of Multivariate Stationary Time Series.

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2001-01-01

    Proposes a special rotation procedure for the exploratory dynamic factor model for stationary multivariate time series. The rotation procedure applies separately to each univariate component series of a q-variate latent factor series and transforms such a component, initially represented as white noise, into a univariate moving-average.…

  8. Frequency of fault occurrence at shallow depths during Plio-Pleistocene and estimation of the incident of new faults

    International Nuclear Information System (INIS)

    Shiratsuchi, H.; Yoshida, S.

    2009-01-01

    It is required that buried high-level radioactive wastes should not be broken directly by faulting in the future. Although a disposal site will be selected in an area where no active faults are present, the possibility of new fault occurrence in the site has to be evaluated. The probability of new fault occurrence is estimated from the frequency of faults which exist in Pliocene and Pleistocene strata distributed beneath 3 large plains in Japan, where a large number of seismic profiles and borehole data are obtained. Estimation of the frequency of faults having occurred and/or reached at shallow depth during Plio-Pleistocene time. The frequency of fault occurrence was estimated by counting the number of faults that exist in Plio-Pleistocene strata that are widely distributed in large plains in Japan. Three plains, Kanto, Nobi and Osaka Plains are selected for this purpose because highly precise geological profiles, which were prepared from numerous geological drillings and geophysical investigations, are available in them. (authors)

  9. Estimating the level of dynamical noise in time series by using fractal dimensions

    Energy Technology Data Exchange (ETDEWEB)

    Sase, Takumi, E-mail: sase@sat.t.u-tokyo.ac.jp [Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 153-8505 (Japan); Ramírez, Jonatán Peña [CONACYT Research Fellow, Center for Scientific Research and Higher Education at Ensenada (CICESE), Carretera Ensenada-Tijuana No. 3918, Zona Playitas, C.P. 22860, Ensenada, Baja California (Mexico); Kitajo, Keiichi [BSI-Toyota Collaboration Center, RIKEN Brain Science Institute, Wako, Saitama 351-0198 (Japan); Aihara, Kazuyuki; Hirata, Yoshito [Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 153-8505 (Japan); Institute of Industrial Science, The University of Tokyo, Tokyo 153-8505 (Japan)

    2016-03-11

    We present a method for estimating the dynamical noise level of a ‘short’ time series even if the dynamical system is unknown. The proposed method estimates the level of dynamical noise by calculating the fractal dimensions of the time series. Additionally, the method is applied to EEG data to demonstrate its possible effectiveness as an indicator of temporal changes in the level of dynamical noise. - Highlights: • A dynamical noise level estimator for time series is proposed. • The estimator does not need any information about the dynamics generating the time series. • The estimator is based on a novel definition of time series dimension (TSD). • It is demonstrated that there exists a monotonic relationship between the • TSD and the level of dynamical noise. • We apply the proposed method to human electroencephalographic data.

  10. Estimating the level of dynamical noise in time series by using fractal dimensions

    International Nuclear Information System (INIS)

    Sase, Takumi; Ramírez, Jonatán Peña; Kitajo, Keiichi; Aihara, Kazuyuki; Hirata, Yoshito

    2016-01-01

    We present a method for estimating the dynamical noise level of a ‘short’ time series even if the dynamical system is unknown. The proposed method estimates the level of dynamical noise by calculating the fractal dimensions of the time series. Additionally, the method is applied to EEG data to demonstrate its possible effectiveness as an indicator of temporal changes in the level of dynamical noise. - Highlights: • A dynamical noise level estimator for time series is proposed. • The estimator does not need any information about the dynamics generating the time series. • The estimator is based on a novel definition of time series dimension (TSD). • It is demonstrated that there exists a monotonic relationship between the • TSD and the level of dynamical noise. • We apply the proposed method to human electroencephalographic data.

  11. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis.

    Science.gov (United States)

    Astola, Laura; Molenaar, Jaap

    2014-07-01

    Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN) is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  12. Quantifying evolutionary dynamics from variant-frequency time series

    Science.gov (United States)

    Khatri, Bhavin S.

    2016-09-01

    From Kimura’s neutral theory of protein evolution to Hubbell’s neutral theory of biodiversity, quantifying the relative importance of neutrality versus selection has long been a basic question in evolutionary biology and ecology. With deep sequencing technologies, this question is taking on a new form: given a time-series of the frequency of different variants in a population, what is the likelihood that the observation has arisen due to selection or neutrality? To tackle the 2-variant case, we exploit Fisher’s angular transformation, which despite being discovered by Ronald Fisher a century ago, has remained an intellectual curiosity. We show together with a heuristic approach it provides a simple solution for the transition probability density at short times, including drift, selection and mutation. Our results show under that under strong selection and sufficiently frequent sampling these evolutionary parameters can be accurately determined from simulation data and so they provide a theoretical basis for techniques to detect selection from variant or polymorphism frequency time-series.

  13. RankExplorer: Visualization of Ranking Changes in Large Time Series Data.

    Science.gov (United States)

    Shi, Conglei; Cui, Weiwei; Liu, Shixia; Xu, Panpan; Chen, Wei; Qu, Huamin

    2012-12-01

    For many applications involving time series data, people are often interested in the changes of item values over time as well as their ranking changes. For example, people search many words via search engines like Google and Bing every day. Analysts are interested in both the absolute searching number for each word as well as their relative rankings. Both sets of statistics may change over time. For very large time series data with thousands of items, how to visually present ranking changes is an interesting challenge. In this paper, we propose RankExplorer, a novel visualization method based on ThemeRiver to reveal the ranking changes. Our method consists of four major components: 1) a segmentation method which partitions a large set of time series curves into a manageable number of ranking categories; 2) an extended ThemeRiver view with embedded color bars and changing glyphs to show the evolution of aggregation values related to each ranking category over time as well as the content changes in each ranking category; 3) a trend curve to show the degree of ranking changes over time; 4) rich user interactions to support interactive exploration of ranking changes. We have applied our method to some real time series data and the case studies demonstrate that our method can reveal the underlying patterns related to ranking changes which might otherwise be obscured in traditional visualizations.

  14. ESTIMATING RELIABILITY OF DISTURBANCES IN SATELLITE TIME SERIES DATA BASED ON STATISTICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z.-G. Zhou

    2016-06-01

    Full Text Available Normally, the status of land cover is inherently dynamic and changing continuously on temporal scale. However, disturbances or abnormal changes of land cover — caused by such as forest fire, flood, deforestation, and plant diseases — occur worldwide at unknown times and locations. Timely detection and characterization of these disturbances is of importance for land cover monitoring. Recently, many time-series-analysis methods have been developed for near real-time or online disturbance detection, using satellite image time series. However, the detection results were only labelled with “Change/ No change” by most of the present methods, while few methods focus on estimating reliability (or confidence level of the detected disturbances in image time series. To this end, this paper propose a statistical analysis method for estimating reliability of disturbances in new available remote sensing image time series, through analysis of full temporal information laid in time series data. The method consists of three main steps. (1 Segmenting and modelling of historical time series data based on Breaks for Additive Seasonal and Trend (BFAST. (2 Forecasting and detecting disturbances in new time series data. (3 Estimating reliability of each detected disturbance using statistical analysis based on Confidence Interval (CI and Confidence Levels (CL. The method was validated by estimating reliability of disturbance regions caused by a recent severe flooding occurred around the border of Russia and China. Results demonstrated that the method can estimate reliability of disturbances detected in satellite image with estimation error less than 5% and overall accuracy up to 90%.

  15. Wavelet entropy of BOLD time series: An application to Rolandic epilepsy.

    Science.gov (United States)

    Gupta, Lalit; Jansen, Jacobus F A; Hofman, Paul A M; Besseling, René M H; de Louw, Anton J A; Aldenkamp, Albert P; Backes, Walter H

    2017-12-01

    To assess the wavelet entropy for the characterization of intrinsic aberrant temporal irregularities in the time series of resting-state blood-oxygen-level-dependent (BOLD) signal fluctuations. Further, to evaluate the temporal irregularities (disorder/order) on a voxel-by-voxel basis in the brains of children with Rolandic epilepsy. The BOLD time series was decomposed using the discrete wavelet transform and the wavelet entropy was calculated. Using a model time series consisting of multiple harmonics and nonstationary components, the wavelet entropy was compared with Shannon and spectral (Fourier-based) entropy. As an application, the wavelet entropy in 22 children with Rolandic epilepsy was compared to 22 age-matched healthy controls. The images were obtained by performing resting-state functional magnetic resonance imaging (fMRI) using a 3T system, an 8-element receive-only head coil, and an echo planar imaging pulse sequence ( T2*-weighted). The wavelet entropy was also compared to spectral entropy, regional homogeneity, and Shannon entropy. Wavelet entropy was found to identify the nonstationary components of the model time series. In Rolandic epilepsy patients, a significantly elevated wavelet entropy was observed relative to controls for the whole cerebrum (P = 0.03). Spectral entropy (P = 0.41), regional homogeneity (P = 0.52), and Shannon entropy (P = 0.32) did not reveal significant differences. The wavelet entropy measure appeared more sensitive to detect abnormalities in cerebral fluctuations represented by nonstationary effects in the BOLD time series than more conventional measures. This effect was observed in the model time series as well as in Rolandic epilepsy. These observations suggest that the brains of children with Rolandic epilepsy exhibit stronger nonstationary temporal signal fluctuations than controls. 2 Technical Efficacy: Stage 3 J. Magn. Reson. Imaging 2017;46:1728-1737. © 2017 International Society for Magnetic

  16. Time Series Forecasting with Missing Values

    OpenAIRE

    Shin-Fu Wu; Chia-Yung Chang; Shie-Jue Lee

    2015-01-01

    Time series prediction has become more popular in various kinds of applications such as weather prediction, control engineering, financial analysis, industrial monitoring, etc. To deal with real-world problems, we are often faced with missing values in the data due to sensor malfunctions or human errors. Traditionally, the missing values are simply omitted or replaced by means of imputation methods. However, omitting those missing values may cause temporal discontinuity. Imputation methods, o...

  17. Detection of chaotic determinism in time series from randomly forced maps

    Science.gov (United States)

    Chon, K. H.; Kanters, J. K.; Cohen, R. J.; Holstein-Rathlou, N. H.

    1997-01-01

    Time series from biological system often display fluctuations in the measured variables. Much effort has been directed at determining whether this variability reflects deterministic chaos, or whether it is merely "noise". Despite this effort, it has been difficult to establish the presence of chaos in time series from biological sytems. The output from a biological system is probably the result of both its internal dynamics, and the input to the system from the surroundings. This implies that the system should be viewed as a mixed system with both stochastic and deterministic components. We present a method that appears to be useful in deciding whether determinism is present in a time series, and if this determinism has chaotic attributes, i.e., a positive characteristic exponent that leads to sensitivity to initial conditions. The method relies on fitting a nonlinear autoregressive model to the time series followed by an estimation of the characteristic exponents of the model over the observed probability distribution of states for the system. The method is tested by computer simulations, and applied to heart rate variability data.

  18. Time-series modeling: applications to long-term finfish monitoring data

    International Nuclear Information System (INIS)

    Bireley, L.E.

    1985-01-01

    The growing concern and awareness that developed during the 1970's over the effects that industry had on the environment caused the electric utility industry in particular to develop monitoring programs. These programs generate long-term series of data that are not very amenable to classical normal-theory statistical analysis. The monitoring data collected from three finfish programs (impingement, trawl and seine) at the Millstone Nuclear Power Station were typical of such series and thus were used to develop methodology that used the full extent of the information in the series. The basis of the methodology was classic Box-Jenkins time-series modeling; however, the models also included deterministic components that involved flow, season and time as predictor variables. Time entered into the models as harmonic regression terms. Of the 32 models fitted to finfish catch data, 19 were found to account for more than 70% of the historical variation. The models were than used to forecast finfish catches a year in advance and comparisons were made to actual data. Usually the confidence intervals associated with the forecasts encompassed most of the observed data. The technique can provide the basis for intervention analysis in future impact assessments

  19. Time Series Imputation via L1 Norm-Based Singular Spectrum Analysis

    Science.gov (United States)

    Kalantari, Mahdi; Yarmohammadi, Masoud; Hassani, Hossein; Silva, Emmanuel Sirimal

    Missing values in time series data is a well-known and important problem which many researchers have studied extensively in various fields. In this paper, a new nonparametric approach for missing value imputation in time series is proposed. The main novelty of this research is applying the L1 norm-based version of Singular Spectrum Analysis (SSA), namely L1-SSA which is robust against outliers. The performance of the new imputation method has been compared with many other established methods. The comparison is done by applying them to various real and simulated time series. The obtained results confirm that the SSA-based methods, especially L1-SSA can provide better imputation in comparison to other methods.

  20. Multiple Time Series Forecasting Using Quasi-Randomized Functional Link Neural Networks

    Directory of Open Access Journals (Sweden)

    Thierry Moudiki

    2018-03-01

    Full Text Available We are interested in obtaining forecasts for multiple time series, by taking into account the potential nonlinear relationships between their observations. For this purpose, we use a specific type of regression model on an augmented dataset of lagged time series. Our model is inspired by dynamic regression models (Pankratz 2012, with the response variable’s lags included as predictors, and is known as Random Vector Functional Link (RVFL neural networks. The RVFL neural networks have been successfully applied in the past, to solving regression and classification problems. The novelty of our approach is to apply an RVFL model to multivariate time series, under two separate regularization constraints on the regression parameters.

  1. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  2. Topological data analysis of financial time series: Landscapes of crashes

    Science.gov (United States)

    Gidea, Marian; Katz, Yuri

    2018-02-01

    We explore the evolution of daily returns of four major US stock market indices during the technology crash of 2000, and the financial crisis of 2007-2009. Our methodology is based on topological data analysis (TDA). We use persistence homology to detect and quantify topological patterns that appear in multidimensional time series. Using a sliding window, we extract time-dependent point cloud data sets, to which we associate a topological space. We detect transient loops that appear in this space, and we measure their persistence. This is encoded in real-valued functions referred to as a 'persistence landscapes'. We quantify the temporal changes in persistence landscapes via their Lp-norms. We test this procedure on multidimensional time series generated by various non-linear and non-equilibrium models. We find that, in the vicinity of financial meltdowns, the Lp-norms exhibit strong growth prior to the primary peak, which ascends during a crash. Remarkably, the average spectral density at low frequencies of the time series of Lp-norms of the persistence landscapes demonstrates a strong rising trend for 250 trading days prior to either dotcom crash on 03/10/2000, or to the Lehman bankruptcy on 09/15/2008. Our study suggests that TDA provides a new type of econometric analysis, which complements the standard statistical measures. The method can be used to detect early warning signals of imminent market crashes. We believe that this approach can be used beyond the analysis of financial time series presented here.

  3. Snapshots of the Greenland ice sheet configuration in the Pliocene to early Pleistocene

    DEFF Research Database (Denmark)

    Solgaard, Anne M.; Reeh, Niels; Japsen, Peter

    2011-01-01

    The geometry of the ice sheets during the Pliocene to early Pleistocene is not well constrained. Here we apply an ice-flow model in the study of the Greenland ice sheet (GIS) during three extreme intervals of this period constrained by geological observations and climate reconstructions. We study...... the extent of the GIS during the Mid-Pliocene Warmth (3.3-3.0 Ma), its advance across the continental shelf during the late Pliocene to early Pleistocene glaciations (3.0-2.4 Ma) as implied by offshore geological studies, and the transition from glacial to interglacial conditions around 2.4 Ma as deduced...... the variability of the GIS during the Pliocene to early Pleistocene and underline the importance of including independent estimates of the GIS in studies of climate during this period. We conclude that the GIS did not exist throughout the Pliocene to early Pleistocene, and that it melted during interglacials even...

  4. Properties of Asymmetric Detrended Fluctuation Analysis in the time series of RR intervals

    Science.gov (United States)

    Piskorski, J.; Kosmider, M.; Mieszkowski, D.; Krauze, T.; Wykretowicz, A.; Guzik, P.

    2018-02-01

    Heart rate asymmetry is a phenomenon by which the accelerations and decelerations of heart rate behave differently, and this difference is consistent and unidirectional, i.e. in most of the analyzed recordings the inequalities have the same directions. So far, it has been established for variance and runs based types of descriptors of RR intervals time series. In this paper we apply the newly developed method of Asymmetric Detrended Fluctuation Analysis, which so far has mainly been used with economic time series, to the set of 420 stationary 30 min time series of RR intervals from young, healthy individuals aged between 20 and 40. This asymmetric approach introduces separate scaling exponents for rising and falling trends. We systematically study the presence of asymmetry in both global and local versions of this method. In this study global means "applying to the whole time series" and local means "applying to windows jumping along the recording". It is found that the correlation structure of the fluctuations left over after detrending in physiological time series shows strong asymmetric features in both magnitude, with α+ physiological data after shuffling or with a group of symmetric synthetic time series.

  5. Patterns of myoxid evolution in the Pliocene and Pleistocene of Europe

    Directory of Open Access Journals (Sweden)

    Adam Nadachoswki

    1995-05-01

    Full Text Available Abstract The origin of recent species belonging to the genera Myoxus, Muscardinus, Glirulus, Eliomys, Dryomys and Myomimus is discussed. Evolution of myoxids in the Pliocene and Pleistocene is expressed by gradual size increase of their cheek teeth. No gradual change in the dental pattern is observed. Riassunto Modelli di evoluzione dei Mioxidi nel Pliocene e Pleistocene in Europa - Viene discussa l'origine delle specie recenti appartenenti ai generi Myoxus, Muscardinus, Glirulus, Eliomys, Dryomys e Myomimus. L'evoluzione dei Mioxidi nel Pliocene e nel Pleistocene è espressa da un graduale aumento delle dimensioni dei molari. Non è stato osservato alcun cambiamento graduale nel pattern dentale.

  6. A New Modified Histogram Matching Normalization for Time Series Microarray Analysis

    Directory of Open Access Journals (Sweden)

    Laura Astola

    2014-07-01

    Full Text Available Microarray data is often utilized in inferring regulatory networks. Quantile normalization (QN is a popular method to reduce array-to-array variation. We show that in the context of time series measurements QN may not be the best choice for this task, especially not if the inference is based on continuous time ODE model. We propose an alternative normalization method that is better suited for network inference from time series data.

  7. Extent, timing, and climatic significance of latest Pleistocene and Holocene glaciation in the Sierra Nevada, California

    Energy Technology Data Exchange (ETDEWEB)

    Clark, Douglas Howe [Univ. of Washington, Seattle, WA (United States)

    1995-01-01

    Despite more than a century of study, scant attention has been paid to the glacial record in the northern end of the Sierra Nevada, and to the smaller moraines deposited after the retreat of the Tioga (last glacial maximum) glaciers. Equilibrium-line altitude (ELA) estimates of the ice fields indicate that the Tioga ELA gradients there are consistent with similar estimates for the southern half of the range, and with an intensification of the modern temperature/precipitation pattern in the region. The Recess Peak advance has traditionally been considered to be mid-Neoglacial age, about 2--3,000 yr B.P., on the basis of relative weathering estimates. Sediment cores of lakes dammed behind moraines correlative with Recess Peak in four widely spaced sites yields a series of high-resolution AMS radiocarbon dates which demonstrate that Recess Peak glaciers retreated before ~13,100 cal yr B.P.. This minimum limiting age indicates that the advance predates the North Atlantic Younger Dryas cooling. It also implies that there have been no advances larger than the Matthes in the roughly 12,000 year interval between it and the Recess Peak advance. This finding casts doubt on several recent studies that claim Younger Dryas glacier advances in western North America. The 13,100 cal yr B.P. date is also a minimum age for deglaciation of the sample sites used to calibrate the in situ production rates of cosmogenic 10Be and 26Al. The discrepancy between this age and the 11,000 cal yr B.P. exposure age assumed in the original calibration introduces a large (> 19%) potential error in late-Pleistocene exposure ages calculated using these production rates.

  8. AFSC/ABL: Ugashik sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 b?? 2002) collected from adult sockeye salmon returning to Ugashik River were retrieved from the Alaska Department of Fish and...

  9. AFSC/ABL: Naknek sockeye salmon scale time series

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A time series of scale samples (1956 2002) collected from adult sockeye salmon returning to Naknek River were retrieved from the Alaska Department of Fish and Game....

  10. Time series regression model for infectious disease and weather.

    Science.gov (United States)

    Imai, Chisato; Armstrong, Ben; Chalabi, Zaid; Mangtani, Punam; Hashizume, Masahiro

    2015-10-01

    Time series regression has been developed and long used to evaluate the short-term associations of air pollution and weather with mortality or morbidity of non-infectious diseases. The application of the regression approaches from this tradition to infectious diseases, however, is less well explored and raises some new issues. We discuss and present potential solutions for five issues often arising in such analyses: changes in immune population, strong autocorrelations, a wide range of plausible lag structures and association patterns, seasonality adjustments, and large overdispersion. The potential approaches are illustrated with datasets of cholera cases and rainfall from Bangladesh and influenza and temperature in Tokyo. Though this article focuses on the application of the traditional time series regression to infectious diseases and weather factors, we also briefly introduce alternative approaches, including mathematical modeling, wavelet analysis, and autoregressive integrated moving average (ARIMA) models. Modifications proposed to standard time series regression practice include using sums of past cases as proxies for the immune population, and using the logarithm of lagged disease counts to control autocorrelation due to true contagion, both of which are motivated from "susceptible-infectious-recovered" (SIR) models. The complexity of lag structures and association patterns can often be informed by biological mechanisms and explored by using distributed lag non-linear models. For overdispersed models, alternative distribution models such as quasi-Poisson and negative binomial should be considered. Time series regression can be used to investigate dependence of infectious diseases on weather, but may need modifying to allow for features specific to this context. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Assessing Coupling Dynamics from an Ensemble of Time Series

    Directory of Open Access Journals (Sweden)

    Germán Gómez-Herrero

    2015-04-01

    Full Text Available Finding interdependency relations between time series provides valuable knowledge about the processes that generated the signals. Information theory sets a natural framework for important classes of statistical dependencies. However, a reliable estimation from information-theoretic functionals is hampered when the dependency to be assessed is brief or evolves in time. Here, we show that these limitations can be partly alleviated when we have access to an ensemble of independent repetitions of the time series. In particular, we gear a data-efficient estimator of probability densities to make use of the full structure of trial-based measures. By doing so, we can obtain time-resolved estimates for a family of entropy combinations (including mutual information, transfer entropy and their conditional counterparts, which are more accurate than the simple average of individual estimates over trials. We show with simulated and real data generated by coupled electronic circuits that the proposed approach allows one to recover the time-resolved dynamics of the coupling between different subsystems.

  12. Wavelet transform approach for fitting financial time series data

    Science.gov (United States)

    Ahmed, Amel Abdoullah; Ismail, Mohd Tahir

    2015-10-01

    This study investigates a newly developed technique; a combined wavelet filtering and VEC model, to study the dynamic relationship among financial time series. Wavelet filter has been used to annihilate noise data in daily data set of NASDAQ stock market of US, and three stock markets of Middle East and North Africa (MENA) region, namely, Egypt, Jordan, and Istanbul. The data covered is from 6/29/2001 to 5/5/2009. After that, the returns of generated series by wavelet filter and original series are analyzed by cointegration test and VEC model. The results show that the cointegration test affirms the existence of cointegration between the studied series, and there is a long-term relationship between the US, stock markets and MENA stock markets. A comparison between the proposed model and traditional model demonstrates that, the proposed model (DWT with VEC model) outperforms traditional model (VEC model) to fit the financial stock markets series well, and shows real information about these relationships among the stock markets.

  13. Evolution of the Sunspot Number and Solar Wind B Time Series

    Science.gov (United States)

    Cliver, Edward W.; Herbst, Konstantin

    2018-03-01

    The past two decades have witnessed significant changes in our knowledge of long-term solar and solar wind activity. The sunspot number time series (1700-present) developed by Rudolf Wolf during the second half of the 19th century was revised and extended by the group sunspot number series (1610-1995) of Hoyt and Schatten during the 1990s. The group sunspot number is significantly lower than the Wolf series before ˜1885. An effort from 2011-2015 to understand and remove differences between these two series via a series of workshops had the unintended consequence of prompting several alternative constructions of the sunspot number. Thus it has been necessary to expand and extend the sunspot number reconciliation process. On the solar wind side, after a decade of controversy, an ISSI International Team used geomagnetic and sunspot data to obtain a high-confidence time series of the solar wind magnetic field strength (B) from 1750-present that can be compared with two independent long-term (> ˜600 year) series of annual B-values based on cosmogenic nuclides. In this paper, we trace the twists and turns leading to our current understanding of long-term solar and solar wind activity.

  14. Pleistocene cave art from Sulawesi, Indonesia.

    Science.gov (United States)

    Aubert, M; Brumm, A; Ramli, M; Sutikna, T; Saptomo, E W; Hakim, B; Morwood, M J; van den Bergh, G D; Kinsley, L; Dosseto, A

    2014-10-09

    Archaeologists have long been puzzled by the appearance in Europe ∼40-35 thousand years (kyr) ago of a rich corpus of sophisticated artworks, including parietal art (that is, paintings, drawings and engravings on immobile rock surfaces) and portable art (for example, carved figurines), and the absence or scarcity of equivalent, well-dated evidence elsewhere, especially along early human migration routes in South Asia and the Far East, including Wallacea and Australia, where modern humans (Homo sapiens) were established by 50 kyr ago. Here, using uranium-series dating of coralloid speleothems directly associated with 12 human hand stencils and two figurative animal depictions from seven cave sites in the Maros karsts of Sulawesi, we show that rock art traditions on this Indonesian island are at least compatible in age with the oldest European art. The earliest dated image from Maros, with a minimum age of 39.9 kyr, is now the oldest known hand stencil in the world. In addition, a painting of a babirusa ('pig-deer') made at least 35.4 kyr ago is among the earliest dated figurative depictions worldwide, if not the earliest one. Among the implications, it can now be demonstrated that humans were producing rock art by ∼40 kyr ago at opposite ends of the Pleistocene Eurasian world.

  15. Time irreversibility and intrinsics revealing of series with complex network approach

    Science.gov (United States)

    Xiong, Hui; Shang, Pengjian; Xia, Jianan; Wang, Jing

    2018-06-01

    In this work, we analyze time series on the basis of the visibility graph algorithm that maps the original series into a graph. By taking into account the all-round information carried by the signals, the time irreversibility and fractal behavior of series are evaluated from a complex network perspective, and considered signals are further classified from different aspects. The reliability of the proposed analysis is supported by numerical simulations on synthesized uncorrelated random noise, short-term correlated chaotic systems and long-term correlated fractal processes, and by the empirical analysis on daily closing prices of eleven worldwide stock indices. Obtained results suggest that finite size has a significant effect on the evaluation, and that there might be no direct relation between the time irreversibility and long-range correlation of series. Similarity and dissimilarity between stock indices are also indicated from respective regional and global perspectives, showing the existence of multiple features of underlying systems.

  16. An interactive toolkit to extract phenological time series data from digital repeat photography

    Science.gov (United States)

    Seyednasrollah, B.; Milliman, T. E.; Hufkens, K.; Kosmala, M.; Richardson, A. D.

    2017-12-01

    Near-surface remote sensing and in situ photography are powerful tools to study how climate change and climate variability influence vegetation phenology and the associated seasonal rhythms of green-up and senescence. The rapidly-growing PhenoCam network has been using in situ digital repeat photography to study phenology in almost 500 locations around the world, with an emphasis on North America. However, extracting time series data from multiple years of half-hourly imagery - while each set of images may contain several regions of interest (ROI's), corresponding to different species or vegetation types - is not always straightforward. Large volumes of data require substantial processing time, and changes (either intentional or accidental) in camera field of view requires adjustment of ROI masks. Here, we introduce and present "DrawROI" as an interactive web-based application for imagery from PhenoCam. DrawROI can also be used offline, as a fully independent toolkit that significantly facilitates extraction of phenological data from any stack of digital repeat photography images. DrawROI provides a responsive environment for phenological scientists to interactively a) delineate ROIs, b) handle field of view (FOV) shifts, and c) extract and export time series data characterizing image color (i.e. red, green and blue channel digital numbers for the defined ROI). The application utilizes artificial intelligence and advanced machine learning techniques and gives user the opportunity to redraw new ROIs every time an FOV shift occurs. DrawROI also offers a quality control flag to indicate noisy data and images with low quality due to presence of foggy weather or snow conditions. The web-based application significantly accelerates the process of creating new ROIs and modifying pre-existing ROI in the PhenoCam database. The offline toolkit is presented as an open source R-package that can be used with similar datasets with time-lapse photography to obtain more data for

  17. Recurrence and symmetry of time series: Application to transition detection

    International Nuclear Information System (INIS)

    Girault, Jean-Marc

    2015-01-01

    Highlights: •A new theoretical framework based on the symmetry concept is proposed. •Four types of symmetry present in any time series were analyzed. •New descriptors make possible the analysis of regime changes in logistic systems. •Chaos–chaos, chaos–periodic, symmetry-breaking, symmetry-increasing bifurcations can be detected. -- Abstract: The study of transitions in low dimensional, nonlinear dynamical systems is a complex problem for which there is not yet a simple, global numerical method able to detect chaos–chaos, chaos–periodic bifurcations and symmetry-breaking, symmetry-increasing bifurcations. We present here for the first time a general framework focusing on the symmetry concept of time series that at the same time reveals new kinds of recurrence. We propose several numerical tools based on the symmetry concept allowing both the qualification and quantification of different kinds of possible symmetry. By using several examples based on periodic symmetrical time series and on logistic and cubic maps, we show that it is possible with simple numerical tools to detect a large number of bifurcations of chaos–chaos, chaos–periodic, broken symmetry and increased symmetry types

  18. Bootstrap Power of Time Series Goodness of fit tests

    Directory of Open Access Journals (Sweden)

    Sohail Chand

    2013-10-01

    Full Text Available In this article, we looked at power of various versions of Box and Pierce statistic and Cramer von Mises test. An extensive simulation study has been conducted to compare the power of these tests. Algorithms have been provided for the power calculations and comparison has also been made between the semi parametric bootstrap methods used for time series. Results show that Box-Pierce statistic and its various versions have good power against linear time series models but poor power against non linear models while situation reverses for Cramer von Mises test. Moreover, we found that dynamic bootstrap method is better than xed design bootstrap method.

  19. The late Middle Pleistocene hominin fossil record of eastern Asia: synthesis and review.

    Science.gov (United States)

    Bae, Christopher J

    2010-01-01

    Traditionally, Middle Pleistocene hominin fossils that cannot be allocated to Homo erectus sensu lato or modern H. sapiens have been assigned to different specific taxa. For example, in eastern Asia, these hominin fossils have been classified as archaic, early, or premodern H. sapiens. An increasing number of Middle Pleistocene hominin fossils are currently being assigned to H. heidelbergensis. This is particularly the case for the African and European Middle Pleistocene hominin fossil record. There have been suggestions that perhaps the eastern Asian late Middle Pleistocene hominins can also be allocated to the H. heidelbergensis hypodigm. In this article, I review the current state of the late Middle Pleistocene hominin fossil record from eastern Asia and examine the various arguments for assigning these hominins to the different specific taxa. The two primary conclusions drawn from this review are as follows: 1) little evidence currently exists in the eastern Asian Middle Pleistocene hominin fossil record to support their assignment to H. heidelbergensis; and 2) rather than add to the growing list of hominin fossil taxa by using taxonomic names like H. daliensis for northeast Asian fossils and H. mabaensis for Southeast Asian fossils, it is better to err on the side of caution and continue to use the term archaic H. sapiens to represent all of these hominin fossils. What should be evident from this review is the need for an increase in the quality and quantity of the eastern Asian hominin fossil data set. Fortunately, with the increasing number of large-scale multidisciplinary paleoanthropological field and laboratory research projects in eastern Asia, the record is quickly becoming better understood. Copyright © 2010 Wiley-Liss, Inc.

  20. CANIS LUPUS (MAMMALIA, CANIDAE FROM THE LATE PLEISTOCENE DEPOSIT OF AVETRANA (TARANTO, SOUTHERN ITALY

    Directory of Open Access Journals (Sweden)

    DAVIDE F.BERTÈ

    2014-11-01

    Full Text Available Here we described the remains of Canis lupus from the bed 8 of Avetrana karst filling (Late Pleistocene; Taranto, Southern Italy. The studied specimens are larger than those collected from the early Late Pleistocene Apulian localities and those referred to the recent Italian wolf. Moreover, the remains from Avetrana are morphometrically close to Canis lupus maximus from France and to C. lupus collected from Central and Northern Italian localities, chronologically related to MIS 2 and MIS 3. Morphologically, the studied specimens slightly differ from both C. l. maximus and other Pleistocene Apulian wolves. The dimensional differences between the Avetrana wolves and those collected from the other early Late Pleistocene Apulian localities could be explained through a spread of a large-sized morphotype from the Northern Italy.